first commit

This commit is contained in:
brotherhobo
2024-03-24 23:30:39 -04:00
commit 4bb3a5ad61
128 changed files with 17902 additions and 0 deletions

68
.github/CONTRIBUTING.md vendored Normal file
View File

@ -0,0 +1,68 @@
# Contributing to the FTC SDK
The following is a set of guidelines for contributing the FIRST FTC SDK. The FTC Technology Team welcomes suggestions for improvements to core software, ideas for new features, requests for built-in support of new sensors, and well written bug reports.
## How can I contribute?
### Pull requests
__STOP!__ If you are new to git, do not understand the mechanics of forks, branches, and pulls, if what you just read is confusing, __do not__ push this button. Most likely it won't do what you think it will.
![Pull Button](../doc/media/PullRequest.PNG)
If you are looking at this button then you've pushed some changes to your team's fork of ftctechnh/ftc_app. Congratulations! You are almost certainly finished.
The vast majority of pull requests seen on the ftctechnh/ftc_app repository are not intended to be merged into the official SDK. Team software is just that, your team's. It's specific to the tasks you are trying to accomplish, the testing you are doing, and goals your team has. You don't want that pushed into the official SDK.
If what you've read so far makes little sense, there are some very good git learning resources online.
[Git Book](https://git-scm.com/book/en/v2)
[Interactive Git Tutorial](https://try.github.io)
### Guidlines for experienced GIT users.
If you are absolutely certain that you want to push the big green button above, read on. Otherwise back _slowly away from keyboard_.
The real intent for advanced users is often to issue a pull request from the [branch](https://www.atlassian.com/git/tutorials/using-branches/git-branch) on a local fork back to master on either the same local fork or a child of the team fork and not on the parent ftctechnh/ftc_app. See [Creating a Pull Request](https://help.github.com/articles/creating-a-pull-request-from-a-fork/).
If that is indeed the intent, then you can merge your [topic branch](https://git-scm.com/book/en/v2/Git-Branching-Branching-Workflows#Topic-Branches) into master locally by hand before pushing it up to github, or if you want a pull request for pulls between branches on the same repository because, say, you want team members to look at your software before merging into master, you can select the base fork from the dropdown on the "Open a pull request" page and select your team repo instead of ftctechnh's.
Alternatively, if you have a team repository forked from ftctechnh/ftc_app, and then team members individually fork from your team repository, then pull requests from the individual team member's forks will have the main team repository automatically selected as the base fork for the pull. And you won't inadvertently request to pull your team software into ftctechnh's repository.
The latter would be the "best" way to manage software among a large team. But as with all things git there are many options.
Pull requests that do not fall into the category above are evaluated by the FTC Technology Team on a case-by-case basis. Please note however that the deployment model of the SDK does not support direct pulls into ftctechnh/ftc_app.
### Report bugs
This section guides you through filing a bug report. The better the report the more likely it is to be root caused and fixed. Please refrain from feature requests or software enhancements when opening new issues. See Suggesting Enhancements below.
#### Before submitting a bug report
- Check the [forums](http://ftcforum.firstinspires.org/forum.php) to see if someone else has run into the problem and whether there is an official solution that doesn't require a new SDK.
- Perform a search of current [issues](https://github.com/FIRST-Tech-Challenge/FtcRobotController/issues) to see if the problem has already been reported. If so, add a comment to the existing issue instead of creating a new one.
#### How Do I Submit A (Good) Bug Report?
Bugs are tracked as GitHub issues. Create an issue on ftctechnh/ftc_app and provide the following information.
Explain the problem and include additional details to help maintainers reproduce the problem:
- Use a clear and descriptive title for the issue to identify the problem.
- Describe the exact steps which reproduce the problem in as many details as possible.
- Provide specific examples to demonstrate the steps.
- Describe the behavior you observed after following the steps and point out what exactly is the problem with that behavior. Explain which behavior you expected to see instead and why. If applicable, include screenshots which show you following the described steps and clearly demonstrate the problem.
- If you're reporting that the RobotController crashed, include the logfile with a stack trace of the crash. [Example of good bug report with stack trace](https://github.com/ftctechnh/ftc_app/issues/224)
- If the problem wasn't triggered by a specific action, describe what you were doing before the problem happened and share more information using the guidelines below.
### Suggesting Enhancements
FIRST volunteers are awesome. You all have great ideas and we want to hear them.
Enhancements should be broadly applicable to a large majority of teams, should not force teams to change their workflow, and should provide real value to the mission of FIRST as it relates to engaging youth in engineering activities.
The best way to get momentum behind new features is to post a description of your idea in the discussions section of this repository. Build community support for it. The FTC Technology Team monitors the discussions. We'll hear you and if there's a large enough call for the feature it's very likely to get put on the list for a future release.

1
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@ -0,0 +1 @@
Before issuing a pull request, please see the contributing page.

81
.gitignore vendored Normal file
View File

@ -0,0 +1,81 @@
# Built application files
*.apk
*.aar
*.ap_
*.aab
!libs/*.aar
# Files for the ART/Dalvik VM
*.dex
# Java/JDK files
*.class
*.hprof
# Generated files
bin/
gen/
out/
# Uncomment the following line in case you need and you don't have the release build type files in your app
# release/
# Gradle files
.gradle/
build/
# Local configuration file (sdk path, etc)
local.properties
# Proguard folder generated by Eclipse
proguard/
# Log Files
*.log
# Android Studio Navigation editor temp files
.navigation/
# Android Studio captures folder
captures/
# IntelliJ
*.iml
.idea/
# For Mac users
.DS_Store
# Keystore files
# Uncomment the following lines if you do not want to check your keystore files in.
#*.jks
#*.keystore
# External native build folder generated in Android Studio 2.2 and later
.externalNativeBuild
.cxx/
# Google Services (e.g. APIs or Firebase)
# google-services.json
# Freeline
freeline.py
freeline/
freeline_project_description.json
# fastlane
fastlane/report.xml
fastlane/Preview.html
fastlane/screenshots
fastlane/test_output
fastlane/readme.md
# Version control
vcs.xml
# lint
lint/intermediates/
lint/generated/
lint/outputs/
lint/tmp/
# lint/reports/

View File

@ -0,0 +1,26 @@
import java.text.SimpleDateFormat
//
// build.gradle in FtcRobotController
//
apply plugin: 'com.android.library'
android {
defaultConfig {
minSdkVersion 24
//noinspection ExpiredTargetSdkVersion
targetSdkVersion 28
buildConfigField "String", "APP_BUILD_TIME", '"' + (new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ", Locale.ROOT).format(new Date())) + '"'
}
compileSdkVersion 29
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
namespace = 'com.qualcomm.ftcrobotcontroller'
}
apply from: '../build.dependencies.gradle'

View File

@ -0,0 +1,79 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:versionCode="53"
android:versionName="9.1">
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />
<application
android:allowBackup="true"
android:largeHeap="true"
android:icon="@drawable/ic_launcher"
android:label="@string/app_name"
android:theme="@style/AppThemeRedRC"
android:usesCleartextTraffic="true">
<!-- Indicates to the ControlHubUpdater what the latest version of the Control Hub is that this app supports -->
<meta-data
android:name="org.firstinspires.latestSupportedControlHubVersion"
android:value="1" />
<!-- The main robot controller activity -->
<activity android:name="org.firstinspires.ftc.robotcontroller.internal.PermissionValidatorWrapper"
android:screenOrientation="fullUser"
android:configChanges="orientation|screenSize"
android:label="@string/app_name"
android:launchMode="singleTask" >
<intent-filter>
<category android:name="android.intent.category.LAUNCHER" />
<action android:name="android.intent.action.MAIN" />
</intent-filter>
</activity>
<activity
android:name="org.firstinspires.ftc.robotcontroller.internal.FtcRobotControllerActivity"
android:screenOrientation="fullUser"
android:configChanges="orientation|screenSize"
android:label="@string/app_name"
android:launchMode="singleTask" >
<intent-filter>
<action android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED" />
</intent-filter>
<meta-data
android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED"
android:resource="@xml/device_filter" />
<!--org.firstinspires.main.entry indicates that this app is compatible with the Dragonboard Control Hub-->
<meta-data
android:name="org.firstinspires.main.entry"
android:value="true" />
</activity>
<!-- The robot controller service in which most of the robot functionality is managed -->
<service
android:name="com.qualcomm.ftccommon.FtcRobotControllerService"
android:enabled="true" />
<!-- Assistant that autostarts the robot controller on android boot (if it's supposed to) -->
<receiver
android:enabled="true"
android:exported="true"
android:name="org.firstinspires.ftc.ftccommon.internal.RunOnBoot"
android:permission="android.permission.RECEIVE_BOOT_COMPLETED">
<intent-filter>
<category android:name="android.intent.category.DEFAULT" />
<action android:name="android.intent.action.BOOT_COMPLETED" />
<action android:name="android.intent.action.QUICKBOOT_POWERON" />
</intent-filter>
</receiver>
</application>
</manifest>

View File

@ -0,0 +1,167 @@
/* Copyright (c) 2021 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* This file contains an example of a Linear "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
* When a selection is made from the menu, the corresponding OpMode is executed.
*
* This particular OpMode illustrates driving a 4-motor Omni-Directional (or Holonomic) robot.
* This code will work with either a Mecanum-Drive or an X-Drive train.
* Both of these drives are illustrated at https://gm0.org/en/latest/docs/robot-design/drivetrains/holonomic.html
* Note that a Mecanum drive must display an X roller-pattern when viewed from above.
*
* Also note that it is critical to set the correct rotation direction for each motor. See details below.
*
* Holonomic drives provide the ability for the robot to move in three axes (directions) simultaneously.
* Each motion axis is controlled by one Joystick axis.
*
* 1) Axial: Driving forward and backward Left-joystick Forward/Backward
* 2) Lateral: Strafing right and left Left-joystick Right and Left
* 3) Yaw: Rotating Clockwise and counter clockwise Right-joystick Right and Left
*
* This code is written assuming that the right-side motors need to be reversed for the robot to drive forward.
* When you first test your robot, if it moves backward when you push the left stick forward, then you must flip
* the direction of all 4 motors (see code below).
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Omni Linear OpMode", group="Linear OpMode")
@Disabled
public class BasicOmniOpMode_Linear extends LinearOpMode {
// Declare OpMode members for each of the 4 motors.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor leftFrontDrive = null;
private DcMotor leftBackDrive = null;
private DcMotor rightFrontDrive = null;
private DcMotor rightBackDrive = null;
@Override
public void runOpMode() {
// Initialize the hardware variables. Note that the strings used here must correspond
// to the names assigned during the robot configuration step on the DS or RC devices.
leftFrontDrive = hardwareMap.get(DcMotor.class, "left_front_drive");
leftBackDrive = hardwareMap.get(DcMotor.class, "left_back_drive");
rightFrontDrive = hardwareMap.get(DcMotor.class, "right_front_drive");
rightBackDrive = hardwareMap.get(DcMotor.class, "right_back_drive");
// ########################################################################################
// !!! IMPORTANT Drive Information. Test your motor directions. !!!!!
// ########################################################################################
// Most robots need the motors on one side to be reversed to drive forward.
// The motor reversals shown here are for a "direct drive" robot (the wheels turn the same direction as the motor shaft)
// If your robot has additional gear reductions or uses a right-angled drive, it's important to ensure
// that your motors are turning in the correct direction. So, start out with the reversals here, BUT
// when you first test your robot, push the left joystick forward and observe the direction the wheels turn.
// Reverse the direction (flip FORWARD <-> REVERSE ) of any wheel that runs backward
// Keep testing until ALL the wheels move the robot forward when you push the left joystick forward.
leftFrontDrive.setDirection(DcMotor.Direction.REVERSE);
leftBackDrive.setDirection(DcMotor.Direction.REVERSE);
rightFrontDrive.setDirection(DcMotor.Direction.FORWARD);
rightBackDrive.setDirection(DcMotor.Direction.FORWARD);
// Wait for the game to start (driver presses PLAY)
telemetry.addData("Status", "Initialized");
telemetry.update();
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
double max;
// POV Mode uses left joystick to go forward & strafe, and right joystick to rotate.
double axial = -gamepad1.left_stick_y; // Note: pushing stick forward gives negative value
double lateral = gamepad1.left_stick_x;
double yaw = gamepad1.right_stick_x;
// Combine the joystick requests for each axis-motion to determine each wheel's power.
// Set up a variable for each drive wheel to save the power level for telemetry.
double leftFrontPower = axial + lateral + yaw;
double rightFrontPower = axial - lateral - yaw;
double leftBackPower = axial - lateral + yaw;
double rightBackPower = axial + lateral - yaw;
// Normalize the values so no wheel power exceeds 100%
// This ensures that the robot maintains the desired motion.
max = Math.max(Math.abs(leftFrontPower), Math.abs(rightFrontPower));
max = Math.max(max, Math.abs(leftBackPower));
max = Math.max(max, Math.abs(rightBackPower));
if (max > 1.0) {
leftFrontPower /= max;
rightFrontPower /= max;
leftBackPower /= max;
rightBackPower /= max;
}
// This is test code:
//
// Uncomment the following code to test your motor directions.
// Each button should make the corresponding motor run FORWARD.
// 1) First get all the motors to take to correct positions on the robot
// by adjusting your Robot Configuration if necessary.
// 2) Then make sure they run in the correct direction by modifying the
// the setDirection() calls above.
// Once the correct motors move in the correct direction re-comment this code.
/*
leftFrontPower = gamepad1.x ? 1.0 : 0.0; // X gamepad
leftBackPower = gamepad1.a ? 1.0 : 0.0; // A gamepad
rightFrontPower = gamepad1.y ? 1.0 : 0.0; // Y gamepad
rightBackPower = gamepad1.b ? 1.0 : 0.0; // B gamepad
*/
// Send calculated power to wheels
leftFrontDrive.setPower(leftFrontPower);
rightFrontDrive.setPower(rightFrontPower);
leftBackDrive.setPower(leftBackPower);
rightBackDrive.setPower(rightBackPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Front left/Right", "%4.2f, %4.2f", leftFrontPower, rightFrontPower);
telemetry.addData("Back left/Right", "%4.2f, %4.2f", leftBackPower, rightBackPower);
telemetry.update();
}
}}

View File

@ -0,0 +1,140 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/*
* This file contains an example of an iterative (Non-Linear) "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
* When a selection is made from the menu, the corresponding OpMode
* class is instantiated on the Robot Controller and executed.
*
* This particular OpMode just executes a basic Tank Drive Teleop for a two wheeled robot
* It includes all the skeletal structure that all iterative OpModes contain.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Iterative OpMode", group="Iterative OpMode")
@Disabled
public class BasicOpMode_Iterative extends OpMode
{
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
/*
* Code to run ONCE when the driver hits INIT
*/
@Override
public void init() {
telemetry.addData("Status", "Initialized");
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// Tell the driver that initialization is complete.
telemetry.addData("Status", "Initialized");
}
/*
* Code to run REPEATEDLY after the driver hits INIT, but before they hit PLAY
*/
@Override
public void init_loop() {
}
/*
* Code to run ONCE when the driver hits PLAY
*/
@Override
public void start() {
runtime.reset();
}
/*
* Code to run REPEATEDLY after the driver hits PLAY but before they hit STOP
*/
@Override
public void loop() {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
}
/*
* Code to run ONCE after the driver hits STOP
*/
@Override
public void stop() {
}
}

View File

@ -0,0 +1,115 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/*
* This file contains an minimal example of a Linear "OpMode". An OpMode is a 'program' that runs in either
* the autonomous or the teleop period of an FTC match. The names of OpModes appear on the menu
* of the FTC Driver Station. When a selection is made from the menu, the corresponding OpMode
* class is instantiated on the Robot Controller and executed.
*
* This particular OpMode just executes a basic Tank Drive Teleop for a two wheeled robot
* It includes all the skeletal structure that all linear OpModes contain.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Linear OpMode", group="Linear OpMode")
@Disabled
public class BasicOpMode_Linear extends LinearOpMode {
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
@Override
public void runOpMode() {
telemetry.addData("Status", "Initialized");
telemetry.update();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// Wait for the game to start (driver presses PLAY)
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
telemetry.update();
}
}
}

View File

@ -0,0 +1,217 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation,
* including Java Builder structures for specifying Vision parameters.
*
* For an introduction to AprilTags, see the FTC-DOCS link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* In this sample, any visible tag ID will be detected and displayed, but only tags that are included in the default
* "TagLibrary" will have their position and orientation information displayed. This default TagLibrary contains
* the current Season's AprilTags and a small set of "test Tags" in the high number range.
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* To experiment with using AprilTags to navigate, try out these two driving samples:
* RobotAutoDriveToAprilTagOmni and RobotAutoDriveToAprilTagTank
*
* There are many "default" VisionPortal and AprilTag configuration parameters that may be overridden if desired.
* These default parameters are shown as comments in the code below.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag", group = "Concept")
@Disabled
public class ConceptAprilTag extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end method runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor.
aprilTag = new AprilTagProcessor.Builder()
// The following default settings are available to un-comment and edit as needed.
//.setDrawAxes(false)
//.setDrawCubeProjection(false)
//.setDrawTagOutline(true)
//.setTagFamily(AprilTagProcessor.TagFamily.TAG_36h11)
//.setTagLibrary(AprilTagGameDatabase.getCenterStageTagLibrary())
//.setOutputUnits(DistanceUnit.INCH, AngleUnit.DEGREES)
// == CAMERA CALIBRATION ==
// If you do not manually specify calibration parameters, the SDK will attempt
// to load a predefined calibration for your camera.
//.setLensIntrinsics(578.272, 578.272, 402.145, 221.506)
// ... these parameters are fx, fy, cx, cy.
.build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second (default)
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second (default)
// Note: Decimation can be changed on-the-fly to adapt during a match.
//aprilTag.setDecimation(3);
// Create the vision portal by using a builder.
VisionPortal.Builder builder = new VisionPortal.Builder();
// Set the camera (webcam vs. built-in RC phone camera).
if (USE_WEBCAM) {
builder.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"));
} else {
builder.setCamera(BuiltinCameraDirection.BACK);
}
// Choose a camera resolution. Not all cameras support all resolutions.
//builder.setCameraResolution(new Size(640, 480));
// Enable the RC preview (LiveView). Set "false" to omit camera monitoring.
//builder.enableLiveView(true);
// Set the stream format; MJPEG uses less bandwidth than default YUY2.
//builder.setStreamFormat(VisionPortal.StreamFormat.YUY2);
// Choose whether or not LiveView stops if no processors are enabled.
// If set "true", monitor shows solid orange screen if no processors enabled.
// If set "false", monitor shows camera view without annotations.
//builder.setAutoStopLiveView(false);
// Set and enable the processor.
builder.addProcessor(aprilTag);
// Build the Vision Portal, using the above settings.
visionPortal = builder.build();
// Disable or re-enable the aprilTag processor at any time.
//visionPortal.setProcessorEnabled(aprilTag, true);
} // end method initAprilTag()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
} // end class

View File

@ -0,0 +1,163 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation, using
* the easy way.
*
* For an introduction to AprilTags, see the FTC-DOCS link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* In this sample, any visible tag ID will be detected and displayed, but only tags that are included in the default
* "TagLibrary" will have their position and orientation information displayed. This default TagLibrary contains
* the current Season's AprilTags and a small set of "test Tags" in the high number range.
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* To experiment with using AprilTags to navigate, try out these two driving samples:
* RobotAutoDriveToAprilTagOmni and RobotAutoDriveToAprilTagTank
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag Easy", group = "Concept")
@Disabled
public class ConceptAprilTagEasy extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end method runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor the easy way.
aprilTag = AprilTagProcessor.easyCreateWithDefaults();
// Create the vision portal the easy way.
if (USE_WEBCAM) {
visionPortal = VisionPortal.easyCreateWithDefaults(
hardwareMap.get(WebcamName.class, "Webcam 1"), aprilTag);
} else {
visionPortal = VisionPortal.easyCreateWithDefaults(
BuiltinCameraDirection.BACK, aprilTag);
}
} // end method initAprilTag()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
} // end class

View File

@ -0,0 +1,246 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode determines the best Exposure for minimizing image motion-blur on a Webcam
* Note that it is not possible to control the exposure for a Phone Camera, so if you are using a Phone for the Robot Controller
* this OpMode/Feature only applies to an externally connected Webcam
*
* The goal is to determine the smallest (shortest) Exposure value that still provides reliable Tag Detection.
* Starting with the minimum Exposure and maximum Gain, the exposure is slowly increased until the Tag is
* detected reliably from the likely operational distance.
*
*
* The best way to run this optimization is to view the camera preview screen while changing the exposure and gains.
*
* To do this, you need to view the RobotController screen directly (not from Driver Station)
* This can be done directly from a RC phone screen (if you are using an external Webcam), but for a Control Hub you must either plug an
* HDMI monitor into the Control Hub HDMI port, or use an external viewer program like ScrCpy (https://scrcpy.org/)
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name="Optimize AprilTag Exposure", group = "Concept")
@Disabled
public class ConceptAprilTagOptimizeExposure extends LinearOpMode
{
private VisionPortal visionPortal = null; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private int myExposure ;
private int minExposure ;
private int maxExposure ;
private int myGain ;
private int minGain ;
private int maxGain ;
boolean thisExpUp = false;
boolean thisExpDn = false;
boolean thisGainUp = false;
boolean thisGainDn = false;
boolean lastExpUp = false;
boolean lastExpDn = false;
boolean lastGainUp = false;
boolean lastGainDn = false;
@Override public void runOpMode()
{
// Initialize the Apriltag Detection process
initAprilTag();
// Establish Min and Max Gains and Exposure. Then set a low exposure with high gain
getCameraSetting();
myExposure = Math.min(5, minExposure);
myGain = maxGain;
setManualExposure(myExposure, myGain);
// Wait for the match to begin.
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
telemetry.addLine("Find lowest Exposure that gives reliable detection.");
telemetry.addLine("Use Left bump/trig to adjust Exposure.");
telemetry.addLine("Use Right bump/trig to adjust Gain.\n");
// Display how many Tags Detected
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
int numTags = currentDetections.size();
if (numTags > 0 )
telemetry.addData("Tag", "####### %d Detected ######", currentDetections.size());
else
telemetry.addData("Tag", "----------- none - ----------");
telemetry.addData("Exposure","%d (%d - %d)", myExposure, minExposure, maxExposure);
telemetry.addData("Gain","%d (%d - %d)", myGain, minGain, maxGain);
telemetry.update();
// check to see if we need to change exposure or gain.
thisExpUp = gamepad1.left_bumper;
thisExpDn = gamepad1.left_trigger > 0.25;
thisGainUp = gamepad1.right_bumper;
thisGainDn = gamepad1.right_trigger > 0.25;
// look for clicks to change exposure
if (thisExpUp && !lastExpUp) {
myExposure = Range.clip(myExposure + 1, minExposure, maxExposure);
setManualExposure(myExposure, myGain);
} else if (thisExpDn && !lastExpDn) {
myExposure = Range.clip(myExposure - 1, minExposure, maxExposure);
setManualExposure(myExposure, myGain);
}
// look for clicks to change the gain
if (thisGainUp && !lastGainUp) {
myGain = Range.clip(myGain + 1, minGain, maxGain );
setManualExposure(myExposure, myGain);
} else if (thisGainDn && !lastGainDn) {
myGain = Range.clip(myGain - 1, minGain, maxGain );
setManualExposure(myExposure, myGain);
}
lastExpUp = thisExpUp;
lastExpDn = thisExpDn;
lastGainUp = thisGainUp;
lastGainDn = thisGainDn;
sleep(20);
}
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Create the WEBCAM vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
}
/*
Manually set the camera gain and exposure.
Can only be called AFTER calling initAprilTag();
Returns true if controls are set.
*/
private boolean setManualExposure(int exposureMS, int gain) {
// Ensure Vision Portal has been setup.
if (visionPortal == null) {
return false;
}
// Wait for the camera to be open
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
// Set exposure. Make sure we are in Manual Mode for these values to take effect.
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
// Set Gain.
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
return (true);
} else {
return (false);
}
}
/*
Read this camera's minimum and maximum Exposure and Gain settings.
Can only be called AFTER calling initAprilTag();
*/
private void getCameraSetting() {
// Ensure Vision Portal has been setup.
if (visionPortal == null) {
return;
}
// Wait for the camera to be open
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Get camera control values unless we are stopping.
if (!isStopRequested()) {
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
minExposure = (int)exposureControl.getMinExposure(TimeUnit.MILLISECONDS) + 1;
maxExposure = (int)exposureControl.getMaxExposure(TimeUnit.MILLISECONDS);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
minGain = gainControl.getMinGain();
maxGain = gainControl.getMaxGain();
}
}
}

View File

@ -0,0 +1,196 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.VisionPortal.CameraState;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation, using
* two webcams.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag Switchable Cameras", group = "Concept")
@Disabled
public class ConceptAprilTagSwitchableCameras extends LinearOpMode {
/*
* Variables used for switching cameras.
*/
private WebcamName webcam1, webcam2;
private boolean oldLeftBumper;
private boolean oldRightBumper;
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryCameraSwitching();
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
doCameraSwitching();
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
CameraName switchableCamera = ClassFactory.getInstance()
.getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
// Create the vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(switchableCamera)
.addProcessor(aprilTag)
.build();
} // end method initAprilTag()
/**
* Add telemetry about camera switching.
*/
private void telemetryCameraSwitching() {
if (visionPortal.getActiveCamera().equals(webcam1)) {
telemetry.addData("activeCamera", "Webcam 1");
telemetry.addData("Press RightBumper", "to switch to Webcam 2");
} else {
telemetry.addData("activeCamera", "Webcam 2");
telemetry.addData("Press LeftBumper", "to switch to Webcam 1");
}
} // end method telemetryCameraSwitching()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
/**
* Set the active camera according to input from the gamepad.
*/
private void doCameraSwitching() {
if (visionPortal.getCameraState() == CameraState.STREAMING) {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
visionPortal.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
visionPortal.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
}
} // end method doCameraSwitching()
} // end class

View File

@ -0,0 +1,202 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of using both AprilTag recognition and TensorFlow
* Object Detection.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: Double Vision", group = "Concept")
@Disabled
public class ConceptDoubleVision extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal myVisionPortal;
@Override
public void runOpMode() {
initDoubleVision();
// This OpMode loops continuously, allowing the user to switch between
// AprilTag and TensorFlow Object Detection (TFOD) image processors.
while (!isStopRequested()) {
if (opModeInInit()) {
telemetry.addData("DS preview on/off","3 dots, Camera Stream");
telemetry.addLine();
telemetry.addLine("----------------------------------------");
}
if (myVisionPortal.getProcessorEnabled(aprilTag)) {
// User instructions: Dpad left or Dpad right.
telemetry.addLine("Dpad Left to disable AprilTag");
telemetry.addLine();
telemetryAprilTag();
} else {
telemetry.addLine("Dpad Right to enable AprilTag");
}
telemetry.addLine();
telemetry.addLine("----------------------------------------");
if (myVisionPortal.getProcessorEnabled(tfod)) {
telemetry.addLine("Dpad Down to disable TFOD");
telemetry.addLine();
telemetryTfod();
} else {
telemetry.addLine("Dpad Up to enable TFOD");
}
// Push telemetry to the Driver Station.
telemetry.update();
if (gamepad1.dpad_left) {
myVisionPortal.setProcessorEnabled(aprilTag, false);
} else if (gamepad1.dpad_right) {
myVisionPortal.setProcessorEnabled(aprilTag, true);
}
if (gamepad1.dpad_down) {
myVisionPortal.setProcessorEnabled(tfod, false);
} else if (gamepad1.dpad_up) {
myVisionPortal.setProcessorEnabled(tfod, true);
}
sleep(20);
} // end while loop
} // end method runOpMode()
/**
* Initialize AprilTag and TFOD.
*/
private void initDoubleVision() {
// -----------------------------------------------------------------------------------------
// AprilTag Configuration
// -----------------------------------------------------------------------------------------
aprilTag = new AprilTagProcessor.Builder()
.build();
// -----------------------------------------------------------------------------------------
// TFOD Configuration
// -----------------------------------------------------------------------------------------
tfod = new TfodProcessor.Builder()
.build();
// -----------------------------------------------------------------------------------------
// Camera Configuration
// -----------------------------------------------------------------------------------------
if (USE_WEBCAM) {
myVisionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessors(tfod, aprilTag)
.build();
} else {
myVisionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessors(tfod, aprilTag)
.build();
}
} // end initDoubleVision()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
} // end method telemetryAprilTag()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -0,0 +1,184 @@
/*
Copyright (c) 2022 REV Robotics, FIRST
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of REV Robotics nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/*
* This OpMode demonstrates the impact of setting the IMU orientation correctly or incorrectly. This
* code assumes there is an IMU configured with the name "imu".
*
* Note: This OpMode is more of a tool than a code sample. The User Interface portion of this code
* goes beyond simply showing how to interface to the IMU.<br>
* For a minimal example of interfacing to an IMU, please see the SensorIMUOrthogonal or SensorIMUNonOrthogonal sample OpModes.
*
* This OpMode enables you to re-specify the Hub Mounting orientation dynamically by using gamepad controls.
* While doing so, the sample will display how Pitch, Roll and Yaw angles change as the hub is moved.
*
* The gamepad controls let you change the two parameters that specify how the Control/Expansion Hub is mounted. <br>
* The first parameter specifies which direction the printed logo on the Hub is pointing. <br>
* The second parameter specifies which direction the USB connector on the Hub is pointing. <br>
* All directions are relative to the robot, and left/right is as viewed from behind the robot.
*
* How will you know if you have chosen the correct Orientation? With the correct orientation
* parameters selected, pitch/roll/yaw should act as follows:
*
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X) <br>
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y) <br>
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z) <br>
*
* The Yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
*
* The rotational velocities should follow the change in corresponding axes.
*/
@TeleOp(name="Concept: IMU Orientation", group="Concept")
@Disabled
public class ConceptExploringIMUOrientation extends LinearOpMode {
static RevHubOrientationOnRobot.LogoFacingDirection[] logoFacingDirections
= RevHubOrientationOnRobot.LogoFacingDirection.values();
static RevHubOrientationOnRobot.UsbFacingDirection[] usbFacingDirections
= RevHubOrientationOnRobot.UsbFacingDirection.values();
static int LAST_DIRECTION = logoFacingDirections.length - 1;
static float TRIGGER_THRESHOLD = 0.2f;
IMU imu;
int logoFacingDirectionPosition;
int usbFacingDirectionPosition;
boolean orientationIsValid = true;
@Override public void runOpMode() throws InterruptedException {
imu = hardwareMap.get(IMU.class, "imu");
logoFacingDirectionPosition = 0; // Up
usbFacingDirectionPosition = 2; // Forward
updateOrientation();
boolean justChangedLogoDirection = false;
boolean justChangedUsbDirection = false;
// Loop until stop requested
while (!isStopRequested()) {
// Check to see if Yaw reset is requested (Y button)
if (gamepad1.y) {
telemetry.addData("Yaw", "Resetting\n");
imu.resetYaw();
} else {
telemetry.addData("Yaw", "Press Y (triangle) on Gamepad to reset.\n");
}
// Check to see if new Logo Direction is requested
if (gamepad1.left_bumper || gamepad1.right_bumper) {
if (!justChangedLogoDirection) {
justChangedLogoDirection = true;
if (gamepad1.left_bumper) {
logoFacingDirectionPosition--;
if (logoFacingDirectionPosition < 0) {
logoFacingDirectionPosition = LAST_DIRECTION;
}
} else {
logoFacingDirectionPosition++;
if (logoFacingDirectionPosition > LAST_DIRECTION) {
logoFacingDirectionPosition = 0;
}
}
updateOrientation();
}
} else {
justChangedLogoDirection = false;
}
// Check to see if new USB Direction is requested
if (gamepad1.left_trigger > TRIGGER_THRESHOLD || gamepad1.right_trigger > TRIGGER_THRESHOLD) {
if (!justChangedUsbDirection) {
justChangedUsbDirection = true;
if (gamepad1.left_trigger > TRIGGER_THRESHOLD) {
usbFacingDirectionPosition--;
if (usbFacingDirectionPosition < 0) {
usbFacingDirectionPosition = LAST_DIRECTION;
}
} else {
usbFacingDirectionPosition++;
if (usbFacingDirectionPosition > LAST_DIRECTION) {
usbFacingDirectionPosition = 0;
}
}
updateOrientation();
}
} else {
justChangedUsbDirection = false;
}
// Display User instructions and IMU data
telemetry.addData("logo Direction (set with bumpers)", logoFacingDirections[logoFacingDirectionPosition]);
telemetry.addData("usb Direction (set with triggers)", usbFacingDirections[usbFacingDirectionPosition] + "\n");
if (orientationIsValid) {
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.DEGREES);
telemetry.addData("Yaw (Z)", "%.2f Deg. (Heading)", orientation.getYaw(AngleUnit.DEGREES));
telemetry.addData("Pitch (X)", "%.2f Deg.", orientation.getPitch(AngleUnit.DEGREES));
telemetry.addData("Roll (Y)", "%.2f Deg.\n", orientation.getRoll(AngleUnit.DEGREES));
telemetry.addData("Yaw (Z) velocity", "%.2f Deg/Sec", angularVelocity.zRotationRate);
telemetry.addData("Pitch (X) velocity", "%.2f Deg/Sec", angularVelocity.xRotationRate);
telemetry.addData("Roll (Y) velocity", "%.2f Deg/Sec", angularVelocity.yRotationRate);
} else {
telemetry.addData("Error", "Selected orientation on robot is invalid");
}
telemetry.update();
}
}
// apply any requested orientation changes.
void updateOrientation() {
RevHubOrientationOnRobot.LogoFacingDirection logo = logoFacingDirections[logoFacingDirectionPosition];
RevHubOrientationOnRobot.UsbFacingDirection usb = usbFacingDirections[usbFacingDirectionPosition];
try {
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logo, usb);
imu.initialize(new IMU.Parameters(orientationOnRobot));
orientationIsValid = true;
} catch (IllegalArgumentException e) {
orientationIsValid = false;
}
}
}

View File

@ -0,0 +1,142 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.Range;
/*
* This OpMode illustrates how to use an external "hardware" class to modularize all the robot's sensors and actuators.
* This approach is very efficient because the same hardware class can be used by all of your teleop and autonomous OpModes
* without requiring many copy & paste operations. Once you have defined and tested the hardware class with one OpMode,
* it is instantly available to other OpModes.
*
* The real benefit of this approach is that as you tweak your robot hardware, you only need to make changes in ONE place (the Hardware Class).
* So, to be effective you should put as much or your hardware setup and access code as possible in the hardware class.
* Essentially anything you do with hardware in BOTH Teleop and Auto should likely go in the hardware class.
*
* The Hardware Class is created in a separate file, and then an "instance" of this class is created in each OpMode.
* In order for the class to do typical OpMode things (like send telemetry data) it must be passed a reference to the
* OpMode object when it's created, so it can access all core OpMode functions. This is illustrated below.
*
* In this concept sample, the hardware class file is called RobotHardware.java and it must accompany this sample OpMode.
* So, if you copy ConceptExternalHardwareClass.java into TeamCode (using Android Studio or OnBotJava) then RobotHardware.java
* must also be copied to the same location (maintaining its name).
*
* For comparison purposes, this sample and its accompanying hardware class duplicates the functionality of the
* RobotTelopPOV_Linear OpMode. It assumes three motors (left_drive, right_drive and arm) and two servos (left_hand and right_hand)
*
* View the RobotHardware.java class file for more details
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* In OnBot Java, add a new OpMode, select this sample, and select TeleOp.
* Also add another new file named RobotHardware.java, select the sample with that name, and select Not an OpMode.
*/
@TeleOp(name="Concept: Robot Hardware Class", group="Robot")
@Disabled
public class ConceptExternalHardwareClass extends LinearOpMode {
// Create a RobotHardware object to be used to access robot hardware.
// Prefix any hardware functions with "robot." to access this class.
RobotHardware robot = new RobotHardware(this);
@Override
public void runOpMode() {
double drive = 0;
double turn = 0;
double arm = 0;
double handOffset = 0;
// initialize all the hardware, using the hardware class. See how clean and simple this is?
robot.init();
// Send telemetry message to signify robot waiting;
// Wait for the game to start (driver presses PLAY)
waitForStart();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Run wheels in POV mode (note: The joystick goes negative when pushed forward, so negate it)
// In this mode the Left stick moves the robot fwd and back, the Right stick turns left and right.
// This way it's also easy to just drive straight, or just turn.
drive = -gamepad1.left_stick_y;
turn = gamepad1.right_stick_x;
// Combine drive and turn for blended motion. Use RobotHardware class
robot.driveRobot(drive, turn);
// Use gamepad left & right Bumpers to open and close the claw
// Use the SERVO constants defined in RobotHardware class.
// Each time around the loop, the servos will move by a small amount.
// Limit the total offset to half of the full travel range
if (gamepad1.right_bumper)
handOffset += robot.HAND_SPEED;
else if (gamepad1.left_bumper)
handOffset -= robot.HAND_SPEED;
handOffset = Range.clip(handOffset, -0.5, 0.5);
// Move both servos to new position. Use RobotHardware class
robot.setHandPositions(handOffset);
// Use gamepad buttons to move arm up (Y) and down (A)
// Use the MOTOR constants defined in RobotHardware class.
if (gamepad1.y)
arm = robot.ARM_UP_POWER;
else if (gamepad1.a)
arm = robot.ARM_DOWN_POWER;
else
arm = 0;
robot.setArmPower(arm);
// Send telemetry messages to explain controls and show robot status
telemetry.addData("Drive", "Left Stick");
telemetry.addData("Turn", "Right Stick");
telemetry.addData("Arm Up/Down", "Y & A Buttons");
telemetry.addData("Hand Open/Closed", "Left and Right Bumpers");
telemetry.addData("-", "-------");
telemetry.addData("Drive Power", "%.2f", drive);
telemetry.addData("Turn Power", "%.2f", turn);
telemetry.addData("Arm Power", "%.2f", arm);
telemetry.addData("Hand Position", "Offset = %.2f", handOffset);
telemetry.update();
// Pace this loop so hands move at a reasonable speed.
sleep(50);
}
}
}

View File

@ -0,0 +1,200 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gamepad;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* This OpMode illustrates using the rumble feature of many gamepads.
*
* Note: Some gamepads "rumble" better than others.
* The Xbox & PS4 have a left (rumble1) and right (rumble2) rumble motor.
* These two gamepads have two distinct rumble modes: Large on the left, and small on the right
* The Etpark gamepad may only respond to rumble1, and may only run at full power.
* The Logitech F310 gamepad does not have *any* rumble ability.
*
* Moral: You should use this sample to experiment with your specific gamepads to explore their rumble features.
*
* The rumble motors are accessed through the standard gamepad1 and gamepad2 objects.
* Several new methods were added to the Gamepad class in FTC SDK Rev 7
* The key methods are as follows:
*
* .rumble(double rumble1, double rumble2, int durationMs)
* This method sets the rumble power of both motors for a specific time duration.
* Both rumble arguments are motor-power levels in the 0.0 to 1.0 range.
* durationMs is the desired length of the rumble action in milliseconds.
* This method returns immediately.
* Note:
* Use a durationMs of Gamepad.RUMBLE_DURATION_CONTINUOUS to provide a continuous rumble
* Use a power of 0, or duration of 0 to stop a rumble.
*
* .rumbleBlips(int count) allows an easy way to signal the driver with a series of rumble blips.
* Just specify how many blips you want.
* This method returns immediately.
*
* .runRumbleEffect(customRumbleEffect) allows you to run a custom rumble sequence that you have
* built using the Gamepad.RumbleEffect.Builder()
* A "custom effect" is a sequence of steps, where each step can rumble any of the
* rumble motors for a specific period at a specific power level.
* The Custom Effect will play as the (un-blocked) OpMode continues to run
*
* .isRumbling() returns true if there is a rumble effect in progress.
* Use this call to prevent stepping on a current rumble.
*
* .stopRumble() Stop any ongoing rumble or custom rumble effect.
*
* .rumble(int durationMs) Full power rumble for fixed duration.
*
* Note: Whenever a new Rumble command is issued, any currently executing rumble action will
* be truncated, and the new action started immediately. Take these precautions:
* 1) Do Not SPAM the rumble motors by issuing rapid fire commands
* 2) Multiple sources for rumble commands must coordinate to avoid tromping on each other.
*
* This can be achieved several possible ways:
* 1) Only having one source for rumble actions
* 2) Issuing rumble commands on transitions, rather than states.
* e.g. The moment a touch sensor is pressed, rather than the entire time it is being pressed.
* 3) Scheduling rumble commands based on timed events. e.g. 10 seconds prior to endgame
* 4) Rumble on non-overlapping mechanical actions. e.g. arm fully-extended or fully-retracted.
* 5) Use isRumbling() to hold off on a new rumble if one is already in progress.
*
* The examples shown here are representstive of how to invoke a gamepad rumble.
* It is assumed that these will be modified to suit the specific robot and team strategy needs.
*
* ######## Read the telemetry display on the Driver Station Screen for instructions. ######
*
* Ex 1) This example shows a) how to create a custom rumble effect, and then b) how to trigger it based
* on game time. One use for this might be to alert the driver that half-time or End-game is approaching.
*
* Ex 2) This example shows tying the rumble power to a changing sensor value.
* In this case it is the Gamepad trigger, but it could be any sensor output scaled to the 0-1 range.
* Since it takes over the rumble motors, it is only performed when the Left Bumper is pressed.
* Note that this approach MUST include a way to turn OFF the rumble when the button is released.
*
* Ex 3) This example shows a simple way to trigger a 3-blip sequence. In this case it is
* triggered by the gamepad A (Cross) button, but it could be any sensor, like a touch or light sensor.
* Note that this code ensures that it only rumbles once when the input goes true.
*
* Ex 4) This example shows how to trigger a single rumble when an input value gets over a certain value.
* In this case it is reading the Right Trigger, but it could be any variable sensor, like a
* range sensor, or position sensor. The code needs to ensure that it is only triggered once, so
* it waits till the sensor drops back below the threshold before it can trigger again.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@Disabled
@TeleOp(name="Concept: Gamepad Rumble", group ="Concept")
public class ConceptGamepadRumble extends LinearOpMode
{
boolean lastA = false; // Use to track the prior button state.
boolean lastLB = false; // Use to track the prior button state.
boolean highLevel = false; // used to prevent multiple level-based rumbles.
boolean secondHalf = false; // Use to prevent multiple half-time warning rumbles.
Gamepad.RumbleEffect customRumbleEffect; // Use to build a custom rumble sequence.
ElapsedTime runtime = new ElapsedTime(); // Use to determine when end game is starting.
final double HALF_TIME = 60.0; // Wait this many seconds before rumble-alert for half-time.
final double TRIGGER_THRESHOLD = 0.75; // Squeeze more than 3/4 to get rumble.
@Override
public void runOpMode()
{
// Example 1. a) start by creating a three-pulse rumble sequence: right, LEFT, LEFT
customRumbleEffect = new Gamepad.RumbleEffect.Builder()
.addStep(0.0, 1.0, 500) // Rumble right motor 100% for 500 mSec
.addStep(0.0, 0.0, 300) // Pause for 300 mSec
.addStep(1.0, 0.0, 250) // Rumble left motor 100% for 250 mSec
.addStep(0.0, 0.0, 250) // Pause for 250 mSec
.addStep(1.0, 0.0, 250) // Rumble left motor 100% for 250 mSec
.build();
telemetry.addData(">", "Press Start");
telemetry.update();
waitForStart();
runtime.reset(); // Start game timer.
// Loop while monitoring buttons for rumble triggers
while (opModeIsActive())
{
// Read and save the current gamepad button states.
boolean currentA = gamepad1.a ;
boolean currentLB = gamepad1.left_bumper ;
// Display the current Rumble status. Just for interest.
telemetry.addData(">", "Are we RUMBLING? %s\n", gamepad1.isRumbling() ? "YES" : "no" );
// ----------------------------------------------------------------------------------------
// Example 1. b) Watch the runtime timer, and run the custom rumble when we hit half-time.
// Make sure we only signal once by setting "secondHalf" flag to prevent further rumbles.
// ----------------------------------------------------------------------------------------
if ((runtime.seconds() > HALF_TIME) && !secondHalf) {
gamepad1.runRumbleEffect(customRumbleEffect);
secondHalf =true;
}
// Display the time remaining while we are still counting down.
if (!secondHalf) {
telemetry.addData(">", "Halftime Alert Countdown: %3.0f Sec \n", (HALF_TIME - runtime.seconds()) );
}
// ----------------------------------------------------------------------------------------
// Example 2. If Left Bumper is being pressed, power the rumble motors based on the two trigger depressions.
// This is useful to see how the rumble feels at various power levels.
// ----------------------------------------------------------------------------------------
if (currentLB) {
// Left Bumper is being pressed, so send left and right "trigger" values to left and right rumble motors.
gamepad1.rumble(gamepad1.left_trigger, gamepad1.right_trigger, Gamepad.RUMBLE_DURATION_CONTINUOUS);
// Show what is being sent to rumbles
telemetry.addData(">", "Squeeze triggers to control rumbles");
telemetry.addData("> : Rumble", "Left: %.0f%% Right: %.0f%%", gamepad1.left_trigger * 100, gamepad1.right_trigger * 100);
} else {
// Make sure rumble is turned off when Left Bumper is released (only one time each press)
if (lastLB) {
gamepad1.stopRumble();
}
// Prompt for manual rumble action
telemetry.addData(">", "Hold Left-Bumper to test Manual Rumble");
telemetry.addData(">", "Press A (Cross) for three blips");
telemetry.addData(">", "Squeeze right trigger slowly for 1 blip");
}
lastLB = currentLB; // remember the current button state for next time around the loop
// ----------------------------------------------------------------------------------------
// Example 3. Blip 3 times at the moment that A (Cross) is pressed. (look for pressed transition)
// BUT !!! Skip it altogether if the Gamepad is already rumbling.
// ----------------------------------------------------------------------------------------
if (currentA && !lastA) {
if (!gamepad1.isRumbling()) // Check for possible overlap of rumbles.
gamepad1.rumbleBlips(3);
}
lastA = currentA; // remember the current button state for next time around the loop
// ----------------------------------------------------------------------------------------
// Example 4. Rumble once when gamepad right trigger goes above the THRESHOLD.
// ----------------------------------------------------------------------------------------
if (gamepad1.right_trigger > TRIGGER_THRESHOLD) {
if (!highLevel) {
gamepad1.rumble(0.9, 0, 200); // 200 mSec burst on left motor.
highLevel = true; // Hold off any more triggers
}
} else {
highLevel = false; // We can trigger again now.
}
// Send the telemetry data to the Driver Station, and then pause to pace the program.
telemetry.update();
sleep(10);
}
}
}

View File

@ -0,0 +1,77 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/*
* This OpMode illustrates using the touchpad feature found on some gamepads.
*
* The Sony PS4 gamepad can detect two distinct touches on the central touchpad.
* Other gamepads with different touchpads may provide mixed results.
*
* The touchpads are accessed through the standard gamepad1 and gamepad2 objects.
* Several new members were added to the Gamepad class in FTC SDK Rev 7
*
* .touchpad_finger_1 returns true if at least one finger is detected.
* .touchpad_finger_1_x finger 1 X coordinate. Valid if touchpad_finger_1 is true
* .touchpad_finger_1_y finger 1 Y coordinate. Valid if touchpad_finger_1 is true
*
* .touchpad_finger_2 returns true if a second finger is detected
* .touchpad_finger_2_x finger 2 X coordinate. Valid if touchpad_finger_2 is true
* .touchpad_finger_2_y finger 2 Y coordinate. Valid if touchpad_finger_2 is true
*
* Finger touches are reported with an X and Y coordinate in following coordinate system.
*
* 1) X is the Horizontal axis, and Y is the vertical axis
* 2) The 0,0 origin is at the center of the touchpad.
* 3) 1.0, 1.0 is at the top right corner of the touchpad.
* 4) -1.0,-1.0 is at the bottom left corner of the touchpad.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@Disabled
@TeleOp(name="Concept: Gamepad Touchpad", group ="Concept")
public class ConceptGamepadTouchpad extends LinearOpMode
{
@Override
public void runOpMode()
{
telemetry.setDisplayFormat(Telemetry.DisplayFormat.MONOSPACE);
telemetry.addData(">", "Press Start");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
boolean finger = false;
// Display finger 1 x & y position if finger detected
if(gamepad1.touchpad_finger_1)
{
finger = true;
telemetry.addLine(String.format("Finger 1: x=%5.2f y=%5.2f\n", gamepad1.touchpad_finger_1_x, gamepad1.touchpad_finger_1_y));
}
// Display finger 2 x & y position if finger detected
if(gamepad1.touchpad_finger_2)
{
finger = true;
telemetry.addLine(String.format("Finger 2: x=%5.2f y=%5.2f\n", gamepad1.touchpad_finger_2_x, gamepad1.touchpad_finger_2_y));
}
if(!finger)
{
telemetry.addLine("No fingers");
}
telemetry.update();
sleep(10);
}
}
}

View File

@ -0,0 +1,227 @@
/* Copyright (c) 2019 Phil Malone. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.lynx.LynxModule;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.util.ElapsedTime;
import java.util.List;
/*
* This OpMode illustrates how to use the Expansion Hub's Bulk-Read feature to speed up control cycle times.
* In this example there are 4 motors that need their encoder positions, and velocities read.
* The sample is written to work with one or two expansion hubs, with no assumption as to where the motors are located.
*
* Three scenarios are tested:
* Cache Mode = OFF This is the normal default, where no cache is used, and every read produces a discrete transaction with
* an expansion hub, which is the slowest approach, but guarentees that the value is as fresh (recent) as possible..
*
* Cache Mode = AUTO This mode will attempt to minimize the number of discrete read commands, by performing bulk-reads
* and then returning values that have been cached. The cache is updated automatically whenever any specific encoder is re-read.
* This mode will always return new data, but it may perform more bulk-reads than absolutely required.
* Extra reads will be performed if multiple encoder/velocity reads are performed on the same encoder in one control cycle.
* This mode is a good compromise between the OFF and MANUAL modes.
* Note: If there are significant user-program delays between encoder reads, the cached value may not be fresh (recent).
* You can issue a clearBulkCache() call at any time force a fresh bulk-read on the next encoder read.
*
* Cache Mode = MANUAL This mode requires the user's code to determine the best time to clear the cached bulk-read data.
* Well organized code will reset the cache once at the beginning of the control cycle, and then immediately read and store all the encoder values.
* This approach will produce the shortest cycle times, but it does require the user to manually clear the cache.
* Since NO automatic Bulk-Reads are performed, neglecting to clear the bulk cache will result in the same values being returned
* each time an encoder read is performed.
*
* -------------------------------------
*
* General tip to speed up your control cycles:
*
* No matter what method you use to read encoders and other inputs, you should try to
* avoid reading the same encoder input multiple times around a control loop.
* Under normal conditions, this will slow down the control loop.
* The preferred method is to read all the required inputs ONCE at the beginning of the loop,
* and save the values in variable that can be used by other parts of the control code.
*
* eg: if you are sending encoder positions to your telemetry display, putting a getCurrentPosition()
* call in the telemetry statement will force the code to go and get another copy which will take time.
* It's much better read the position into a variable once, and use that variable for control AND display.
* Reading saved variables takes no time at all.
*
* Once you put all your sensor reads at the beginning of the control cycle, it's very easy to use
* the bulk-read AUTO mode to streamline your cycle timing.
*/
@TeleOp (name = "Motor Bulk Reads", group = "Tests")
@Disabled
public class ConceptMotorBulkRead extends LinearOpMode {
final int TEST_CYCLES = 500; // Number of control cycles to run to determine cycle times.
private DcMotorEx m1, m2, m3, m4; // Motor Objects
private long e1, e2, e3, e4; // Encoder Values
private double v1, v2, v3, v4; // Velocities
// Cycle Times
double t1 = 0;
double t2 = 0;
double t3 = 0;
@Override
public void runOpMode() {
int cycles;
// Important Step 1: Make sure you use DcMotorEx when you instantiate your motors.
m1 = hardwareMap.get(DcMotorEx.class, "m1"); // Configure the robot to use these 4 motor names,
m2 = hardwareMap.get(DcMotorEx.class, "m2"); // or change these strings to match your existing Robot Configuration.
m3 = hardwareMap.get(DcMotorEx.class, "m3");
m4 = hardwareMap.get(DcMotorEx.class, "m4");
// Important Step 2: Get access to a list of Expansion Hub Modules to enable changing caching methods.
List<LynxModule> allHubs = hardwareMap.getAll(LynxModule.class);
ElapsedTime timer = new ElapsedTime();
telemetry.addData(">", "Press play to start tests");
telemetry.addData(">", "Test results will update for each access method.");
telemetry.update();
waitForStart();
// --------------------------------------------------------------------------------------
// Run control loop using legacy encoder reads
// In this mode, a single read is done for each encoder position, and a bulk read is done for each velocity read.
// This is the worst case scenario.
// This is the same as using LynxModule.BulkCachingMode.OFF
// --------------------------------------------------------------------------------------
displayCycleTimes("Test 1 of 3 (Wait for completion)");
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
e1 = m1.getCurrentPosition();
e2 = m2.getCurrentPosition();
e3 = m3.getCurrentPosition();
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t1 = timer.milliseconds() / cycles;
displayCycleTimes("Test 2 of 3 (Wait for completion)");
// --------------------------------------------------------------------------------------
// Run test cycles using AUTO cache mode
// In this mode, only one bulk read is done per cycle, UNLESS you read a specific encoder/velocity item AGAIN in that cycle.
// --------------------------------------------------------------------------------------
// Important Step 3: Option A. Set all Expansion hubs to use the AUTO Bulk Caching mode
for (LynxModule module : allHubs) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.AUTO);
}
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
e1 = m1.getCurrentPosition(); // Uses 1 bulk-read for all 4 encoder/velocity reads,
e2 = m2.getCurrentPosition(); // but don't do any `get` operations more than once per cycle.
e3 = m3.getCurrentPosition();
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t2 = timer.milliseconds() / cycles;
displayCycleTimes("Test 3 of 3 (Wait for completion)");
// --------------------------------------------------------------------------------------
// Run test cycles using MANUAL cache mode
// In this mode, only one block read is done each control cycle.
// This is the MOST efficient method, but it does require that the cache is cleared manually each control cycle.
// --------------------------------------------------------------------------------------
// Important Step 3: Option B. Set all Expansion hubs to use the MANUAL Bulk Caching mode
for (LynxModule module : allHubs) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.MANUAL);
}
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
// Important Step 4: If you are using MANUAL mode, you must clear the BulkCache once per control cycle
for (LynxModule module : allHubs) {
module.clearBulkCache();
}
e1 = m1.getCurrentPosition(); // Uses 1 bulk-read to obtain ALL the motor data
e2 = m2.getCurrentPosition(); // There is no penalty for doing more `get` operations in this cycle,
e3 = m3.getCurrentPosition(); // but they will return the same data.
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t3 = timer.milliseconds() / cycles;
displayCycleTimes("Complete");
// wait until op-mode is stopped by user, before clearing display.
while (opModeIsActive()) ;
}
// Display three comparison times.
void displayCycleTimes(String status) {
telemetry.addData("Testing", status);
telemetry.addData("Cache = OFF", "%5.1f mS/cycle", t1);
telemetry.addData("Cache = AUTO", "%5.1f mS/cycle", t2);
telemetry.addData("Cache = MANUAL", "%5.1f mS/cycle", t3);
telemetry.update();
}
}

View File

@ -0,0 +1,89 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* Demonstrates an empty iterative OpMode
*/
@TeleOp(name = "Concept: NullOp", group = "Concept")
@Disabled
public class ConceptNullOp extends OpMode {
private ElapsedTime runtime = new ElapsedTime();
/**
* This method will be called once, when the INIT button is pressed.
*/
@Override
public void init() {
telemetry.addData("Status", "Initialized");
}
/**
* This method will be called repeatedly during the period between when
* the init button is pressed and when the play button is pressed (or the
* OpMode is stopped).
*/
@Override
public void init_loop() {
}
/**
* This method will be called once, when the play button is pressed.
*/
@Override
public void start() {
runtime.reset();
}
/**
* This method will be called repeatedly during the period between when
* the play button is pressed and when the OpMode is stopped.
*/
@Override
public void loop() {
telemetry.addData("Status", "Run Time: " + runtime.toString());
}
/**
* This method will be called once, when this OpMode is stopped.
* <p>
* Your ability to control hardware from this method will be limited.
*/
@Override
public void stop() {
}
}

View File

@ -0,0 +1,114 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
/*
* This OpMode ramps a single motor speed up and down repeatedly until Stop is pressed.
* The code is structured as a LinearOpMode
*
* This code assumes a DC motor configured with the name "left_drive" as is found on a Robot.
*
* INCREMENT sets how much to increase/decrease the power each cycle
* CYCLE_MS sets the update period.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: Ramp Motor Speed", group = "Concept")
@Disabled
public class ConceptRampMotorSpeed extends LinearOpMode {
static final double INCREMENT = 0.01; // amount to ramp motor each CYCLE_MS cycle
static final int CYCLE_MS = 50; // period of each cycle
static final double MAX_FWD = 1.0; // Maximum FWD power applied to motor
static final double MAX_REV = -1.0; // Maximum REV power applied to motor
// Define class members
DcMotor motor;
double power = 0;
boolean rampUp = true;
@Override
public void runOpMode() {
// Connect to motor (Assume standard left wheel)
// Change the text in quotes to match any motor name on your robot.
motor = hardwareMap.get(DcMotor.class, "left_drive");
// Wait for the start button
telemetry.addData(">", "Press Start to run Motors." );
telemetry.update();
waitForStart();
// Ramp motor speeds till stop pressed.
while(opModeIsActive()) {
// Ramp the motors, according to the rampUp variable.
if (rampUp) {
// Keep stepping up until we hit the max value.
power += INCREMENT ;
if (power >= MAX_FWD ) {
power = MAX_FWD;
rampUp = !rampUp; // Switch ramp direction
}
}
else {
// Keep stepping down until we hit the min value.
power -= INCREMENT ;
if (power <= MAX_REV ) {
power = MAX_REV;
rampUp = !rampUp; // Switch ramp direction
}
}
// Display the current value
telemetry.addData("Motor Power", "%5.2f", power);
telemetry.addData(">", "Press Stop to end test." );
telemetry.update();
// Set the motor to the new power and pause;
motor.setPower(power);
sleep(CYCLE_MS);
idle();
}
// Turn off motor and signal done;
motor.setPower(0);
telemetry.addData(">", "Done");
telemetry.update();
}
}

View File

@ -0,0 +1,111 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/*
* This OpMode executes a basic Tank Drive Teleop for a two wheeled robot using two REV SPARKminis.
* To use this example, connect two REV SPARKminis into servo ports on the Expansion Hub. On the
* robot configuration, use the drop down list under 'Servos' to select 'REV SPARKmini Controller'
* and name them 'left_drive' and 'right_drive'.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="REV SPARKmini Simple Drive Example", group="Concept")
@Disabled
public class ConceptRevSPARKMini extends LinearOpMode {
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotorSimple leftDrive = null;
private DcMotorSimple rightDrive = null;
@Override
public void runOpMode() {
telemetry.addData("Status", "Initialized");
telemetry.update();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotorSimple.class, "left_drive");
rightDrive = hardwareMap.get(DcMotorSimple.class, "right_drive");
// Most robots need the motor on one side to be reversed to drive forward
// Reverse the motor that runs backward when connected directly to the battery
leftDrive.setDirection(DcMotorSimple.Direction.FORWARD);
rightDrive.setDirection(DcMotorSimple.Direction.REVERSE);
// Wait for the game to start (driver presses PLAY)
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
telemetry.update();
}
}
}

View File

@ -0,0 +1,115 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Servo;
/*
* This OpMode scans a single servo back and forward until Stop is pressed.
* The code is structured as a LinearOpMode
* INCREMENT sets how much to increase/decrease the servo position each cycle
* CYCLE_MS sets the update period.
*
* This code assumes a Servo configured with the name "left_hand" as is found on a Robot.
*
* NOTE: When any servo position is set, ALL attached servos are activated, so ensure that any other
* connected servos are able to move freely before running this test.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: Scan Servo", group = "Concept")
@Disabled
public class ConceptScanServo extends LinearOpMode {
static final double INCREMENT = 0.01; // amount to slew servo each CYCLE_MS cycle
static final int CYCLE_MS = 50; // period of each cycle
static final double MAX_POS = 1.0; // Maximum rotational position
static final double MIN_POS = 0.0; // Minimum rotational position
// Define class members
Servo servo;
double position = (MAX_POS - MIN_POS) / 2; // Start at halfway position
boolean rampUp = true;
@Override
public void runOpMode() {
// Connect to servo (Assume Robot Left Hand)
// Change the text in quotes to match any servo name on your robot.
servo = hardwareMap.get(Servo.class, "left_hand");
// Wait for the start button
telemetry.addData(">", "Press Start to scan Servo." );
telemetry.update();
waitForStart();
// Scan servo till stop pressed.
while(opModeIsActive()){
// slew the servo, according to the rampUp (direction) variable.
if (rampUp) {
// Keep stepping up until we hit the max value.
position += INCREMENT ;
if (position >= MAX_POS ) {
position = MAX_POS;
rampUp = !rampUp; // Switch ramp direction
}
}
else {
// Keep stepping down until we hit the min value.
position -= INCREMENT ;
if (position <= MIN_POS ) {
position = MIN_POS;
rampUp = !rampUp; // Switch ramp direction
}
}
// Display the current value
telemetry.addData("Servo Position", "%5.2f", position);
telemetry.addData(">", "Press Stop to end test." );
telemetry.update();
// Set the servo to the new position and pause;
servo.setPosition(position);
sleep(CYCLE_MS);
idle();
}
// Signal done;
telemetry.addData(">", "Done");
telemetry.update();
}
}

View File

@ -0,0 +1,133 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
/*
* This OpMode demonstrates how to play simple sounds on both the RC and DS phones.
* It illustrates how to build sounds into your application as a resource.
* This technique is best suited for use with Android Studio since it assumes you will be creating a new application
*
* If you are using OnBotJava, please see the ConceptSoundsOnBotJava sample
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* Operation:
*
* Gamepad X & B buttons are used to trigger sounds in this example, but any event can be used.
* Note: Time should be allowed for sounds to complete before playing other sounds.
*
* For sound files to be used as a compiled-in resource, they need to be located in a folder called "raw" under your "res" (resources) folder.
* You can create your own "raw" folder from scratch, or you can copy the one from the FtcRobotController module.
*
* Android Studio coders will ultimately need a folder in your path as follows:
* <project root>/TeamCode/src/main/res/raw
*
* Copy any .wav files you want to play into this folder.
* Make sure that your files ONLY use lower-case characters, and have no spaces or special characters other than underscore.
*
* The name you give your .wav files will become the resource ID for these sounds.
* eg: gold.wav becomes R.raw.gold
*
* If you wish to use the sounds provided for this sample, they are located in:
* <project root>/FtcRobotController/src/main/res/raw
* You can copy and paste the entire 'raw' folder using Android Studio.
*
*/
@TeleOp(name="Concept: Sound Resources", group="Concept")
@Disabled
public class ConceptSoundsASJava extends LinearOpMode {
// Declare OpMode members.
private boolean goldFound; // Sound file present flags
private boolean silverFound;
private boolean isX = false; // Gamepad button state variables
private boolean isB = false;
private boolean wasX = false; // Gamepad button history variables
private boolean WasB = false;
@Override
public void runOpMode() {
// Determine Resource IDs for sounds built into the RC application.
int silverSoundID = hardwareMap.appContext.getResources().getIdentifier("silver", "raw", hardwareMap.appContext.getPackageName());
int goldSoundID = hardwareMap.appContext.getResources().getIdentifier("gold", "raw", hardwareMap.appContext.getPackageName());
// Determine if sound resources are found.
// Note: Preloading is NOT required, but it's a good way to verify all your sounds are available before you run.
if (goldSoundID != 0)
goldFound = SoundPlayer.getInstance().preload(hardwareMap.appContext, goldSoundID);
if (silverSoundID != 0)
silverFound = SoundPlayer.getInstance().preload(hardwareMap.appContext, silverSoundID);
// Display sound status
telemetry.addData("gold resource", goldFound ? "Found" : "NOT found\n Add gold.wav to /src/main/res/raw" );
telemetry.addData("silver resource", silverFound ? "Found" : "Not found\n Add silver.wav to /src/main/res/raw" );
// Wait for the game to start (driver presses PLAY)
telemetry.addData(">", "Press Start to continue");
telemetry.update();
waitForStart();
telemetry.addData(">", "Press X, B to play sounds.");
telemetry.update();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// say Silver each time gamepad X is pressed (This sound is a resource)
if (silverFound && (isX = gamepad1.x) && !wasX) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, silverSoundID);
telemetry.addData("Playing", "Resource Silver");
telemetry.update();
}
// say Gold each time gamepad B is pressed (This sound is a resource)
if (goldFound && (isB = gamepad1.b) && !WasB) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, goldSoundID);
telemetry.addData("Playing", "Resource Gold");
telemetry.update();
}
// Save last button states
wasX = isX;
WasB = isB;
}
}
}

View File

@ -0,0 +1,120 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.io.File;
/*
* This OpMode demonstrates how to play simple sounds on both the RC and DS phones.
* It illustrates how to play sound files that have been copied to the RC Phone
* This technique is best suited for use with OnBotJava since it does not require the app to be modified.
*
* Operation:
*
* Gamepad X & B buttons are used to trigger sounds in this example, but any event can be used.
* Note: Time should be allowed for sounds to complete before playing other sounds.
*
* To play a new sound, you will need to copy the .wav files to the phone, and then provide the full path to them as part of your OpMode.
* This is done in this sample for the two sound files. silver.wav and gold.wav
*
* You can put the files in a variety of soundPaths, but we recommend you put them in the /FIRST/blocks/sounds folder.
* Your OpModes will have guaranteed access to this folder, and you can transfer files into this folder using the BLOCKS web page.
* -- There is a link called "sounds" on the right hand side of the color bar on the BLOCKS page that can be used to send sound files to this folder by default.
* Or you can use Windows File Manager, or ADB to transfer the sound files
*
* To get full use of THIS sample, you will need to copy two sound file called silver.wav and gold.wav to /FIRST/blocks/sounds on the RC phone.
* They can be located here:
* https://github.com/ftctechnh/ftc_app/tree/master/FtcRobotController/src/main/res/raw/gold.wav
* https://github.com/ftctechnh/ftc_app/tree/master/FtcRobotController/src/main/res/raw/silver.wav
*/
@TeleOp(name="Concept: Sound Files", group="Concept")
@Disabled
public class ConceptSoundsOnBotJava extends LinearOpMode {
// Point to sound files on the phone's drive
private String soundPath = "/FIRST/blocks/sounds";
private File goldFile = new File("/sdcard" + soundPath + "/gold.wav");
private File silverFile = new File("/sdcard" + soundPath + "/silver.wav");
// Declare OpMode members.
private boolean isX = false; // Gamepad button state variables
private boolean isB = false;
private boolean wasX = false; // Gamepad button history variables
private boolean WasB = false;
@Override
public void runOpMode() {
// Make sure that the sound files exist on the phone
boolean goldFound = goldFile.exists();
boolean silverFound = silverFile.exists();
// Display sound status
telemetry.addData("gold sound", goldFound ? "Found" : "NOT Found \nCopy gold.wav to " + soundPath );
telemetry.addData("silver sound", silverFound ? "Found" : "NOT Found \nCopy silver.wav to " + soundPath );
// Wait for the game to start (driver presses PLAY)
telemetry.addData(">", "Press Start to continue");
telemetry.update();
waitForStart();
telemetry.addData(">", "Press X or B to play sounds.");
telemetry.update();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// say Silver each time gamepad X is pressed (This sound is a resource)
if (silverFound && (isX = gamepad1.x) && !wasX) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, silverFile);
telemetry.addData("Playing", "Silver File");
telemetry.update();
}
// say Gold each time gamepad B is pressed (This sound is a resource)
if (goldFound && (isB = gamepad1.b) && !WasB) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, goldFile);
telemetry.addData("Playing", "Gold File");
telemetry.update();
}
// Save last button states
wasX = isX;
WasB = isB;
}
}
}

View File

@ -0,0 +1,122 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.content.Context;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
/*
* This OpMode demonstrates how to play one of the several SKYSTONE/Star Wars sounds loaded into the SDK.
* It does this by creating a simple "chooser" controlled by the gamepad Up Down buttons.
* This code also prevents sounds from stacking up by setting a "playing" flag, which is cleared when the sound finishes playing.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* Operation:
* Use the DPAD to change the selected sound, and the Right Bumper to play it.
*/
@TeleOp(name="SKYSTONE Sounds", group="Concept")
@Disabled
public class ConceptSoundsSKYSTONE extends LinearOpMode {
// List of available sound resources
String sounds[] = {"ss_alarm", "ss_bb8_down", "ss_bb8_up", "ss_darth_vader", "ss_fly_by",
"ss_mf_fail", "ss_laser", "ss_laser_burst", "ss_light_saber", "ss_light_saber_long", "ss_light_saber_short",
"ss_light_speed", "ss_mine", "ss_power_up", "ss_r2d2_up", "ss_roger_roger", "ss_siren", "ss_wookie" };
boolean soundPlaying = false;
@Override
public void runOpMode() {
// Variables for choosing from the available sounds
int soundIndex = 0;
int soundID = -1;
boolean was_dpad_up = false;
boolean was_dpad_down = false;
Context myApp = hardwareMap.appContext;
// create a sound parameter that holds the desired player parameters.
SoundPlayer.PlaySoundParams params = new SoundPlayer.PlaySoundParams();
params.loopControl = 0;
params.waitForNonLoopingSoundsToFinish = true;
// In this sample, we will skip waiting for the user to press play, and start displaying sound choices right away
while (!isStopRequested()) {
// Look for DPAD presses to change the selection
if (gamepad1.dpad_down && !was_dpad_down) {
// Go to next sound (with list wrap) and display it
soundIndex = (soundIndex + 1) % sounds.length;
}
if (gamepad1.dpad_up && !was_dpad_up) {
// Go to previous sound (with list wrap) and display it
soundIndex = (soundIndex + sounds.length - 1) % sounds.length;
}
// Look for trigger to see if we should play sound
// Only start a new sound if we are currently not playing one.
if (gamepad1.right_bumper && !soundPlaying) {
// Determine Resource IDs for the sounds you want to play, and make sure it's valid.
if ((soundID = myApp.getResources().getIdentifier(sounds[soundIndex], "raw", myApp.getPackageName())) != 0){
// Signal that the sound is now playing.
soundPlaying = true;
// Start playing, and also Create a callback that will clear the playing flag when the sound is complete.
SoundPlayer.getInstance().startPlaying(myApp, soundID, params, null,
new Runnable() {
public void run() {
soundPlaying = false;
}} );
}
}
// Remember the last state of the dpad to detect changes.
was_dpad_up = gamepad1.dpad_up;
was_dpad_down = gamepad1.dpad_down;
// Display the current sound choice, and the playing status.
telemetry.addData("", "Use DPAD up/down to choose sound.");
telemetry.addData("", "Press Right Bumper to play sound.");
telemetry.addData("", "");
telemetry.addData("Sound >", sounds[soundIndex]);
telemetry.addData("Status >", soundPlaying ? "Playing" : "Stopped");
telemetry.update();
}
}
}

View File

@ -0,0 +1,177 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.VoltageSensor;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/*
* This OpMode illustrates various ways in which telemetry can be
* transmitted from the robot controller to the driver station. The sample illustrates
* numeric and text data, formatted output, and optimized evaluation of expensive-to-acquire
* information. The telemetry log is illustrated by scrolling a poem
* to the driver station.
*
* Also see the Telemetry javadocs.
*/
@TeleOp(name = "Concept: Telemetry", group = "Concept")
@Disabled
public class ConceptTelemetry extends LinearOpMode {
/** Keeps track of the line of the poem which is to be emitted next */
int poemLine = 0;
/** Keeps track of how long it's been since we last emitted a line of poetry */
ElapsedTime poemElapsed = new ElapsedTime();
static final String[] poem = new String[] {
"Mary had a little lamb,",
"His fleece was white as snow,",
"And everywhere that Mary went,",
"The lamb was sure to go.",
"",
"He followed her to school one day,",
"Which was against the rule,",
"It made the children laugh and play",
"To see a lamb at school.",
"",
"And so the teacher turned it out,",
"But still it lingered near,",
"And waited patiently about,",
"Till Mary did appear.",
"",
"\"Why does the lamb love Mary so?\"",
"The eager children cry.",
"\"Why, Mary loves the lamb, you know,\"",
"The teacher did reply.",
"",
""
};
@Override public void runOpMode() {
/* we keep track of how long it's been since the OpMode was started, just
* to have some interesting data to show */
ElapsedTime opmodeRunTime = new ElapsedTime();
// We show the log in oldest-to-newest order, as that's better for poetry
telemetry.log().setDisplayOrder(Telemetry.Log.DisplayOrder.OLDEST_FIRST);
// We can control the number of lines shown in the log
telemetry.log().setCapacity(6);
// The interval between lines of poetry, in seconds
double sPoemInterval = 0.6;
/*
* Wait until we've been given the ok to go. For something to do, we emit the
* elapsed time as we sit here and wait. If we didn't want to do anything while
* we waited, we would just call waitForStart().
*/
while (!isStarted()) {
telemetry.addData("time", "%.1f seconds", opmodeRunTime.seconds());
telemetry.update();
idle();
}
// Ok, we've been given the ok to go
/*
* As an illustration, the first line on our telemetry display will display the battery voltage.
* The idea here is that it's expensive to compute the voltage (at least for purposes of illustration)
* so you don't want to do it unless the data is _actually_ going to make it to the
* driver station (recall that telemetry transmission is throttled to reduce bandwidth use.
* Note that getBatteryVoltage() below returns 'Infinity' if there's no voltage sensor attached.
*
* @see Telemetry#getMsTransmissionInterval()
*/
telemetry.addData("voltage", "%.1f volts", new Func<Double>() {
@Override public Double value() {
return getBatteryVoltage();
}
});
// Reset to keep some timing stats for the post-'start' part of the OpMode
opmodeRunTime.reset();
int loopCount = 1;
// Go go gadget robot!
while (opModeIsActive()) {
// Emit poetry if it's been a while
if (poemElapsed.seconds() > sPoemInterval) {
emitPoemLine();
}
// As an illustration, show some loop timing information
telemetry.addData("loop count", loopCount);
telemetry.addData("ms/loop", "%.3f ms", opmodeRunTime.milliseconds() / loopCount);
// Show joystick information as some other illustrative data
telemetry.addLine("left joystick | ")
.addData("x", gamepad1.left_stick_x)
.addData("y", gamepad1.left_stick_y);
telemetry.addLine("right joystick | ")
.addData("x", gamepad1.right_stick_x)
.addData("y", gamepad1.right_stick_y);
/*
* Transmit the telemetry to the driver station, subject to throttling.
* See the documentation for Telemetry.getMsTransmissionInterval() for more information.
*/
telemetry.update();
// Update loop info
loopCount++;
}
}
// emits a line of poetry to the telemetry log
void emitPoemLine() {
telemetry.log().add(poem[poemLine]);
poemLine = (poemLine+1) % poem.length;
poemElapsed.reset();
}
// Computes the current battery voltage
double getBatteryVoltage() {
double result = Double.POSITIVE_INFINITY;
for (VoltageSensor sensor : hardwareMap.voltageSensor) {
double voltage = sensor.getVoltage();
if (voltage > 0) {
result = Math.min(result, voltage);
}
}
return result;
}
}

View File

@ -0,0 +1,199 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection,
* including Java Builder structures for specifying Vision parameters.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetection extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
// TFOD_MODEL_ASSET points to a model file stored in the project Asset location,
// this is only used for Android Studio when using models in Assets.
private static final String TFOD_MODEL_ASSET = "MyModelStoredAsAsset.tflite";
// TFOD_MODEL_FILE points to a model file stored onboard the Robot Controller's storage,
// this is used when uploading models directly to the RC using the model upload interface.
private static final String TFOD_MODEL_FILE = "/sdcard/FIRST/tflitemodels/myCustomModel.tflite";
// Define the labels recognized in the model for TFOD (must be in training order!)
private static final String[] LABELS = {
"Pixel",
};
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor by using a builder.
tfod = new TfodProcessor.Builder()
// With the following lines commented out, the default TfodProcessor Builder
// will load the default model for the season. To define a custom model to load,
// choose one of the following:
// Use setModelAssetName() if the custom TF Model is built in as an asset (AS only).
// Use setModelFileName() if you have downloaded a custom team model to the Robot Controller.
//.setModelAssetName(TFOD_MODEL_ASSET)
//.setModelFileName(TFOD_MODEL_FILE)
// The following default settings are available to un-comment and edit as needed to
// set parameters for custom models.
//.setModelLabels(LABELS)
//.setIsModelTensorFlow2(true)
//.setIsModelQuantized(true)
//.setModelInputSize(300)
//.setModelAspectRatio(16.0 / 9.0)
.build();
// Create the vision portal by using a builder.
VisionPortal.Builder builder = new VisionPortal.Builder();
// Set the camera (webcam vs. built-in RC phone camera).
if (USE_WEBCAM) {
builder.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"));
} else {
builder.setCamera(BuiltinCameraDirection.BACK);
}
// Choose a camera resolution. Not all cameras support all resolutions.
//builder.setCameraResolution(new Size(640, 480));
// Enable the RC preview (LiveView). Set "false" to omit camera monitoring.
//builder.enableLiveView(true);
// Set the stream format; MJPEG uses less bandwidth than default YUY2.
//builder.setStreamFormat(VisionPortal.StreamFormat.YUY2);
// Choose whether or not LiveView stops if no processors are enabled.
// If set "true", monitor shows solid orange screen if no processors enabled.
// If set "false", monitor shows camera view without annotations.
//builder.setAutoStopLiveView(false);
// Set and enable the processor.
builder.addProcessor(tfod);
// Build the Vision Portal, using the above settings.
visionPortal = builder.build();
// Set confidence threshold for TFOD recognitions, at any time.
//tfod.setMinResultConfidence(0.75f);
// Disable or re-enable the TFOD processor at any time.
//visionPortal.setProcessorEnabled(tfod, true);
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -0,0 +1,142 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* the easiest way.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Easy", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionEasy extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor the easy way.
tfod = TfodProcessor.easyCreateWithDefaults();
// Create the vision portal the easy way.
if (USE_WEBCAM) {
visionPortal = VisionPortal.easyCreateWithDefaults(
hardwareMap.get(WebcamName.class, "Webcam 1"), tfod);
} else {
visionPortal = VisionPortal.easyCreateWithDefaults(
BuiltinCameraDirection.BACK, tfod);
}
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -0,0 +1,186 @@
/* Copyright (c) 2020 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.VisionPortal.CameraState;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* two webcams.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Switchable Cameras", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionSwitchableCameras extends LinearOpMode {
/**
* Variables used for switching cameras.
*/
private WebcamName webcam1, webcam2;
private boolean oldLeftBumper;
private boolean oldRightBumper;
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryCameraSwitching();
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
doCameraSwitching();
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor by using a builder.
tfod = new TfodProcessor.Builder().build();
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
CameraName switchableCamera = ClassFactory.getInstance()
.getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
// Create the vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(switchableCamera)
.addProcessor(tfod)
.build();
} // end method initTfod()
/**
* Add telemetry about camera switching.
*/
private void telemetryCameraSwitching() {
if (visionPortal.getActiveCamera().equals(webcam1)) {
telemetry.addData("activeCamera", "Webcam 1");
telemetry.addData("Press RightBumper", "to switch to Webcam 2");
} else {
telemetry.addData("activeCamera", "Webcam 2");
telemetry.addData("Press LeftBumper", "to switch to Webcam 1");
}
} // end method telemetryCameraSwitching()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
/**
* Set the active camera according to input from the gamepad.
*/
private void doCameraSwitching() {
if (visionPortal.getCameraState() == CameraState.STREAMING) {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
visionPortal.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
visionPortal.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
}
} // end method doCameraSwitching()
} // end class

View File

@ -0,0 +1,187 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* This OpMode illustrates the concept of driving a path based on encoder counts.
* The code is structured as a LinearOpMode
*
* The code REQUIRES that you DO have encoders on the wheels,
* otherwise you would use: RobotAutoDriveByTime;
*
* This code ALSO requires that the drive Motors have been configured such that a positive
* power command moves them forward, and causes the encoders to count UP.
*
* The desired path in this example is:
* - Drive forward for 48 inches
* - Spin right for 12 Inches
* - Drive Backward for 24 inches
* - Stop and close the claw.
*
* The code is written using a method called: encoderDrive(speed, leftInches, rightInches, timeoutS)
* that performs the actual movement.
* This method assumes that each movement is relative to the last stopping place.
* There are other ways to perform encoder based moves, but this method is probably the simplest.
* This code uses the RUN_TO_POSITION mode to enable the Motor controllers to generate the run profile
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive By Encoder", group="Robot")
@Disabled
public class RobotAutoDriveByEncoder_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private ElapsedTime runtime = new ElapsedTime();
// Calculate the COUNTS_PER_INCH for your specific drive train.
// Go to your motor vendor website to determine your motor's COUNTS_PER_MOTOR_REV
// For external drive gearing, set DRIVE_GEAR_REDUCTION as needed.
// For example, use a value of 2.0 for a 12-tooth spur gear driving a 24-tooth spur gear.
// This is gearing DOWN for less speed and more torque.
// For gearing UP, use a gear ratio less than 1.0. Note this will affect the direction of wheel rotation.
static final double COUNTS_PER_MOTOR_REV = 1440 ; // eg: TETRIX Motor Encoder
static final double DRIVE_GEAR_REDUCTION = 1.0 ; // No External Gearing.
static final double WHEEL_DIAMETER_INCHES = 4.0 ; // For figuring circumference
static final double COUNTS_PER_INCH = (COUNTS_PER_MOTOR_REV * DRIVE_GEAR_REDUCTION) /
(WHEEL_DIAMETER_INCHES * 3.1415);
static final double DRIVE_SPEED = 0.6;
static final double TURN_SPEED = 0.5;
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
leftDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
rightDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Send telemetry message to indicate successful Encoder reset
telemetry.addData("Starting at", "%7d :%7d",
leftDrive.getCurrentPosition(),
rightDrive.getCurrentPosition());
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// Step through each leg of the path,
// Note: Reverse movement is obtained by setting a negative distance (not speed)
encoderDrive(DRIVE_SPEED, 48, 48, 5.0); // S1: Forward 47 Inches with 5 Sec timeout
encoderDrive(TURN_SPEED, 12, -12, 4.0); // S2: Turn Right 12 Inches with 4 Sec timeout
encoderDrive(DRIVE_SPEED, -24, -24, 4.0); // S3: Reverse 24 Inches with 4 Sec timeout
telemetry.addData("Path", "Complete");
telemetry.update();
sleep(1000); // pause to display final telemetry message.
}
/*
* Method to perform a relative move, based on encoder counts.
* Encoders are not reset as the move is based on the current position.
* Move will stop if any of three conditions occur:
* 1) Move gets to the desired position
* 2) Move runs out of time
* 3) Driver stops the OpMode running.
*/
public void encoderDrive(double speed,
double leftInches, double rightInches,
double timeoutS) {
int newLeftTarget;
int newRightTarget;
// Ensure that the OpMode is still active
if (opModeIsActive()) {
// Determine new target position, and pass to motor controller
newLeftTarget = leftDrive.getCurrentPosition() + (int)(leftInches * COUNTS_PER_INCH);
newRightTarget = rightDrive.getCurrentPosition() + (int)(rightInches * COUNTS_PER_INCH);
leftDrive.setTargetPosition(newLeftTarget);
rightDrive.setTargetPosition(newRightTarget);
// Turn On RUN_TO_POSITION
leftDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
rightDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
// reset the timeout time and start motion.
runtime.reset();
leftDrive.setPower(Math.abs(speed));
rightDrive.setPower(Math.abs(speed));
// keep looping while we are still active, and there is time left, and both motors are running.
// Note: We use (isBusy() && isBusy()) in the loop test, which means that when EITHER motor hits
// its target position, the motion will stop. This is "safer" in the event that the robot will
// always end the motion as soon as possible.
// However, if you require that BOTH motors have finished their moves before the robot continues
// onto the next step, use (isBusy() || isBusy()) in the loop test.
while (opModeIsActive() &&
(runtime.seconds() < timeoutS) &&
(leftDrive.isBusy() && rightDrive.isBusy())) {
// Display it for the driver.
telemetry.addData("Running to", " %7d :%7d", newLeftTarget, newRightTarget);
telemetry.addData("Currently at", " at %7d :%7d",
leftDrive.getCurrentPosition(), rightDrive.getCurrentPosition());
telemetry.update();
}
// Stop all motion;
leftDrive.setPower(0);
rightDrive.setPower(0);
// Turn off RUN_TO_POSITION
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
sleep(250); // optional pause after each move.
}
}
}

View File

@ -0,0 +1,429 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.IMU;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/*
* This OpMode illustrates the concept of driving an autonomous path based on Gyro (IMU) heading and encoder counts.
* The code is structured as a LinearOpMode
*
* The path to be followed by the robot is built from a series of drive, turn or pause steps.
* Each step on the path is defined by a single function call, and these can be strung together in any order.
*
* The code REQUIRES that you have encoders on the drive motors, otherwise you should use: RobotAutoDriveByTime;
*
* This code uses the Universal IMU interface so it will work with either the BNO055, or BHI260 IMU.
* To run as written, the Control/Expansion hub should be mounted horizontally on a flat part of the robot chassis.
* The REV Logo should be facing UP, and the USB port should be facing forward.
* If this is not the configuration of your REV Control Hub, then the code should be modified to reflect the correct orientation.
*
* This sample requires that the drive Motors have been configured with names : left_drive and right_drive.
* It also requires that a positive power command moves both motors forward, and causes the encoders to count UP.
* So please verify that both of your motors move the robot forward on the first move. If not, make the required correction.
* See the beginning of runOpMode() to set the FORWARD/REVERSE option for each motor.
*
* This code uses RUN_TO_POSITION mode for driving straight, and RUN_USING_ENCODER mode for turning and holding.
* Note: This code implements the requirement of calling setTargetPosition() at least once before switching to RUN_TO_POSITION mode.
*
* Notes:
*
* All angles are referenced to the coordinate-frame that is set whenever resetHeading() is called.
* In this sample, the heading is reset when the Start button is touched on the Driver station.
* Note: It would be possible to reset the heading after each move, but this would accumulate steering errors.
*
* The angle of movement/rotation is assumed to be a standardized rotation around the robot Z axis,
* which means that a Positive rotation is Counter Clockwise, looking down on the field.
* This is consistent with the FTC field coordinate conventions set out in the document:
* https://ftc-docs.firstinspires.org/field-coordinate-system
*
* Control Approach.
*
* To reach, or maintain a required heading, this code implements a basic Proportional Controller where:
*
* Steering power = Heading Error * Proportional Gain.
*
* "Heading Error" is calculated by taking the difference between the desired heading and the actual heading,
* and then "normalizing" it by converting it to a value in the +/- 180 degree range.
*
* "Proportional Gain" is a constant that YOU choose to set the "strength" of the steering response.
*
* Use Android Studio to Copy this Class, and Paste it into your "TeamCode" folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive By Gyro", group="Robot")
@Disabled
public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private IMU imu = null; // Control/Expansion Hub IMU
private double headingError = 0;
// These variable are declared here (as class members) so they can be updated in various methods,
// but still be displayed by sendTelemetry()
private double targetHeading = 0;
private double driveSpeed = 0;
private double turnSpeed = 0;
private double leftSpeed = 0;
private double rightSpeed = 0;
private int leftTarget = 0;
private int rightTarget = 0;
// Calculate the COUNTS_PER_INCH for your specific drive train.
// Go to your motor vendor website to determine your motor's COUNTS_PER_MOTOR_REV
// For external drive gearing, set DRIVE_GEAR_REDUCTION as needed.
// For example, use a value of 2.0 for a 12-tooth spur gear driving a 24-tooth spur gear.
// This is gearing DOWN for less speed and more torque.
// For gearing UP, use a gear ratio less than 1.0. Note this will affect the direction of wheel rotation.
static final double COUNTS_PER_MOTOR_REV = 537.7 ; // eg: GoBILDA 312 RPM Yellow Jacket
static final double DRIVE_GEAR_REDUCTION = 1.0 ; // No External Gearing.
static final double WHEEL_DIAMETER_INCHES = 4.0 ; // For figuring circumference
static final double COUNTS_PER_INCH = (COUNTS_PER_MOTOR_REV * DRIVE_GEAR_REDUCTION) /
(WHEEL_DIAMETER_INCHES * 3.1415);
// These constants define the desired driving/control characteristics
// They can/should be tweaked to suit the specific robot drive train.
static final double DRIVE_SPEED = 0.4; // Max driving speed for better distance accuracy.
static final double TURN_SPEED = 0.2; // Max Turn speed to limit turn rate
static final double HEADING_THRESHOLD = 1.0 ; // How close must the heading get to the target before moving to next step.
// Requiring more accuracy (a smaller number) will often make the turn take longer to get into the final position.
// Define the Proportional control coefficient (or GAIN) for "heading control".
// We define one value when Turning (larger errors), and the other is used when Driving straight (smaller errors).
// Increase these numbers if the heading does not corrects strongly enough (eg: a heavy robot or using tracks)
// Decrease these numbers if the heading does not settle on the correct value (eg: very agile robot with omni wheels)
static final double P_TURN_GAIN = 0.02; // Larger is more responsive, but also less stable
static final double P_DRIVE_GAIN = 0.03; // Larger is more responsive, but also less stable
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
/* The next two lines define Hub orientation.
* The Default Orientation (shown) is when a hub is mounted horizontally with the printed logo pointing UP and the USB port pointing FORWARD.
*
* To Do: EDIT these two lines to match YOUR mounting configuration.
*/
RevHubOrientationOnRobot.LogoFacingDirection logoDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
RevHubOrientationOnRobot.UsbFacingDirection usbDirection = RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logoDirection, usbDirection);
// Now initialize the IMU with this mounting orientation
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Ensure the robot is stationary. Reset the encoders and set the motors to BRAKE mode
leftDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
rightDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
leftDrive.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
rightDrive.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
// Wait for the game to start (Display Gyro value while waiting)
while (opModeInInit()) {
telemetry.addData(">", "Robot Heading = %4.0f", getHeading());
telemetry.update();
}
// Set the encoders for closed loop speed control, and reset the heading.
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
imu.resetYaw();
// Step through each leg of the path,
// Notes: Reverse movement is obtained by setting a negative distance (not speed)
// holdHeading() is used after turns to let the heading stabilize
// Add a sleep(2000) after any step to keep the telemetry data visible for review
driveStraight(DRIVE_SPEED, 24.0, 0.0); // Drive Forward 24"
turnToHeading( TURN_SPEED, -45.0); // Turn CW to -45 Degrees
holdHeading( TURN_SPEED, -45.0, 0.5); // Hold -45 Deg heading for a 1/2 second
driveStraight(DRIVE_SPEED, 17.0, -45.0); // Drive Forward 17" at -45 degrees (12"x and 12"y)
turnToHeading( TURN_SPEED, 45.0); // Turn CCW to 45 Degrees
holdHeading( TURN_SPEED, 45.0, 0.5); // Hold 45 Deg heading for a 1/2 second
driveStraight(DRIVE_SPEED, 17.0, 45.0); // Drive Forward 17" at 45 degrees (-12"x and 12"y)
turnToHeading( TURN_SPEED, 0.0); // Turn CW to 0 Degrees
holdHeading( TURN_SPEED, 0.0, 1.0); // Hold 0 Deg heading for 1 second
driveStraight(DRIVE_SPEED,-48.0, 0.0); // Drive in Reverse 48" (should return to approx. staring position)
telemetry.addData("Path", "Complete");
telemetry.update();
sleep(1000); // Pause to display last telemetry message.
}
/*
* ====================================================================================================
* Driving "Helper" functions are below this line.
* These provide the high and low level methods that handle driving straight and turning.
* ====================================================================================================
*/
// ********** HIGH Level driving functions. ********************
/**
* Drive in a straight line, on a fixed compass heading (angle), based on encoder counts.
* Move will stop if either of these conditions occur:
* 1) Move gets to the desired position
* 2) Driver stops the OpMode running.
*
* @param maxDriveSpeed MAX Speed for forward/rev motion (range 0 to +1.0) .
* @param distance Distance (in inches) to move from current position. Negative distance means move backward.
* @param heading Absolute Heading Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from the current robotHeading.
*/
public void driveStraight(double maxDriveSpeed,
double distance,
double heading) {
// Ensure that the OpMode is still active
if (opModeIsActive()) {
// Determine new target position, and pass to motor controller
int moveCounts = (int)(distance * COUNTS_PER_INCH);
leftTarget = leftDrive.getCurrentPosition() + moveCounts;
rightTarget = rightDrive.getCurrentPosition() + moveCounts;
// Set Target FIRST, then turn on RUN_TO_POSITION
leftDrive.setTargetPosition(leftTarget);
rightDrive.setTargetPosition(rightTarget);
leftDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
rightDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
// Set the required driving speed (must be positive for RUN_TO_POSITION)
// Start driving straight, and then enter the control loop
maxDriveSpeed = Math.abs(maxDriveSpeed);
moveRobot(maxDriveSpeed, 0);
// keep looping while we are still active, and BOTH motors are running.
while (opModeIsActive() &&
(leftDrive.isBusy() && rightDrive.isBusy())) {
// Determine required steering to keep on heading
turnSpeed = getSteeringCorrection(heading, P_DRIVE_GAIN);
// if driving in reverse, the motor correction also needs to be reversed
if (distance < 0)
turnSpeed *= -1.0;
// Apply the turning correction to the current driving speed.
moveRobot(driveSpeed, turnSpeed);
// Display drive status for the driver.
sendTelemetry(true);
}
// Stop all motion & Turn off RUN_TO_POSITION
moveRobot(0, 0);
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
}
}
/**
* Spin on the central axis to point in a new direction.
* <p>
* Move will stop if either of these conditions occur:
* <p>
* 1) Move gets to the heading (angle)
* <p>
* 2) Driver stops the OpMode running.
*
* @param maxTurnSpeed Desired MAX speed of turn. (range 0 to +1.0)
* @param heading Absolute Heading Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from current heading.
*/
public void turnToHeading(double maxTurnSpeed, double heading) {
// Run getSteeringCorrection() once to pre-calculate the current error
getSteeringCorrection(heading, P_DRIVE_GAIN);
// keep looping while we are still active, and not on heading.
while (opModeIsActive() && (Math.abs(headingError) > HEADING_THRESHOLD)) {
// Determine required steering to keep on heading
turnSpeed = getSteeringCorrection(heading, P_TURN_GAIN);
// Clip the speed to the maximum permitted value.
turnSpeed = Range.clip(turnSpeed, -maxTurnSpeed, maxTurnSpeed);
// Pivot in place by applying the turning correction
moveRobot(0, turnSpeed);
// Display drive status for the driver.
sendTelemetry(false);
}
// Stop all motion;
moveRobot(0, 0);
}
/**
* Obtain & hold a heading for a finite amount of time
* <p>
* Move will stop once the requested time has elapsed
* <p>
* This function is useful for giving the robot a moment to stabilize it's heading between movements.
*
* @param maxTurnSpeed Maximum differential turn speed (range 0 to +1.0)
* @param heading Absolute Heading Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from current heading.
* @param holdTime Length of time (in seconds) to hold the specified heading.
*/
public void holdHeading(double maxTurnSpeed, double heading, double holdTime) {
ElapsedTime holdTimer = new ElapsedTime();
holdTimer.reset();
// keep looping while we have time remaining.
while (opModeIsActive() && (holdTimer.time() < holdTime)) {
// Determine required steering to keep on heading
turnSpeed = getSteeringCorrection(heading, P_TURN_GAIN);
// Clip the speed to the maximum permitted value.
turnSpeed = Range.clip(turnSpeed, -maxTurnSpeed, maxTurnSpeed);
// Pivot in place by applying the turning correction
moveRobot(0, turnSpeed);
// Display drive status for the driver.
sendTelemetry(false);
}
// Stop all motion;
moveRobot(0, 0);
}
// ********** LOW Level driving functions. ********************
/**
* Use a Proportional Controller to determine how much steering correction is required.
*
* @param desiredHeading The desired absolute heading (relative to last heading reset)
* @param proportionalGain Gain factor applied to heading error to obtain turning power.
* @return Turning power needed to get to required heading.
*/
public double getSteeringCorrection(double desiredHeading, double proportionalGain) {
targetHeading = desiredHeading; // Save for telemetry
// Determine the heading current error
headingError = targetHeading - getHeading();
// Normalize the error to be within +/- 180 degrees
while (headingError > 180) headingError -= 360;
while (headingError <= -180) headingError += 360;
// Multiply the error by the gain to determine the required steering correction/ Limit the result to +/- 1.0
return Range.clip(headingError * proportionalGain, -1, 1);
}
/**
* Take separate drive (fwd/rev) and turn (right/left) requests,
* combines them, and applies the appropriate speed commands to the left and right wheel motors.
* @param drive forward motor speed
* @param turn clockwise turning motor speed.
*/
public void moveRobot(double drive, double turn) {
driveSpeed = drive; // save this value as a class member so it can be used by telemetry.
turnSpeed = turn; // save this value as a class member so it can be used by telemetry.
leftSpeed = drive - turn;
rightSpeed = drive + turn;
// Scale speeds down if either one exceeds +/- 1.0;
double max = Math.max(Math.abs(leftSpeed), Math.abs(rightSpeed));
if (max > 1.0)
{
leftSpeed /= max;
rightSpeed /= max;
}
leftDrive.setPower(leftSpeed);
rightDrive.setPower(rightSpeed);
}
/**
* Display the various control parameters while driving
*
* @param straight Set to true if we are driving straight, and the encoder positions should be included in the telemetry.
*/
private void sendTelemetry(boolean straight) {
if (straight) {
telemetry.addData("Motion", "Drive Straight");
telemetry.addData("Target Pos L:R", "%7d:%7d", leftTarget, rightTarget);
telemetry.addData("Actual Pos L:R", "%7d:%7d", leftDrive.getCurrentPosition(),
rightDrive.getCurrentPosition());
} else {
telemetry.addData("Motion", "Turning");
}
telemetry.addData("Heading- Target : Current", "%5.2f : %5.0f", targetHeading, getHeading());
telemetry.addData("Error : Steer Pwr", "%5.1f : %5.1f", headingError, turnSpeed);
telemetry.addData("Wheel Speeds L : R", "%5.2f : %5.2f", leftSpeed, rightSpeed);
telemetry.update();
}
/**
* read the Robot heading directly from the IMU (in degrees)
*/
public double getHeading() {
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
return orientation.getYaw(AngleUnit.DEGREES);
}
}

View File

@ -0,0 +1,128 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* This OpMode illustrates the concept of driving a path based on time.
* The code is structured as a LinearOpMode
*
* The code assumes that you do NOT have encoders on the wheels,
* otherwise you would use: RobotAutoDriveByEncoder;
*
* The desired path in this example is:
* - Drive forward for 3 seconds
* - Spin right for 1.3 seconds
* - Drive Backward for 1 Second
*
* The code is written in a simple form with no optimizations.
* However, there are several ways that this type of sequence could be streamlined,
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive By Time", group="Robot")
@Disabled
public class RobotAutoDriveByTime_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private ElapsedTime runtime = new ElapsedTime();
static final double FORWARD_SPEED = 0.6;
static final double TURN_SPEED = 0.5;
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Ready to run"); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// Step through each leg of the path, ensuring that the Auto mode has not been stopped along the way
// Step 1: Drive forward for 3 seconds
leftDrive.setPower(FORWARD_SPEED);
rightDrive.setPower(FORWARD_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 3.0)) {
telemetry.addData("Path", "Leg 1: %4.1f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 2: Spin right for 1.3 seconds
leftDrive.setPower(TURN_SPEED);
rightDrive.setPower(-TURN_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 1.3)) {
telemetry.addData("Path", "Leg 2: %4.1f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 3: Drive Backward for 1 Second
leftDrive.setPower(-FORWARD_SPEED);
rightDrive.setPower(-FORWARD_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 1.0)) {
telemetry.addData("Path", "Leg 3: %4.1f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 4: Stop
leftDrive.setPower(0);
rightDrive.setPower(0);
telemetry.addData("Path", "Complete");
telemetry.update();
sleep(1000);
}
}

View File

@ -0,0 +1,321 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates using a camera to locate and drive towards a specific AprilTag.
* The code assumes a Holonomic (Mecanum or X Drive) Robot.
*
* For an introduction to AprilTags, see the ftc-docs link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* The drive goal is to rotate to keep the Tag centered in the camera, while strafing to be directly in front of the tag, and
* driving towards the tag to achieve the desired distance.
* To reduce any motion blur (which will interrupt the detection process) the Camera exposure is reduced to a very low value (5mS)
* You can determine the best Exposure and Gain values by using the ConceptAprilTagOptimizeExposure OpMode in this Samples folder.
*
* The code assumes a Robot Configuration with motors named: leftfront_drive and rightfront_drive, leftback_drive and rightback_drive.
* The motor directions must be set so a positive power goes forward on all wheels.
* This sample assumes that the current game AprilTag Library (usually for the current season) is being loaded by default,
* so you should choose to approach a valid tag ID (usually starting at 0)
*
* Under manual control, the left stick will move forward/back & left/right. The right stick will rotate the robot.
* Manually drive the robot until it displays Target data on the Driver Station.
*
* Press and hold the *Left Bumper* to enable the automatic "Drive to target" mode.
* Release the Left Bumper to return to manual driving mode.
*
* Under "Drive To Target" mode, the robot has three goals:
* 1) Turn the robot to always keep the Tag centered on the camera frame. (Use the Target Bearing to turn the robot.)
* 2) Strafe the robot towards the centerline of the Tag, so it approaches directly in front of the tag. (Use the Target Yaw to strafe the robot)
* 3) Drive towards the Tag to get to the desired distance. (Use Tag Range to drive the robot forward/backward)
*
* Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
* Speed and Turn sensitivity can be adjusted using the SPEED_GAIN, STRAFE_GAIN and TURN_GAIN constants.
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
*/
@TeleOp(name="Omni Drive To AprilTag", group = "Concept")
@Disabled
public class RobotAutoDriveToAprilTagOmni extends LinearOpMode
{
// Adjust these numbers to suit your robot.
final double DESIRED_DISTANCE = 12.0; // this is how close the camera should get to the target (inches)
// Set the GAIN constants to control the relationship between the measured position error, and how much power is
// applied to the drive motors to correct the error.
// Drive = Error * Gain Make these values smaller for smoother control, or larger for a more aggressive response.
final double SPEED_GAIN = 0.02 ; // Forward Speed Control "Gain". eg: Ramp up to 50% power at a 25 inch error. (0.50 / 25.0)
final double STRAFE_GAIN = 0.015 ; // Strafe Speed Control "Gain". eg: Ramp up to 25% power at a 25 degree Yaw error. (0.25 / 25.0)
final double TURN_GAIN = 0.01 ; // Turn Control "Gain". eg: Ramp up to 25% power at a 25 degree error. (0.25 / 25.0)
final double MAX_AUTO_SPEED = 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_STRAFE= 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_TURN = 0.3; // Clip the turn speed to this max value (adjust for your robot)
private DcMotor leftFrontDrive = null; // Used to control the left front drive wheel
private DcMotor rightFrontDrive = null; // Used to control the right front drive wheel
private DcMotor leftBackDrive = null; // Used to control the left back drive wheel
private DcMotor rightBackDrive = null; // Used to control the right back drive wheel
private static final boolean USE_WEBCAM = true; // Set true to use a webcam, or false for a phone camera
private static final int DESIRED_TAG_ID = -1; // Choose the tag you want to approach or set to -1 for ANY tag.
private VisionPortal visionPortal; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private AprilTagDetection desiredTag = null; // Used to hold the data for a detected AprilTag
@Override public void runOpMode()
{
boolean targetFound = false; // Set to true when an AprilTag target is detected
double drive = 0; // Desired forward power/speed (-1 to +1)
double strafe = 0; // Desired strafe power/speed (-1 to +1)
double turn = 0; // Desired turning power/speed (-1 to +1)
// Initialize the Apriltag Detection process
initAprilTag();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must match the names assigned during the robot configuration.
// step (using the FTC Robot Controller app on the phone).
leftFrontDrive = hardwareMap.get(DcMotor.class, "leftfront_drive");
rightFrontDrive = hardwareMap.get(DcMotor.class, "rightfront_drive");
leftBackDrive = hardwareMap.get(DcMotor.class, "leftback_drive");
rightBackDrive = hardwareMap.get(DcMotor.class, "rightback_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftFrontDrive.setDirection(DcMotor.Direction.REVERSE);
leftBackDrive.setDirection(DcMotor.Direction.REVERSE);
rightFrontDrive.setDirection(DcMotor.Direction.FORWARD);
rightBackDrive.setDirection(DcMotor.Direction.FORWARD);
if (USE_WEBCAM)
setManualExposure(6, 250); // Use low exposure time to reduce motion blur
// Wait for driver to press start
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
targetFound = false;
desiredTag = null;
// Step through the list of detected tags and look for a matching tag
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
for (AprilTagDetection detection : currentDetections) {
// Look to see if we have size info on this tag.
if (detection.metadata != null) {
// Check to see if we want to track towards this tag.
if ((DESIRED_TAG_ID < 0) || (detection.id == DESIRED_TAG_ID)) {
// Yes, we want to use this tag.
targetFound = true;
desiredTag = detection;
break; // don't look any further.
} else {
// This tag is in the library, but we do not want to track it right now.
telemetry.addData("Skipping", "Tag ID %d is not desired", detection.id);
}
} else {
// This tag is NOT in the library, so we don't have enough information to track to it.
telemetry.addData("Unknown", "Tag ID %d is not in TagLibrary", detection.id);
}
}
// Tell the driver what we see, and what to do.
if (targetFound) {
telemetry.addData("\n>","HOLD Left-Bumper to Drive to Target\n");
telemetry.addData("Found", "ID %d (%s)", desiredTag.id, desiredTag.metadata.name);
telemetry.addData("Range", "%5.1f inches", desiredTag.ftcPose.range);
telemetry.addData("Bearing","%3.0f degrees", desiredTag.ftcPose.bearing);
telemetry.addData("Yaw","%3.0f degrees", desiredTag.ftcPose.yaw);
} else {
telemetry.addData("\n>","Drive using joysticks to find valid target\n");
}
// If Left Bumper is being pressed, AND we have found the desired target, Drive to target Automatically .
if (gamepad1.left_bumper && targetFound) {
// Determine heading, range and Yaw (tag image rotation) error so we can use them to control the robot automatically.
double rangeError = (desiredTag.ftcPose.range - DESIRED_DISTANCE);
double headingError = desiredTag.ftcPose.bearing;
double yawError = desiredTag.ftcPose.yaw;
// Use the speed and turn "gains" to calculate how we want the robot to move.
drive = Range.clip(rangeError * SPEED_GAIN, -MAX_AUTO_SPEED, MAX_AUTO_SPEED);
turn = Range.clip(headingError * TURN_GAIN, -MAX_AUTO_TURN, MAX_AUTO_TURN) ;
strafe = Range.clip(-yawError * STRAFE_GAIN, -MAX_AUTO_STRAFE, MAX_AUTO_STRAFE);
telemetry.addData("Auto","Drive %5.2f, Strafe %5.2f, Turn %5.2f ", drive, strafe, turn);
} else {
// drive using manual POV Joystick mode. Slow things down to make the robot more controlable.
drive = -gamepad1.left_stick_y / 2.0; // Reduce drive rate to 50%.
strafe = -gamepad1.left_stick_x / 2.0; // Reduce strafe rate to 50%.
turn = -gamepad1.right_stick_x / 3.0; // Reduce turn rate to 33%.
telemetry.addData("Manual","Drive %5.2f, Strafe %5.2f, Turn %5.2f ", drive, strafe, turn);
}
telemetry.update();
// Apply desired axes motions to the drivetrain.
moveRobot(drive, strafe, turn);
sleep(10);
}
}
/**
* Move robot according to desired axes motions
* <p>
* Positive X is forward
* <p>
* Positive Y is strafe left
* <p>
* Positive Yaw is counter-clockwise
*/
public void moveRobot(double x, double y, double yaw) {
// Calculate wheel powers.
double leftFrontPower = x -y -yaw;
double rightFrontPower = x +y +yaw;
double leftBackPower = x +y -yaw;
double rightBackPower = x -y +yaw;
// Normalize wheel powers to be less than 1.0
double max = Math.max(Math.abs(leftFrontPower), Math.abs(rightFrontPower));
max = Math.max(max, Math.abs(leftBackPower));
max = Math.max(max, Math.abs(rightBackPower));
if (max > 1.0) {
leftFrontPower /= max;
rightFrontPower /= max;
leftBackPower /= max;
rightBackPower /= max;
}
// Send powers to the wheels.
leftFrontDrive.setPower(leftFrontPower);
rightFrontDrive.setPower(rightFrontPower);
leftBackDrive.setPower(leftBackPower);
rightBackDrive.setPower(rightBackPower);
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second
// Note: Decimation can be changed on-the-fly to adapt during a match.
aprilTag.setDecimation(2);
// Create the vision portal by using a builder.
if (USE_WEBCAM) {
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
} else {
visionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessor(aprilTag)
.build();
}
}
/*
Manually set the camera gain and exposure.
This can only be called AFTER calling initAprilTag(), and only works for Webcams;
*/
private void setManualExposure(int exposureMS, int gain) {
// Wait for the camera to be open, then use the controls
if (visionPortal == null) {
return;
}
// Make sure camera is streaming before we try to set the exposure controls
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
}
}
}

View File

@ -0,0 +1,298 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates using a camera to locate and drive towards a specific AprilTag.
* The code assumes a basic two-wheel (Tank) Robot Drivetrain
*
* For an introduction to AprilTags, see the ftc-docs link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* The driving goal is to rotate to keep the tag centered in the camera, while driving towards the tag to achieve the desired distance.
* To reduce any motion blur (which will interrupt the detection process) the Camera exposure is reduced to a very low value (5mS)
* You can determine the best exposure and gain values by using the ConceptAprilTagOptimizeExposure OpMode in this Samples folder.
*
* The code assumes a Robot Configuration with motors named left_drive and right_drive.
* The motor directions must be set so a positive power goes forward on both wheels;
* This sample assumes that the default AprilTag Library (usually for the current season) is being loaded by default
* so you should choose to approach a valid tag ID (usually starting at 0)
*
* Under manual control, the left stick will move forward/back, and the right stick will rotate the robot.
* This is called POV Joystick mode, different than Tank Drive (where each joystick controls a wheel).
*
* Manually drive the robot until it displays Target data on the Driver Station.
* Press and hold the *Left Bumper* to enable the automatic "Drive to target" mode.
* Release the Left Bumper to return to manual driving mode.
*
* Under "Drive To Target" mode, the robot has two goals:
* 1) Turn the robot to always keep the Tag centered on the camera frame. (Use the Target Bearing to turn the robot.)
* 2) Drive towards the Tag to get to the desired distance. (Use Tag Range to drive the robot forward/backward)
*
* Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
* Speed and Turn sensitivity can be adjusted using the SPEED_GAIN and TURN_GAIN constants.
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
*/
@TeleOp(name="Tank Drive To AprilTag", group = "Concept")
@Disabled
public class RobotAutoDriveToAprilTagTank extends LinearOpMode
{
// Adjust these numbers to suit your robot.
final double DESIRED_DISTANCE = 12.0; // this is how close the camera should get to the target (inches)
// Set the GAIN constants to control the relationship between the measured position error, and how much power is
// applied to the drive motors to correct the error.
// Drive = Error * Gain Make these values smaller for smoother control, or larger for a more aggressive response.
final double SPEED_GAIN = 0.02 ; // Speed Control "Gain". eg: Ramp up to 50% power at a 25 inch error. (0.50 / 25.0)
final double TURN_GAIN = 0.01 ; // Turn Control "Gain". eg: Ramp up to 25% power at a 25 degree error. (0.25 / 25.0)
final double MAX_AUTO_SPEED = 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_TURN = 0.25; // Clip the turn speed to this max value (adjust for your robot)
private DcMotor leftDrive = null; // Used to control the left drive wheel
private DcMotor rightDrive = null; // Used to control the right drive wheel
private static final boolean USE_WEBCAM = true; // Set true to use a webcam, or false for a phone camera
private static final int DESIRED_TAG_ID = -1; // Choose the tag you want to approach or set to -1 for ANY tag.
private VisionPortal visionPortal; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private AprilTagDetection desiredTag = null; // Used to hold the data for a detected AprilTag
@Override public void runOpMode()
{
boolean targetFound = false; // Set to true when an AprilTag target is detected
double drive = 0; // Desired forward power/speed (-1 to +1) +ve is forward
double turn = 0; // Desired turning power/speed (-1 to +1) +ve is CounterClockwise
// Initialize the Apriltag Detection process
initAprilTag();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must match the names assigned during the robot configuration.
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Single Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
if (USE_WEBCAM)
setManualExposure(6, 250); // Use low exposure time to reduce motion blur
// Wait for the driver to press Start
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
targetFound = false;
desiredTag = null;
// Step through the list of detected tags and look for a matching tag
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
for (AprilTagDetection detection : currentDetections) {
// Look to see if we have size info on this tag.
if (detection.metadata != null) {
// Check to see if we want to track towards this tag.
if ((DESIRED_TAG_ID < 0) || (detection.id == DESIRED_TAG_ID)) {
// Yes, we want to use this tag.
targetFound = true;
desiredTag = detection;
break; // don't look any further.
} else {
// This tag is in the library, but we do not want to track it right now.
telemetry.addData("Skipping", "Tag ID %d is not desired", detection.id);
}
} else {
// This tag is NOT in the library, so we don't have enough information to track to it.
telemetry.addData("Unknown", "Tag ID %d is not in TagLibrary", detection.id);
}
}
// Tell the driver what we see, and what to do.
if (targetFound) {
telemetry.addData("\n>","HOLD Left-Bumper to Drive to Target\n");
telemetry.addData("Found", "ID %d (%s)", desiredTag.id, desiredTag.metadata.name);
telemetry.addData("Range", "%5.1f inches", desiredTag.ftcPose.range);
telemetry.addData("Bearing","%3.0f degrees", desiredTag.ftcPose.bearing);
} else {
telemetry.addData("\n>","Drive using joysticks to find valid target\n");
}
// If Left Bumper is being pressed, AND we have found the desired target, Drive to target Automatically .
if (gamepad1.left_bumper && targetFound) {
// Determine heading and range error so we can use them to control the robot automatically.
double rangeError = (desiredTag.ftcPose.range - DESIRED_DISTANCE);
double headingError = desiredTag.ftcPose.bearing;
// Use the speed and turn "gains" to calculate how we want the robot to move. Clip it to the maximum
drive = Range.clip(rangeError * SPEED_GAIN, -MAX_AUTO_SPEED, MAX_AUTO_SPEED);
turn = Range.clip(headingError * TURN_GAIN, -MAX_AUTO_TURN, MAX_AUTO_TURN) ;
telemetry.addData("Auto","Drive %5.2f, Turn %5.2f", drive, turn);
} else {
// drive using manual POV Joystick mode.
drive = -gamepad1.left_stick_y / 2.0; // Reduce drive rate to 50%.
turn = -gamepad1.right_stick_x / 4.0; // Reduce turn rate to 25%.
telemetry.addData("Manual","Drive %5.2f, Turn %5.2f", drive, turn);
}
telemetry.update();
// Apply desired axes motions to the drivetrain.
moveRobot(drive, turn);
sleep(10);
}
}
/**
* Move robot according to desired axes motions
* <p>
* Positive X is forward
* <p>
* Positive Yaw is counter-clockwise
*/
public void moveRobot(double x, double yaw) {
// Calculate left and right wheel powers.
double leftPower = x - yaw;
double rightPower = x + yaw;
// Normalize wheel powers to be less than 1.0
double max = Math.max(Math.abs(leftPower), Math.abs(rightPower));
if (max >1.0) {
leftPower /= max;
rightPower /= max;
}
// Send powers to the wheels.
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second
// Note: Decimation can be changed on-the-fly to adapt during a match.
aprilTag.setDecimation(2);
// Create the vision portal by using a builder.
if (USE_WEBCAM) {
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
} else {
visionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessor(aprilTag)
.build();
}
}
/*
Manually set the camera gain and exposure.
This can only be called AFTER calling initAprilTag(), and only works for Webcams;
*/
private void setManualExposure(int exposureMS, int gain) {
// Wait for the camera to be open, then use the controls
if (visionPortal == null) {
return;
}
// Make sure camera is streaming before we try to set the exposure controls
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
telemetry.addData("Camera", "Ready");
telemetry.update();
}
}
}

View File

@ -0,0 +1,142 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.NormalizedColorSensor;
import com.qualcomm.robotcore.hardware.NormalizedRGBA;
import com.qualcomm.robotcore.hardware.SwitchableLight;
/*
* This OpMode illustrates the concept of driving up to a line and then stopping.
* The code is structured as a LinearOpMode
*
* The Sensor used here can be a REV Color Sensor V2 or V3. Make sure the white LED is turned on.
* The sensor can be plugged into any I2C port, and must be named "sensor_color" in the active configuration.
*
* Depending on the height of your color sensor, you may want to set the sensor "gain".
* The higher the gain, the greater the reflected light reading will be.
* Use the SensorColor sample in this folder to determine the minimum gain value that provides an
* "Alpha" reading of 1.0 when you are on top of the white line. In this sample, we use a gain of 15
* which works well with a Rev V2 color sensor
*
* Setting the correct WHITE_THRESHOLD value is key to stopping correctly.
* This should be set halfway between the bare-tile, and white-line "Alpha" values.
* The reflected light value can be read on the screen once the OpMode has been INIT, but before it is STARTED.
* Move the sensor on and off the white line and note the min and max readings.
* Edit this code to make WHITE_THRESHOLD halfway between the min and max.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive To Line", group="Robot")
@Disabled
public class RobotAutoDriveToLine_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
/** The variable to store a reference to our color sensor hardware object */
NormalizedColorSensor colorSensor;
static final double WHITE_THRESHOLD = 0.5; // spans between 0.0 - 1.0 from dark to light
static final double APPROACH_SPEED = 0.25;
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Get a reference to our sensor object. It's recommended to use NormalizedColorSensor over
// ColorSensor, because NormalizedColorSensor consistently gives values between 0 and 1, while
// the values you get from ColorSensor are dependent on the specific sensor you're using.
colorSensor = hardwareMap.get(NormalizedColorSensor.class, "sensor_color");
// If necessary, turn ON the white LED (if there is no LED switch on the sensor)
if (colorSensor instanceof SwitchableLight) {
((SwitchableLight)colorSensor).enableLight(true);
}
// Some sensors allow you to set your light sensor gain for optimal sensitivity...
// See the SensorColor sample in this folder for how to determine the optimal gain.
// A gain of 15 causes a Rev Color Sensor V2 to produce an Alpha value of 1.0 at about 1.5" above the floor.
colorSensor.setGain(15);
// Wait for driver to press PLAY)
// Abort this loop is started or stopped.
while (opModeInInit()) {
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Ready to drive to white line."); //
// Display the light level while we are waiting to start
getBrightness();
}
// Start the robot moving forward, and then begin looking for a white line.
leftDrive.setPower(APPROACH_SPEED);
rightDrive.setPower(APPROACH_SPEED);
// run until the white line is seen OR the driver presses STOP;
while (opModeIsActive() && (getBrightness() < WHITE_THRESHOLD)) {
sleep(5);
}
// Stop all motors
leftDrive.setPower(0);
rightDrive.setPower(0);
}
// to obtain reflected light, read the normalized values from the color sensor. Return the Alpha channel.
double getBrightness() {
NormalizedRGBA colors = colorSensor.getNormalizedColors();
telemetry.addData("Light Level (0 to 1)", "%4.2f", colors.alpha);
telemetry.update();
return colors.alpha;
}
}

View File

@ -0,0 +1,167 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/*
* This file works in conjunction with the External Hardware Class sample called: ConceptExternalHardwareClass.java
* Please read the explanations in that Sample about how to use this class definition.
*
* This file defines a Java Class that performs all the setup and configuration for a sample robot's hardware (motors and sensors).
* It assumes three motors (left_drive, right_drive and arm) and two servos (left_hand and right_hand)
*
* This one file/class can be used by ALL of your OpModes without having to cut & paste the code each time.
*
* Where possible, the actual hardware objects are "abstracted" (or hidden) so the OpMode code just makes calls into the class,
* rather than accessing the internal hardware directly. This is why the objects are declared "private".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with *exactly the same name*.
*
* Or... In OnBot Java, add a new file named RobotHardware.java, select this sample, and select Not an OpMode.
* Also add a new OpMode, select the sample ConceptExternalHardwareClass.java, and select TeleOp.
*
*/
public class RobotHardware {
/* Declare OpMode members. */
private LinearOpMode myOpMode = null; // gain access to methods in the calling OpMode.
// Define Motor and Servo objects (Make them private so they can't be accessed externally)
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private DcMotor armMotor = null;
private Servo leftHand = null;
private Servo rightHand = null;
// Define Drive constants. Make them public so they CAN be used by the calling OpMode
public static final double MID_SERVO = 0.5 ;
public static final double HAND_SPEED = 0.02 ; // sets rate to move servo
public static final double ARM_UP_POWER = 0.45 ;
public static final double ARM_DOWN_POWER = -0.45 ;
// Define a constructor that allows the OpMode to pass a reference to itself.
public RobotHardware (LinearOpMode opmode) {
myOpMode = opmode;
}
/**
* Initialize all the robot's hardware.
* This method must be called ONCE when the OpMode is initialized.
* <p>
* All of the hardware devices are accessed via the hardware map, and initialized.
*/
public void init() {
// Define and Initialize Motors (note: need to use reference to actual OpMode).
leftDrive = myOpMode.hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = myOpMode.hardwareMap.get(DcMotor.class, "right_drive");
armMotor = myOpMode.hardwareMap.get(DcMotor.class, "arm");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Define and initialize ALL installed servos.
leftHand = myOpMode.hardwareMap.get(Servo.class, "left_hand");
rightHand = myOpMode.hardwareMap.get(Servo.class, "right_hand");
leftHand.setPosition(MID_SERVO);
rightHand.setPosition(MID_SERVO);
myOpMode.telemetry.addData(">", "Hardware Initialized");
myOpMode.telemetry.update();
}
/**
* Calculates the left/right motor powers required to achieve the requested
* robot motions: Drive (Axial motion) and Turn (Yaw motion).
* Then sends these power levels to the motors.
*
* @param Drive Fwd/Rev driving power (-1.0 to 1.0) +ve is forward
* @param Turn Right/Left turning power (-1.0 to 1.0) +ve is CW
*/
public void driveRobot(double Drive, double Turn) {
// Combine drive and turn for blended motion.
double left = Drive + Turn;
double right = Drive - Turn;
// Scale the values so neither exceed +/- 1.0
double max = Math.max(Math.abs(left), Math.abs(right));
if (max > 1.0)
{
left /= max;
right /= max;
}
// Use existing function to drive both wheels.
setDrivePower(left, right);
}
/**
* Pass the requested wheel motor powers to the appropriate hardware drive motors.
*
* @param leftWheel Fwd/Rev driving power (-1.0 to 1.0) +ve is forward
* @param rightWheel Fwd/Rev driving power (-1.0 to 1.0) +ve is forward
*/
public void setDrivePower(double leftWheel, double rightWheel) {
// Output the values to the motor drives.
leftDrive.setPower(leftWheel);
rightDrive.setPower(rightWheel);
}
/**
* Pass the requested arm power to the appropriate hardware drive motor
*
* @param power driving power (-1.0 to 1.0)
*/
public void setArmPower(double power) {
armMotor.setPower(power);
}
/**
* Send the two hand-servos to opposing (mirrored) positions, based on the passed offset.
*
* @param offset
*/
public void setHandPositions(double offset) {
offset = Range.clip(offset, -0.5, 0.5);
leftHand.setPosition(MID_SERVO + offset);
rightHand.setPosition(MID_SERVO - offset);
}
}

View File

@ -0,0 +1,159 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/*
* This OpMode executes a POV Game style Teleop for a direct drive robot
* The code is structured as a LinearOpMode
*
* In this mode the left stick moves the robot FWD and back, the Right stick turns left and right.
* It raises and lowers the arm using the Gamepad Y and A buttons respectively.
* It also opens and closes the claws slowly using the left and right Bumper buttons.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Robot: Teleop POV", group="Robot")
@Disabled
public class RobotTeleopPOV_Linear extends LinearOpMode {
/* Declare OpMode members. */
public DcMotor leftDrive = null;
public DcMotor rightDrive = null;
public DcMotor leftArm = null;
public Servo leftClaw = null;
public Servo rightClaw = null;
double clawOffset = 0;
public static final double MID_SERVO = 0.5 ;
public static final double CLAW_SPEED = 0.02 ; // sets rate to move servo
public static final double ARM_UP_POWER = 0.45 ;
public static final double ARM_DOWN_POWER = -0.45 ;
@Override
public void runOpMode() {
double left;
double right;
double drive;
double turn;
double max;
// Define and Initialize Motors
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
leftArm = hardwareMap.get(DcMotor.class, "left_arm");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Define and initialize ALL installed servos.
leftClaw = hardwareMap.get(Servo.class, "left_hand");
rightClaw = hardwareMap.get(Servo.class, "right_hand");
leftClaw.setPosition(MID_SERVO);
rightClaw.setPosition(MID_SERVO);
// Send telemetry message to signify robot waiting;
telemetry.addData(">", "Robot Ready. Press Play."); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Run wheels in POV mode (note: The joystick goes negative when pushed forward, so negate it)
// In this mode the Left stick moves the robot fwd and back, the Right stick turns left and right.
// This way it's also easy to just drive straight, or just turn.
drive = -gamepad1.left_stick_y;
turn = gamepad1.right_stick_x;
// Combine drive and turn for blended motion.
left = drive + turn;
right = drive - turn;
// Normalize the values so neither exceed +/- 1.0
max = Math.max(Math.abs(left), Math.abs(right));
if (max > 1.0)
{
left /= max;
right /= max;
}
// Output the safe vales to the motor drives.
leftDrive.setPower(left);
rightDrive.setPower(right);
// Use gamepad left & right Bumpers to open and close the claw
if (gamepad1.right_bumper)
clawOffset += CLAW_SPEED;
else if (gamepad1.left_bumper)
clawOffset -= CLAW_SPEED;
// Move both servos to new position. Assume servos are mirror image of each other.
clawOffset = Range.clip(clawOffset, -0.5, 0.5);
leftClaw.setPosition(MID_SERVO + clawOffset);
rightClaw.setPosition(MID_SERVO - clawOffset);
// Use gamepad buttons to move arm up (Y) and down (A)
if (gamepad1.y)
leftArm.setPower(ARM_UP_POWER);
else if (gamepad1.a)
leftArm.setPower(ARM_DOWN_POWER);
else
leftArm.setPower(0.0);
// Send telemetry message to signify robot running;
telemetry.addData("claw", "Offset = %.2f", clawOffset);
telemetry.addData("left", "%.2f", left);
telemetry.addData("right", "%.2f", right);
telemetry.update();
// Pace this loop so jaw action is reasonable speed.
sleep(50);
}
}
}

View File

@ -0,0 +1,160 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/*
* This OpMode executes a Tank Drive control TeleOp a direct drive robot
* The code is structured as an Iterative OpMode
*
* In this mode, the left and right joysticks control the left and right motors respectively.
* Pushing a joystick forward will make the attached motor drive forward.
* It raises and lowers the claw using the Gamepad Y and A buttons respectively.
* It also opens and closes the claws slowly using the left and right Bumper buttons.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Robot: Teleop Tank", group="Robot")
@Disabled
public class RobotTeleopTank_Iterative extends OpMode{
/* Declare OpMode members. */
public DcMotor leftDrive = null;
public DcMotor rightDrive = null;
public DcMotor leftArm = null;
public Servo leftClaw = null;
public Servo rightClaw = null;
double clawOffset = 0;
public static final double MID_SERVO = 0.5 ;
public static final double CLAW_SPEED = 0.02 ; // sets rate to move servo
public static final double ARM_UP_POWER = 0.50 ; // Run arm motor up at 50% power
public static final double ARM_DOWN_POWER = -0.25 ; // Run arm motor down at -25% power
/*
* Code to run ONCE when the driver hits INIT
*/
@Override
public void init() {
// Define and Initialize Motors
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
leftArm = hardwareMap.get(DcMotor.class, "left_arm");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left and right sticks forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Define and initialize ALL installed servos.
leftClaw = hardwareMap.get(Servo.class, "left_hand");
rightClaw = hardwareMap.get(Servo.class, "right_hand");
leftClaw.setPosition(MID_SERVO);
rightClaw.setPosition(MID_SERVO);
// Send telemetry message to signify robot waiting;
telemetry.addData(">", "Robot Ready. Press Play."); //
}
/*
* Code to run REPEATEDLY after the driver hits INIT, but before they hit PLAY
*/
@Override
public void init_loop() {
}
/*
* Code to run ONCE when the driver hits PLAY
*/
@Override
public void start() {
}
/*
* Code to run REPEATEDLY after the driver hits PLAY but before they hit STOP
*/
@Override
public void loop() {
double left;
double right;
// Run wheels in tank mode (note: The joystick goes negative when pushed forward, so negate it)
left = -gamepad1.left_stick_y;
right = -gamepad1.right_stick_y;
leftDrive.setPower(left);
rightDrive.setPower(right);
// Use gamepad left & right Bumpers to open and close the claw
if (gamepad1.right_bumper)
clawOffset += CLAW_SPEED;
else if (gamepad1.left_bumper)
clawOffset -= CLAW_SPEED;
// Move both servos to new position. Assume servos are mirror image of each other.
clawOffset = Range.clip(clawOffset, -0.5, 0.5);
leftClaw.setPosition(MID_SERVO + clawOffset);
rightClaw.setPosition(MID_SERVO - clawOffset);
// Use gamepad buttons to move the arm up (Y) and down (A)
if (gamepad1.y)
leftArm.setPower(ARM_UP_POWER);
else if (gamepad1.a)
leftArm.setPower(ARM_DOWN_POWER);
else
leftArm.setPower(0.0);
// Send telemetry message to signify robot running;
telemetry.addData("claw", "Offset = %.2f", clawOffset);
telemetry.addData("left", "%.2f", left);
telemetry.addData("right", "%.2f", right);
}
/*
* Code to run ONCE after the driver hits STOP
*/
@Override
public void stop() {
}
}

View File

@ -0,0 +1,163 @@
/*
* Copyright (c) 2018 Craig MacFarlane
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification, are permitted
* (subject to the limitations in the disclaimer below) provided that the following conditions are
* met:
*
* Redistributions of source code must retain the above copyright notice, this list of conditions
* and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this list of conditions
* and the following disclaimer in the documentation and/or other materials provided with the
* distribution.
*
* Neither the name of Craig MacFarlane nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS LICENSE. THIS
* SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED
* WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
* OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF
* THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevBlinkinLedDriver;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.robotcore.internal.system.Deadline;
import java.util.concurrent.TimeUnit;
/*
* This OpMode demonstrates use of the REV Robotics Blinkin LED Driver.
* AUTO mode cycles through all of the patterns.
* MANUAL mode allows the user to manually change patterns using the
* left and right bumpers of a gamepad.
*
* Configure the driver on a servo port, and name it "blinkin".
*
* Displays the first pattern upon init.
*/
@TeleOp(name="BlinkinExample")
@Disabled
public class SampleRevBlinkinLedDriver extends OpMode {
/*
* Change the pattern every 10 seconds in AUTO mode.
*/
private final static int LED_PERIOD = 10;
/*
* Rate limit gamepad button presses to every 500ms.
*/
private final static int GAMEPAD_LOCKOUT = 500;
RevBlinkinLedDriver blinkinLedDriver;
RevBlinkinLedDriver.BlinkinPattern pattern;
Telemetry.Item patternName;
Telemetry.Item display;
DisplayKind displayKind;
Deadline ledCycleDeadline;
Deadline gamepadRateLimit;
protected enum DisplayKind {
MANUAL,
AUTO
}
@Override
public void init()
{
displayKind = DisplayKind.AUTO;
blinkinLedDriver = hardwareMap.get(RevBlinkinLedDriver.class, "blinkin");
pattern = RevBlinkinLedDriver.BlinkinPattern.RAINBOW_RAINBOW_PALETTE;
blinkinLedDriver.setPattern(pattern);
display = telemetry.addData("Display Kind: ", displayKind.toString());
patternName = telemetry.addData("Pattern: ", pattern.toString());
ledCycleDeadline = new Deadline(LED_PERIOD, TimeUnit.SECONDS);
gamepadRateLimit = new Deadline(GAMEPAD_LOCKOUT, TimeUnit.MILLISECONDS);
}
@Override
public void loop()
{
handleGamepad();
if (displayKind == DisplayKind.AUTO) {
doAutoDisplay();
} else {
/*
* MANUAL mode: Nothing to do, setting the pattern as a result of a gamepad event.
*/
}
}
/*
* handleGamepad
*
* Responds to a gamepad button press. Demonstrates rate limiting for
* button presses. If loop() is called every 10ms and and you don't rate
* limit, then any given button press may register as multiple button presses,
* which in this application is problematic.
*
* A: Manual mode, Right bumper displays the next pattern, left bumper displays the previous pattern.
* B: Auto mode, pattern cycles, changing every LED_PERIOD seconds.
*/
protected void handleGamepad()
{
if (!gamepadRateLimit.hasExpired()) {
return;
}
if (gamepad1.a) {
setDisplayKind(DisplayKind.MANUAL);
gamepadRateLimit.reset();
} else if (gamepad1.b) {
setDisplayKind(DisplayKind.AUTO);
gamepadRateLimit.reset();
} else if ((displayKind == DisplayKind.MANUAL) && (gamepad1.left_bumper)) {
pattern = pattern.previous();
displayPattern();
gamepadRateLimit.reset();
} else if ((displayKind == DisplayKind.MANUAL) && (gamepad1.right_bumper)) {
pattern = pattern.next();
displayPattern();
gamepadRateLimit.reset();
}
}
protected void setDisplayKind(DisplayKind displayKind)
{
this.displayKind = displayKind;
display.setValue(displayKind.toString());
}
protected void doAutoDisplay()
{
if (ledCycleDeadline.hasExpired()) {
pattern = pattern.next();
displayPattern();
ledCycleDeadline.reset();
}
}
protected void displayPattern()
{
blinkinLedDriver.setPattern(pattern);
patternName.setValue(pattern.toString());
}
}

View File

@ -0,0 +1,186 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.bosch.BNO055IMU;
import com.qualcomm.hardware.bosch.JustLoggingAccelerationIntegrator;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.navigation.Acceleration;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.Position;
import org.firstinspires.ftc.robotcore.external.navigation.Velocity;
import java.util.Locale;
/*
* This OpMode gives a short demo on how to use the BNO055 Inertial Motion Unit (IMU) from AdaFruit.
*
* Note: this is a Legacy example that will not work with newer Control/Expansion Hubs that use a different IMU
* Please use the new SensorIMUOrthogonal or SensorIMUNonOrthogonal samples for a more universal IMU interface.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* @see <a href="http://www.adafruit.com/products/2472">Adafruit IMU</a>
*/
@TeleOp(name = "Sensor: BNO055 IMU", group = "Sensor")
@Disabled // Comment this out to add to the OpMode list
public class SensorBNO055IMU extends LinearOpMode
{
//----------------------------------------------------------------------------------------------
// State
//----------------------------------------------------------------------------------------------
// The IMU sensor object
BNO055IMU imu;
// State used for updating telemetry
Orientation angles;
Acceleration gravity;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() {
// Set up the parameters with which we will use our IMU. Note that integration
// algorithm here just reports accelerations to the logcat log; it doesn't actually
// provide positional information.
BNO055IMU.Parameters parameters = new BNO055IMU.Parameters();
parameters.angleUnit = BNO055IMU.AngleUnit.DEGREES;
parameters.accelUnit = BNO055IMU.AccelUnit.METERS_PERSEC_PERSEC;
parameters.calibrationDataFile = "BNO055IMUCalibration.json"; // see the calibration sample OpMode
parameters.loggingEnabled = true;
parameters.loggingTag = "IMU";
parameters.accelerationIntegrationAlgorithm = new JustLoggingAccelerationIntegrator();
// Retrieve and initialize the IMU. We expect the IMU to be attached to an I2C port
// on a Core Device Interface Module, configured to be a sensor of type "AdaFruit IMU",
// and named "imu".
imu = hardwareMap.get(BNO055IMU.class, "imu");
imu.initialize(parameters);
// Set up our telemetry dashboard
composeTelemetry();
// Wait until we're told to go
waitForStart();
// Start the logging of measured acceleration
imu.startAccelerationIntegration(new Position(), new Velocity(), 1000);
// Loop and update the dashboard
while (opModeIsActive()) {
telemetry.update();
}
}
//----------------------------------------------------------------------------------------------
// Telemetry Configuration
//----------------------------------------------------------------------------------------------
void composeTelemetry() {
// At the beginning of each telemetry update, grab a bunch of data
// from the IMU that we will then display in separate lines.
telemetry.addAction(new Runnable() { @Override public void run()
{
// Acquiring the angles is relatively expensive; we don't want
// to do that in each of the three items that need that info, as that's
// three times the necessary expense.
angles = imu.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
gravity = imu.getGravity();
}
});
telemetry.addLine()
.addData("status", new Func<String>() {
@Override public String value() {
return imu.getSystemStatus().toShortString();
}
})
.addData("calib", new Func<String>() {
@Override public String value() {
return imu.getCalibrationStatus().toString();
}
});
telemetry.addLine()
.addData("heading", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.firstAngle);
}
})
.addData("roll", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.secondAngle);
}
})
.addData("pitch", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.thirdAngle);
}
});
telemetry.addLine()
.addData("grvty", new Func<String>() {
@Override public String value() {
return gravity.toString();
}
})
.addData("mag", new Func<String>() {
@Override public String value() {
return String.format(Locale.getDefault(), "%.3f",
Math.sqrt(gravity.xAccel*gravity.xAccel
+ gravity.yAccel*gravity.yAccel
+ gravity.zAccel*gravity.zAccel));
}
});
}
//----------------------------------------------------------------------------------------------
// Formatting
//----------------------------------------------------------------------------------------------
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format(Locale.getDefault(), "%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,230 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.bosch.BNO055IMU;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ReadWriteFile;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.internal.system.AppUtil;
import java.io.File;
import java.util.Locale;
/*
* This OpMode calibrates a BNO055 IMU per
* "Section 3.11 Calibration" of the BNO055 specification.
*
* Note: this is a Legacy example that will not work with newer Control/Expansion Hubs that use a different IMU
* Please use the new SensorIMUOrthogonal or SensorIMUNonOrthogonal samples for a more universal IMU interface.
*
* Manual calibration of the IMU is definitely NOT necessary: except for the magnetometer (which is not used by the
* default "IMU" SensorMode), the BNO055 is internally self-calibrating and thus can be very successfully used without
* manual intervention. That said, performing a one-time calibration, saving the results persistently, then loading them
* again at each run can help reduce the time that automatic calibration requires.
*
* This summary of the calibration process from Intel is informative:
* http://iotdk.intel.com/docs/master/upm/classupm_1_1_b_n_o055.html
*
* "This device requires calibration in order to operate accurately. [...] Calibration data is
* lost on a power cycle. See one of the examples for a description of how to calibrate the device,
* but in essence:
*
* There is a calibration status register available [...] that returns the calibration status
* of the accelerometer (ACC), magnetometer (MAG), gyroscope (GYR), and overall system (SYS).
* Each of these values range from 0 (uncalibrated) to 3 (fully calibrated). Calibration [ideally]
* involves certain motions to get all 4 values at 3. The motions are as follows (though see the
* datasheet for more information):
*
* 1. GYR: Simply let the sensor sit flat for a few seconds.</ol>
* 2. ACC: Move the sensor in various positions. Start flat, then rotate slowly by 45
* degrees, hold for a few seconds, then continue rotating another 45 degrees and
* hold, etc. 6 or more movements of this type may be required. You can move through
* any axis you desire, but make sure that the device is lying at least once
* perpendicular to the x, y, and z axis.</ol>
* 3. MAG: Move slowly in a figure 8 pattern in the air, until the calibration values reaches 3.</ol>
* 4. SYS: This will usually reach 3 when the other items have also reached 3. If not, continue
* slowly moving the device though various axes until it does."</ol>
*
* To calibrate the IMU, run this sample OpMode with a gamepad attached to the driver station.
* Once the IMU has reached sufficient calibration as reported on telemetry, press the 'A'
* button on the gamepad to write the calibration to a file. That file can then be indicated
* later when running an OpMode which uses the IMU.
*
* Note: if your intended uses of the IMU do not include use of all its sensors (for example,
* you might not use the magnetometer), then it makes little sense for you to wait for full
* calibration of the sensors you are not using before saving the calibration data. Indeed,
* it appears that in a SensorMode that doesn't use the magnetometer (for example), the
* magnetometer cannot actually be calibrated.
*
* References:
* The AdafruitBNO055IMU Javadoc
* The BNO055IMU.Parameters.calibrationDataFile Javadoc
* The BNO055 product page: https://www.bosch-sensortec.com/bst/products/all_products/bno055
* The BNO055 datasheet: https://www.bosch-sensortec.com/media/boschsensortec/downloads/datasheets/bst-bno055-ds000.pdf
*/
@TeleOp(name = "Sensor: BNO055 IMU Calibration", group = "Sensor")
@Disabled // Uncomment this to add to the OpMode list
public class SensorBNO055IMUCalibration extends LinearOpMode
{
//----------------------------------------------------------------------------------------------
// State
//----------------------------------------------------------------------------------------------
// Our sensors, motors, and other devices go here, along with other long term state
BNO055IMU imu;
// State used for updating telemetry
Orientation angles;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() {
telemetry.log().setCapacity(12);
telemetry.log().add("");
telemetry.log().add("Please refer to the calibration instructions");
telemetry.log().add("contained in the Adafruit IMU calibration");
telemetry.log().add("sample OpMode.");
telemetry.log().add("");
telemetry.log().add("When sufficient calibration has been reached,");
telemetry.log().add("press the 'A' button to write the current");
telemetry.log().add("calibration data to a file.");
telemetry.log().add("");
// We are expecting the IMU to be attached to an I2C port on a Core Device Interface Module and named "imu".
BNO055IMU.Parameters parameters = new BNO055IMU.Parameters();
parameters.loggingEnabled = true;
parameters.loggingTag = "IMU";
imu = hardwareMap.get(BNO055IMU.class, "imu");
imu.initialize(parameters);
composeTelemetry();
telemetry.log().add("Waiting for start...");
// Wait until we're told to go
while (!isStarted()) {
telemetry.update();
idle();
}
telemetry.log().add("...started...");
while (opModeIsActive()) {
if (gamepad1.a) {
// Get the calibration data
BNO055IMU.CalibrationData calibrationData = imu.readCalibrationData();
// Save the calibration data to a file. You can choose whatever file
// name you wish here, but you'll want to indicate the same file name
// when you initialize the IMU in an OpMode in which it is used. If you
// have more than one IMU on your robot, you'll of course want to use
// different configuration file names for each.
String filename = "AdafruitIMUCalibration.json";
File file = AppUtil.getInstance().getSettingsFile(filename);
ReadWriteFile.writeFile(file, calibrationData.serialize());
telemetry.log().add("saved to '%s'", filename);
// Wait for the button to be released
while (gamepad1.a) {
telemetry.update();
idle();
}
}
telemetry.update();
}
}
void composeTelemetry() {
// At the beginning of each telemetry update, grab a bunch of data
// from the IMU that we will then display in separate lines.
telemetry.addAction(new Runnable() { @Override public void run()
{
// Acquiring the angles is relatively expensive; we don't want
// to do that in each of the three items that need that info, as that's
// three times the necessary expense.
angles = imu.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
}
});
telemetry.addLine()
.addData("status", new Func<String>() {
@Override public String value() {
return imu.getSystemStatus().toShortString();
}
})
.addData("calib", new Func<String>() {
@Override public String value() {
return imu.getCalibrationStatus().toString();
}
});
telemetry.addLine()
.addData("heading", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.firstAngle);
}
})
.addData("roll", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.secondAngle);
}
})
.addData("pitch", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.thirdAngle);
}
});
}
//----------------------------------------------------------------------------------------------
// Formatting
//----------------------------------------------------------------------------------------------
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format(Locale.getDefault(), "%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,221 @@
/* Copyright (c) 2017-2020 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import com.qualcomm.robotcore.hardware.NormalizedColorSensor;
import com.qualcomm.robotcore.hardware.NormalizedRGBA;
import com.qualcomm.robotcore.hardware.SwitchableLight;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/*
* This OpMode shows how to use a color sensor in a generic
* way, regardless of which particular make or model of color sensor is used. The OpMode
* assumes that the color sensor is configured with a name of "sensor_color".
*
* There will be some variation in the values measured depending on the specific sensor you are using.
*
* You can increase the gain (a multiplier to make the sensor report higher values) by holding down
* the A button on the gamepad, and decrease the gain by holding down the B button on the gamepad.
*
* If the color sensor has a light which is controllable from software, you can use the X button on
* the gamepad to toggle the light on and off. The REV sensors don't support this, but instead have
* a physical switch on them to turn the light on and off, beginning with REV Color Sensor V2.
*
* If the color sensor also supports short-range distance measurements (usually via an infrared
* proximity sensor), the reported distance will be written to telemetry. As of September 2020,
* the only color sensors that support this are the ones from REV Robotics. These infrared proximity
* sensor measurements are only useful at very small distances, and are sensitive to ambient light
* and surface reflectivity. You should use a different sensor if you need precise distance measurements.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: Color", group = "Sensor")
@Disabled
public class SensorColor extends LinearOpMode {
/** The colorSensor field will contain a reference to our color sensor hardware object */
NormalizedColorSensor colorSensor;
/** The relativeLayout field is used to aid in providing interesting visual feedback
* in this sample application; you probably *don't* need this when you use a color sensor on your
* robot. Note that you won't see anything change on the Driver Station, only on the Robot Controller. */
View relativeLayout;
/*
* The runOpMode() method is the root of this OpMode, as it is in all LinearOpModes.
* Our implementation here, though is a bit unusual: we've decided to put all the actual work
* in the runSample() method rather than directly in runOpMode() itself. The reason we do that is
* that in this sample we're changing the background color of the robot controller screen as the
* OpMode runs, and we want to be able to *guarantee* that we restore it to something reasonable
* and palatable when the OpMode ends. The simplest way to do that is to use a try...finally
* block around the main, core logic, and an easy way to make that all clear was to separate
* the former from the latter in separate methods.
*/
@Override public void runOpMode() {
// Get a reference to the RelativeLayout so we can later change the background
// color of the Robot Controller app to match the hue detected by the RGB sensor.
int relativeLayoutId = hardwareMap.appContext.getResources().getIdentifier("RelativeLayout", "id", hardwareMap.appContext.getPackageName());
relativeLayout = ((Activity) hardwareMap.appContext).findViewById(relativeLayoutId);
try {
runSample(); // actually execute the sample
} finally {
// On the way out, *guarantee* that the background is reasonable. It doesn't actually start off
// as pure white, but it's too much work to dig out what actually was used, and this is good
// enough to at least make the screen reasonable again.
// Set the panel back to the default color
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.WHITE);
}
});
}
}
protected void runSample() {
// You can give the sensor a gain value, will be multiplied by the sensor's raw value before the
// normalized color values are calculated. Color sensors (especially the REV Color Sensor V3)
// can give very low values (depending on the lighting conditions), which only use a small part
// of the 0-1 range that is available for the red, green, and blue values. In brighter conditions,
// you should use a smaller gain than in dark conditions. If your gain is too high, all of the
// colors will report at or near 1, and you won't be able to determine what color you are
// actually looking at. For this reason, it's better to err on the side of a lower gain
// (but always greater than or equal to 1).
float gain = 2;
// Once per loop, we will update this hsvValues array. The first element (0) will contain the
// hue, the second element (1) will contain the saturation, and the third element (2) will
// contain the value. See http://web.archive.org/web/20190311170843/https://infohost.nmt.edu/tcc/help/pubs/colortheory/web/hsv.html
// for an explanation of HSV color.
final float[] hsvValues = new float[3];
// xButtonPreviouslyPressed and xButtonCurrentlyPressed keep track of the previous and current
// state of the X button on the gamepad
boolean xButtonPreviouslyPressed = false;
boolean xButtonCurrentlyPressed = false;
// Get a reference to our sensor object. It's recommended to use NormalizedColorSensor over
// ColorSensor, because NormalizedColorSensor consistently gives values between 0 and 1, while
// the values you get from ColorSensor are dependent on the specific sensor you're using.
colorSensor = hardwareMap.get(NormalizedColorSensor.class, "sensor_color");
// If possible, turn the light on in the beginning (it might already be on anyway,
// we just make sure it is if we can).
if (colorSensor instanceof SwitchableLight) {
((SwitchableLight)colorSensor).enableLight(true);
}
// Wait for the start button to be pressed.
waitForStart();
// Loop until we are asked to stop
while (opModeIsActive()) {
// Explain basic gain information via telemetry
telemetry.addLine("Hold the A button on gamepad 1 to increase gain, or B to decrease it.\n");
telemetry.addLine("Higher gain values mean that the sensor will report larger numbers for Red, Green, and Blue, and Value\n");
// Update the gain value if either of the A or B gamepad buttons is being held
if (gamepad1.a) {
// Only increase the gain by a small amount, since this loop will occur multiple times per second.
gain += 0.005;
} else if (gamepad1.b && gain > 1) { // A gain of less than 1 will make the values smaller, which is not helpful.
gain -= 0.005;
}
// Show the gain value via telemetry
telemetry.addData("Gain", gain);
// Tell the sensor our desired gain value (normally you would do this during initialization,
// not during the loop)
colorSensor.setGain(gain);
// Check the status of the X button on the gamepad
xButtonCurrentlyPressed = gamepad1.x;
// If the button state is different than what it was, then act
if (xButtonCurrentlyPressed != xButtonPreviouslyPressed) {
// If the button is (now) down, then toggle the light
if (xButtonCurrentlyPressed) {
if (colorSensor instanceof SwitchableLight) {
SwitchableLight light = (SwitchableLight)colorSensor;
light.enableLight(!light.isLightOn());
}
}
}
xButtonPreviouslyPressed = xButtonCurrentlyPressed;
// Get the normalized colors from the sensor
NormalizedRGBA colors = colorSensor.getNormalizedColors();
/* Use telemetry to display feedback on the driver station. We show the red, green, and blue
* normalized values from the sensor (in the range of 0 to 1), as well as the equivalent
* HSV (hue, saturation and value) values. See http://web.archive.org/web/20190311170843/https://infohost.nmt.edu/tcc/help/pubs/colortheory/web/hsv.html
* for an explanation of HSV color. */
// Update the hsvValues array by passing it to Color.colorToHSV()
Color.colorToHSV(colors.toColor(), hsvValues);
telemetry.addLine()
.addData("Red", "%.3f", colors.red)
.addData("Green", "%.3f", colors.green)
.addData("Blue", "%.3f", colors.blue);
telemetry.addLine()
.addData("Hue", "%.3f", hsvValues[0])
.addData("Saturation", "%.3f", hsvValues[1])
.addData("Value", "%.3f", hsvValues[2]);
telemetry.addData("Alpha", "%.3f", colors.alpha);
/* If this color sensor also has a distance sensor, display the measured distance.
* Note that the reported distance is only useful at very close range, and is impacted by
* ambient light and surface reflectivity. */
if (colorSensor instanceof DistanceSensor) {
telemetry.addData("Distance (cm)", "%.3f", ((DistanceSensor) colorSensor).getDistance(DistanceUnit.CM));
}
telemetry.update();
// Change the Robot Controller's background color to match the color detected by the color sensor.
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.HSVToColor(hsvValues));
}
});
}
}
}

View File

@ -0,0 +1,78 @@
/* Copyright (c) 2024 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DigitalChannel;
/*
* This OpMode demonstrates how to use a digital channel.
*
* The OpMode assumes that the digital channel is configured with a name of "digitalTouch".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Sensor: digital channel", group = "Sensor")
@Disabled
public class SensorDigitalTouch extends LinearOpMode {
DigitalChannel digitalTouch; // Digital channel Object
@Override
public void runOpMode() {
// get a reference to our touchSensor object.
digitalTouch = hardwareMap.get(DigitalChannel.class, "digitalTouch");
digitalTouch.setMode(DigitalChannel.Mode.INPUT);
telemetry.addData("DigitalTouchSensorExample", "Press start to continue...");
telemetry.update();
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read the digital channel.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// button is pressed if value returned is LOW or false.
// send the info back to driver station using telemetry function.
if (digitalTouch.getState() == false) {
telemetry.addData("Button", "PRESSED");
} else {
telemetry.addData("Button", "NOT PRESSED");
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,149 @@
/*
Copyright (c) 2023 FIRST
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of FIRST nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.dfrobot.HuskyLens;
import com.qualcomm.hardware.rev.Rev2mDistanceSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
import org.firstinspires.ftc.robotcore.internal.system.Deadline;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates how to use the DFRobot HuskyLens.
*
* The HuskyLens is a Vision Sensor with a built-in object detection model. It can
* detect a number of predefined objects and AprilTags in the 36h11 family, can
* recognize colors, and can be trained to detect custom objects. See this website for
* documentation: https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336
*
* This sample illustrates how to detect AprilTags, but can be used to detect other types
* of objects by changing the algorithm. It assumes that the HuskyLens is configured with
* a name of "huskylens".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: HuskyLens", group = "Sensor")
@Disabled
public class SensorHuskyLens extends LinearOpMode {
private final int READ_PERIOD = 1;
private HuskyLens huskyLens;
@Override
public void runOpMode()
{
huskyLens = hardwareMap.get(HuskyLens.class, "huskylens");
/*
* This sample rate limits the reads solely to allow a user time to observe
* what is happening on the Driver Station telemetry. Typical applications
* would not likely rate limit.
*/
Deadline rateLimit = new Deadline(READ_PERIOD, TimeUnit.SECONDS);
/*
* Immediately expire so that the first time through we'll do the read.
*/
rateLimit.expire();
/*
* Basic check to see if the device is alive and communicating. This is not
* technically necessary here as the HuskyLens class does this in its
* doInitialization() method which is called when the device is pulled out of
* the hardware map. However, sometimes it's unclear why a device reports as
* failing on initialization. In the case of this device, it's because the
* call to knock() failed.
*/
if (!huskyLens.knock()) {
telemetry.addData(">>", "Problem communicating with " + huskyLens.getDeviceName());
} else {
telemetry.addData(">>", "Press start to continue");
}
/*
* The device uses the concept of an algorithm to determine what types of
* objects it will look for and/or what mode it is in. The algorithm may be
* selected using the scroll wheel on the device, or via software as shown in
* the call to selectAlgorithm().
*
* The SDK itself does not assume that the user wants a particular algorithm on
* startup, and hence does not set an algorithm.
*
* Users, should, in general, explicitly choose the algorithm they want to use
* within the OpMode by calling selectAlgorithm() and passing it one of the values
* found in the enumeration HuskyLens.Algorithm.
*/
huskyLens.selectAlgorithm(HuskyLens.Algorithm.TAG_RECOGNITION);
telemetry.update();
waitForStart();
/*
* Looking for AprilTags per the call to selectAlgorithm() above. A handy grid
* for testing may be found at https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336#target_20.
*
* Note again that the device only recognizes the 36h11 family of tags out of the box.
*/
while(opModeIsActive()) {
if (!rateLimit.hasExpired()) {
continue;
}
rateLimit.reset();
/*
* All algorithms, except for LINE_TRACKING, return a list of Blocks where a
* Block represents the outline of a recognized object along with its ID number.
* ID numbers allow you to identify what the device saw. See the HuskyLens documentation
* referenced in the header comment above for more information on IDs and how to
* assign them to objects.
*
* Returns an empty array if no objects are seen.
*/
HuskyLens.Block[] blocks = huskyLens.blocks();
telemetry.addData("Block count", blocks.length);
for (int i = 0; i < blocks.length; i++) {
telemetry.addData("Block", blocks[i].toString());
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,181 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
import static com.qualcomm.hardware.rev.RevHubOrientationOnRobot.xyzOrientation;
/*
* This OpMode shows how to use the new universal IMU interface. This
* interface may be used with the BNO055 IMU or the BHI260 IMU. It assumes that an IMU is configured
* on the robot with the name "imu".
*
* The sample will display the current Yaw, Pitch and Roll of the robot.<br>
* With the correct orientation parameters selected, pitch/roll/yaw should act as follows:
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X)
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y)
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z)
*
* The yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
*
* This specific sample DOES NOT assume that the Hub is mounted on one of the three orthogonal
* planes (X/Y, X/Z or Y/Z) OR that the Hub has only been rotated in a range of 90 degree increments.
*
* Note: if your Hub is mounted Orthogonally (on a orthogonal surface, angled at some multiple of
* 90 Degrees) then you should use the simpler SensorImuOrthogonal sample in this folder.
*
* But... If your Hub is mounted Non-Orthogonally, you must specify one or more rotational angles
* that transform a "Default" Hub orientation into your desired orientation. That is what is
* illustrated here.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* Finally, edit this OpMode to use at least one angle around an axis to orient your Hub.
*/
@TeleOp(name = "Sensor: IMU Non-Orthogonal", group = "Sensor")
@Disabled // Comment this out to add to the OpMode list
public class SensorIMUNonOrthogonal extends LinearOpMode
{
// The IMU sensor object
IMU imu;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() throws InterruptedException {
// Retrieve and initialize the IMU.
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
/* Define how the hub is mounted to the robot to get the correct Yaw, Pitch and Roll values.
*
* You can apply up to three axis rotations to orient your Hub according to how it's mounted on the robot.
*
* The starting point for these rotations is the "Default" Hub orientation, which is:
* 1) Hub laying flat on a horizontal surface, with the Printed Logo facing UP
* 2) Rotated such that the USB ports are facing forward on the robot.
*
* The order that the rotations are performed matters, so this sample shows doing them in the order X, Y, then Z.
* For specifying non-orthogonal hub mounting orientations, we must temporarily use axes
* defined relative to the Hub itself, instead of the usual Robot Coordinate System axes
* used for the results the IMU gives us. In the starting orientation, the Hub axes are
* aligned with the Robot Coordinate System:
*
* X Axis: Starting at Center of Hub, pointing out towards I2C connectors
* Y Axis: Starting at Center of Hub, pointing out towards USB connectors
* Z Axis: Starting at Center of Hub, pointing Up through LOGO
*
* Positive rotation is defined by right-hand rule with thumb pointing in +ve direction on axis.
*
* Some examples.
*
* ----------------------------------------------------------------------------------------------------------------------------------
* Example A) Assume that the hub is mounted on a sloped plate at the back of the robot, with the USB ports coming out the top of the hub.
* The plate is tilted UP 60 degrees from horizontal.
*
* To get the "Default" hub into this configuration you would just need a single rotation.
* 1) Rotate the Hub +60 degrees around the X axis to tilt up the front edge.
* 2) No rotation around the Y or Z axes.
*
* So the X,Y,Z rotations would be 60,0,0
*
* ----------------------------------------------------------------------------------------------------------------------------------
* Example B) Assume that the hub is laying flat on the chassis, but it has been twisted 30 degrees towards the right front wheel to make
* the USB cable accessible.
*
* To get the "Default" hub into this configuration you would just need a single rotation, but around a different axis.
* 1) No rotation around the X or Y axes.
* 1) Rotate the Hub -30 degrees (Clockwise) around the Z axis, since a positive angle would be Counter Clockwise.
*
* So the X,Y,Z rotations would be 0,0,-30
*
* ----------------------------------------------------------------------------------------------------------------------------------
* Example C) Assume that the hub is mounted on a vertical plate on the right side of the robot, with the Logo facing out, and the
* Hub rotated so that the USB ports are facing down 30 degrees towards the back wheels of the robot.
*
* To get the "Default" hub into this configuration will require several rotations.
* 1) Rotate the hub +90 degrees around the X axis to get it standing upright with the logo pointing backwards on the robot
* 2) Next, rotate the hub +90 around the Y axis to get it facing to the right.
* 3) Finally rotate the hub +120 degrees around the Z axis to take the USB ports from vertical to sloping down 30 degrees and
* facing towards the back of the robot.
*
* So the X,Y,Z rotations would be 90,90,120
*/
// The next three lines define the desired axis rotations.
// To Do: EDIT these values to match YOUR mounting configuration.
double xRotation = 0; // enter the desired X rotation angle here.
double yRotation = 0; // enter the desired Y rotation angle here.
double zRotation = 0; // enter the desired Z rotation angle here.
Orientation hubRotation = xyzOrientation(xRotation, yRotation, zRotation);
// Now initialize the IMU with this mounting orientation
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(hubRotation);
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Loop and update the dashboard
while (!isStopRequested()) {
telemetry.addData("Hub orientation", "X=%.1f, Y=%.1f, Z=%.1f \n", xRotation, yRotation, zRotation);
// Check to see if heading reset is requested
if (gamepad1.y) {
telemetry.addData("Yaw", "Resetting\n");
imu.resetYaw();
} else {
telemetry.addData("Yaw", "Press Y (triangle) on Gamepad to reset\n");
}
// Retrieve Rotational Angles and Velocities
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.DEGREES);
telemetry.addData("Yaw (Z)", "%.2f Deg. (Heading)", orientation.getYaw(AngleUnit.DEGREES));
telemetry.addData("Pitch (X)", "%.2f Deg.", orientation.getPitch(AngleUnit.DEGREES));
telemetry.addData("Roll (Y)", "%.2f Deg.\n", orientation.getRoll(AngleUnit.DEGREES));
telemetry.addData("Yaw (Z) velocity", "%.2f Deg/Sec", angularVelocity.zRotationRate);
telemetry.addData("Pitch (X) velocity", "%.2f Deg/Sec", angularVelocity.xRotationRate);
telemetry.addData("Roll (Y) velocity", "%.2f Deg/Sec", angularVelocity.yRotationRate);
telemetry.update();
}
}
}

View File

@ -0,0 +1,143 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/*
* This OpMode shows how to use the new universal IMU interface. This
* interface may be used with the BNO055 IMU or the BHI260 IMU. It assumes that an IMU is configured
* on the robot with the name "imu".
*
* The sample will display the current Yaw, Pitch and Roll of the robot.<br>
* With the correct orientation parameters selected, pitch/roll/yaw should act as follows:
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X) <br>
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y) <br>
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z) <br>
*
* The yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
*
* This specific sample assumes that the Hub is mounted on one of the three orthogonal planes
* (X/Y, X/Z or Y/Z) and that the Hub has only been rotated in a range of 90 degree increments.
*
* Note: if your Hub is mounted on a surface angled at some non-90 Degree multiple (like 30) look at
* the alternative SensorImuNonOrthogonal sample in this folder.
*
* This "Orthogonal" requirement means that:
*
* 1) The Logo printed on the top of the Hub can ONLY be pointing in one of six directions:
* FORWARD, BACKWARD, UP, DOWN, LEFT and RIGHT.
*
* 2) The USB ports can only be pointing in one of the same six directions:<br>
* FORWARD, BACKWARD, UP, DOWN, LEFT and RIGHT.
*
* So, To fully define how your Hub is mounted to the robot, you must simply specify:<br>
* logoFacingDirection<br>
* usbFacingDirection
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* Finally, choose the two correct parameters to define how your Hub is mounted and edit this OpMode
* to use those parameters.
*/
@TeleOp(name = "Sensor: IMU Orthogonal", group = "Sensor")
@Disabled // Comment this out to add to the OpMode list
public class SensorIMUOrthogonal extends LinearOpMode
{
// The IMU sensor object
IMU imu;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() throws InterruptedException {
// Retrieve and initialize the IMU.
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
/* Define how the hub is mounted on the robot to get the correct Yaw, Pitch and Roll values.
*
* Two input parameters are required to fully specify the Orientation.
* The first parameter specifies the direction the printed logo on the Hub is pointing.
* The second parameter specifies the direction the USB connector on the Hub is pointing.
* All directions are relative to the robot, and left/right is as-viewed from behind the robot.
*/
/* The next two lines define Hub orientation.
* The Default Orientation (shown) is when a hub is mounted horizontally with the printed logo pointing UP and the USB port pointing FORWARD.
*
* To Do: EDIT these two lines to match YOUR mounting configuration.
*/
RevHubOrientationOnRobot.LogoFacingDirection logoDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
RevHubOrientationOnRobot.UsbFacingDirection usbDirection = RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logoDirection, usbDirection);
// Now initialize the IMU with this mounting orientation
// Note: if you choose two conflicting directions, this initialization will cause a code exception.
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Loop and update the dashboard
while (!isStopRequested()) {
telemetry.addData("Hub orientation", "Logo=%s USB=%s\n ", logoDirection, usbDirection);
// Check to see if heading reset is requested
if (gamepad1.y) {
telemetry.addData("Yaw", "Resetting\n");
imu.resetYaw();
} else {
telemetry.addData("Yaw", "Press Y (triangle) on Gamepad to reset\n");
}
// Retrieve Rotational Angles and Velocities
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.DEGREES);
telemetry.addData("Yaw (Z)", "%.2f Deg. (Heading)", orientation.getYaw(AngleUnit.DEGREES));
telemetry.addData("Pitch (X)", "%.2f Deg.", orientation.getPitch(AngleUnit.DEGREES));
telemetry.addData("Roll (Y)", "%.2f Deg.\n", orientation.getRoll(AngleUnit.DEGREES));
telemetry.addData("Yaw (Z) velocity", "%.2f Deg/Sec", angularVelocity.zRotationRate);
telemetry.addData("Pitch (X) velocity", "%.2f Deg/Sec", angularVelocity.xRotationRate);
telemetry.addData("Roll (Y) velocity", "%.2f Deg/Sec", angularVelocity.yRotationRate);
telemetry.update();
}
}
}

View File

@ -0,0 +1,128 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.kauailabs.NavxMicroNavigationSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gyroscope;
import com.qualcomm.robotcore.hardware.IntegratingGyroscope;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
/*
* This OpMode shows how to use Kauai Labs navX Micro Robotics Navigation
* Sensor. It assumes that the sensor is configured with a name of "navx".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: KL navX Micro", group = "Sensor")
@Disabled
public class SensorKLNavxMicro extends LinearOpMode {
/** In this sample, for illustration purposes we use two interfaces on the one gyro object.
* That's likely atypical: you'll probably use one or the other in any given situation,
* depending on what you're trying to do. {@link IntegratingGyroscope} (and it's base interface,
* {@link Gyroscope}) are common interfaces supported by possibly several different gyro
* implementations. {@link NavxMicroNavigationSensor}, by contrast, provides functionality that
* is unique to the navX Micro sensor.
*/
IntegratingGyroscope gyro;
NavxMicroNavigationSensor navxMicro;
// A timer helps provide feedback while calibration is taking place
ElapsedTime timer = new ElapsedTime();
@Override public void runOpMode() throws InterruptedException {
// Get a reference to a Modern Robotics GyroSensor object. We use several interfaces
// on this object to illustrate which interfaces support which functionality.
navxMicro = hardwareMap.get(NavxMicroNavigationSensor.class, "navx");
gyro = (IntegratingGyroscope)navxMicro;
// If you're only interested int the IntegratingGyroscope interface, the following will suffice.
// gyro = hardwareMap.get(IntegratingGyroscope.class, "navx");
// The gyro automatically starts calibrating. This takes a few seconds.
telemetry.log().add("Gyro Calibrating. Do Not Move!");
// Wait until the gyro calibration is complete
timer.reset();
while (navxMicro.isCalibrating()) {
telemetry.addData("calibrating", "%s", Math.round(timer.seconds())%2==0 ? "|.." : "..|");
telemetry.update();
Thread.sleep(50);
}
telemetry.log().clear(); telemetry.log().add("Gyro Calibrated. Press Start.");
telemetry.clear(); telemetry.update();
// Wait for the start button to be pressed
waitForStart();
telemetry.log().clear();
while (opModeIsActive()) {
// Read dimensionalized data from the gyro. This gyro can report angular velocities
// about all three axes. Additionally, it internally integrates the Z axis to
// be able to report an absolute angular Z orientation.
AngularVelocity rates = gyro.getAngularVelocity(AngleUnit.DEGREES);
Orientation angles = gyro.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
telemetry.addLine()
.addData("dx", formatRate(rates.xRotationRate))
.addData("dy", formatRate(rates.yRotationRate))
.addData("dz", "%s deg/s", formatRate(rates.zRotationRate));
telemetry.addLine()
.addData("heading", formatAngle(angles.angleUnit, angles.firstAngle))
.addData("roll", formatAngle(angles.angleUnit, angles.secondAngle))
.addData("pitch", "%s deg", formatAngle(angles.angleUnit, angles.thirdAngle));
telemetry.update();
idle(); // Always call idle() at the bottom of your while(opModeIsActive()) loop
}
}
String formatRate(float rate) {
return String.format("%.3f", rate);
}
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format("%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,138 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.ColorSensor;
/*
*
* This OpMode that shows how to use
* a Modern Robotics Color Sensor.
*
* The OpMode assumes that the color sensor
* is configured with a name of "sensor_color".
*
* You can use the X button on gamepad1 to toggle the LED on and off.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR Color", group = "Sensor")
@Disabled
public class SensorMRColor extends LinearOpMode {
ColorSensor colorSensor; // Hardware Device Object
@Override
public void runOpMode() {
// hsvValues is an array that will hold the hue, saturation, and value information.
float hsvValues[] = {0F,0F,0F};
// values is a reference to the hsvValues array.
final float values[] = hsvValues;
// get a reference to the RelativeLayout so we can change the background
// color of the Robot Controller app to match the hue detected by the RGB sensor.
int relativeLayoutId = hardwareMap.appContext.getResources().getIdentifier("RelativeLayout", "id", hardwareMap.appContext.getPackageName());
final View relativeLayout = ((Activity) hardwareMap.appContext).findViewById(relativeLayoutId);
// bPrevState and bCurrState represent the previous and current state of the button.
boolean bPrevState = false;
boolean bCurrState = false;
// bLedOn represents the state of the LED.
boolean bLedOn = true;
// get a reference to our ColorSensor object.
colorSensor = hardwareMap.get(ColorSensor.class, "sensor_color");
// Set the LED in the beginning
colorSensor.enableLed(bLedOn);
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read the RGB data.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// check the status of the x button on either gamepad.
bCurrState = gamepad1.x;
// check for button state transitions.
if (bCurrState && (bCurrState != bPrevState)) {
// button is transitioning to a pressed state. So Toggle LED
bLedOn = !bLedOn;
colorSensor.enableLed(bLedOn);
}
// update previous state variable.
bPrevState = bCurrState;
// convert the RGB values to HSV values.
Color.RGBToHSV(colorSensor.red() * 8, colorSensor.green() * 8, colorSensor.blue() * 8, hsvValues);
// send the info back to driver station using telemetry function.
telemetry.addData("LED", bLedOn ? "On" : "Off");
telemetry.addData("Clear", colorSensor.alpha());
telemetry.addData("Red ", colorSensor.red());
telemetry.addData("Green", colorSensor.green());
telemetry.addData("Blue ", colorSensor.blue());
telemetry.addData("Hue", hsvValues[0]);
// change the background color to match the color detected by the RGB sensor.
// pass a reference to the hue, saturation, and value array as an argument
// to the HSVToColor method.
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.HSVToColor(0xff, values));
}
});
telemetry.update();
}
// Set the panel back to the default color
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.WHITE);
}
});
}
}

View File

@ -0,0 +1,160 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cGyro;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IntegratingGyroscope;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
/*
* This OpMode shows how to use the Modern Robotics Gyro.
*
* The OpMode assumes that the gyro sensor is attached to a Device Interface Module
* I2C channel and is configured with a name of "gyro".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR Gyro", group = "Sensor")
@Disabled
public class SensorMRGyro extends LinearOpMode {
/* In this sample, for illustration purposes we use two interfaces on the one gyro object.
* That's likely atypical: you'll probably use one or the other in any given situation,
* depending on what you're trying to do. {@link IntegratingGyroscope} (and it's base interface,
* {@link Gyroscope}) are common interfaces supported by possibly several different gyro
* implementations. {@link ModernRoboticsI2cGyro}, by contrast, provides functionality that
* is unique to the Modern Robotics gyro sensor.
*/
IntegratingGyroscope gyro;
ModernRoboticsI2cGyro modernRoboticsI2cGyro;
// A timer helps provide feedback while calibration is taking place
ElapsedTime timer = new ElapsedTime();
@Override
public void runOpMode() {
boolean lastResetState = false;
boolean curResetState = false;
// Get a reference to a Modern Robotics gyro object. We use several interfaces
// on this object to illustrate which interfaces support which functionality.
modernRoboticsI2cGyro = hardwareMap.get(ModernRoboticsI2cGyro.class, "gyro");
gyro = (IntegratingGyroscope)modernRoboticsI2cGyro;
// If you're only interested int the IntegratingGyroscope interface, the following will suffice.
// gyro = hardwareMap.get(IntegratingGyroscope.class, "gyro");
// A similar approach will work for the Gyroscope interface, if that's all you need.
// Start calibrating the gyro. This takes a few seconds and is worth performing
// during the initialization phase at the start of each OpMode.
telemetry.log().add("Gyro Calibrating. Do Not Move!");
modernRoboticsI2cGyro.calibrate();
// Wait until the gyro calibration is complete
timer.reset();
while (!isStopRequested() && modernRoboticsI2cGyro.isCalibrating()) {
telemetry.addData("calibrating", "%s", Math.round(timer.seconds())%2==0 ? "|.." : "..|");
telemetry.update();
sleep(50);
}
telemetry.log().clear(); telemetry.log().add("Gyro Calibrated. Press Start.");
telemetry.clear(); telemetry.update();
// Wait for the start button to be pressed
waitForStart();
telemetry.log().clear();
telemetry.log().add("Press A & B to reset heading");
// Loop until we're asked to stop
while (opModeIsActive()) {
// If the A and B buttons are pressed just now, reset Z heading.
curResetState = (gamepad1.a && gamepad1.b);
if (curResetState && !lastResetState) {
modernRoboticsI2cGyro.resetZAxisIntegrator();
}
lastResetState = curResetState;
// The raw() methods report the angular rate of change about each of the
// three axes directly as reported by the underlying sensor IC.
int rawX = modernRoboticsI2cGyro.rawX();
int rawY = modernRoboticsI2cGyro.rawY();
int rawZ = modernRoboticsI2cGyro.rawZ();
int heading = modernRoboticsI2cGyro.getHeading();
int integratedZ = modernRoboticsI2cGyro.getIntegratedZValue();
// Read dimensionalized data from the gyro. This gyro can report angular velocities
// about all three axes. Additionally, it internally integrates the Z axis to
// be able to report an absolute angular Z orientation.
AngularVelocity rates = gyro.getAngularVelocity(AngleUnit.DEGREES);
float zAngle = gyro.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES).firstAngle;
// Read administrative information from the gyro
int zAxisOffset = modernRoboticsI2cGyro.getZAxisOffset();
int zAxisScalingCoefficient = modernRoboticsI2cGyro.getZAxisScalingCoefficient();
telemetry.addLine()
.addData("dx", formatRate(rates.xRotationRate))
.addData("dy", formatRate(rates.yRotationRate))
.addData("dz", "%s deg/s", formatRate(rates.zRotationRate));
telemetry.addData("angle", "%s deg", formatFloat(zAngle));
telemetry.addData("heading", "%3d deg", heading);
telemetry.addData("integrated Z", "%3d", integratedZ);
telemetry.addLine()
.addData("rawX", formatRaw(rawX))
.addData("rawY", formatRaw(rawY))
.addData("rawZ", formatRaw(rawZ));
telemetry.addLine().addData("z offset", zAxisOffset).addData("z coeff", zAxisScalingCoefficient);
telemetry.update();
}
}
String formatRaw(int rawValue) {
return String.format("%d", rawValue);
}
String formatRate(float rate) {
return String.format("%.3f", rate);
}
String formatFloat(float rate) {
return String.format("%.3f", rate);
}
}

View File

@ -0,0 +1,70 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.OpticalDistanceSensor;
/*
* This OpMode shows how to use a Modern Robotics Optical Distance Sensor
* It assumes that the ODS sensor is configured with a name of "sensor_ods".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR ODS", group = "Sensor")
@Disabled
public class SensorMROpticalDistance extends LinearOpMode {
OpticalDistanceSensor odsSensor; // Hardware Device Object
@Override
public void runOpMode() {
// get a reference to our Light Sensor object.
odsSensor = hardwareMap.get(OpticalDistanceSensor.class, "sensor_ods");
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read the light levels.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// send the info back to driver station using telemetry function.
telemetry.addData("Raw", odsSensor.getRawLightDetected());
telemetry.addData("Normal", odsSensor.getLightDetected());
telemetry.update();
}
}
}

View File

@ -0,0 +1,70 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cRangeSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/*
* This OpMode illustrates how to use the Modern Robotics Range Sensor.
*
* The OpMode assumes that the range sensor is configured with a name of "sensor_range".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* @see <a href="http://modernroboticsinc.com/range-sensor">MR Range Sensor</a>
*/
@TeleOp(name = "Sensor: MR range sensor", group = "Sensor")
@Disabled // comment out or remove this line to enable this OpMode
public class SensorMRRangeSensor extends LinearOpMode {
ModernRoboticsI2cRangeSensor rangeSensor;
@Override public void runOpMode() {
// get a reference to our compass
rangeSensor = hardwareMap.get(ModernRoboticsI2cRangeSensor.class, "sensor_range");
// wait for the start button to be pressed
waitForStart();
while (opModeIsActive()) {
telemetry.addData("raw ultrasonic", rangeSensor.rawUltrasonic());
telemetry.addData("raw optical", rangeSensor.rawOptical());
telemetry.addData("cm optical", "%.2f cm", rangeSensor.cmOptical());
telemetry.addData("cm", "%.2f cm", rangeSensor.getDistance(DistanceUnit.CM));
telemetry.update();
}
}
}

View File

@ -0,0 +1,87 @@
/*
Copyright (c) 2018 FIRST
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of FIRST nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.Rev2mDistanceSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/*
* This OpMode illustrates how to use the REV Robotics 2M Distance Sensor.
*
* The OpMode assumes that the sensor is configured with a name of "sensor_distance".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* See the sensor's product page: https://www.revrobotics.com/rev-31-1505/
*/
@TeleOp(name = "Sensor: REV2mDistance", group = "Sensor")
@Disabled
public class SensorREV2mDistance extends LinearOpMode {
private DistanceSensor sensorDistance;
@Override
public void runOpMode() {
// you can use this as a regular DistanceSensor.
sensorDistance = hardwareMap.get(DistanceSensor.class, "sensor_distance");
// you can also cast this to a Rev2mDistanceSensor if you want to use added
// methods associated with the Rev2mDistanceSensor class.
Rev2mDistanceSensor sensorTimeOfFlight = (Rev2mDistanceSensor) sensorDistance;
telemetry.addData(">>", "Press start to continue");
telemetry.update();
waitForStart();
while(opModeIsActive()) {
// generic DistanceSensor methods.
telemetry.addData("deviceName", sensorDistance.getDeviceName() );
telemetry.addData("range", String.format("%.01f mm", sensorDistance.getDistance(DistanceUnit.MM)));
telemetry.addData("range", String.format("%.01f cm", sensorDistance.getDistance(DistanceUnit.CM)));
telemetry.addData("range", String.format("%.01f m", sensorDistance.getDistance(DistanceUnit.METER)));
telemetry.addData("range", String.format("%.01f in", sensorDistance.getDistance(DistanceUnit.INCH)));
// Rev2mDistanceSensor specific methods.
telemetry.addData("ID", String.format("%x", sensorTimeOfFlight.getModelID()));
telemetry.addData("did time out", Boolean.toString(sensorTimeOfFlight.didTimeoutOccur()));
telemetry.update();
}
}
}

View File

@ -0,0 +1,78 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.TouchSensor;
/*
* This OpMode demonstrates how to use a REV Robotics Touch Sensor, REV Robotics Magnetic Limit Switch, or other device
* that implements the TouchSensor interface. Any touch sensor that connects its output to ground when pressed
* (known as "active low") can be configured as a "REV Touch Sensor". This includes REV's Magnetic Limit Switch.
*
* The OpMode assumes that the touch sensor is configured with a name of "sensor_touch".
*
* A REV Robotics Touch Sensor must be configured on digital port number 1, 3, 5, or 7.
* A Magnetic Limit Switch can be configured on any digital port.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Sensor: REV touch sensor", group = "Sensor")
@Disabled
public class SensorTouch extends LinearOpMode {
TouchSensor touchSensor; // Touch sensor Object
@Override
public void runOpMode() {
// get a reference to our touchSensor object.
touchSensor = hardwareMap.get(TouchSensor.class, "sensor_touch");
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read whether the sensor is being pressed.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// send the info back to driver station using telemetry function.
if (touchSensor.isPressed()) {
telemetry.addData("Touch Sensor", "Is Pressed");
} else {
telemetry.addData("Touch Sensor", "Is Not Pressed");
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,127 @@
/*
* Copyright (c) 2023 FIRST
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior
* written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
* TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
* THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import java.util.Locale;
/*
* This OpMode helps calibrate a webcam or RC phone camera, useful for AprilTag pose estimation
* with the FTC VisionPortal. It captures a camera frame (image) and stores it on the Robot Controller
* (Control Hub or RC phone), with each press of the gamepad button X (or Square).
* Full calibration instructions are here:
*
* https://ftc-docs.firstinspires.org/camera-calibration
*
* In Android Studio, copy this class into your "teamcode" folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* In OnBot Java, use "Add File" to add this OpMode from the list of Samples.
*/
@TeleOp(name = "Utility: Camera Frame Capture", group = "Utility")
@Disabled
public class UtilityCameraFrameCapture extends LinearOpMode
{
/*
* EDIT THESE PARAMETERS AS NEEDED
*/
final boolean USING_WEBCAM = false;
final BuiltinCameraDirection INTERNAL_CAM_DIR = BuiltinCameraDirection.BACK;
final int RESOLUTION_WIDTH = 640;
final int RESOLUTION_HEIGHT = 480;
// Internal state
boolean lastX;
int frameCount;
long capReqTime;
@Override
public void runOpMode()
{
VisionPortal portal;
if (USING_WEBCAM)
{
portal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.setCameraResolution(new Size(RESOLUTION_WIDTH, RESOLUTION_HEIGHT))
.build();
}
else
{
portal = new VisionPortal.Builder()
.setCamera(INTERNAL_CAM_DIR)
.setCameraResolution(new Size(RESOLUTION_WIDTH, RESOLUTION_HEIGHT))
.build();
}
while (!isStopRequested())
{
boolean x = gamepad1.x;
if (x && !lastX)
{
portal.saveNextFrameRaw(String.format(Locale.US, "CameraFrameCapture-%06d", frameCount++));
capReqTime = System.currentTimeMillis();
}
lastX = x;
telemetry.addLine("######## Camera Capture Utility ########");
telemetry.addLine(String.format(Locale.US, " > Resolution: %dx%d", RESOLUTION_WIDTH, RESOLUTION_HEIGHT));
telemetry.addLine(" > Press X (or Square) to capture a frame");
telemetry.addData(" > Camera Status", portal.getCameraState());
if (capReqTime != 0)
{
telemetry.addLine("\nCaptured Frame!");
}
if (capReqTime != 0 && System.currentTimeMillis() - capReqTime > 1000)
{
capReqTime = 0;
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,45 @@
## Caution
No Team-specific code should be placed or modified in this ``.../samples`` folder.
Samples should be Copied from here, and then Pasted into the team's
[/TeamCode/src/main/java/org/firstinspires/ftc/teamcode](../../../../../../../../../../TeamCode/src/main/java/org/firstinspires/ftc/teamcode)
folder, using the Android Studio cut and paste commands. This automatically changes all file and
class names to be consistent. From there, the sample can be modified to suit the team's needs.
For more detailed instructions see the /teamcode readme.
### Naming of Samples
To gain a better understanding of how the samples are organized, and how to interpret the
naming system, it will help to understand the conventions that were used during their creation.
These conventions are described (in detail) in the sample_conventions.md file in this folder.
To summarize: A range of different samples classes will reside in the java/external/samples.
The class names will follow a naming convention which indicates the purpose of each class.
The prefix of the name will be one of the following:
Basic: This is a minimally functional OpMode used to illustrate the skeleton/structure
of a particular style of OpMode. These are bare bones examples.
Sensor: This is a Sample OpMode that shows how to use a specific sensor.
It is not intended to drive a functioning robot, it is simply showing the minimal code
required to read and display the sensor values.
Robot: This is a Sample OpMode that assumes a simple two-motor (differential) drive base.
It may be used to provide a common baseline driving OpMode, or
to demonstrate how a particular sensor or concept can be used to navigate.
Concept: This is a sample OpMode that illustrates performing a specific function or concept.
These may be complex, but their operation should be explained clearly in the comments,
or the comments should reference an external doc, guide or tutorial.
Each OpMode should try to only demonstrate a single concept so they are easy to
locate based on their name. These OpModes may not produce a drivable robot.
After the prefix, other conventions will apply:
* Sensor class names are constructed as: Sensor - Company - Type
* Robot class names are constructed as: Robot - Mode - Action - OpModetype
* Concept class names are constructed as: Concept - Topic - OpModetype

View File

@ -0,0 +1,108 @@
## Sample Class/Opmode conventions
#### V 1.1.0 8/9/2017
This document defines the FTC Sample OpMode and Class conventions.
### OpMode Name
To gain a better understanding of how the samples are organized, and how to interpret the
naming system, it will help to understand the conventions that were used during their creation.
To summarize: A range of different samples classes will reside in the java/external/samples.
The class names will follow a naming convention which indicates the purpose of each class.
The prefix of the name will be one of the following:
Basic: This is a minimally functional OpMode used to illustrate the skeleton/structure
of a particular style of OpMode. These are bare bones examples.
Sensor: This is a Sample OpMode that shows how to use a specific sensor.
It is not intended to drive a functioning robot, it is simply showing the minimal code
required to read and display the sensor values.
Robot: This is a Sample OpMode that assumes a simple two-motor (differential) drive base.
It may be used to provide a common baseline driving OpMode, or
to demonstrate how a particular sensor or concept can be used to navigate.
Concept: This is a sample OpMode that illustrates performing a specific function or concept.
These may be complex, but their operation should be explained clearly in the comments,
or the comments should reference an external doc, guide or tutorial.
Each OpMode should try to only demonstrate a single concept so they are easy to
locate based on their name. These OpModes may not produce a drivable robot.
After the prefix, other conventions will apply:
* Sensor class names should constructed as: Sensor - Company - Type
* Robot class names should be constructed as: Robot - Mode - Action - OpModetype
* Concept class names should be constructed as: Concept - Topic - OpModetype
### Sample OpMode Content/Style
Code is formatted as per the Google Style Guide:
https://google.github.io/styleguide/javaguide.html
With “Sensor” and “Hardware” samples, the code should demonstrate the essential function,
and not be embellished with too much additional “clever” code. If a sensor has special
addressing needs, or has a variety of modes or outputs, these should be demonstrated as
simply as possible.
Special programming methods, or robot control techniques should be reserved for “Concept” Samples,
and where possible, Samples should strive to only demonstrate a single concept,
eg: State machine coding, or a User Menu system, and not combine them into a single “all inclusive”
sample. This will prevent an “all inclusive” Sample being deleted just because one part of it
becomes obsolete.
### Device Configuration Names
The following device names are used in the external samples
** Motors:
left_drive
right_drive
left_arm
** Servos:
left_hand
right_hand
arm
claw
** Sensors:
sensor_color
sensor_ir
sensor_light
sensor_ods
sensor_range
sensor_touch
sensor_color_distance
sensor_digital
digin
digout
** Localization:
compass
gyro
imu
navx
### Device Object Names
Device Object names should use the same words as the devices configuration name, but they
should be re-structured to be a suitable Java variable name. This should keep the same word order,
but adopt the style of beginning with a lower case letter, and then each subsequent word
starting with an upper case letter.
Eg: from the examples above: tool, leftMotor, rightClawServo, rearLightSensor.
Note: Sometimes its helpful to put the device type first, followed by the variant.
eg: motorLeft and motorRight, but this should only be done if the same word order
is used on the device configuration name.
### OpMode code Comments
Sample comments should read like normal code comments, that is, as an explanation of what the
sample code is doing. They should NOT be directives to the user,
like: “insert your joystick code here” as these comments typically arent
detailed enough to be useful. They also often get left in the code and become garbage.
Instead, an example of the joystick code should be shown with a comment describing what it is doing.

View File

@ -0,0 +1,70 @@
/* Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */
package org.firstinspires.ftc.robotcontroller.internal;
import com.qualcomm.robotcore.eventloop.opmode.OpModeManager;
import com.qualcomm.robotcore.eventloop.opmode.OpModeRegister;
import org.firstinspires.ftc.robotcontroller.external.samples.ConceptNullOp;
/**
* {@link FtcOpModeRegister} is responsible for registering OpModes for use in an FTC game.
* @see #register(OpModeManager)
*/
public class FtcOpModeRegister implements OpModeRegister {
/**
* {@link #register(OpModeManager)} is called by the SDK game in order to register
* OpMode classes or instances that will participate in an FTC game.
*
* There are two mechanisms by which an OpMode may be registered.
*
* 1) The preferred method is by means of class annotations in the OpMode itself.
* See, for example the class annotations in {@link ConceptNullOp}.
*
* 2) The other, retired, method is to modify this {@link #register(OpModeManager)}
* method to include explicit calls to OpModeManager.register().
* This method of modifying this file directly is discouraged, as it
* makes updates to the SDK harder to integrate into your code.
*
* @param manager the object which contains methods for carrying out OpMode registrations
*
* @see com.qualcomm.robotcore.eventloop.opmode.TeleOp
* @see com.qualcomm.robotcore.eventloop.opmode.Autonomous
*/
public void register(OpModeManager manager) {
/**
* Any manual OpMode class registrations should go here.
*/
}
}

View File

@ -0,0 +1,845 @@
/* Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */
package org.firstinspires.ftc.robotcontroller.internal;
import android.app.ActionBar;
import android.app.Activity;
import android.app.ActivityManager;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.SharedPreferences;
import android.content.res.Configuration;
import android.hardware.usb.UsbDevice;
import android.hardware.usb.UsbManager;
import android.net.wifi.WifiManager;
import android.os.Bundle;
import android.os.IBinder;
import android.preference.PreferenceManager;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.StringRes;
import android.view.Menu;
import android.view.MenuItem;
import android.view.MotionEvent;
import android.view.View;
import android.view.WindowManager;
import android.webkit.WebView;
import android.widget.ImageButton;
import android.widget.LinearLayout;
import android.widget.LinearLayout.LayoutParams;
import android.widget.PopupMenu;
import android.widget.TextView;
import com.google.blocks.ftcrobotcontroller.ProgrammingWebHandlers;
import com.google.blocks.ftcrobotcontroller.runtime.BlocksOpMode;
import com.qualcomm.ftccommon.ClassManagerFactory;
import com.qualcomm.ftccommon.FtcAboutActivity;
import com.qualcomm.ftccommon.FtcEventLoop;
import com.qualcomm.ftccommon.FtcEventLoopIdle;
import com.qualcomm.ftccommon.FtcRobotControllerService;
import com.qualcomm.ftccommon.FtcRobotControllerService.FtcRobotControllerBinder;
import com.qualcomm.ftccommon.FtcRobotControllerSettingsActivity;
import com.qualcomm.ftccommon.LaunchActivityConstantsList;
import com.qualcomm.ftccommon.LaunchActivityConstantsList.RequestCode;
import com.qualcomm.ftccommon.Restarter;
import com.qualcomm.ftccommon.UpdateUI;
import com.qualcomm.ftccommon.configuration.EditParameters;
import com.qualcomm.ftccommon.configuration.FtcLoadFileActivity;
import com.qualcomm.ftccommon.configuration.RobotConfigFile;
import com.qualcomm.ftccommon.configuration.RobotConfigFileManager;
import com.qualcomm.ftcrobotcontroller.BuildConfig;
import com.qualcomm.ftcrobotcontroller.R;
import com.qualcomm.hardware.HardwareFactory;
import com.qualcomm.robotcore.eventloop.EventLoopManager;
import com.qualcomm.robotcore.eventloop.opmode.FtcRobotControllerServiceState;
import com.qualcomm.robotcore.eventloop.opmode.OpModeRegister;
import com.qualcomm.robotcore.hardware.configuration.LynxConstants;
import com.qualcomm.robotcore.hardware.configuration.Utility;
import com.qualcomm.robotcore.robot.Robot;
import com.qualcomm.robotcore.robot.RobotState;
import com.qualcomm.robotcore.util.ClockWarningSource;
import com.qualcomm.robotcore.util.Device;
import com.qualcomm.robotcore.util.Dimmer;
import com.qualcomm.robotcore.util.ImmersiveMode;
import com.qualcomm.robotcore.util.RobotLog;
import com.qualcomm.robotcore.util.WebServer;
import com.qualcomm.robotcore.wifi.NetworkConnection;
import com.qualcomm.robotcore.wifi.NetworkConnectionFactory;
import com.qualcomm.robotcore.wifi.NetworkType;
import org.firstinspires.ftc.ftccommon.external.SoundPlayingRobotMonitor;
import org.firstinspires.ftc.ftccommon.internal.AnnotatedHooksClassFilter;
import org.firstinspires.ftc.ftccommon.internal.FtcRobotControllerWatchdogService;
import org.firstinspires.ftc.ftccommon.internal.ProgramAndManageActivity;
import org.firstinspires.ftc.onbotjava.ExternalLibraries;
import org.firstinspires.ftc.onbotjava.OnBotJavaHelperImpl;
import org.firstinspires.ftc.onbotjava.OnBotJavaProgrammingMode;
import org.firstinspires.ftc.robotcore.external.navigation.MotionDetection;
import org.firstinspires.ftc.robotcore.internal.hardware.android.AndroidBoard;
import org.firstinspires.ftc.robotcore.internal.network.DeviceNameManagerFactory;
import org.firstinspires.ftc.robotcore.internal.network.PreferenceRemoterRC;
import org.firstinspires.ftc.robotcore.internal.network.StartResult;
import org.firstinspires.ftc.robotcore.internal.network.WifiDirectChannelChanger;
import org.firstinspires.ftc.robotcore.internal.network.WifiMuteEvent;
import org.firstinspires.ftc.robotcore.internal.network.WifiMuteStateMachine;
import org.firstinspires.ftc.robotcore.internal.opmode.ClassManager;
import org.firstinspires.ftc.robotcore.internal.opmode.OnBotJavaHelper;
import org.firstinspires.ftc.robotcore.internal.system.AppAliveNotifier;
import org.firstinspires.ftc.robotcore.internal.system.AppUtil;
import org.firstinspires.ftc.robotcore.internal.system.Assert;
import org.firstinspires.ftc.robotcore.internal.system.PreferencesHelper;
import org.firstinspires.ftc.robotcore.internal.system.ServiceController;
import org.firstinspires.ftc.robotcore.internal.ui.ThemedActivity;
import org.firstinspires.ftc.robotcore.internal.ui.UILocation;
import org.firstinspires.ftc.robotcore.internal.webserver.RobotControllerWebInfo;
import org.firstinspires.ftc.robotserver.internal.programmingmode.ProgrammingModeManager;
import org.firstinspires.inspection.RcInspectionActivity;
import org.threeten.bp.YearMonth;
import org.xmlpull.v1.XmlPullParserException;
import java.io.FileNotFoundException;
import java.util.List;
import java.util.Queue;
import java.util.concurrent.ConcurrentLinkedQueue;
@SuppressWarnings("WeakerAccess")
public class FtcRobotControllerActivity extends Activity
{
public static final String TAG = "RCActivity";
public String getTag() { return TAG; }
private static final int REQUEST_CONFIG_WIFI_CHANNEL = 1;
private static final int NUM_GAMEPADS = 2;
protected WifiManager.WifiLock wifiLock;
protected RobotConfigFileManager cfgFileMgr;
private OnBotJavaHelper onBotJavaHelper;
protected ProgrammingModeManager programmingModeManager;
protected UpdateUI.Callback callback;
protected Context context;
protected Utility utility;
protected StartResult prefRemoterStartResult = new StartResult();
protected StartResult deviceNameStartResult = new StartResult();
protected PreferencesHelper preferencesHelper;
protected final SharedPreferencesListener sharedPreferencesListener = new SharedPreferencesListener();
protected ImageButton buttonMenu;
protected TextView textDeviceName;
protected TextView textNetworkConnectionStatus;
protected TextView textRobotStatus;
protected TextView[] textGamepad = new TextView[NUM_GAMEPADS];
protected TextView textOpMode;
protected TextView textErrorMessage;
protected ImmersiveMode immersion;
protected UpdateUI updateUI;
protected Dimmer dimmer;
protected LinearLayout entireScreenLayout;
protected FtcRobotControllerService controllerService;
protected NetworkType networkType;
protected FtcEventLoop eventLoop;
protected Queue<UsbDevice> receivedUsbAttachmentNotifications;
protected WifiMuteStateMachine wifiMuteStateMachine;
protected MotionDetection motionDetection;
private static boolean permissionsValidated = false;
private WifiDirectChannelChanger wifiDirectChannelChanger;
protected class RobotRestarter implements Restarter {
public void requestRestart() {
requestRobotRestart();
}
}
protected boolean serviceShouldUnbind = false;
protected ServiceConnection connection = new ServiceConnection() {
@Override
public void onServiceConnected(ComponentName name, IBinder service) {
FtcRobotControllerBinder binder = (FtcRobotControllerBinder) service;
onServiceBind(binder.getService());
}
@Override
public void onServiceDisconnected(ComponentName name) {
RobotLog.vv(FtcRobotControllerService.TAG, "%s.controllerService=null", TAG);
controllerService = null;
}
};
@Override
protected void onNewIntent(Intent intent) {
super.onNewIntent(intent);
if (UsbManager.ACTION_USB_DEVICE_ATTACHED.equals(intent.getAction())) {
UsbDevice usbDevice = intent.getParcelableExtra(UsbManager.EXTRA_DEVICE);
RobotLog.vv(TAG, "ACTION_USB_DEVICE_ATTACHED: %s", usbDevice.getDeviceName());
if (usbDevice != null) { // paranoia
// We might get attachment notifications before the event loop is set up, so
// we hold on to them and pass them along only when we're good and ready.
if (receivedUsbAttachmentNotifications != null) { // *total* paranoia
receivedUsbAttachmentNotifications.add(usbDevice);
passReceivedUsbAttachmentsToEventLoop();
}
}
}
}
protected void passReceivedUsbAttachmentsToEventLoop() {
if (this.eventLoop != null) {
for (;;) {
UsbDevice usbDevice = receivedUsbAttachmentNotifications.poll();
if (usbDevice == null)
break;
this.eventLoop.onUsbDeviceAttached(usbDevice);
}
}
else {
// Paranoia: we don't want the pending list to grow without bound when we don't
// (yet) have an event loop
while (receivedUsbAttachmentNotifications.size() > 100) {
receivedUsbAttachmentNotifications.poll();
}
}
}
/**
* There are cases where a permission may be revoked and the system restart will restart the
* FtcRobotControllerActivity, instead of the launch activity. Detect when that happens, and throw
* the device back to the permission validator activity.
*/
protected boolean enforcePermissionValidator() {
if (!permissionsValidated) {
RobotLog.vv(TAG, "Redirecting to permission validator");
Intent permissionValidatorIntent = new Intent(AppUtil.getDefContext(), PermissionValidatorWrapper.class);
startActivity(permissionValidatorIntent);
finish();
return true;
} else {
RobotLog.vv(TAG, "Permissions validated already");
return false;
}
}
public static void setPermissionsValidated() {
permissionsValidated = true;
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (enforcePermissionValidator()) {
return;
}
RobotLog.onApplicationStart(); // robustify against onCreate() following onDestroy() but using the same app instance, which apparently does happen
RobotLog.vv(TAG, "onCreate()");
ThemedActivity.appAppThemeToActivity(getTag(), this); // do this way instead of inherit to help AppInventor
// Oddly, sometimes after a crash & restart the root activity will be something unexpected, like from the before crash? We don't yet understand
RobotLog.vv(TAG, "rootActivity is of class %s", AppUtil.getInstance().getRootActivity().getClass().getSimpleName());
RobotLog.vv(TAG, "launchActivity is of class %s", FtcRobotControllerWatchdogService.launchActivity());
Assert.assertTrue(FtcRobotControllerWatchdogService.isLaunchActivity(AppUtil.getInstance().getRootActivity()));
Assert.assertTrue(AppUtil.getInstance().isRobotController());
// Quick check: should we pretend we're not here, and so allow the Lynx to operate as
// a stand-alone USB-connected module?
if (LynxConstants.isRevControlHub()) {
// Double-sure check that we can talk to the DB over the serial TTY
AndroidBoard.getInstance().getAndroidBoardIsPresentPin().setState(true);
}
context = this;
utility = new Utility(this);
DeviceNameManagerFactory.getInstance().start(deviceNameStartResult);
PreferenceRemoterRC.getInstance().start(prefRemoterStartResult);
receivedUsbAttachmentNotifications = new ConcurrentLinkedQueue<UsbDevice>();
eventLoop = null;
setContentView(R.layout.activity_ftc_controller);
preferencesHelper = new PreferencesHelper(TAG, context);
preferencesHelper.writeBooleanPrefIfDifferent(context.getString(R.string.pref_rc_connected), true);
preferencesHelper.getSharedPreferences().registerOnSharedPreferenceChangeListener(sharedPreferencesListener);
// Check if this RC app is from a later FTC season than what was installed previously
int ftcSeasonYearOfPreviouslyInstalledRc = preferencesHelper.readInt(getString(R.string.pref_ftc_season_year_of_current_rc), 0);
int ftcSeasonYearOfCurrentlyInstalledRc = AppUtil.getInstance().getFtcSeasonYear(AppUtil.getInstance().getLocalSdkBuildMonth()).getValue();
if (ftcSeasonYearOfCurrentlyInstalledRc > ftcSeasonYearOfPreviouslyInstalledRc) {
preferencesHelper.writeIntPrefIfDifferent(getString(R.string.pref_ftc_season_year_of_current_rc), ftcSeasonYearOfCurrentlyInstalledRc);
// Since it's a new FTC season, we should reset certain settings back to their default values.
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_2_4_ghz_band), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_obsolete_software), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_mismatched_app_versions), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_incorrect_clocks), true);
}
entireScreenLayout = (LinearLayout) findViewById(R.id.entire_screen);
buttonMenu = (ImageButton) findViewById(R.id.menu_buttons);
buttonMenu.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
PopupMenu popupMenu = new PopupMenu(FtcRobotControllerActivity.this, v);
popupMenu.setOnMenuItemClickListener(new PopupMenu.OnMenuItemClickListener() {
@Override
public boolean onMenuItemClick(MenuItem item) {
return onOptionsItemSelected(item); // Delegate to the handler for the hardware menu button
}
});
popupMenu.inflate(R.menu.ftc_robot_controller);
AnnotatedHooksClassFilter.getInstance().callOnCreateMenuMethods(
FtcRobotControllerActivity.this, popupMenu.getMenu());
popupMenu.show();
}
});
updateMonitorLayout(getResources().getConfiguration());
BlocksOpMode.setActivityAndWebView(this, (WebView) findViewById(R.id.webViewBlocksRuntime));
ExternalLibraries.getInstance().onCreate();
onBotJavaHelper = new OnBotJavaHelperImpl();
/*
* Paranoia as the ClassManagerFactory requires EXTERNAL_STORAGE permissions
* and we've seen on the DS where the finish() call above does not short-circuit
* the onCreate() call for the activity and then we crash here because we don't
* have permissions. So...
*/
if (permissionsValidated) {
ClassManager.getInstance().setOnBotJavaClassHelper(onBotJavaHelper);
ClassManagerFactory.registerFilters();
ClassManagerFactory.processAllClasses();
}
cfgFileMgr = new RobotConfigFileManager(this);
// Clean up 'dirty' status after a possible crash
RobotConfigFile configFile = cfgFileMgr.getActiveConfig();
if (configFile.isDirty()) {
configFile.markClean();
cfgFileMgr.setActiveConfig(false, configFile);
}
textDeviceName = (TextView) findViewById(R.id.textDeviceName);
textNetworkConnectionStatus = (TextView) findViewById(R.id.textNetworkConnectionStatus);
textRobotStatus = (TextView) findViewById(R.id.textRobotStatus);
textOpMode = (TextView) findViewById(R.id.textOpMode);
textErrorMessage = (TextView) findViewById(R.id.textErrorMessage);
textGamepad[0] = (TextView) findViewById(R.id.textGamepad1);
textGamepad[1] = (TextView) findViewById(R.id.textGamepad2);
immersion = new ImmersiveMode(getWindow().getDecorView());
dimmer = new Dimmer(this);
dimmer.longBright();
programmingModeManager = new ProgrammingModeManager();
programmingModeManager.register(new ProgrammingWebHandlers());
programmingModeManager.register(new OnBotJavaProgrammingMode());
updateUI = createUpdateUI();
callback = createUICallback(updateUI);
PreferenceManager.setDefaultValues(this, R.xml.app_settings, false);
WifiManager wifiManager = (WifiManager) getApplicationContext().getSystemService(Context.WIFI_SERVICE);
wifiLock = wifiManager.createWifiLock(WifiManager.WIFI_MODE_FULL_HIGH_PERF, "");
hittingMenuButtonBrightensScreen();
wifiLock.acquire();
callback.networkConnectionUpdate(NetworkConnection.NetworkEvent.DISCONNECTED);
readNetworkType();
ServiceController.startService(FtcRobotControllerWatchdogService.class);
bindToService();
RobotLog.logAppInfo();
RobotLog.logDeviceInfo();
AndroidBoard.getInstance().logAndroidBoardInfo();
if (preferencesHelper.readBoolean(getString(R.string.pref_wifi_automute), false)) {
initWifiMute(true);
}
FtcAboutActivity.setBuildTimeFromBuildConfig(BuildConfig.APP_BUILD_TIME);
// check to see if there is a preferred Wi-Fi to use.
checkPreferredChannel();
AnnotatedHooksClassFilter.getInstance().callOnCreateMethods(this);
}
protected UpdateUI createUpdateUI() {
Restarter restarter = new RobotRestarter();
UpdateUI result = new UpdateUI(this, dimmer);
result.setRestarter(restarter);
result.setTextViews(textNetworkConnectionStatus, textRobotStatus, textGamepad, textOpMode, textErrorMessage, textDeviceName);
return result;
}
protected UpdateUI.Callback createUICallback(UpdateUI updateUI) {
UpdateUI.Callback result = updateUI.new Callback();
result.setStateMonitor(new SoundPlayingRobotMonitor());
return result;
}
@Override
protected void onStart() {
super.onStart();
RobotLog.vv(TAG, "onStart()");
entireScreenLayout.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
dimmer.handleDimTimer();
return false;
}
});
}
@Override
protected void onResume() {
super.onResume();
RobotLog.vv(TAG, "onResume()");
// In case the user just got back from fixing their clock, refresh ClockWarningSource
ClockWarningSource.getInstance().onPossibleRcClockUpdate();
}
@Override
protected void onPause() {
super.onPause();
RobotLog.vv(TAG, "onPause()");
}
@Override
protected void onStop() {
// Note: this gets called even when the configuration editor is launched. That is, it gets
// called surprisingly often. So, we don't actually do much here.
super.onStop();
RobotLog.vv(TAG, "onStop()");
}
@Override
protected void onDestroy() {
super.onDestroy();
RobotLog.vv(TAG, "onDestroy()");
shutdownRobot(); // Ensure the robot is put away to bed
if (callback != null) callback.close();
PreferenceRemoterRC.getInstance().stop(prefRemoterStartResult);
DeviceNameManagerFactory.getInstance().stop(deviceNameStartResult);
unbindFromService();
// If the app manually (?) is stopped, then we don't need the auto-starting function (?)
ServiceController.stopService(FtcRobotControllerWatchdogService.class);
if (wifiLock != null) wifiLock.release();
if (preferencesHelper != null) preferencesHelper.getSharedPreferences().unregisterOnSharedPreferenceChangeListener(sharedPreferencesListener);
RobotLog.cancelWriteLogcatToDisk();
AnnotatedHooksClassFilter.getInstance().callOnDestroyMethods(this);
}
protected void bindToService() {
readNetworkType();
Intent intent = new Intent(this, FtcRobotControllerService.class);
intent.putExtra(NetworkConnectionFactory.NETWORK_CONNECTION_TYPE, networkType);
serviceShouldUnbind = bindService(intent, connection, Context.BIND_AUTO_CREATE);
}
protected void unbindFromService() {
if (serviceShouldUnbind) {
unbindService(connection);
serviceShouldUnbind = false;
}
}
protected void readNetworkType() {
// Control hubs are always running the access point model. Everything else, for the time
// being always runs the Wi-Fi Direct model.
if (Device.isRevControlHub() == true) {
networkType = NetworkType.RCWIRELESSAP;
} else {
networkType = NetworkType.fromString(preferencesHelper.readString(context.getString(R.string.pref_pairing_kind), NetworkType.globalDefaultAsString()));
}
// update the app_settings
preferencesHelper.writeStringPrefIfDifferent(context.getString(R.string.pref_pairing_kind), networkType.toString());
}
@Override
public void onWindowFocusChanged(boolean hasFocus) {
super.onWindowFocusChanged(hasFocus);
if (hasFocus) {
immersion.hideSystemUI();
getWindow().setFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION, WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.ftc_robot_controller, menu);
AnnotatedHooksClassFilter.getInstance().callOnCreateMenuMethods(this, menu);
return true;
}
private boolean isRobotRunning() {
if (controllerService == null) {
return false;
}
Robot robot = controllerService.getRobot();
if ((robot == null) || (robot.eventLoopManager == null)) {
return false;
}
RobotState robotState = robot.eventLoopManager.state;
if (robotState != RobotState.RUNNING) {
return false;
} else {
return true;
}
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
int id = item.getItemId();
if (id == R.id.action_program_and_manage) {
if (isRobotRunning()) {
Intent programmingModeIntent = new Intent(AppUtil.getDefContext(), ProgramAndManageActivity.class);
RobotControllerWebInfo webInfo = programmingModeManager.getWebServer().getConnectionInformation();
programmingModeIntent.putExtra(LaunchActivityConstantsList.RC_WEB_INFO, webInfo.toJson());
startActivity(programmingModeIntent);
} else {
AppUtil.getInstance().showToast(UILocation.ONLY_LOCAL, context.getString(R.string.toastWifiUpBeforeProgrammingMode));
}
} else if (id == R.id.action_inspection_mode) {
Intent inspectionModeIntent = new Intent(AppUtil.getDefContext(), RcInspectionActivity.class);
startActivity(inspectionModeIntent);
return true;
} else if (id == R.id.action_restart_robot) {
dimmer.handleDimTimer();
AppUtil.getInstance().showToast(UILocation.BOTH, context.getString(R.string.toastRestartingRobot));
requestRobotRestart();
return true;
}
else if (id == R.id.action_configure_robot) {
EditParameters parameters = new EditParameters();
Intent intentConfigure = new Intent(AppUtil.getDefContext(), FtcLoadFileActivity.class);
parameters.putIntent(intentConfigure);
startActivityForResult(intentConfigure, RequestCode.CONFIGURE_ROBOT_CONTROLLER.ordinal());
}
else if (id == R.id.action_settings) {
// historical: this once erroneously used FTC_CONFIGURE_REQUEST_CODE_ROBOT_CONTROLLER
Intent settingsIntent = new Intent(AppUtil.getDefContext(), FtcRobotControllerSettingsActivity.class);
startActivityForResult(settingsIntent, RequestCode.SETTINGS_ROBOT_CONTROLLER.ordinal());
return true;
}
else if (id == R.id.action_about) {
Intent intent = new Intent(AppUtil.getDefContext(), FtcAboutActivity.class);
startActivity(intent);
return true;
}
else if (id == R.id.action_exit_app) {
//Clear backstack and everything to prevent edge case where VM might be
//restarted (after it was exited) if more than one activity was on the
//backstack for some reason.
finishAffinity();
//For lollipop and up, we can clear ourselves from the recents list too
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
ActivityManager manager = (ActivityManager) getSystemService(ACTIVITY_SERVICE);
List<ActivityManager.AppTask> tasks = manager.getAppTasks();
for (ActivityManager.AppTask task : tasks) {
task.finishAndRemoveTask();
}
}
// Allow the user to use the Control Hub operating system's UI, instead of relaunching the app
AppAliveNotifier.getInstance().disableAppWatchdogUntilNextAppStart();
//Finally, nuke the VM from orbit
AppUtil.getInstance().exitApplication();
return true;
}
return super.onOptionsItemSelected(item);
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
// don't destroy assets on screen rotation
updateMonitorLayout(newConfig);
}
/**
* Updates the orientation of monitorContainer (which contains cameraMonitorView)
* based on the given configuration. Makes the children split the space.
*/
private void updateMonitorLayout(Configuration configuration) {
LinearLayout monitorContainer = (LinearLayout) findViewById(R.id.monitorContainer);
if (configuration.orientation == Configuration.ORIENTATION_LANDSCAPE) {
// When the phone is landscape, lay out the monitor views horizontally.
monitorContainer.setOrientation(LinearLayout.HORIZONTAL);
for (int i = 0; i < monitorContainer.getChildCount(); i++) {
View view = monitorContainer.getChildAt(i);
view.setLayoutParams(new LayoutParams(0, LayoutParams.MATCH_PARENT, 1 /* weight */));
}
} else {
// When the phone is portrait, lay out the monitor views vertically.
monitorContainer.setOrientation(LinearLayout.VERTICAL);
for (int i = 0; i < monitorContainer.getChildCount(); i++) {
View view = monitorContainer.getChildAt(i);
view.setLayoutParams(new LayoutParams(LayoutParams.MATCH_PARENT, 0, 1 /* weight */));
}
}
monitorContainer.requestLayout();
}
@Override
protected void onActivityResult(int request, int result, Intent intent) {
if (request == REQUEST_CONFIG_WIFI_CHANNEL) {
if (result == RESULT_OK) {
AppUtil.getInstance().showToast(UILocation.BOTH, context.getString(R.string.toastWifiConfigurationComplete));
}
}
// was some historical confusion about launch codes here, so we err safely
if (request == RequestCode.CONFIGURE_ROBOT_CONTROLLER.ordinal() || request == RequestCode.SETTINGS_ROBOT_CONTROLLER.ordinal()) {
// We always do a refresh, whether it was a cancel or an OK, for robustness
shutdownRobot();
cfgFileMgr.getActiveConfigAndUpdateUI();
updateUIAndRequestRobotSetup();
}
}
public void onServiceBind(final FtcRobotControllerService service) {
RobotLog.vv(FtcRobotControllerService.TAG, "%s.controllerService=bound", TAG);
controllerService = service;
updateUI.setControllerService(controllerService);
controllerService.setOnBotJavaHelper(onBotJavaHelper);
updateUIAndRequestRobotSetup();
programmingModeManager.setState(new FtcRobotControllerServiceState() {
@NonNull
@Override
public WebServer getWebServer() {
return service.getWebServer();
}
@Nullable
@Override
public OnBotJavaHelper getOnBotJavaHelper() {
return service.getOnBotJavaHelper();
}
@Override
public EventLoopManager getEventLoopManager() {
return service.getRobot().eventLoopManager;
}
});
AnnotatedHooksClassFilter.getInstance().callWebHandlerRegistrarMethods(this,
service.getWebServer().getWebHandlerManager());
}
private void updateUIAndRequestRobotSetup() {
if (controllerService != null) {
callback.networkConnectionUpdate(controllerService.getNetworkConnectionStatus());
callback.updateRobotStatus(controllerService.getRobotStatus());
// Only show this first-time toast on headless systems: what we have now on non-headless suffices
requestRobotSetup(LynxConstants.isRevControlHub()
? new Runnable() {
@Override public void run() {
showRestartRobotCompleteToast(R.string.toastRobotSetupComplete);
}
}
: null);
}
}
private void requestRobotSetup(@Nullable Runnable runOnComplete) {
if (controllerService == null) return;
RobotConfigFile file = cfgFileMgr.getActiveConfigAndUpdateUI();
HardwareFactory hardwareFactory = new HardwareFactory(context);
try {
hardwareFactory.setXmlPullParser(file.getXml());
} catch (FileNotFoundException | XmlPullParserException e) {
RobotLog.ww(TAG, e, "Unable to set configuration file %s. Falling back on noConfig.", file.getName());
file = RobotConfigFile.noConfig(cfgFileMgr);
try {
hardwareFactory.setXmlPullParser(file.getXml());
cfgFileMgr.setActiveConfigAndUpdateUI(false, file);
} catch (FileNotFoundException | XmlPullParserException e1) {
RobotLog.ee(TAG, e1, "Failed to fall back on noConfig");
}
}
OpModeRegister userOpModeRegister = createOpModeRegister();
eventLoop = new FtcEventLoop(hardwareFactory, userOpModeRegister, callback, this);
FtcEventLoopIdle idleLoop = new FtcEventLoopIdle(hardwareFactory, userOpModeRegister, callback, this);
controllerService.setCallback(callback);
controllerService.setupRobot(eventLoop, idleLoop, runOnComplete);
passReceivedUsbAttachmentsToEventLoop();
AndroidBoard.showErrorIfUnknownControlHub();
AnnotatedHooksClassFilter.getInstance().callOnCreateEventLoopMethods(this, eventLoop);
}
protected OpModeRegister createOpModeRegister() {
return new FtcOpModeRegister();
}
private void shutdownRobot() {
if (controllerService != null) controllerService.shutdownRobot();
}
private void requestRobotRestart() {
AppUtil.getInstance().showToast(UILocation.BOTH, AppUtil.getDefContext().getString(R.string.toastRestartingRobot));
//
RobotLog.clearGlobalErrorMsg();
RobotLog.clearGlobalWarningMsg();
shutdownRobot();
requestRobotSetup(new Runnable() {
@Override public void run() {
showRestartRobotCompleteToast(R.string.toastRestartRobotComplete);
}
});
}
private void showRestartRobotCompleteToast(@StringRes int resid) {
AppUtil.getInstance().showToast(UILocation.BOTH, AppUtil.getDefContext().getString(resid));
}
private void checkPreferredChannel() {
// For P2P network, check to see what preferred channel is.
if (networkType == NetworkType.WIFIDIRECT) {
int prefChannel = preferencesHelper.readInt(getString(com.qualcomm.ftccommon.R.string.pref_wifip2p_channel), -1);
if (prefChannel == -1) {
prefChannel = 0;
RobotLog.vv(TAG, "pref_wifip2p_channel: No preferred channel defined. Will use a default value of %d", prefChannel);
} else {
RobotLog.vv(TAG, "pref_wifip2p_channel: Found existing preferred channel (%d).", prefChannel);
}
// attempt to set the preferred channel.
RobotLog.vv(TAG, "pref_wifip2p_channel: attempting to set preferred channel...");
wifiDirectChannelChanger = new WifiDirectChannelChanger();
wifiDirectChannelChanger.changeToChannel(prefChannel);
}
}
protected void hittingMenuButtonBrightensScreen() {
ActionBar actionBar = getActionBar();
if (actionBar != null) {
actionBar.addOnMenuVisibilityListener(new ActionBar.OnMenuVisibilityListener() {
@Override
public void onMenuVisibilityChanged(boolean isVisible) {
if (isVisible) {
dimmer.handleDimTimer();
}
}
});
}
}
protected class SharedPreferencesListener implements SharedPreferences.OnSharedPreferenceChangeListener {
@Override public void onSharedPreferenceChanged(SharedPreferences sharedPreferences, String key) {
if (key.equals(context.getString(R.string.pref_app_theme))) {
ThemedActivity.restartForAppThemeChange(getTag(), getString(R.string.appThemeChangeRestartNotifyRC));
} else if (key.equals(context.getString(R.string.pref_wifi_automute))) {
if (preferencesHelper.readBoolean(context.getString(R.string.pref_wifi_automute), false)) {
initWifiMute(true);
} else {
initWifiMute(false);
}
}
}
}
protected void initWifiMute(boolean enable) {
if (enable) {
wifiMuteStateMachine = new WifiMuteStateMachine();
wifiMuteStateMachine.initialize();
wifiMuteStateMachine.start();
motionDetection = new MotionDetection(2.0, 10);
motionDetection.startListening();
motionDetection.registerListener(new MotionDetection.MotionDetectionListener() {
@Override
public void onMotionDetected(double vector)
{
wifiMuteStateMachine.consumeEvent(WifiMuteEvent.USER_ACTIVITY);
}
});
} else {
wifiMuteStateMachine.stop();
wifiMuteStateMachine = null;
motionDetection.stopListening();
motionDetection.purgeListeners();
motionDetection = null;
}
}
@Override
public void onUserInteraction() {
if (wifiMuteStateMachine != null) {
wifiMuteStateMachine.consumeEvent(WifiMuteEvent.USER_ACTIVITY);
}
}
}

View File

@ -0,0 +1,91 @@
/*
* Copyright (c) 2018 Craig MacFarlane
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification, are permitted
* (subject to the limitations in the disclaimer below) provided that the following conditions are
* met:
*
* Redistributions of source code must retain the above copyright notice, this list of conditions
* and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this list of conditions
* and the following disclaimer in the documentation and/or other materials provided with the
* distribution.
*
* Neither the name of Craig MacFarlane nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS LICENSE. THIS
* SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED
* WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
* OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF
* THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.internal;
import android.Manifest;
import android.os.Bundle;
import com.qualcomm.ftcrobotcontroller.R;
import org.firstinspires.ftc.robotcore.internal.system.Misc;
import org.firstinspires.ftc.robotcore.internal.system.PermissionValidatorActivity;
import java.util.ArrayList;
import java.util.List;
public class PermissionValidatorWrapper extends PermissionValidatorActivity {
private final String TAG = "PermissionValidatorWrapper";
/*
* The list of dangerous permissions the robot controller needs.
*/
protected List<String> robotControllerPermissions = new ArrayList<String>() {{
add(Manifest.permission.WRITE_EXTERNAL_STORAGE);
add(Manifest.permission.READ_EXTERNAL_STORAGE);
add(Manifest.permission.CAMERA);
add(Manifest.permission.ACCESS_COARSE_LOCATION);
add(Manifest.permission.ACCESS_FINE_LOCATION);
add(Manifest.permission.READ_PHONE_STATE);
}};
private final static Class startApplication = FtcRobotControllerActivity.class;
public String mapPermissionToExplanation(final String permission) {
if (permission.equals(Manifest.permission.WRITE_EXTERNAL_STORAGE)) {
return Misc.formatForUser(R.string.permRcWriteExternalStorageExplain);
} else if (permission.equals(Manifest.permission.READ_EXTERNAL_STORAGE)) {
return Misc.formatForUser(R.string.permRcReadExternalStorageExplain);
} else if (permission.equals(Manifest.permission.CAMERA)) {
return Misc.formatForUser(R.string.permRcCameraExplain);
} else if (permission.equals(Manifest.permission.ACCESS_COARSE_LOCATION)) {
return Misc.formatForUser(R.string.permAccessLocationExplain);
} else if (permission.equals(Manifest.permission.ACCESS_FINE_LOCATION)) {
return Misc.formatForUser(R.string.permAccessLocationExplain);
} else if (permission.equals(Manifest.permission.READ_PHONE_STATE)) {
return Misc.formatForUser(R.string.permReadPhoneState);
}
return Misc.formatForUser(R.string.permGenericExplain);
}
@Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
permissions = robotControllerPermissions;
}
protected Class onStartApplication()
{
FtcRobotControllerActivity.setPermissionsValidated();
return startApplication;
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 975 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.7 KiB

View File

@ -0,0 +1,184 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:style="http://schemas.android.com/apk/res-auto"
tools:context="org.firstinspires.ftc.robotcontroller.internal.FtcRobotControllerActivity"
android:focusable="true"
android:id="@+id/entire_screen"
android:orientation="vertical">
<!-- black bar on top -->
<RelativeLayout
android:id="@+id/top_bar"
android:layout_width="fill_parent"
android:layout_height="80dp"
android:background="@color/background_black">
<ImageView
android:id="@+id/robotIcon"
android:src="@drawable/icon_robotcontroller"
android:layout_width="wrap_content"
android:layout_height="fill_parent"
android:adjustViewBounds="true"
android:layout_margin="1dp"/>
<TextView
android:id="@+id/textDeviceName"
android:layout_toEndOf="@id/robotIcon"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="?attr/textWhite"
android:textAppearance="?android:attr/textAppearanceMedium"
android:padding="8dp"
android:textStyle="bold"/>
<ImageButton
android:id="@+id/menu_buttons"
android:scaleType="fitXY"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:src="@drawable/icon_menu"
android:background="@android:color/transparent"
android:padding="15dp"
android:adjustViewBounds="true"
android:layout_alignParentEnd="true"
android:layout_centerInParent="true"
android:layout_centerHorizontal="true"
android:layout_margin="10dp"/>
</RelativeLayout>
<!-- end of black bar -->
<include layout="@layout/header"
android:id="@+id/included_header"/>
<RelativeLayout
android:background="@color/background_very_light_grey"
android:id="@+id/RelativeLayout"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:keepScreenOn="true"
android:paddingBottom="@dimen/activity_vertical_margin"
android:paddingLeft="@dimen/activity_horizontal_margin"
android:paddingRight="@dimen/activity_horizontal_margin"
android:paddingTop="@dimen/activity_vertical_margin" >
<TextView
android:id="@+id/textNetworkConnectionStatus"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textRobotStatus"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textNetworkConnectionStatus"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textOpMode"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textRobotStatus"
android:visibility="invisible"
android:text="" />
<FrameLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textOpMode"
android:layout_above="@+id/textGamepad1">
<LinearLayout
android:id="@+id/monitorContainer"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical">
<LinearLayout
android:id="@+id/cameraMonitorViewId"
android:visibility="gone"
android:layout_width="match_parent"
android:layout_height="0dp"
android:layout_weight="1"
android:orientation="vertical"
/>
</LinearLayout>
<!-- When visible, the error message will overlay and thus partially
obscure part of the camera monitor. We make this trade off so as to
allow for a bigger camera monitor view in the common case when the
error is not in fact present
-->
<TextView style="@style/FtcTextViewStyleBold"
android:id="@+id/textErrorMessage"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:text=""
android:visibility="invisible"
android:textColor="?attr/textMediumDark" />
</FrameLayout>
<TextView
android:id="@+id/textGamepad1"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_above="@+id/textGamepad2"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textGamepad2"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:visibility="invisible"
android:text="" />
</RelativeLayout>
<WebView
android:id="@+id/webViewBlocksRuntime"
android:layout_width="1dp"
android:layout_height="1dp" />
</LinearLayout>

View File

@ -0,0 +1,78 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<menu xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<item
android:id="@+id/action_settings"
android:orderInCategory="100"
app:showAsAction="never"
android:title="@string/settings_menu_item"/>
<item
android:id="@+id/action_restart_robot"
android:orderInCategory="200"
app:showAsAction="never"
android:title="@string/restart_robot_menu_item"/>
<item
android:id="@+id/action_configure_robot"
android:orderInCategory="300"
app:showAsAction="never"
android:title="@string/configure_robot_menu_item"/>
<item
android:id="@+id/action_program_and_manage"
android:orderInCategory="550"
app:showAsAction="never"
android:title="@string/program_and_manage_menu_item"/>
<item
android:id="@+id/action_inspection_mode"
android:orderInCategory="600"
app:showAsAction="never"
android:title="@string/inspection_mode_menu_item"/>
<item
android:id="@+id/action_about"
android:orderInCategory="999"
android:icon="@android:drawable/ic_menu_info_details"
android:title="@string/about_menu_item"/>
<item
android:id="@+id/action_exit_app"
android:orderInCategory="1000"
android:icon="@android:drawable/ic_menu_info_details"
android:title="@string/exit_menu_item"/>
</menu>

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,40 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<resources>
<!-- Default screen margins, per the Android Design guidelines. -->
<dimen name="activity_horizontal_margin">16dp</dimen>
<dimen name="activity_vertical_margin">5dp</dimen>
</resources>

View File

@ -0,0 +1,72 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
strings.xml in FtcRobotController
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<resources>
<string name="app_name">FTC Robot Controller</string>
<!-- Menu -->
<string name="inspection_mode_menu_item">Self Inspect</string>
<string name="program_and_manage_menu_item">Program &amp; Manage</string>
<string name="blocks_menu_item">Blocks</string>
<string name="settings_menu_item">Settings</string>
<string name="restart_robot_menu_item">Restart Robot</string>
<string name="configure_robot_menu_item">Configure Robot</string>
<string name="about_menu_item">About</string>
<string name="exit_menu_item">Exit</string>
<!-- Toast messages -->
<string name="toastWifiConfigurationComplete">Configuration Complete</string>
<string name="toastRestartingRobot">Restarting Robot</string>
<string name="toastWifiUpBeforeProgrammingMode">The Robot Controller must be fully up and running before entering Program and Manage Mode.</string>
<!-- for interpreting pref_app_theme contents. may be override in merged resources -->
<integer-array name="app_theme_ids">
<item>@style/AppThemeRedRC</item>
<item>@style/AppThemeGreenRC</item>
<item>@style/AppThemeBlueRC</item>
<item>@style/AppThemePurpleRC</item>
<item>@style/AppThemeOrangeRC</item>
<item>@style/AppThemeTealRC</item>
</integer-array>
<string translatable="false" name="pref_ftc_season_year_of_current_rc">pref_ftc_season_year_of_current_rc_new</string>
<string name="packageName">@string/packageNameRobotController</string>
</resources>

View File

@ -0,0 +1,23 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- This is a placeholder for adding future style-specific content in the robot controller only -->
<resources>
<style name="AppThemeRedRC" parent="AppTheme.Red">
</style>
<style name="AppThemeGreenRC" parent="AppTheme.Green">
</style>
<style name="AppThemeBlueRC" parent="AppTheme.Blue">
</style>
<style name="AppThemePurpleRC" parent="AppTheme.Purple">
</style>
<style name="AppThemeOrangeRC" parent="AppTheme.Orange">
</style>
<style name="AppThemeTealRC" parent="AppTheme.Teal">
</style>
</resources>

View File

@ -0,0 +1,93 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
app_settings.xml in FtcRobotController
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
See https://developer.android.com/guide/topics/ui/settings.html
-->
<PreferenceScreen xmlns:tools="http://schemas.android.com/tools"
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
tools:context=".FtcRobotControllerActivity">
<PreferenceCategory
android:title="@string/prefcat_configure_robot">
<EditTextPreference
android:title="@string/prefedit_device_name_rc"
android:summary="@string/prefedit_device_name_summary_rc"
android:key="@string/pref_device_name"
android:defaultValue=""
/>
<org.firstinspires.ftc.robotcore.internal.ui.ColorListPreference
android:title="@string/prefedit_app_theme_rc"
android:summary="@string/prefedit_app_theme_summary_rc"
android:key="@string/pref_app_theme"
android:entries="@array/app_theme_names"
android:entryValues="@array/app_theme_tokens"
app:colors="@array/app_theme_colors"
/>
<SwitchPreference
android:title="@string/prefedit_sound_on_off"
android:summary="@string/prefedit_sound_on_off_summary_rc"
android:key="@string/pref_sound_on_off"
android:defaultValue="true"
/>
<PreferenceScreen
android:title="@string/titleAdvancedRCSettings"
android:summary="@string/summaryAdvancedRCSettings"
android:key="@string/pref_launch_advanced_rc_settings">
<intent
android:targetPackage="@string/packageName"
android:targetClass="com.qualcomm.ftccommon.FtcAdvancedRCSettingsActivity"
/>
</PreferenceScreen>
<PreferenceScreen
android:title="@string/prefedit_view_logs"
android:summary="@string/prefedit_view_logs_summary"
android:key="@string/pref_launch_viewlogs">
<intent
android:targetPackage="@string/packageName"
android:targetClass="com.qualcomm.ftccommon.ViewLogsActivity" />
</PreferenceScreen>
</PreferenceCategory>
</PreferenceScreen>

View File

@ -0,0 +1,49 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<!--
https://developer.android.com/guide/topics/connectivity/usb/host
-->
<!-- see also RobotUsbDevice.getUsbIdentifiers() -->
<resources>
<usb-device vendor-id="1027" product-id="24597" /> <!-- FT232 Lynx: 0x0403/0x6015 -->
<!-- cameras -->
<!-- We don't currently auto-connect to UVC cameras, instead relying
on the app itself to poll. But we could change that if we wished -->
<!-- Update: turns out we need that if we are to get onNewIntent() notifications
in our activity when cameras attach. See FtcRobotControllerActivity. -->
<usb-device class="14" subclass="2"/>
</resources>

29
LICENSE Normal file
View File

@ -0,0 +1,29 @@
Copyright (c) 2014-2022 FIRST. All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of FIRST nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS
IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

1481
README.md Normal file

File diff suppressed because it is too large Load Diff

33
TeamCode/build.gradle Normal file
View File

@ -0,0 +1,33 @@
//
// build.gradle in TeamCode
//
// Most of the definitions for building your module reside in a common, shared
// file 'build.common.gradle'. Being factored in this way makes it easier to
// integrate updates to the FTC into your code. If you really need to customize
// the build definitions, you can place those customizations in this file, but
// please think carefully as to whether such customizations are really necessary
// before doing so.
// Custom definitions may go here
// Include common definitions from above.
apply from: '../build.common.gradle'
apply from: '../build.dependencies.gradle'
android {
namespace = 'org.firstinspires.ftc.teamcode'
packagingOptions {
jniLibs.useLegacyPackaging true
}
}
dependencies {
implementation project(':FtcRobotController')
annotationProcessor files('lib/OpModeAnnotationProcessor.jar')
implementation 'org.apache.commons:commons-math3:3.6.1'
implementation 'com.fasterxml/jackson.core:jacson-databind:2.12.7'
implementation 'com.acmerobotics.com.roadrunner:core:0.5.6'
}

Binary file not shown.

View File

@ -0,0 +1,11 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Note: the actual manifest file used in your APK merges this file with contributions
from the modules on which your app depends (such as FtcRobotController, etc).
So it won't ultimately be as empty as it might here appear to be :-) -->
<!-- The package name here determines the package for your R class and your BuildConfig class -->
<manifest
xmlns:android="http://schemas.android.com/apk/res/android">
<application/>
</manifest>

View File

@ -0,0 +1,117 @@
## Basic Ideas
Pedro Pathing is a reactive vector based follower. What this means is that the robot dynamically
calculates a set of vectors that are required to correct error as well as to move forward and applies them.
The robot calculates:
* centripetal force correction
* translational correction
* heading correction
* drive vector
These are then applied to the robot in this order until either the robot's power is maxed out or all
the vectors are applied.
## Why Pedro Pathing?
Why use Pedro Pathing? Why not something else like Road Runner or Pure Pursuit?
* Why not Pure Pursuit?
* Pure Pursuit searches for the farthest point on the path that's within a certain radius from the robot. Pure Pursuit will then go in a straight line to that point. This poses several problems, as a small search radius will cause some oscillations on corners, and a large search radius will cut corners on paths, which makes the paths inaccurate to real life.
* Pedro Pathing instead corrects to the closest point on the path while still following the path. This ensures that the follower will stay on the path while still being able to move forward along the path without cutting corners or encountering oscillation issues.
* Why not Road Runner?
* Road Runner is a motion profile based follower, which means that a set of instructions for motor powers are calculated for each path beforehand and then run. After reaching the end of this motion profile, Road Runner corrects. This can be sufficient for most situations, but if the robot encounters an obstacle or wheel slippage, it may be unable to correct in time.
* Pedro Pathing instead dynamically corrects throughout the path. The movement vectors are calculated at every point along the path, and because of this, the path can even be changed midway through and Pedro Pathing will still be able to correct. Since correction occurs throughout the path, the error correction isn't concentrated on the end of the path and therefore the robot is able to better minimize error.
## How Does Pedro Path?
As mentioned in the *Basic Ideas* section, Pedro Pathing calculates a set of vectors to move the
robot along a path, which is defined with Bezier curves. Here, we'll go more in-depth on how these
vectors are calculated and used in path following.
### The Hierarchy
While following paths, sometimes all these motion vectors are demanding more power from the robot
than it actually has. How do we deal with this?
Our motion vectors are applied in order of importance, which is the order they appear in within the
list in *Basic Ideas*. The centripetal force vector is the highest importance, since it ensures the
robot sticks to the path. If the robot is far off the path, then the robot will not drive along the
path, and so the centripetal force vector will be reduced in magnitude. This is why ranking the
centripetal force correction above the translational correction doesn't produce issues. The next
highest, of course, is the translational correction. This corrects strictly the robot's position to
the closest point on the path. The reasoning behind this is that it is usually much more important
that the robot be on the path, and so avoid potential obstacles, rather than facing the correct
direction. The third highest important vector is the heading correction, which turns the robot to
the correct angle. This is higher than the drive vector since we don't want to drive forward if the
robot isn't facing the correct direction. Finally, the drive vector is applied. This ensures that
the robot only continues on the path when there aren't any major issues with the following.
As each vector is applied, the robot checks to see if the sum of the applied vectors is greater than
the power that the drivetrain can produce, which would be 1 motor power. If the magnitude of the
combined vectors is greater than 1, then the most recently added vector is scaled down until the
combined vectors' magnitude is equal to 1. If all vectors are able to be applied without exceeding
the power limit, then all the vectors can just be applied without issue.
### Centripetal Force Correction
Have you ever noticed that your robot seems to want to swing outwards when taking corners? This is
due to a lack of centripetal force correction. In order to take curves effectively, your robot must
accelerate towards the inside of the curve. If we can approximate the region of the path the robot
is at with a circle, then we can use the formula for centripetal force to calculate how much power
we need to allocate to approximate a centripetal force.
Because paths are defined with Bezier curves, we can easily take the first and second derivative of
the path, expressed as vectors. We can use that to calculate the curvature of the path, which is the
inverse of the length of the radius of the circle we can use to approximate the path. The actual
formula for the calculations of the curvature is the cross product of the first derivative and
second derivative, divided by the magnitude of the first derivative raised to the power of 3.
With this, along with the weight of the robot and the velocity of the robot along the path, we can
calculate the force necessary to keep the robot on the path, and then tune a scaling factor to turn
that force into a corresponding power for the robot.
### Translational Correction
This is as simple as it sounds: this corrects error in the robot's position only. However, the
robot's translational error is actually used within two different PIDs to move the robot. A large
PID and a small PID are used. The large PID is used when error exceeds a certain limit. and the small
PID is used when the error is within that limit.
When the robot encounters error, the large PID, if applicable, is used to bring the robot within the
small PID error range without much overshoot to avoid oscillations. Then, the small PID is used to
bring the robot to within acceptable error ranges. The reason for this double PID is to allow for
aggressive correction to bring the robot within tight tolerances, while not correcting too
aggressively at larger ranges, which would've caused oscillations.
### Heading Correction
The heading correction operates very similarly to the translational correction, except this corrects
the direction the robot is facing. It also uses a small and large PID rather than a single PID, like
the translational correction. The heading correction will turn in the closest direction from the
robot's current heading to the target heading.
### Drive Vector
The drive vector points in the direction of the tangent of the path and it is responsible for moving
the robot along the path. Similar to the translational and heading corrections, the drive vector
uses a large and small PID. Using basic kinematics equations, we can use the velocity of the robot
along the path, the length of path left, and a specified target rate of deceleration to calculate
the velocity we should be moving at. Additionally, after finding out the rate of deceleration of the
robot under 0 power, we can compensate for that with another kinematics equation. Combining these
two lets us control our velocity to both move along the path quickly and brake without overshooting.
## Additional Capabilities
In addition to following paths, Pedro Pathing can be used for a few other purposes.
### Holding Points
Pedro Pathing is also capable of holding a specified position and direction. This can be useful for
improving on the end accuracy of paths, providing additional correction time if possible. It can
also be useful in cases where contact with other robots might occur. For instance, in the 2022-2023
FTC season challenge, Power Play, robots might come into contact when scoring on a contested middle
junction. Pedro Pathing would be able to recover and correct from a robot collision, allowing for
more consistent scoring.
### TeleOp Enhancements
Finally, Pedro Pathing can be used in TeleOp to enhance driving. With regular mecanum drive, robots
will tend to swing out when taking corners. Pedro Pathing can account for that, allowing the robot
to take corners more smoothly and efficiently. Using the same localizer as is used in autonomous, a
first and second derivative can be estimated from previous positions. Then, with a modified version
of the curvature formula, we can estimate a centripetal force correction and apply it under driver
control.
## Questions?
If you still have more questions, feel free to contact us at `scottsbots10158@gmail.com`

View File

@ -0,0 +1,29 @@
## Welcome!
This is the Pedro Pathing path following program developed by FTC team 10158 Scott's Bots in the
2023-2024 Centerstage season.
## Installation
The quickest way to get started is with the quickstart [here](https://github.com/brotherhobo/Pedro-Pathing-Quickstart).
Otherwise, take the `pedroPathing` folder and put it under the `teamcode` folder in your project.
You can do this from either downloading the project from the above quickstart link or the 10158
CENTERSTAGE repository [here](https://github.com/brotherhobo/10158-Centerstage).
For this version of Pedro Pathing, the localizer used is the Road Runner localizer. To install its
dependencies:
1. Find `build.dependencies.gradle` in the main folder of your project.
2. Add the following code to the end of the `repositories` block:
```
maven { url = 'https://maven.brott.dev/' }
```
3. Then, add the following code to the end of your `dependencies` block:
```
implementation 'com.acmerobotics.dashboard:dashboard:0.4.5'
```
4. Find the `build.gradle` file under the `teamcode` folder.
5. In this gradle file, add the following dependencies:
```
implementation 'org.apache.commons:commons-math3:3.6.1'
implementation 'com.fasterxml/jackson.core:jacson-databind:2.12.7'
implementation 'com.acmerobotics.com.roadrunner:core:0.5.6'
```

View File

@ -0,0 +1,91 @@
## Prerequisites
Obviously, you have to have a robot to use Pedro Pathing. Also, Pedro Pathing is only able to work
with omnidirectional drives, like mecanum drive. There is currently no support for swerve drives.
You must also have a localizer of some sort. Pedro Pathing currently has the three-wheel localizer
from Road Runner, but we have plans to create our own localizer soon. Additionally, using
[FTC Dashboard](http://192.168.43.1:8080/dash) will help a lot in tuning.
## Tuning
* To start with, we need the mass of the robot in kg. This is used for the centripetal force
correction, and the mass should be put on line `114` in the `FollowerConstants` class under the
`tuning` package.
* Next, we need to find the preferred mecanum drive vectors. The rollers on mecanum wheels point at a
45 degree angle from the forward direction, but the actual direction the force is output is actually
closer to forward. To find the direction your wheels will go, you will need to run the
`Forward Velocity Tuner` and `Strafe Velocity Tuner` OpModes. These will run your robot at full
power for 40 inches forward and to the right, respectively. The distance can be changed through FTC
Dashboard under the dropdown for each respective class, but higher distances work better. After the
distance has finished running, the end velocity will be output to telemetry. The robot may continue
to drift a little bit after the robot has finished running the distance, so make sure you have
plenty of room. Once you're done, put the velocity for the `Forward Velocity Tuner` on line `25` in
the `FollowerConstants` class, and the velocity for the `Strafe Velocity Tuner` on line `26` in the
`FollowerConstants` class.
* The last set of automatic tuners you'll need to run are the zero power acceleration tuners. These
find the rate at which your robot decelerates when power is cut from the drivetrain. This is used to
get a more accurate estimation of the drive vector. To find this, you will need to run the
`Forward Zero Power Acceleration Tuner` and the `Lateral Zero Power Acceleration Tuner` OpModes.
These will run your robot until it hits a velocity of 10 inches/second forward and to the right,
respectively. The velocity can be changed through FTC Dashboard under the dropdown for each
respective class, but higher velocities work better. After the velocity has been reached, power will
be cut from the drivetrain and the robot's deceleration will be tracked until the robot stops, at
which point it will display the deceleration in telemetry. This robot will need to drift to a stop
to properly work, and the higher the velocity the greater the drift distance, so make sure you have
enough room. Once you're done, put the zero power acceleration for the
`Forward Zero Power Acceleration Tuner` on line `122` in the `FollowerConstants` class and the zero
power acceleration for the `Lateral Zero Power Acceleration Tuner` on line `127` in the
`FollowerConstants` class.
* After this, we will want to tune the translational PIDs. Go to FTC Dashboard and disable all but
the `useTranslational` checkboxes under the `Follower` tab. Then, run `StraightBackAndForth`. Make
sure you disable the timer on autonomous OpModes. There are two different PIDs for translational
error, the `smallTranslationalPIDF` and `largeTranslationalPIDF`. If error is larger than a certain
amount, the `largeTranslationalPIDF` will be used, and if error is smaller than that amount the
`smallTranslationalPIDF` will be used. If you need to add a feedforward value, use the
`smallTranslationalPIDFFeedForward` and `largeTranslationalPIDFFeedForward` since those will add the
feedforward in the direction the robot is trying to move, rather than the feedforward in the PIDFs
themselves, since those will only add the feedforward one way. You can change the amount of error
required to switch PIDFs, as well as the PIDF constants and feedforward values, under the
`FollowerConstants` tab in FTC Dashboard. To tune the PIDs, push the robot off the path and see how
corrects. You will want to alternate sides you push to reduce field wear and tear as well as push
with varying power and distance. I would recommend tuning the large PID first, and tuning it so that
the PID is capable of correcting to the point where the PIDs switch with little momentum. Then, tune
the small PID to minimize oscillations while still achieving a satisfactory level of accuracy.
Overall, try to tune for fewer oscillations rather than higher speeds or perfect accuracy, since
this will make the robot run more smoothly under actual pathing conditions.
* Next, we will tune the heading PIDs. The process is essentially the same as above, except you will
want to only enable `useHeading` under `Follower` on FTC Dashboard, as well as turn the robot from
opposing corners instead of pushing the robot. Naturally, instead of changing the stuff with
"translational" in the name, you will instead want to look for stuff with "heading" in the name.
Otherwise, these two PIDs are functionally very similar. The same tips from above will apply to this.
* Afterwards, we will tune the drive PIDs. Before we continue, we will need to set the
`zeroPowerAccelerationMultiplier`. This determines how fast your robot will decelerate as a factor
of how fast your robot will coast to a stop. Honestly, this is up to you. I personally used 4, but
what works best for you is most important. Higher numbers will cause a faster brake, but increase
oscillations at the end. Lower numbers will do the opposite. This can be found on line `136` in
`FollowerConstants`. There are once again two PIDs for the drive vector, but these PIDs are much,
much more sensitive than the others. For reference, my P values were in the hundredths and
thousandths place values, and my D values were in the hundred thousandths and millionths place
values. To tune this, enable `useDrive`, `useHeading`, and `useTranslational` in the `Follower`
dropdown in FTC Dashboard. Next, run `StraightBackAndForth`and don't forget to turn off the timer on
the OpMode. Then, tune the large PID and then the small PID following the tips from earlier. For
this, it is very important to try to reduce oscillations. Additionally, I would absolutely not
recommend using the I, or integral, part of the PID for this. Using integral in drivetrain PIDs is
already not ideal, but it will continuously build up error in this PID, causing major issues when
it gets too strong. So, just don't use it. P and D are enough.
* Finally, we will want to tune the centripetal force correction. This is a pretty simple tune. Open
up FTC Dashboard and enable everything under the `Follower` tab. Then, run `CurvedBackAndForth`
and turn off its timer. If you notice the robot is correcting towards the inside of the curve
as/after running a path, then increase `centripetalScaling`, which can be found on line `117` of
`FollowerConstants`. If the robot is correcting towards the outside of the curve, then decrease
`centripetalScaling`.
* Once you've found satisfactory tunings for everything, run the robot around in
`StraightBackAndForth`, `CurvedBackAndForth`, or some paths of your own making. There's also
`Circle`, but that's more so for fun than anything else. If you notice something could be improved,
feel free to mess around more with your PIDs. That should be all! If you have any more questions,
refer to the OVERVIEW readme file or the README readme file. Best of luck to your team this season! :)

View File

@ -0,0 +1,69 @@
package org.firstinspires.ftc.teamcode.pedroPathing.examples;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import org.firstinspires.ftc.teamcode.pedroPathing.follower.Follower;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.MathFunctions;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Vector;
/**
* This is the TeleOpEnhancements OpMode. It is an example usage of the TeleOp enhancements that
* Pedro Pathing is capable of.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/21/2024
*/
@TeleOp(name = "Pedro Pathing TeleOp Enhancements", group = "Test")
public class TeleOpEnhancements extends OpMode {
private Follower follower;
private DcMotorEx leftFront;
private DcMotorEx leftRear;
private DcMotorEx rightFront;
private DcMotorEx rightRear;
private Vector driveVector;
private Vector headingVector;
/**
* This initializes the drive motors as well as the Follower and motion Vectors.
*/
@Override
public void init() {
follower = new Follower(hardwareMap, false);
leftFront = hardwareMap.get(DcMotorEx.class, "leftFront");
leftRear = hardwareMap.get(DcMotorEx.class, "leftRear");
rightRear = hardwareMap.get(DcMotorEx.class, "rightRear");
rightFront = hardwareMap.get(DcMotorEx.class, "rightFront");
leftFront.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
leftRear.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
rightRear.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
rightFront.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
driveVector = new Vector();
headingVector = new Vector();
}
/**
* This runs the OpMode. This is only drive control with Pedro Pathing live centripetal force
* correction.
*/
@Override
public void loop() {
driveVector.setOrthogonalComponents(-gamepad1.left_stick_y, -gamepad1.left_stick_x);
driveVector.setMagnitude(MathFunctions.clamp(driveVector.getMagnitude(), 0, 1));
driveVector.rotateVector(follower.getPose().getHeading());
headingVector.setComponents(-gamepad1.left_stick_x, follower.getPose().getHeading());
follower.setMovementVectors(follower.getCentripetalForceCorrection(), headingVector, driveVector);
follower.update();
}
}

View File

@ -0,0 +1,152 @@
package org.firstinspires.ftc.teamcode.pedroPathing.follower;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.MathFunctions;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Vector;
/**
* This is the DriveVectorScaler class. This class takes in inputs Vectors for driving, heading
* correction, and translational/centripetal correction and returns an array with wheel powers.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/4/2024
*/
public class DriveVectorScaler {
// This is ordered left front, left back, right front, right back. These are also normalized.
private Vector[] mecanumVectors;
/**
* This creates a new DriveVectorScaler, which takes in various movement vectors and outputs
* the wheel drive powers necessary to move in the intended direction, given the true movement
* vector for the front left mecanum wheel.
*
* @param frontLeftVector this is the front left mecanum wheel's preferred drive vector.
*/
public DriveVectorScaler(Vector frontLeftVector) {
Vector copiedFrontLeftVector = MathFunctions.normalizeVector(frontLeftVector);
mecanumVectors = new Vector[]{
new Vector(copiedFrontLeftVector.getMagnitude(), copiedFrontLeftVector.getTheta()),
new Vector(copiedFrontLeftVector.getMagnitude(), 2*Math.PI-copiedFrontLeftVector.getTheta()),
new Vector(copiedFrontLeftVector.getMagnitude(), 2*Math.PI-copiedFrontLeftVector.getTheta()),
new Vector(copiedFrontLeftVector.getMagnitude(), copiedFrontLeftVector.getTheta())};
}
/**
* This takes in vectors for corrective power, heading power, and pathing power and outputs
* an Array of four doubles, one for each wheel's motor power.
*
* IMPORTANT NOTE: all vector inputs are clamped between 0 and 1 inclusive in magnitude.
*
* @param correctivePower this Vector includes the centrifugal force scaling Vector as well as a
* translational power Vector to correct onto the Bezier curve the Follower
* is following.
* @param headingPower this Vector points in the direction of the robot's current heaing, and
* the magnitude tells the robot how much it should turn and in which
* direction.
* @param pathingPower this Vector points in the direction the robot needs to go to continue along
* the Path.
* @param robotHeading this is the current heading of the robot, which is used to calculate how
* much power to allocate to each wheel.
* @return this returns an Array of doubles with a length of 4, which contains the wheel powers.
*/
public double[] getDrivePowers(Vector correctivePower, Vector headingPower, Vector pathingPower, double robotHeading) {
// clamps down the magnitudes of the input vectors
if (correctivePower.getMagnitude() > 1) correctivePower.setMagnitude(1);
if (headingPower.getMagnitude() > 1) headingPower.setMagnitude(1);
if (pathingPower.getMagnitude() > 1) pathingPower.setMagnitude(1);
// the powers for the wheel vectors
double [] wheelPowers = new double[4];
// This contains a copy of the mecanum wheel vectors
Vector[] mecanumVectorsCopy = new Vector[4];
// this contains the pathing vectors, one for each side (heading control requires 2)
Vector[] truePathingVectors = new Vector[2];
if (correctivePower.getMagnitude() == 1) {
// checks for corrective power equal to 1 in magnitude. if equal to one, then set pathing power to that
truePathingVectors[0] = MathFunctions.copyVector(correctivePower);
truePathingVectors[1] = MathFunctions.copyVector(correctivePower);
} else {
// corrective power did not take up all the power, so add on heading power
Vector leftSideVector = MathFunctions.subtractVectors(correctivePower, headingPower);
Vector rightSideVector = MathFunctions.addVectors(correctivePower, headingPower);
if (leftSideVector.getMagnitude() > 1 || rightSideVector.getMagnitude() > 1) {
//if the combined corrective and heading power is greater than 1, then scale down heading power
double headingScalingFactor = Math.min(findNormalizingScaling(correctivePower, headingPower), findNormalizingScaling(correctivePower, MathFunctions.scalarMultiplyVector(headingPower, -1)));
truePathingVectors[0] = MathFunctions.subtractVectors(correctivePower, MathFunctions.scalarMultiplyVector(headingPower, headingScalingFactor));
truePathingVectors[1] = MathFunctions.addVectors(correctivePower, MathFunctions.scalarMultiplyVector(headingPower, headingScalingFactor));
} else {
// if we're here then we can add on some drive power but scaled down to 1
Vector leftSideVectorWithPathing = MathFunctions.addVectors(leftSideVector, pathingPower);
Vector rightSideVectorWithPathing = MathFunctions.addVectors(rightSideVector, pathingPower);
if (leftSideVectorWithPathing.getMagnitude() > 1 || rightSideVectorWithPathing.getMagnitude() > 1) {
// too much power now, so we scale down the pathing vector
double pathingScalingFactor = Math.min(findNormalizingScaling(leftSideVector, pathingPower), findNormalizingScaling(rightSideVector, pathingPower));
truePathingVectors[0] = MathFunctions.addVectors(leftSideVector, MathFunctions.scalarMultiplyVector(pathingPower, pathingScalingFactor));
truePathingVectors[1] = MathFunctions.addVectors(rightSideVector, MathFunctions.scalarMultiplyVector(pathingPower, pathingScalingFactor));
} else {
// just add the vectors together and you get the final vector
truePathingVectors[0] = MathFunctions.copyVector(leftSideVectorWithPathing);
truePathingVectors[1] = MathFunctions.copyVector(rightSideVectorWithPathing);
}
}
}
truePathingVectors[0] = MathFunctions.scalarMultiplyVector(truePathingVectors[0], 2.0);
truePathingVectors[1] = MathFunctions.scalarMultiplyVector(truePathingVectors[1], 2.0);
for (int i = 0; i < mecanumVectorsCopy.length; i++) {
// this copies the vectors from mecanumVectors but creates new references for them
mecanumVectorsCopy[i] = MathFunctions.copyVector(mecanumVectors[i]);
mecanumVectorsCopy[i].rotateVector(robotHeading);
}
wheelPowers[0] = (mecanumVectorsCopy[1].getXComponent()*truePathingVectors[0].getYComponent() - truePathingVectors[0].getXComponent()*mecanumVectorsCopy[1].getYComponent()) / (mecanumVectorsCopy[1].getXComponent()*mecanumVectorsCopy[0].getYComponent() - mecanumVectorsCopy[0].getXComponent()*mecanumVectorsCopy[1].getYComponent());
wheelPowers[1] = (mecanumVectorsCopy[0].getXComponent()*truePathingVectors[0].getYComponent() - truePathingVectors[0].getXComponent()*mecanumVectorsCopy[0].getYComponent()) / (mecanumVectorsCopy[0].getXComponent()*mecanumVectorsCopy[1].getYComponent() - mecanumVectorsCopy[1].getXComponent()*mecanumVectorsCopy[0].getYComponent());
wheelPowers[2] = (mecanumVectorsCopy[3].getXComponent()*truePathingVectors[1].getYComponent() - truePathingVectors[1].getXComponent()*mecanumVectorsCopy[3].getYComponent()) / (mecanumVectorsCopy[3].getXComponent()*mecanumVectorsCopy[2].getYComponent() - mecanumVectorsCopy[2].getXComponent()*mecanumVectorsCopy[3].getYComponent());
wheelPowers[3] = (mecanumVectorsCopy[2].getXComponent()*truePathingVectors[1].getYComponent() - truePathingVectors[1].getXComponent()*mecanumVectorsCopy[2].getYComponent()) / (mecanumVectorsCopy[2].getXComponent()*mecanumVectorsCopy[3].getYComponent() - mecanumVectorsCopy[3].getXComponent()*mecanumVectorsCopy[2].getYComponent());
double wheelPowerMax = Math.max(Math.max(Math.abs(wheelPowers[0]), Math.abs(wheelPowers[1])), Math.max(Math.abs(wheelPowers[2]), Math.abs(wheelPowers[3])));
if (wheelPowerMax > 1) {
wheelPowers[0] /= wheelPowerMax;
wheelPowers[1] /= wheelPowerMax;
wheelPowers[2] /= wheelPowerMax;
wheelPowers[3] /= wheelPowerMax;
}
return wheelPowers;
}
/**
* This takes in two Vectors, one static and one variable, and returns the scaling factor that,
* when multiplied to the variable Vector, results in magnitude of the sum of the static Vector
* and the scaled variable Vector being 1.
*
* IMPORTANT NOTE: I did not intend for this to be used for anything other than the method above
* this one in this class, so there will be errors if you input Vectors of length greater than 1,
* and it will scale up the variable Vector if the magnitude of the sum of the two input Vectors
* isn't greater than 1. So, just don't use this elsewhere. There's gotta be a better way to do
* whatever you're trying to do.
*
* I know that this is used outside of this class, however, I created this method so I get to
* use it if I want to. Also, it's only used once outside of the DriveVectorScaler class, and
* it's used to scale Vectors, as intended.
*
* @param staticVector the Vector that is held constant.
* @param variableVector the Vector getting scaled to make the sum of the input Vectors have a
* magnitude of 1.
* @return returns the scaling factor for the variable Vector.
*/
public double findNormalizingScaling(Vector staticVector, Vector variableVector) {
double a = Math.pow(variableVector.getXComponent(), 2) + Math.pow(variableVector.getYComponent(), 2);
double b = staticVector.getXComponent() * variableVector.getXComponent() + staticVector.getYComponent() * variableVector.getYComponent();
double c = Math.pow(staticVector.getXComponent(), 2) + Math.pow(staticVector.getYComponent(), 2) - 1.0;
return (-b + Math.sqrt(Math.pow(b, 2) - a*c))/(a);
}
}

View File

@ -0,0 +1,879 @@
package org.firstinspires.ftc.teamcode.pedroPathing.follower;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.drivePIDFSwitch;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.forwardZeroPowerAcceleration;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.headingPIDFSwitch;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.largeDrivePIDFFeedForward;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.largeHeadingPIDFFeedForward;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.largeTranslationalPIDFFeedForward;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.lateralZeroPowerAcceleration;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.smallDrivePIDFFeedForward;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.smallHeadingPIDFFeedForward;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.smallTranslationalPIDFFeedForward;
import static org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants.translationalPIDFSwitch;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.dashboard.telemetry.MultipleTelemetry;
import com.acmerobotics.roadrunner.geometry.Pose2d;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.HardwareMap;
import com.qualcomm.robotcore.hardware.configuration.typecontainers.MotorConfigurationType;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.teamcode.pedroPathing.localization.PoseUpdater;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.BezierPoint;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.MathFunctions;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Path;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.PathBuilder;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.PathCallback;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.PathChain;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Vector;
import org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants;
import org.firstinspires.ftc.teamcode.pedroPathing.util.PIDFController;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
/**
* This is the Follower class. It handles the actual following of the paths and all the on-the-fly
* calculations that are relevant for movement.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/4/2024
*/
@Config
public class Follower {
private HardwareMap hardwareMap;
private DcMotorEx leftFront;
private DcMotorEx leftRear;
private DcMotorEx rightFront;
private DcMotorEx rightRear;
private List<DcMotorEx> motors;
private DriveVectorScaler driveVectorScaler;
private PoseUpdater poseUpdater;
private Pose2d closestPose;
private Path currentPath;
private PathChain currentPathChain;
private final int BEZIER_CURVE_BINARY_STEP_LIMIT = FollowerConstants.BEZIER_CURVE_BINARY_STEP_LIMIT;
private final int AVERAGED_VELOCITY_SAMPLE_NUMBER = FollowerConstants.AVERAGED_VELOCITY_SAMPLE_NUMBER;
private int chainIndex;
private long[] pathStartTimes;
private boolean followingPathChain;
private boolean holdingPosition;
private boolean isBusy;
private boolean auto = true;
private boolean reachedParametricPathEnd;
private boolean holdPositionAtEnd;
private double maxPower = 1;
private double previousSmallTranslationalIntegral;
private double previousLargeTranslationalIntegral;
private double holdPointTranslationalScaling = FollowerConstants.holdPointTranslationalScaling;
private double holdPointHeadingScaling = FollowerConstants.holdPointHeadingScaling;
public double driveError;
public double headingError;
private long reachedParametricPathEndTime;
private double[] drivePowers;
private Vector[] teleOpMovementVectors = new Vector[]{new Vector(), new Vector(), new Vector()};
private ArrayList<Vector> velocities = new ArrayList<>();
private ArrayList<Vector> accelerations = new ArrayList<>();
private Vector averageVelocity;
private Vector averagePreviousVelocity;
private Vector averageAcceleration;
private Vector smallTranslationalIntegralVector;
private Vector largeTranslationalIntegralVector;
public Vector driveVector;
public Vector headingVector;
public Vector translationalVector;
public Vector centripetalVector;
public Vector correctiveVector;
private PIDFController smallTranslationalPIDF = new PIDFController(FollowerConstants.smallTranslationalPIDFCoefficients);
private PIDFController smallTranslationalIntegral = new PIDFController(FollowerConstants.smallTranslationalIntegral);
private PIDFController largeTranslationalPIDF = new PIDFController(FollowerConstants.largeTranslationalPIDFCoefficients);
private PIDFController largeTranslationalIntegral = new PIDFController(FollowerConstants.largeTranslationalIntegral);
private PIDFController smallHeadingPIDF = new PIDFController(FollowerConstants.smallHeadingPIDFCoefficients);
private PIDFController largeHeadingPIDF = new PIDFController(FollowerConstants.largeHeadingPIDFCoefficients);
private PIDFController smallDrivePIDF = new PIDFController(FollowerConstants.smallDrivePIDFCoefficients);
private PIDFController largeDrivePIDF = new PIDFController(FollowerConstants.largeDrivePIDFCoefficients);
public static boolean useTranslational = true;
public static boolean useCentripetal = true;
public static boolean useHeading = true;
public static boolean useDrive = true;
/**
* This creates a new Follower given a HardwareMap.
*
* @param hardwareMap HardwareMap required
*/
public Follower(HardwareMap hardwareMap) {
this.hardwareMap = hardwareMap;
initialize();
}
/**
* This creates a new Follower given a HardwareMap and sets whether the Follower is being used
* in autonomous or teleop.
*
* @param hardwareMap HardwareMap required
* @param setAuto sets whether or not the Follower is being used in autonomous or teleop
*/
public Follower(HardwareMap hardwareMap, boolean setAuto) {
this.hardwareMap = hardwareMap;
setAuto(setAuto);
initialize();
}
/**
* This initializes the follower.
* In this, the DriveVectorScaler and PoseUpdater is instantiated, the drive motors are
* initialized and their behavior is set, and the variables involved in approximating first and
* second derivatives for teleop are set.
*/
public void initialize() {
driveVectorScaler = new DriveVectorScaler(FollowerConstants.frontLeftVector);
poseUpdater = new PoseUpdater(hardwareMap);
leftFront = hardwareMap.get(DcMotorEx.class, "leftFront");
leftRear = hardwareMap.get(DcMotorEx.class, "leftRear");
rightRear = hardwareMap.get(DcMotorEx.class, "rightRear");
rightFront = hardwareMap.get(DcMotorEx.class, "rightFront");
// TODO: Make sure that this is the direction your motors need to be reversed in.
leftFront.setDirection(DcMotorSimple.Direction.REVERSE);
leftRear.setDirection(DcMotorSimple.Direction.REVERSE);
motors = Arrays.asList(leftFront, leftRear, rightFront, rightRear);
for (DcMotorEx motor : motors) {
MotorConfigurationType motorConfigurationType = motor.getMotorType().clone();
motorConfigurationType.setAchieveableMaxRPMFraction(1.0);
motor.setMotorType(motorConfigurationType);
}
for (DcMotorEx motor : motors) {
motor.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.FLOAT);
}
for (int i = 0; i < AVERAGED_VELOCITY_SAMPLE_NUMBER; i++) {
velocities.add(new Vector());
}
for (int i = 0; i < AVERAGED_VELOCITY_SAMPLE_NUMBER / 2; i++) {
accelerations.add(new Vector());
}
calculateAveragedVelocityAndAcceleration();
}
/**
* This sets the maximum power the motors are allowed to use.
*
* @param set This caps the motor power from [0, 1].
*/
public void setMaxPower(double set) {
maxPower = MathFunctions.clamp(set, 0, 1);
}
/**
* This handles the limiting of the drive powers array to the max power.
*/
public void limitDrivePowers() {
for (int i = 0; i < drivePowers.length; i++) {
if (Math.abs(drivePowers[i]) > maxPower) {
drivePowers[i] = maxPower * MathFunctions.getSign(drivePowers[i]);
}
}
}
/**
* This returns the current pose from the PoseUpdater.
*
* @return returns the pose
*/
public Pose2d getPose() {
return poseUpdater.getPose();
}
/**
* This sets the current pose in the PoseUpdater using Road Runner's setPoseEstimate() method.
*
* @param pose The pose to set the current pose to.
*/
public void setPose(Pose2d pose) {
poseUpdater.setPose(pose);
}
/**
* This returns the current velocity of the robot as a Vector.
*
* @return returns the current velocity as a Vector.
*/
public Vector getVelocity() {
return poseUpdater.getVelocity();
}
/**
* This returns the current acceleration of the robot as a Vector.
*
* @return returns the current acceleration as a Vector.
*/
public Vector getAcceleration() {
return poseUpdater.getAcceleration();
}
/**
* This returns the magnitude of the current velocity. For when you only need the magnitude.
*
* @return returns the magnitude of the current velocity.
*/
public double getVelocityMagnitude() {
return poseUpdater.getVelocity().getMagnitude();
}
/**
* This sets the starting pose. Do not run this after moving at all.
*
* @param pose the pose to set the starting pose to.
*/
public void setStartingPose(Pose2d pose) {
poseUpdater.setStartingPose(pose);
}
/**
* This sets the current pose, using offsets so no reset time delay. This is better than the
* Road Runner reset, in general. Think of using offsets as setting trim in an aircraft. This can
* be reset as well, so beware of using the resetOffset() method.
*
* @param set The pose to set the current pose to.
*/
public void setCurrentPoseWithOffset(Pose2d set) {
poseUpdater.setCurrentPoseWithOffset(set);
}
/**
* This sets the offset for only the x position.
*
* @param xOffset This sets the offset.
*/
public void setXOffset(double xOffset) {
poseUpdater.setXOffset(xOffset);
}
/**
* This sets the offset for only the y position.
*
* @param yOffset This sets the offset.
*/
public void setYOffset(double yOffset) {
poseUpdater.setYOffset(yOffset);
}
/**
* This sets the offset for only the heading.
*
* @param headingOffset This sets the offset.
*/
public void setHeadingOffset(double headingOffset) {
poseUpdater.setHeadingOffset(headingOffset);
}
/**
* This returns the x offset.
*
* @return returns the x offset.
*/
public double getXOffset() {
return poseUpdater.getXOffset();
}
/**
* This returns the y offset.
*
* @return returns the y offset.
*/
public double getYOffset() {
return poseUpdater.getYOffset();
}
/**
* This returns the heading offset.
*
* @return returns the heading offset.
*/
public double getHeadingOffset() {
return poseUpdater.getHeadingOffset();
}
/**
* This resets all offsets set to the PoseUpdater. If you have reset your pose using the
* setCurrentPoseUsingOffset(Pose2d set) method, then your pose will be returned to what the
* PoseUpdater thinks your pose would be, not the pose you reset to.
*/
public void resetOffset() {
poseUpdater.resetOffset();
}
/**
* This holds a Point.
*
* @param point the Point to stay at.
* @param heading the heading to face.
*/
public void holdPoint(BezierPoint point, double heading) {
breakFollowing();
holdingPosition = true;
isBusy = false;
followingPathChain = false;
currentPath = new Path(point);
currentPath.setConstantHeadingInterpolation(heading);
closestPose = currentPath.getClosestPoint(poseUpdater.getPose(), 1);
}
/**
* This follows a Path.
* This also makes the Follower hold the last Point on the Path.
*
* @param path the Path to follow.
*/
public void followPath(Path path, boolean holdEnd) {
breakFollowing();
holdPositionAtEnd = holdEnd;
isBusy = true;
followingPathChain = false;
currentPath = path;
closestPose = currentPath.getClosestPoint(poseUpdater.getPose(), BEZIER_CURVE_BINARY_STEP_LIMIT);
}
/**
* This follows a Path.
*
* @param path the Path to follow.
*/
public void followPath(Path path) {
followPath(path, false);
}
/**
* This follows a PathChain. Drive vector projection is only done on the last Path.
* This also makes the Follower hold the last Point on the PathChain.
*
* @param pathChain the PathChain to follow.
*/
public void followPath(PathChain pathChain, boolean holdEnd) {
breakFollowing();
holdPositionAtEnd = holdEnd;
pathStartTimes = new long[pathChain.size()];
pathStartTimes[0] = System.currentTimeMillis();
isBusy = true;
followingPathChain = true;
chainIndex = 0;
currentPathChain = pathChain;
currentPath = pathChain.getPath(chainIndex);
closestPose = currentPath.getClosestPoint(poseUpdater.getPose(), BEZIER_CURVE_BINARY_STEP_LIMIT);
}
/**
* This follows a PathChain. Drive vector projection is only done on the last Path.
*
* @param pathChain the PathChain to follow.
*/
public void followPath(PathChain pathChain) {
followPath(pathChain, false);
}
/**
* This calls an update to the PoseUpdater, which updates the robot's current position estimate.
* This also updates all the Follower's PIDFs, which updates the motor powers.
*/
public void update() {
poseUpdater.update();
if (auto) {
if (holdingPosition) {
closestPose = currentPath.getClosestPoint(poseUpdater.getPose(), 1);
drivePowers = driveVectorScaler.getDrivePowers(MathFunctions.scalarMultiplyVector(getTranslationalCorrection(), holdPointTranslationalScaling), MathFunctions.scalarMultiplyVector(getHeadingVector(), holdPointHeadingScaling), new Vector(), poseUpdater.getPose().getHeading());
limitDrivePowers();
for (int i = 0; i < motors.size(); i++) {
motors.get(i).setPower(drivePowers[i]);
}
} else {
if (isBusy) {
closestPose = currentPath.getClosestPoint(poseUpdater.getPose(), BEZIER_CURVE_BINARY_STEP_LIMIT);
if (followingPathChain) updateCallbacks();
drivePowers = driveVectorScaler.getDrivePowers(getCorrectiveVector(), getHeadingVector(), getDriveVector(), poseUpdater.getPose().getHeading());
limitDrivePowers();
for (int i = 0; i < motors.size(); i++) {
motors.get(i).setPower(drivePowers[i]);
}
}
if (currentPath.isAtParametricEnd()) {
if (followingPathChain && chainIndex < currentPathChain.size() - 1) {
// Not at last path, keep going
breakFollowing();
pathStartTimes[chainIndex] = System.currentTimeMillis();
isBusy = true;
followingPathChain = true;
chainIndex++;
currentPath = currentPathChain.getPath(chainIndex);
closestPose = currentPath.getClosestPoint(poseUpdater.getPose(), BEZIER_CURVE_BINARY_STEP_LIMIT);
} else {
// At last path, run some end detection stuff
// set isBusy to false if at end
if (!reachedParametricPathEnd) {
reachedParametricPathEnd = true;
reachedParametricPathEndTime = System.currentTimeMillis();
}
if ((System.currentTimeMillis() - reachedParametricPathEndTime > currentPath.getPathEndTimeoutConstraint()) || (poseUpdater.getVelocity().getMagnitude() < currentPath.getPathEndVelocityConstraint() && MathFunctions.distance(poseUpdater.getPose(), closestPose) < currentPath.getPathEndTranslationalConstraint() && MathFunctions.getSmallestAngleDifference(poseUpdater.getPose().getHeading(), currentPath.getClosestPointHeadingGoal()) < currentPath.getPathEndHeadingConstraint())) {
if (holdPositionAtEnd) {
holdPositionAtEnd = false;
holdPoint(new BezierPoint(currentPath.getLastControlPoint()), currentPath.getHeadingGoal(1));
} else {
breakFollowing();
}
}
}
}
}
} else {
velocities.add(poseUpdater.getVelocity());
velocities.remove(velocities.get(velocities.size() - 1));
calculateAveragedVelocityAndAcceleration();
drivePowers = driveVectorScaler.getDrivePowers(teleOpMovementVectors[0], teleOpMovementVectors[1], teleOpMovementVectors[2], poseUpdater.getPose().getHeading());
limitDrivePowers();
for (int i = 0; i < motors.size(); i++) {
motors.get(i).setPower(drivePowers[i]);
}
}
}
/**
* This calculates an averaged approximate velocity and acceleration. This is used for a
* real-time correction of centripetal force, which is used in teleop.
*/
public void calculateAveragedVelocityAndAcceleration() {
averageVelocity = new Vector();
averagePreviousVelocity = new Vector();
for (int i = 0; i < velocities.size() / 2; i++) {
averageVelocity = MathFunctions.addVectors(averageVelocity, velocities.get(i));
}
averageVelocity = MathFunctions.scalarMultiplyVector(averageVelocity, 1.0 / ((double) velocities.size() / 2));
for (int i = velocities.size() / 2; i < velocities.size(); i++) {
averagePreviousVelocity = MathFunctions.addVectors(averagePreviousVelocity, velocities.get(i));
}
averagePreviousVelocity = MathFunctions.scalarMultiplyVector(averagePreviousVelocity, 1.0 / ((double) velocities.size() / 2));
accelerations.add(MathFunctions.subtractVectors(averageVelocity, averagePreviousVelocity));
accelerations.remove(accelerations.size() - 1);
averageAcceleration = new Vector();
for (int i = 0; i < accelerations.size(); i++) {
averageAcceleration = MathFunctions.addVectors(averageAcceleration, accelerations.get(i));
}
averageAcceleration = MathFunctions.scalarMultiplyVector(averageAcceleration, 1.0 / accelerations.size());
}
/**
* This checks if any PathCallbacks should be run right now, and runs them if applicable.
*/
public void updateCallbacks() {
for (PathCallback callback : currentPathChain.getCallbacks()) {
if (!callback.hasBeenRun()) {
if (callback.getType() == PathCallback.PARAMETRIC) {
// parametric call back
if (chainIndex == callback.getIndex() && (getCurrentTValue() >= callback.getStartCondition() || MathFunctions.roughlyEquals(getCurrentTValue(), callback.getStartCondition()))) {
callback.run();
}
} else {
// time based call back
if (chainIndex >= callback.getIndex() && System.currentTimeMillis() - pathStartTimes[callback.getIndex()] > callback.getStartCondition()) {
callback.run();
}
}
}
}
}
/**
* This resets the PIDFs and stops following the current Path.
*/
public void breakFollowing() {
holdingPosition = false;
isBusy = false;
reachedParametricPathEnd = false;
smallDrivePIDF.reset();
largeDrivePIDF.reset();
smallHeadingPIDF.reset();
largeHeadingPIDF.reset();
smallTranslationalPIDF.reset();
smallTranslationalIntegral.reset();
smallTranslationalIntegralVector = new Vector();
previousSmallTranslationalIntegral = 0;
largeTranslationalPIDF.reset();
largeTranslationalIntegral.reset();
largeTranslationalIntegralVector = new Vector();
previousLargeTranslationalIntegral = 0;
driveVector = new Vector();
headingVector = new Vector();
translationalVector = new Vector();
centripetalVector = new Vector();
correctiveVector = new Vector();
driveError = 0;
headingError = 0;
for (int i = 0; i < motors.size(); i++) {
motors.get(i).setPower(0);
}
}
/**
* This returns if the Follower is currently following a Path or a PathChain.
*
* @return returns if the Follower is busy.
*/
public boolean isBusy() {
return isBusy;
}
/**
* Sets the correctional, heading, and drive movement vectors for teleop enhancements.
* The correctional Vector only accounts for an approximated centripetal correction.
*/
public void setMovementVectors(Vector correctional, Vector heading, Vector drive) {
teleOpMovementVectors = new Vector[]{correctional, heading, drive};
}
/**
* This returns a Vector in the direction the robot must go to move along the path. This Vector
* takes into account the projected position of the robot to calculate how much power is needed.
* <p>
* Note: This vector is clamped to be at most 1 in magnitude.
*
* @return returns the drive vector.
*/
public Vector getDriveVector() {
if (!useDrive) return new Vector();
if (followingPathChain && chainIndex < currentPathChain.size() - 1) {
return new Vector(1, currentPath.getClosestPointTangentVector().getTheta());
}
driveError = getDriveVelocityError();
if (Math.abs(driveError) < drivePIDFSwitch) {
smallDrivePIDF.updateError(driveError);
driveVector = new Vector(MathFunctions.clamp(smallDrivePIDF.runPIDF() + smallDrivePIDFFeedForward * MathFunctions.getSign(driveError), -1, 1), currentPath.getClosestPointTangentVector().getTheta());
return MathFunctions.copyVector(driveVector);
}
largeDrivePIDF.updateError(driveError);
driveVector = new Vector(MathFunctions.clamp(largeDrivePIDF.runPIDF() + largeDrivePIDFFeedForward * MathFunctions.getSign(driveError), -1, 1), currentPath.getClosestPointTangentVector().getTheta());
return MathFunctions.copyVector(driveVector);
}
/**
* This returns the velocity the robot needs to be at to make it to the end of the Path
* at some specified deceleration (well technically just some negative acceleration).
*
* @return returns the projected velocity.
*/
public double getDriveVelocityError() {
double distanceToGoal;
if (!currentPath.isAtParametricEnd()) {
distanceToGoal = currentPath.length() * (1 - currentPath.getClosestPointTValue());
} else {
Vector offset = new Vector();
offset.setOrthogonalComponents(getPose().getX() - currentPath.getLastControlPoint().getX(), getPose().getY() - currentPath.getLastControlPoint().getY());
distanceToGoal = MathFunctions.dotProduct(currentPath.getEndTangent(), offset);
}
Vector distanceToGoalVector = MathFunctions.scalarMultiplyVector(MathFunctions.normalizeVector(currentPath.getClosestPointTangentVector()), distanceToGoal);
Vector velocity = new Vector(MathFunctions.dotProduct(getVelocity(), MathFunctions.normalizeVector(currentPath.getClosestPointTangentVector())), currentPath.getClosestPointTangentVector().getTheta());
Vector forwardHeadingVector = new Vector(1.0, poseUpdater.getPose().getHeading());
double forwardVelocity = MathFunctions.dotProduct(forwardHeadingVector, velocity);
double forwardDistanceToGoal = MathFunctions.dotProduct(forwardHeadingVector, distanceToGoalVector);
double forwardVelocityGoal = MathFunctions.getSign(forwardDistanceToGoal) * Math.sqrt(Math.abs(-2 * currentPath.getZeroPowerAccelerationMultiplier() * forwardZeroPowerAcceleration * forwardDistanceToGoal));
double forwardVelocityZeroPowerDecay = forwardVelocity - MathFunctions.getSign(forwardDistanceToGoal) * Math.sqrt(Math.abs(Math.pow(forwardVelocity, 2) + 2 * forwardZeroPowerAcceleration * forwardDistanceToGoal));
Vector lateralHeadingVector = new Vector(1.0, poseUpdater.getPose().getHeading() - Math.PI / 2);
double lateralVelocity = MathFunctions.dotProduct(lateralHeadingVector, velocity);
double lateralDistanceToGoal = MathFunctions.dotProduct(lateralHeadingVector, distanceToGoalVector);
double lateralVelocityGoal = MathFunctions.getSign(lateralDistanceToGoal) * Math.sqrt(Math.abs(-2 * currentPath.getZeroPowerAccelerationMultiplier() * lateralZeroPowerAcceleration * lateralDistanceToGoal));
double lateralVelocityZeroPowerDecay = lateralVelocity - MathFunctions.getSign(lateralDistanceToGoal) * Math.sqrt(Math.abs(Math.pow(lateralVelocity, 2) + 2 * lateralZeroPowerAcceleration * lateralDistanceToGoal));
Vector forwardVelocityError = new Vector(forwardVelocityGoal - forwardVelocityZeroPowerDecay - forwardVelocity, forwardHeadingVector.getTheta());
Vector lateralVelocityError = new Vector(lateralVelocityGoal - lateralVelocityZeroPowerDecay - lateralVelocity, lateralHeadingVector.getTheta());
Vector velocityErrorVector = MathFunctions.addVectors(forwardVelocityError, lateralVelocityError);
return velocityErrorVector.getMagnitude() * MathFunctions.getSign(MathFunctions.dotProduct(velocityErrorVector, currentPath.getClosestPointTangentVector()));
}
/**
* This returns a Vector in the direction of the robot that contains the heading correction
* as its magnitude. Positive heading correction turns the robot counter-clockwise, and negative
* heading correction values turn the robot clockwise. So basically, Pedro Pathing uses a right-
* handed coordinate system.
* <p>
* Note: This vector is clamped to be at most 1 in magnitude.
*
* @return returns the heading vector.
*/
public Vector getHeadingVector() {
if (!useHeading) return new Vector();
headingError = MathFunctions.getTurnDirection(poseUpdater.getPose().getHeading(), currentPath.getClosestPointHeadingGoal()) * MathFunctions.getSmallestAngleDifference(poseUpdater.getPose().getHeading(), currentPath.getClosestPointHeadingGoal());
if (Math.abs(headingError) < headingPIDFSwitch) {
smallHeadingPIDF.updateError(headingError);
headingVector = new Vector(MathFunctions.clamp(smallHeadingPIDF.runPIDF() + smallHeadingPIDFFeedForward * MathFunctions.getTurnDirection(poseUpdater.getPose().getHeading(), currentPath.getClosestPointHeadingGoal()), -1, 1), poseUpdater.getPose().getHeading());
return MathFunctions.copyVector(headingVector);
}
largeHeadingPIDF.updateError(headingError);
headingVector = new Vector(MathFunctions.clamp(largeHeadingPIDF.runPIDF() + largeHeadingPIDFFeedForward * MathFunctions.getTurnDirection(poseUpdater.getPose().getHeading(), currentPath.getClosestPointHeadingGoal()), -1, 1), poseUpdater.getPose().getHeading());
return MathFunctions.copyVector(headingVector);
}
/**
* This returns a combined Vector in the direction the robot must go to correct both translational
* error as well as centripetal force.
* <p>
* Note: This vector is clamped to be at most 1 in magnitude.
*
* @return returns the corrective vector.
*/
public Vector getCorrectiveVector() {
Vector centripetal = getCentripetalForceCorrection();
Vector translational = getTranslationalCorrection();
Vector corrective = MathFunctions.addVectors(centripetal, translational);
if (corrective.getMagnitude() > 1) {
return MathFunctions.addVectors(centripetal, MathFunctions.scalarMultiplyVector(translational, driveVectorScaler.findNormalizingScaling(centripetal, translational)));
}
correctiveVector = MathFunctions.copyVector(corrective);
return corrective;
}
/**
* This returns a Vector in the direction the robot must go to account for only translational
* error.
* <p>
* Note: This vector is clamped to be at most 1 in magnitude.
*
* @return returns the translational correction vector.
*/
public Vector getTranslationalCorrection() {
if (!useTranslational) return new Vector();
Vector translationalVector = new Vector();
double x = closestPose.getX() - poseUpdater.getPose().getX();
double y = closestPose.getY() - poseUpdater.getPose().getY();
translationalVector.setOrthogonalComponents(x, y);
if (!(currentPath.isAtParametricEnd() || currentPath.isAtParametricStart())) {
translationalVector = MathFunctions.subtractVectors(translationalVector, new Vector(MathFunctions.dotProduct(translationalVector, MathFunctions.normalizeVector(currentPath.getClosestPointTangentVector())), currentPath.getClosestPointTangentVector().getTheta()));
smallTranslationalIntegralVector = MathFunctions.subtractVectors(smallTranslationalIntegralVector, new Vector(MathFunctions.dotProduct(smallTranslationalIntegralVector, MathFunctions.normalizeVector(currentPath.getClosestPointTangentVector())), currentPath.getClosestPointTangentVector().getTheta()));
largeTranslationalIntegralVector = MathFunctions.subtractVectors(largeTranslationalIntegralVector, new Vector(MathFunctions.dotProduct(largeTranslationalIntegralVector, MathFunctions.normalizeVector(currentPath.getClosestPointTangentVector())), currentPath.getClosestPointTangentVector().getTheta()));
}
if (MathFunctions.distance(poseUpdater.getPose(), closestPose) < translationalPIDFSwitch) {
smallTranslationalIntegral.updateError(translationalVector.getMagnitude());
smallTranslationalIntegralVector = MathFunctions.addVectors(smallTranslationalIntegralVector, new Vector(smallTranslationalIntegral.runPIDF() - previousSmallTranslationalIntegral, translationalVector.getTheta()));
previousSmallTranslationalIntegral = smallTranslationalIntegral.runPIDF();
smallTranslationalPIDF.updateError(translationalVector.getMagnitude());
translationalVector.setMagnitude(smallTranslationalPIDF.runPIDF() + smallTranslationalPIDFFeedForward);
translationalVector = MathFunctions.addVectors(translationalVector, smallTranslationalIntegralVector);
} else {
largeTranslationalIntegral.updateError(translationalVector.getMagnitude());
largeTranslationalIntegralVector = MathFunctions.addVectors(largeTranslationalIntegralVector, new Vector(largeTranslationalIntegral.runPIDF() - previousLargeTranslationalIntegral, translationalVector.getTheta()));
previousLargeTranslationalIntegral = largeTranslationalIntegral.runPIDF();
largeTranslationalPIDF.updateError(translationalVector.getMagnitude());
translationalVector.setMagnitude(largeTranslationalPIDF.runPIDF() + largeTranslationalPIDFFeedForward);
translationalVector = MathFunctions.addVectors(translationalVector, largeTranslationalIntegralVector);
}
translationalVector.setMagnitude(MathFunctions.clamp(translationalVector.getMagnitude(), 0, 1));
this.translationalVector = MathFunctions.copyVector(translationalVector);
return translationalVector;
}
/**
* This returns the raw translational error, or how far off the closest point the robot is.
*
* @return This returns the raw translational error as a Vector.
*/
public Vector getTranslationalError() {
Vector error = new Vector();
double x = closestPose.getX() - poseUpdater.getPose().getX();
double y = closestPose.getY() - poseUpdater.getPose().getY();
error.setOrthogonalComponents(x, y);
return error;
}
/**
* This returns a Vector in the direction the robot must go to account for only centripetal
* force.
* <p>
* Note: This vector is clamped to be between [0, 1] in magnitude.
*
* @return returns the centripetal force correction vector.
*/
public Vector getCentripetalForceCorrection() {
if (!useCentripetal) return new Vector();
double curvature;
if (auto) {
curvature = currentPath.getClosestPointCurvature();
} else {
double yPrime = averageVelocity.getYComponent() / averageVelocity.getXComponent();
double yDoublePrime = averageAcceleration.getYComponent() / averageVelocity.getXComponent();
curvature = (Math.pow(Math.sqrt(1 + Math.pow(yPrime, 2)), 3)) / (yDoublePrime);
}
if (Double.isNaN(curvature)) return new Vector();
centripetalVector = new Vector(MathFunctions.clamp(FollowerConstants.centripetalScaling * FollowerConstants.mass * Math.pow(MathFunctions.dotProduct(poseUpdater.getVelocity(), MathFunctions.normalizeVector(currentPath.getClosestPointTangentVector())), 2) * curvature, -1, 1), currentPath.getClosestPointTangentVector().getTheta() + Math.PI / 2 * MathFunctions.getSign(currentPath.getClosestPointNormalVector().getTheta()));
return centripetalVector;
}
/**
* This returns the closest pose to the robot on the Path the Follower is currently following.
* This closest pose is calculated through a binary search method with some specified number of
* steps to search. By default, 10 steps are used, which should be more than enough.
*
* @return returns the closest pose.
*/
public Pose2d getClosestPose() {
return closestPose;
}
/**
* This sets whether or not the Follower is being used in auto or teleop.
*
* @param set sets auto or not
*/
public void setAuto(boolean set) {
auto = set;
}
/**
* This returns whether the follower is at the parametric end of its current Path.
* The parametric end is determined by if the closest Point t-value is greater than some specified
* end t-value.
* If running a PathChain, this returns true only if at parametric end of last Path in the PathChain.
*
* @return returns whether the Follower is at the parametric end of its Path.
*/
public boolean atParametricEnd() {
if (followingPathChain) {
if (chainIndex == currentPathChain.size() - 1) return currentPath.isAtParametricEnd();
return false;
}
return currentPath.isAtParametricEnd();
}
/**
* This returns the t value of the closest point on the current Path to the robot
* In the absence of a current Path, it returns 1.0.
*
* @return returns the current t value.
*/
public double getCurrentTValue() {
if (isBusy) return currentPath.getClosestPointTValue();
return 1.0;
}
/**
* This returns the current path number. For following Paths, this will return 0. For PathChains,
* this will return the current path number. For holding Points, this will also return 0.
*
* @return returns the current path number.
*/
public double getCurrentPathNumber() {
if (!followingPathChain) return 0;
return chainIndex;
}
/**
* This returns a new PathBuilder object for easily building PathChains.
*
* @return returns a new PathBuilder object.
*/
public PathBuilder pathBuilder() {
return new PathBuilder();
}
/**
* This writes out information about the various motion Vectors to the Telemetry specified.
*
* @param telemetry this is an instance of Telemetry or the FTC Dashboard telemetry that this
* method will use to output the debug data.
*/
public void telemetryDebug(MultipleTelemetry telemetry) {
telemetry.addData("follower busy", isBusy());
telemetry.addData("heading error", headingError);
telemetry.addData("heading vector magnitude", headingVector.getMagnitude());
telemetry.addData("corrective vector magnitude", correctiveVector.getMagnitude());
telemetry.addData("corrective vector heading", correctiveVector.getTheta());
telemetry.addData("translational error magnitude", getTranslationalError().getMagnitude());
telemetry.addData("translational error direction", getTranslationalError().getTheta());
telemetry.addData("translational vector magnitude", translationalVector.getMagnitude());
telemetry.addData("translational vector heading", translationalVector.getMagnitude());
telemetry.addData("centripetal vector magnitude", centripetalVector.getMagnitude());
telemetry.addData("centripetal vector heading", centripetalVector.getTheta());
telemetry.addData("drive error", driveError);
telemetry.addData("drive vector magnitude", driveVector.getMagnitude());
telemetry.addData("drive vector heading", driveVector.getTheta());
telemetry.addData("x", getPose().getX());
telemetry.addData("y", getPose().getY());
telemetry.addData("heading", getPose().getHeading());
telemetry.addData("velocity magnitude", getVelocity().getMagnitude());
telemetry.addData("velocity heading", getVelocity().getTheta());
telemetry.update();
}
/**
* This writes out information about the various motion Vectors to the Telemetry specified.
*
* @param telemetry this is an instance of Telemetry or the FTC Dashboard telemetry that this
* method will use to output the debug data.
*/
public void telemetryDebug(Telemetry telemetry) {
telemetryDebug(new MultipleTelemetry(telemetry));
}
}

View File

@ -0,0 +1,313 @@
package org.firstinspires.ftc.teamcode.pedroPathing.localization;
import com.acmerobotics.roadrunner.geometry.Pose2d;
import com.qualcomm.hardware.lynx.LynxModule;
import com.qualcomm.robotcore.hardware.HardwareMap;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.MathFunctions;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Vector;
import java.util.ArrayList;
import java.util.List;
/**
* This is the PoseUpdater class. This class handles getting pose data from the localizer and returning
* the information in a useful way to the Follower.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/4/2024
*/
public class PoseUpdater {
private HardwareMap hardwareMap;
private IMU imu;
private ThreeWheelLocalizer localizer;
private Pose2d startingPose = new Pose2d(0,0,0);
private Pose2d currentPose = startingPose;
private Pose2d previousPose = startingPose;
private Vector currentVelocity = new Vector();
private Vector previousVelocity = new Vector();
private Vector currentAcceleration = new Vector();
private double xOffset = 0;
private double yOffset = 0;
private double headingOffset = 0;
private long previousPoseTime;
private long currentPoseTime;
/**
* Creates a new PoseUpdater from a HardwareMap.
*
* @param hardwareMap the HardwareMap
*/
public PoseUpdater(HardwareMap hardwareMap) {
this.hardwareMap = hardwareMap;
for (LynxModule module : hardwareMap.getAll(LynxModule.class)) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.AUTO);
}
List<Integer> lastTrackingEncPositions = new ArrayList<>();
List<Integer> lastTrackingEncVels = new ArrayList<>();
localizer = new ThreeWheelLocalizer(hardwareMap, lastTrackingEncPositions, lastTrackingEncVels);
}
/**
* This updates the robot's pose, as well as updating the previous pose, velocity, and
* acceleration. The cache for the current pose, velocity, and acceleration is cleared, and
* the time stamps are updated as well.
*/
public void update() {
previousVelocity = getVelocity();
previousPose = applyOffset(getRawPose());
currentPose = null;
currentVelocity = null;
currentAcceleration = null;
previousPoseTime = currentPoseTime;
currentPoseTime = System.nanoTime();
localizer.update();
}
/**
* This sets the starting pose. Do not run this after moving at all.
*
* @param set the Pose to set the starting pose to.
*/
public void setStartingPose(Pose2d set) {
startingPose = set;
previousPose = startingPose;
previousPoseTime = System.nanoTime();
currentPoseTime = System.nanoTime();
localizer.setPoseEstimate(set);
}
/**
* This sets the current pose, using offsets so no reset time delay. This is better than the
* Road Runner reset, in general. Think of using offsets as setting trim in an aircraft. This can
* be reset as well, so beware of using the resetOffset() method.
*
* @param set The pose to set the current pose to.
*/
public void setCurrentPoseWithOffset(Pose2d set) {
Pose2d currentPose = getRawPose();
setXOffset(set.getX() - currentPose.getX());
setYOffset(set.getY() - currentPose.getY());
setHeadingOffset(MathFunctions.getTurnDirection(currentPose.getHeading(), set.getHeading()) * MathFunctions.getSmallestAngleDifference(currentPose.getHeading(), set.getHeading()));
}
/**
* This sets the offset for only the x position.
*
* @param offset This sets the offset.
*/
public void setXOffset(double offset) {
xOffset = offset;
}
/**
* This sets the offset for only the y position.
*
* @param offset This sets the offset.
*/
public void setYOffset(double offset) {
yOffset = offset;
}
/**
* This sets the offset for only the heading.
*
* @param offset This sets the offset.
*/
public void setHeadingOffset(double offset) {
headingOffset = offset;
}
/**
* This returns the x offset.
*
* @return returns the x offset.
*/
public double getXOffset() {
return xOffset;
}
/**
* This returns the y offset.
*
* @return returns the y offset.
*/
public double getYOffset() {
return yOffset;
}
/**
* This returns the heading offset.
*
* @return returns the heading offset.
*/
public double getHeadingOffset() {
return headingOffset;
}
/**
* This applies the offset to a specified Pose.
*
* @param pose The pose to be offset.
* @return This returns a new Pose with the offset applied.
*/
public Pose2d applyOffset(Pose2d pose) {
return new Pose2d(pose.getX()+xOffset, pose.getY()+yOffset, pose.getHeading()+headingOffset);
}
/**
* This resets all offsets set to the PoseUpdater. If you have reset your pose using the
* setCurrentPoseUsingOffset(Pose2d set) method, then your pose will be returned to what the
* PoseUpdater thinks your pose would be, not the pose you reset to.
*/
public void resetOffset() {
setXOffset(0);
setYOffset(0);
setHeadingOffset(0);
}
/**
* This returns the current pose, with offsets applied. If this is called multiple times in
* a single update, the current pose is cached so that subsequent calls don't have to repeat
* localizer calls or calculations.
*
* @return returns the current pose.
*/
public Pose2d getPose() {
if (currentPose == null) {
currentPose = localizer.getPoseEstimate();
return applyOffset(currentPose);
} else {
return applyOffset(currentPose);
}
}
/**
* This returns the current raw pose, without any offsets applied. If this is called multiple times in
* a single update, the current pose is cached so that subsequent calls don't have to repeat
* localizer calls or calculations.
*
* @return returns the raw pose.
*/
public Pose2d getRawPose() {
if (currentPose == null) {
currentPose = localizer.getPoseEstimate();
return currentPose;
} else {
return currentPose;
}
}
/**
* This sets the current pose using the Road Runner pose reset. This is slow.
*
* @param set the pose to set the current pose to.
*/
public void setPose(Pose2d set) {
resetOffset();
localizer.setPoseEstimate(set);
}
/**
* Returns the robot's pose from the previous update.
*
* @return returns the robot's previous pose.
*/
public Pose2d getPreviousPose() {
return previousPose;
}
/**
* Returns the robot's change in pose from the previous update.
*
* @return returns the robot's delta pose.
*/
public Pose2d getDeltaPose() {
return getPose().minus(previousPose);
}
/**
* This returns the velocity of the robot as a Vector. If this is called multiple times in
* a single update, the velocity Vector is cached so that subsequent calls don't have to repeat
* localizer calls or calculations.
*
* @return returns the velocity of the robot.
*/
public Vector getVelocity() {
if (currentVelocity == null) {
currentVelocity = new Vector();
currentVelocity.setOrthogonalComponents(getPose().getX() - previousPose.getX(), getPose().getY() - previousPose.getY());
currentVelocity.setMagnitude(MathFunctions.distance(getPose(), previousPose) / ((currentPoseTime - previousPoseTime) / Math.pow(10.0, 9)));
return MathFunctions.copyVector(currentVelocity);
} else {
return MathFunctions.copyVector(currentVelocity);
}
}
/**
* This returns the angular velocity of the robot as a double.
*
* @return returns the angular velocity of the robot.
*/
public double getAngularVelocity() {
return MathFunctions.getTurnDirection(previousPose.getHeading(), getPose().getHeading()) * MathFunctions.getSmallestAngleDifference(getPose().getHeading(), previousPose.getHeading()) / ((currentPoseTime-previousPoseTime)/Math.pow(10.0, 9));
}
/**
* This returns the acceleration of the robot as a Vector. If this is called multiple times in
* a single update, the acceleration Vector is cached so that subsequent calls don't have to
* repeat localizer calls or calculations.
*
* @return returns the acceleration of the robot.
*/
public Vector getAcceleration() {
if (currentAcceleration == null) {
currentAcceleration = MathFunctions.subtractVectors(getVelocity(), previousVelocity);
currentAcceleration.setMagnitude(currentAcceleration.getMagnitude() / ((currentPoseTime - previousPoseTime) / Math.pow(10.0, 9)));
return MathFunctions.copyVector(currentAcceleration);
} else {
return MathFunctions.copyVector(currentAcceleration);
}
}
/**
* This resets the heading of the robot to the IMU's heading, using Road Runner's pose reset.
*/
public void resetHeadingToIMU() {
localizer.resetHeading(getNormalizedIMUHeading() + startingPose.getHeading());
}
/**
* This resets the heading of the robot to the IMU's heading, using offsets instead of Road
* Runner's pose reset. This way, it's faster, but this can be wiped with the resetOffsets()
* method.
*/
public void resetHeadingToIMUWithOffsets() {
setCurrentPoseWithOffset(new Pose2d(getPose().getX(), getPose().getY(), getNormalizedIMUHeading() + startingPose.getHeading()));
}
/**
* This returns the IMU heading normalized to be between [0, 2 PI] radians.
*
* @return returns the normalized IMU heading.
*/
public double getNormalizedIMUHeading() {
return MathFunctions.normalizeAngle(-imu.getRobotYawPitchRollAngles().getYaw(AngleUnit.RADIANS));
}
}

View File

@ -0,0 +1,125 @@
package org.firstinspires.ftc.teamcode.pedroPathing.localization;
import com.acmerobotics.roadrunner.util.NanoClock;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
/**
* Wraps a motor instance to provide corrected velocity counts and allow reversing independently of the corresponding
* slot's motor direction
*/
public class RoadRunnerEncoder {
private final static int CPS_STEP = 0x10000;
private static double inverseOverflow(double input, double estimate) {
// convert to uint16
int real = (int) input & 0xffff;
// initial, modulo-based correction: it can recover the remainder of 5 of the upper 16 bits
// because the velocity is always a multiple of 20 cps due to Expansion Hub's 50ms measurement window
real += ((real % 20) / 4) * CPS_STEP;
// estimate-based correction: it finds the nearest multiple of 5 to correct the upper bits by
real += Math.round((estimate - real) / (5 * CPS_STEP)) * 5 * CPS_STEP;
return real;
}
public enum Direction {
FORWARD(1),
REVERSE(-1);
private int multiplier;
Direction(int multiplier) {
this.multiplier = multiplier;
}
public int getMultiplier() {
return multiplier;
}
}
private DcMotorEx motor;
private NanoClock clock;
private Direction direction;
private int lastPosition;
private int velocityEstimateIdx;
private double[] velocityEstimates;
private double lastUpdateTime;
public RoadRunnerEncoder(DcMotorEx motor, NanoClock clock) {
this.motor = motor;
this.clock = clock;
this.direction = Direction.FORWARD;
this.lastPosition = 0;
this.velocityEstimates = new double[3];
this.lastUpdateTime = clock.seconds();
}
public RoadRunnerEncoder(DcMotorEx motor) {
this(motor, NanoClock.system());
}
public Direction getDirection() {
return direction;
}
private int getMultiplier() {
return getDirection().getMultiplier() * (motor.getDirection() == DcMotorSimple.Direction.FORWARD ? 1 : -1);
}
/**
* Allows you to set the direction of the counts and velocity without modifying the motor's direction state
* @param direction either reverse or forward depending on if encoder counts should be negated
*/
public void setDirection(Direction direction) {
this.direction = direction;
}
/**
* Gets the position from the underlying motor and adjusts for the set direction.
* Additionally, this method updates the velocity estimates used for compensated velocity
*
* @return encoder position
*/
public int getCurrentPosition() {
int multiplier = getMultiplier();
int currentPosition = motor.getCurrentPosition() * multiplier;
if (currentPosition != lastPosition) {
double currentTime = clock.seconds();
double dt = currentTime - lastUpdateTime;
velocityEstimates[velocityEstimateIdx] = (currentPosition - lastPosition) / dt;
velocityEstimateIdx = (velocityEstimateIdx + 1) % 3;
lastPosition = currentPosition;
lastUpdateTime = currentTime;
}
return currentPosition;
}
/**
* Gets the velocity directly from the underlying motor and compensates for the direction
* See {@link #getCorrectedVelocity} for high (>2^15) counts per second velocities (such as on REV Through Bore)
*
* @return raw velocity
*/
public double getRawVelocity() {
int multiplier = getMultiplier();
return motor.getVelocity() * multiplier;
}
/**
* Uses velocity estimates gathered in {@link #getCurrentPosition} to estimate the upper bits of velocity
* that are lost in overflow due to velocity being transmitted as 16 bits.
* CAVEAT: must regularly call {@link #getCurrentPosition} for the compensation to work correctly.
*
* @return corrected velocity
*/
public double getCorrectedVelocity() {
double median = velocityEstimates[0] > velocityEstimates[1]
? Math.max(velocityEstimates[1], Math.min(velocityEstimates[0], velocityEstimates[2]))
: Math.max(velocityEstimates[0], Math.min(velocityEstimates[1], velocityEstimates[2]));
return inverseOverflow(getRawVelocity(), median);
}
}

View File

@ -0,0 +1,117 @@
package org.firstinspires.ftc.teamcode.pedroPathing.localization;
import androidx.annotation.NonNull;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.roadrunner.geometry.Pose2d;
import com.acmerobotics.roadrunner.localization.ThreeTrackingWheelLocalizer;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.HardwareMap;
import java.util.Arrays;
import java.util.List;
/*
* Sample tracking wheel localizer implementation assuming the standard configuration:
*
* left on robot is y pos
*
* front of robot is x pos
*
* /--------------\
* | ____ |
* | ---- |
* | || || |
* | || || |
* | |
* | |
* \--------------/
*
*/
/**
* This class is adapted from the Road Runner StandardTrackingWheelLocalizer class. Later, this will
* be replaced with a custom localizer.
*/
@Config
public class ThreeWheelLocalizer extends ThreeTrackingWheelLocalizer {
public static double TICKS_PER_REV = 8192;
public static double WHEEL_RADIUS = 1.37795; // in
public static double GEAR_RATIO = 1; // output (wheel) speed / input (encoder) speed
public static double X_MULTIPLIER = 0.5008239963;
public static double Y_MULTIPLIER = 0.5018874659;
public static double leftX = -18.5/25.4 - 0.1, leftY = 164.4/25.4, rightX = -18.4/25.4 - 0.1, rightY = -159.6/25.4, strafeX = -107.9/25.4+0.25, strafeY = -1.1/25.4-0.23;
private RoadRunnerEncoder leftEncoder, rightEncoder, strafeEncoder;
private List<Integer> lastEncPositions, lastEncVels;
public ThreeWheelLocalizer(HardwareMap hardwareMap, List<Integer> lastTrackingEncPositions, List<Integer> lastTrackingEncVels) {
super(Arrays.asList(
new Pose2d(leftX, leftY, 0), // left
new Pose2d(rightX, rightY, 0), // right
new Pose2d(strafeX, strafeY, Math.toRadians(90)) // strafe
));
lastEncPositions = lastTrackingEncPositions;
lastEncVels = lastTrackingEncVels;
// TODO: redo the configs here
leftEncoder = new RoadRunnerEncoder(hardwareMap.get(DcMotorEx.class, "leftRear"));
rightEncoder = new RoadRunnerEncoder(hardwareMap.get(DcMotorEx.class, "rightFront"));
strafeEncoder = new RoadRunnerEncoder(hardwareMap.get(DcMotorEx.class, "strafeEncoder"));
// TODO: reverse any encoders using Encoder.setDirection(Encoder.Direction.REVERSE)
leftEncoder.setDirection(RoadRunnerEncoder.Direction.REVERSE);
rightEncoder.setDirection(RoadRunnerEncoder.Direction.REVERSE);
strafeEncoder.setDirection(RoadRunnerEncoder.Direction.FORWARD);
}
public void resetHeading(double heading) {
setPoseEstimate(new Pose2d(getPoseEstimate().getX(), getPoseEstimate().getY(), heading));
}
public static double encoderTicksToInches(double ticks) {
return WHEEL_RADIUS * 2 * Math.PI * GEAR_RATIO * ticks / TICKS_PER_REV;
}
@NonNull
@Override
public List<Double> getWheelPositions() {
int leftPos = leftEncoder.getCurrentPosition();
int rightPos = rightEncoder.getCurrentPosition();
int frontPos = strafeEncoder.getCurrentPosition();
lastEncPositions.clear();
lastEncPositions.add(leftPos);
lastEncPositions.add(rightPos);
lastEncPositions.add(frontPos);
return Arrays.asList(
encoderTicksToInches(leftPos) * X_MULTIPLIER,
encoderTicksToInches(rightPos) * X_MULTIPLIER,
encoderTicksToInches(frontPos) * Y_MULTIPLIER
);
}
@NonNull
@Override
public List<Double> getWheelVelocities() {
int leftVel = (int) leftEncoder.getCorrectedVelocity();
int rightVel = (int) rightEncoder.getCorrectedVelocity();
int frontVel = (int) strafeEncoder.getCorrectedVelocity();
lastEncVels.clear();
lastEncVels.add(leftVel);
lastEncVels.add(rightVel);
lastEncVels.add(frontVel);
return Arrays.asList(
encoderTicksToInches(leftVel) * X_MULTIPLIER,
encoderTicksToInches(rightVel) * X_MULTIPLIER,
encoderTicksToInches(frontVel) * Y_MULTIPLIER
);
}
}

View File

@ -0,0 +1,317 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants;
import java.util.ArrayList;
/**
* This is the BezierCurve class. This class handles the creation of Bezier curves, which are used
* as the basis of the path for the Path class. Bezier curves are parametric curves defined by a set
* of control points. So, they take in one input, t, that ranges from [0, 1] and that returns a point
* on the curve. Essentially, Bezier curves are a way of defining a parametric line easily. You can
* read more on Bezier curves here: https://en.wikipedia.org/wiki/Bézier_curve
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/5/2024
*/
public class BezierCurve {
// This contains the coefficients for the curve points
private ArrayList<BezierCurveCoefficients> pointCoefficients = new ArrayList<>();
// This contains the control points for the Bezier curve
private ArrayList<Point> controlPoints = new ArrayList<>();
private Vector endTangent = new Vector();
private final int APPROXIMATION_STEPS = FollowerConstants.APPROXIMATION_STEPS;
private double UNIT_TO_TIME;
private double length;
/**
* This creates an empty BezierCurve.
* IMPORTANT NOTE: Only use this for the constructors of classes extending this. If you try to
* run the robot on a Path containing an empty BezierCurve, then it will explode.
*/
public BezierCurve() {
}
/**
* This creates a new BezierCurve with an ArrayList of control points and generates the curve.
* IMPORTANT NOTE: The order of the control points is important. That's the order the code will
* process them in, with the 0 index being the start point and the final index being the end point
*
* @param controlPoints This is the ArrayList of control points that define the BezierCurve.
*/
public BezierCurve(ArrayList<Point> controlPoints) {
if (controlPoints.size()<3) {
try {
throw new Exception("Too few control points");
} catch (Exception e) {
e.printStackTrace();
}
}
this.controlPoints = controlPoints;
generateBezierCurve();
length = approximateLength();
UNIT_TO_TIME = 1/length;
endTangent.setOrthogonalComponents(controlPoints.get(controlPoints.size()-1).getX()-controlPoints.get(controlPoints.size()-2).getX(), controlPoints.get(controlPoints.size()-1).getY()-controlPoints.get(controlPoints.size()-2).getY());
endTangent = MathFunctions.normalizeVector(endTangent);
}
/**
* This creates a new Bezier curve with some specified control points and generates the curve.
* IMPORTANT NOTE: The order of the control points is important. That's the order the code will
* process them in, with the 0 index being the start point and the final index being the end point.
*
* @param controlPoints This is the specified control points that define the BezierCurve.
*/
public BezierCurve(Point... controlPoints) {
for (Point controlPoint : controlPoints) {
this.controlPoints.add(controlPoint);
}
if (this.controlPoints.size()<3) {
try {
throw new Exception("Too few control points");
} catch (Exception e) {
e.printStackTrace();
}
}
generateBezierCurve();
length = approximateLength();
UNIT_TO_TIME = 1/length;
endTangent.setOrthogonalComponents(this.controlPoints.get(this.controlPoints.size()-1).getX()-this.controlPoints.get(this.controlPoints.size()-2).getX(), this.controlPoints.get(this.controlPoints.size()-1).getY()-this.controlPoints.get(this.controlPoints.size()-2).getY());
endTangent = MathFunctions.normalizeVector(endTangent);
}
/**
* This generates the Bezier curve. It assumes that the ArrayList of control points has been set.
* Well, this actually generates the coefficients for each control point on the Bezier curve.
* These coefficients can then be used to calculate a position, velocity, or accleration on the
* Bezier curve on the fly without much computational expense.
*
* See https://en.wikipedia.org/wiki/Bézier_curve for the explicit formula for Bezier curves
*/
public void generateBezierCurve() {
int n = controlPoints.size()-1;
for (int i = 0; i <= n; i++) {
pointCoefficients.add(new BezierCurveCoefficients(n, i));
}
}
/**
* This returns the unit tangent Vector at the end of the BezierCurve.
*
* @return returns the end tangent Vector.
*/
public Vector getEndTangent() {
return MathFunctions.copyVector(endTangent);
}
/**
* This approximates the length of the BezierCurve in APPROXIMATION_STEPS number of steps. It's
* like a Riemann's sum, but for a parametric function's arc length.
*
* @return returns the approximated length of the BezierCurve.
*/
public double approximateLength() {
Point previousPoint = getPoint(0);
Point currentPoint;
double approxLength = 0;
for (int i = 1; i <= APPROXIMATION_STEPS; i++) {
currentPoint = getPoint(i/(double)APPROXIMATION_STEPS);
approxLength += previousPoint.distanceFrom(currentPoint);
previousPoint = currentPoint;
}
return approxLength;
}
/**
* This returns the point on the Bezier curve that is specified by the parametric t value. A
* Bezier curve is a parametric function that returns points along it with t ranging from [0, 1],
* with 0 being the beginning of the curve and 1 being at the end. The Follower will follow
* BezierCurves from 0 to 1, in terms of t.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the point requested.
*/
public Point getPoint(double t) {
t = MathFunctions.clamp(t, 0, 1);
double xCoordinate = 0;
double yCoordinate = 0;
// calculates the x coordinate of the point requested
for (int i = 0; i < controlPoints.size(); i++) {
xCoordinate += pointCoefficients.get(i).getValue(t) * controlPoints.get(i).getX();
}
// calculates the y coordinate of the point requested
for (int i = 0; i < controlPoints.size(); i++) {
yCoordinate += pointCoefficients.get(i).getValue(t) * controlPoints.get(i).getY();
}
return new Point(xCoordinate, yCoordinate, Point.CARTESIAN);
}
/**
* This returns the curvature of the Bezier curve at a specified t-value.
*
* @param t the parametric t input.
* @return returns the curvature.
*/
public double getCurvature(double t) {
t = MathFunctions.clamp(t, 0, 1);
Vector derivative = getDerivative(t);
Vector secondDerivative = new Vector(getSecondDerivative(t).getMagnitude(), getApproxSecondDerivative(t).getTheta());
if (derivative.getMagnitude() == 0) return 0;
return (MathFunctions.crossProduct(derivative, secondDerivative))/Math.pow(derivative.getMagnitude(),3);
}
/**
* This returns the derivative on the BezierCurve that is specified by the parametric t value.
* This is returned as a Vector, and this Vector is the tangent to the BezierCurve.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the derivative requested.
*/
public Vector getDerivative(double t) {
t = MathFunctions.clamp(t, 0, 1);
double xCoordinate = 0;
double yCoordinate = 0;
Vector returnVector = new Vector();
// calculates the x coordinate of the point requested
for (int i = 0; i < controlPoints.size()-1; i++) {
xCoordinate += pointCoefficients.get(i).getDerivativeValue(t) * (MathFunctions.subtractPoints(controlPoints.get(i+1), controlPoints.get(i)).getX());
}
// calculates the y coordinate of the point requested
for (int i = 0; i < controlPoints.size()-1; i++) {;
yCoordinate += pointCoefficients.get(i).getDerivativeValue(t) * (MathFunctions.subtractPoints(controlPoints.get(i+1), controlPoints.get(i)).getY());
}
returnVector.setOrthogonalComponents(xCoordinate, yCoordinate);
return returnVector;
}
/**
* This returns the second derivative on the BezierCurve that is specified by the parametric t value.
* This is returned as a Vector, and this Vector is the acceleration on the BezierCurve.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the second derivative requested.
*/
public Vector getSecondDerivative(double t) {
t = MathFunctions.clamp(t, 0, 1);
double xCoordinate = 0;
double yCoordinate = 0;
Vector returnVector = new Vector();
// calculates the x coordinate of the point requested
for (int i = 0; i < controlPoints.size()-2; i++) {
xCoordinate += pointCoefficients.get(i).getSecondDerivativeValue(t) * (MathFunctions.addPoints(MathFunctions.subtractPoints(controlPoints.get(i+2), new Point(2*controlPoints.get(i+1).getX(), 2*controlPoints.get(i+1).getY(), Point.CARTESIAN)), controlPoints.get(i)).getX());
}
// calculates the y coordinate of the point requested
for (int i = 0; i < controlPoints.size()-2; i++) {
yCoordinate += pointCoefficients.get(i).getSecondDerivativeValue(t) * (MathFunctions.addPoints(MathFunctions.subtractPoints(controlPoints.get(i+2), new Point(2*controlPoints.get(i+1).getX(), 2*controlPoints.get(i+1).getY(), Point.CARTESIAN)), controlPoints.get(i)).getY());
}
returnVector.setOrthogonalComponents(xCoordinate, yCoordinate);
return returnVector;
}
/**
* Because, for whatever reason, the second derivative returned by the getSecondDerivative(double t)
* method doesn't return the correct heading of the second derivative, this gets an approximate
* second derivative essentially using the limit method. I use this for its heading only.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the approximated second derivative.
*/
public Vector getApproxSecondDerivative(double t) {
double current = getDerivative(t).getTheta();
double deltaCurrent = getDerivative(t + 0.0001).getTheta();
return new Vector(1, deltaCurrent - current);
}
/**
* Returns the ArrayList of control points for this BezierCurve.
*
* @return This returns the control points.
*/
public ArrayList<Point> getControlPoints() {
return controlPoints;
}
/**
* Returns the first control point for this BezierCurve.
*
* @return This returns the Point.
*/
public Point getFirstControlPoint() {
return controlPoints.get(0);
}
/**
* Returns the second control point, or the one after the start, for this BezierCurve.
*
* @return This returns the Point.
*/
public Point getSecondControlPoint() {
return controlPoints.get(1);
}
/**
* Returns the second to last control point for this BezierCurve.
*
* @return This returns the Point.
*/
public Point getSecondToLastControlPoint() {
return controlPoints.get(controlPoints.size()-2);
}
/**
* Returns the last control point for this BezierCurve.
*
* @return This returns the Point.
*/
public Point getLastControlPoint() {
return controlPoints.get(controlPoints.size()-1);
}
/**
* Returns the approximate length of this BezierCurve.
*
* @return This returns the length.
*/
public double length() {
return length;
}
/**
* Returns the conversion factor of one unit of distance into t-value. Since parametric functions
* are defined by t, which can mean time, I use "time" in some method names for conciseness.
*
* @return returns the conversion factor.
*/
public double UNIT_TO_TIME() {
return UNIT_TO_TIME;
}
/**
* Returns the type of path. This is used in case we need to identify the type of BezierCurve
* this is.
*
* @return returns the type of path.
*/
public String pathType() {
return "curve";
}
}

View File

@ -0,0 +1,66 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
/**
* This is the BezierCurveCoefficients class. This class handles holding the coefficients for each
* control point for the BezierCurve class to allow for faster on the fly calculations of points,
* derivatives, and second derivatives on Bezier curves.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/8/2024
*/
public class BezierCurveCoefficients {
private double coefficient;
private double derivativeCoefficient;
private double secondDerivativeCoefficient;
private int n;
private int i;
/**
* This creates the coefficients within the summation equations for calculating positions,
* derivatives, and second derivatives on Bezier curves.
*
* @param n this is the degree of the Bezier curve function.
* @param i this is the i within the summation equation, so basically which place it is in the
* summation.
*/
public BezierCurveCoefficients(int n, int i) {
this.n = n;
this.i = i;
coefficient = MathFunctions.nCr(n, i);
derivativeCoefficient = MathFunctions.nCr(n - 1, i);
secondDerivativeCoefficient = MathFunctions.nCr(n - 2, i);
}
/**
* This returns the coefficient for the summation to calculate a position on BezierCurves.
*
* @param t this is the t value within the parametric equation for the Bezier curve.
* @return this returns the coefficient.
*/
public double getValue(double t) {
return coefficient * Math.pow(1 - t, n - i) * Math.pow(t, i);
}
/**
* This returns the coefficient for the summation to calculate a derivative on BezierCurves.
*
* @param t this is the t value within the parametric equation for the Bezier curve.
* @return this returns the coefficient.
*/
public double getDerivativeValue(double t) {
return n * derivativeCoefficient * Math.pow(t, i) * Math.pow(1 - t, n - i - 1);
}
/**
* This returns the coefficient for the summation to calculate a second derivative on BezierCurves.
*
* @param t this is the t value within the parametric equation for the Bezier curve.
* @return this returns the coefficient.
*/
public double getSecondDerivativeValue(double t) {
return n * (n - 1) * secondDerivativeCoefficient * Math.pow(t, i) * Math.pow(1 - t, n - i - 2);
}
}

View File

@ -0,0 +1,208 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import java.util.ArrayList;
/**
* This is the BezierLine class. This class handles the creation of BezierLines, which is what I
* call Bezier curves with only two control points. The parent BezierCurve class cannot handle Bezier
* curves with less than three control points, so this class handles lines. Additionally, it makes
* the calculations done on the fly a little less computationally expensive and more streamlined.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/9/2024
*/
public class BezierLine extends BezierCurve {
private Point startPoint;
private Point endPoint;
private Vector endTangent;
private double UNIT_TO_TIME;
private double length;
/**
* This creates a new BezierLine with specified start and end Points.
* This is just a line but it extends the BezierCurve class so things work.
*
* @param startPoint start point of the line.
* @param endPoint end point of the line.
*/
public BezierLine(Point startPoint, Point endPoint) {
super();
this.startPoint = startPoint;
this.endPoint = endPoint;
length = approximateLength();
UNIT_TO_TIME = 1 / length;
endTangent = MathFunctions.normalizeVector(getDerivative(1));
}
/**
* This returns the unit tangent Vector at the end of the BezierLine.
*
* @return returns the tangent Vector.
*/
@Override
public Vector getEndTangent() {
return MathFunctions.copyVector(endTangent);
}
/**
* This gets the length of the BezierLine.
*
* @return returns the length of the BezierLine.
*/
@Override
public double approximateLength() {
return Math.sqrt(Math.pow(startPoint.getX() - endPoint.getX(), 2) + Math.pow(startPoint.getY() - endPoint.getY(), 2));
}
/**
* This returns the Point on the Bezier line that is specified by the parametric t value.
*
* @param t this is the t value of the parametric line. t is clamped to be between 0 and 1 inclusive.
* @return this returns the Point requested.
*/
@Override
public Point getPoint(double t) {
t = MathFunctions.clamp(t, 0, 1);
return new Point((endPoint.getX() - startPoint.getX()) * t + startPoint.getX(), (endPoint.getY() - startPoint.getY()) * t + startPoint.getY(), Point.CARTESIAN);
}
/**
* This returns the curvature of the BezierLine, which is zero.
*
* @param t the parametric t value.
* @return returns the curvature.
*/
@Override
public double getCurvature(double t) {
return 0.0;
}
/**
* This returns the derivative on the BezierLine as a Vector, which is a constant slope.
* The t value doesn't really do anything, but it's there so I can override methods.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the derivative requested.
*/
@Override
public Vector getDerivative(double t) {
Vector returnVector = new Vector();
returnVector.setOrthogonalComponents(endPoint.getX() - startPoint.getX(), endPoint.getY() - startPoint.getY());
return returnVector;
}
/**
* This returns the second derivative on the Bezier line, which is a zero Vector.
* Once again, the t is only there for the override.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the second derivative requested.
*/
@Override
public Vector getSecondDerivative(double t) {
return new Vector();
}
/**
* This returns the zero Vector, but it's here so I can override the method in the BezierCurve
* class.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the approximated second derivative, which is the zero Vector.
*/
@Override
public Vector getApproxSecondDerivative(double t) {
return new Vector();
}
/**
* Returns the ArrayList of control points for this BezierLine.
*
* @return This returns the control points.
*/
@Override
public ArrayList<Point> getControlPoints() {
ArrayList<Point> returnList = new ArrayList<>();
returnList.add(startPoint);
returnList.add(endPoint);
return returnList;
}
/**
* Returns the first control point for this BezierLine.
*
* @return This returns the Point.
*/
@Override
public Point getFirstControlPoint() {
return startPoint;
}
/**
* Returns the second control point, or the one after the start, for this BezierLine.
*
* @return This returns the Point.
*/
@Override
public Point getSecondControlPoint() {
return endPoint;
}
/**
* Returns the second to last control point for this BezierLine.
*
* @return This returns the Point.
*/
@Override
public Point getSecondToLastControlPoint() {
return startPoint;
}
/**
* Returns the last control point for this BezierLine.
*
* @return This returns the Point.
*/
@Override
public Point getLastControlPoint() {
return endPoint;
}
/**
* Returns the length of this BezierLine.
*
* @return This returns the length.
*/
@Override
public double length() {
return length;
}
/**
* Returns the conversion factor of one unit of distance into t value.
*
* @return returns the conversion factor.
*/
@Override
public double UNIT_TO_TIME() {
return UNIT_TO_TIME;
}
/**
* Returns the type of path. This is used in case we need to identify the type of BezierCurve
* this is.
*
* @return returns the type of path.
*/
@Override
public String pathType() {
return "line";
}
}

View File

@ -0,0 +1,208 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import java.util.ArrayList;
/**
* This is the BezierPoint class. This class handles the creation of BezierPoints, which is what I
* call Bezier curves with only one control point. The parent BezierCurve class cannot handle Bezier
* curves with less than three control points, so this class handles points. Additionally, it makes
* the calculations done on the fly a little less computationally expensive and more streamlined.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/9/2024
*/
public class BezierPoint extends BezierCurve {
private Point point;
private Vector endTangent = new Vector();
private double UNIT_TO_TIME;
private double length;
/**
* This creates a new BezierPoint with a specified Point.
* This is just a point but it extends the BezierCurve class so things work.
*
* @param point the specified point.
*/
public BezierPoint(Point point) {
super();
this.point = point;
length = approximateLength();
}
/**
* This supposedly returns the unit tangent Vector at the end of the path, but since there is
* no end tangent of a point, this returns a zero Vector instead. Holding BezierPoints in the
* Follower doesn't use the drive Vector, so the end tangent Vector is not needed or truly used.
*
* @return returns the zero Vector.
*/
@Override
public Vector getEndTangent() {
return MathFunctions.copyVector(endTangent);
}
/**
* This gets the length of the BezierPoint. Since points don't have length, this returns zero.
*
* @return returns the length of the BezierPoint.
*/
@Override
public double approximateLength() {
return 0.0;
}
/**
* This returns the point on the BezierPoint that is specified by the parametric t value. Since
* this is a Point, this just returns the one control point's position.
*
* @param t this is the t value of the parametric line. t is clamped to be between 0 and 1 inclusive.
* @return this returns the Point requested.
*/
@Override
public Point getPoint(double t) {
return new Point(point.getX(), point.getY(), Point.CARTESIAN);
}
/**
* This returns the curvature of the BezierPoint, which is zero since this is a Point.
*
* @param t the parametric t value.
* @return returns the curvature, which is zero.
*/
@Override
public double getCurvature(double t) {
return 0.0;
}
/**
* This returns the derivative on the BezierPoint, which is the zero Vector since this is a Point.
* The t value doesn't really do anything, but it's there so I can override methods.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the derivative requested, which is the zero Vector.
*/
@Override
public Vector getDerivative(double t) {
return MathFunctions.copyVector(endTangent);
}
/**
* This returns the second derivative on the Bezier line, which is the zero Vector since this
* is a Point.
* Once again, the t is only there for the override.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the second derivative requested, which is the zero Vector.
*/
@Override
public Vector getSecondDerivative(double t) {
return new Vector();
}
/**
* This returns the zero Vector, but it's here so I can override the method in the BezierCurve
* class.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the approximated second derivative, which is the zero Vector.
*/
@Override
public Vector getApproxSecondDerivative(double t) {
return new Vector();
}
/**
* Returns the ArrayList of control points for this BezierPoint
*
* @return This returns the control point.
*/
@Override
public ArrayList<Point> getControlPoints() {
ArrayList<Point> returnList = new ArrayList<>();
returnList.add(point);
return returnList;
}
/**
* Returns the first, and only, control point for this BezierPoint
*
* @return This returns the Point.
*/
@Override
public Point getFirstControlPoint() {
return point;
}
/**
* Returns the second control point, or the one after the start, for this BezierPoint. This
* returns the one control point of the BezierPoint, and there are several redundant methods
* that return the same control point, but it's here so I can override methods.
*
* @return This returns the Point.
*/
@Override
public Point getSecondControlPoint() {
return point;
}
/**
* Returns the second to last control point for this BezierPoint. This
* returns the one control point of the BezierPoint, and there are several redundant methods
* that return the same control point, but it's here so I can override methods.
*
* @return This returns the Point.
*/
@Override
public Point getSecondToLastControlPoint() {
return point;
}
/**
* Returns the last control point for this BezierPoint. This
* returns the one control point of the BezierPoint, and there are several redundant methods
* that return the same control point, but it's here so I can override methods.
*
* @return This returns the Point.
*/
@Override
public Point getLastControlPoint() {
return point;
}
/**
* Returns the length of this BezierPoint, which is zero since Points don't have length.
*
* @return This returns the length.
*/
@Override
public double length() {
return length;
}
/**
* Returns the conversion factor of one unit of distance into t value. There is no length or
* conversion factor to speak of for Points.
*
* @return returns the conversion factor.
*/
@Override
public double UNIT_TO_TIME() {
return 0;
}
/**
* Returns the type of path. This is used in case we need to identify the type of BezierCurve
* this is.
*
* @return returns the type of path.
*/
@Override
public String pathType() {
return "point";
}
}

View File

@ -0,0 +1,284 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import com.acmerobotics.roadrunner.geometry.Pose2d;
/**
* This is the MathFunctions class. This contains many useful math related methods that I use in
* other classes to simplify code elsewhere.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/9/2024
*/
public class MathFunctions {
/**
* This is a simple implementation of the choose function in math. It's equivalent to the number
* of ways you can choose r items from n total items, if only which items got picked and not the
* order of picking the items mattered.
*
* @param n this is how many you want to choose from.
* @param r this is how many you want to choose.
* @return returns the result of the "n choose r" function.
*/
public static double nCr(int n, int r) {
double num = 1;
double denom = 1;
// this multiplies up the numerator of the nCr function
for (int i = n; i > n-r; i--) {
num *= i;
}
// this multiplies up the denominator of the nCr function
for (int i = 1; i <=r; i++) {
denom *= i;
}
return num/denom;
}
/**
* This returns the sign (positive/negative) of a number.
*
* @param get the number.
* @return returns the sign of the number.
*/
public static double getSign(double get) {
if (get == 0) return 0;
if (get > 0) return 1;
return -1;
}
/**
* This clamps down a value to between the lower and upper bounds inclusive.
*
* @param num the number to be clamped.
* @param lower the lower bound.
* @param upper the upper bound.
* @return returns the clamped number.
*/
public static double clamp(double num, double lower, double upper) {
if (num < lower) return lower;
if (num > upper) return upper;
return num;
}
/**
* This normalizes an angle to be between 0 and 2 pi radians, inclusive.
*
* IMPORTANT NOTE: This method operates in radians.
*
* @param angleRadians the angle to be normalized.
* @return returns the normalized angle.
*/
public static double normalizeAngle(double angleRadians) {
double angle = angleRadians;
while (angle<0) angle += 2*Math.PI;
while (angle>2*Math.PI) angle -= 2*Math.PI;
return angle;
}
/**
* This returns the smallest angle between two angles. This operates in radians.
*
* @param one one of the angles.
* @param two the other one.
* @return returns the smallest angle.
*/
public static double getSmallestAngleDifference(double one, double two) {
return Math.min(MathFunctions.normalizeAngle(one-two), MathFunctions.normalizeAngle(two-one));
}
/**
* This gets the direction to turn between a start heading and an end heading. Positive is left
* and negative is right. This operates in radians.
*
* @return returns the turn direction.
*/
public static double getTurnDirection(double startHeading, double endHeading) {
if (MathFunctions.normalizeAngle(endHeading-startHeading) >= 0 && MathFunctions.normalizeAngle(endHeading-startHeading) <= Math.PI) {
return 1; // counter clock wise
}
return -1; // clock wise
}
/**
* This returns the distance between a Pose2d and a Point,
*
* @param pose this is the Pose2d.
* @param point this is the Point.
* @return returns the distance between the two.
*/
public static double distance(Pose2d pose, Point point) {
return Math.sqrt(Math.pow(pose.getX()-point.getX(), 2) + Math.pow(pose.getY()-point.getY(), 2));
}
/**
* This returns the distance between a Pose2d and another Pose2d.
*
* @param one this is the first Pose2d.
* @param two this is the second Pose2d.
* @return returns the distance between the two.
*/
public static double distance(Pose2d one, Pose2d two) {
return Math.sqrt(Math.pow(one.getX()-two.getX(), 2) + Math.pow(one.getY()-two.getY(), 2));
}
/**
* This returns a Point that is the sum of the two input Point.
*
* @param one the first Point
* @param two the second Point
* @return returns the sum of the two Points.
*/
public static Point addPoints(Point one, Point two) {
return new Point(one.getX() + two.getX(), one.getY() + two.getY(), Point.CARTESIAN);
}
/**
* This subtracts the second Point from the first Point and returns the result as a Point.
* Do note that order matters here.
*
* @param one the first Point.
* @param two the second Point.
* @return returns the difference of the two Points.
*/
public static Point subtractPoints(Point one, Point two) {
return new Point(one.getX() - two.getX(), one.getY() - two.getY(), Point.CARTESIAN);
}
/**
* This multiplies a Point by a scalar and returns the result as a Point
*
* @param point the Point being multiplied.
* @param scalar the scalar multiplying into the Point.
* @return returns the scaled Point.
*/
public static Point scalarMultiplyPoint(Point point, double scalar) {
return new Point(point.getX()*scalar, point.getY()*scalar, Point.CARTESIAN);
}
/**
* Copies a Point, but with a different reference location in the memory. So basically a deep
* copy.
*
* @param point the Point to be deep copied.
* @return returns the copied Point.
*/
public static Point copyPoint(Point point) {
return new Point(point.getX(), point.getY(), Point.CARTESIAN);
}
/**
* Copies a Vector, but with a different reference location in the memory. So basically a deep
* copy.
*
* @param vector Vector to be deep copied.
* @return returns the copied Vector.
*/
public static Vector copyVector(Vector vector) {
return new Vector(vector.getMagnitude(), vector.getTheta());
}
/**
* This multiplies a Vector by a scalar and returns the result as a Vector.
*
* @param vector the Vector being multiplied.
* @param scalar the scalar multiplying into the Vector.
* @return returns the scaled Vector.
*/
public static Vector scalarMultiplyVector(Vector vector, double scalar) {
return new Vector(vector.getMagnitude()*scalar, vector.getTheta());
}
/**
* This normalizes a Vector to be of magnitude 1, unless the Vector is the zero Vector.
* In that case, it just returns back the zero Vector but with a different memory location.
*
* @param vector the Vector being normalized.
* @return returns the normalized (or zero) Vector.
*/
public static Vector normalizeVector(Vector vector) {
if (vector.getMagnitude() == 0) {
return new Vector(0.0, vector.getTheta());
} else {
return new Vector(vector.getMagnitude()/Math.abs(vector.getMagnitude()), vector.getTheta());
}
}
/**
* This returns a Vector that is the sum of the two input Vectors.
*
* @param one the first Vector.
* @param two the second Vector.
* @return returns the sum of the Vectors.
*/
public static Vector addVectors(Vector one, Vector two) {
Vector returnVector = new Vector();
returnVector.setOrthogonalComponents(one.getXComponent()+two.getXComponent(), one.getYComponent()+two.getYComponent());
return returnVector;
}
/**
* This subtracts the second Vector from the first Vector and returns the result as a Vector.
* Do note that order matters here.
*
* @param one the first Vector.
* @param two the second Vector.
* @return returns the second Vector subtracted from the first Vector.
*/
public static Vector subtractVectors(Vector one, Vector two) {
Vector returnVector = new Vector();
returnVector.setOrthogonalComponents(one.getXComponent()-two.getXComponent(), one.getYComponent()-two.getYComponent());
return returnVector;
}
/**
* This computes the dot product of the two Vectors.
*
* @param one the first Vector.
* @param two the second Vector.
* @return returns the dot product of the two Vectors.
*/
public static double dotProduct(Vector one, Vector two) {
return one.getXComponent()*two.getXComponent() + one.getYComponent()*two.getYComponent();
}
/**
* This computes the first Vector crossed with the second Vector, so a cross product.
* Do note that order matters here.
*
* @param one the first Vector.
* @param two the second Vector.
* @return returns the cross product of the two Vectors.
*/
public static double crossProduct(Vector one, Vector two) {
return one.getXComponent()*two.getYComponent() - one.getYComponent()*two.getXComponent();
}
/**
* This returns whether a specified value is within a second specified value by plus/minus a
* specified accuracy amount.
*
* @param one first number specified.
* @param two second number specified.
* @param accuracy the level of accuracy specified.
* @return returns if the two numbers are within the specified accuracy of each other.
*/
public static boolean roughlyEquals(double one, double two, double accuracy) {
return (one < two + accuracy && one > two - accuracy);
}
/**
* This returns whether a specified number is within a second specified number by plus/minus 0.0001.
*
* @param one first number specified.
* @param two second number specified.
* @return returns if a specified number is within plus/minus 0.0001 of the second specified number.
*/
public static boolean roughlyEquals(double one, double two) {
return roughlyEquals(one, two, 0.0001);
}
}

View File

@ -0,0 +1,477 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import com.acmerobotics.roadrunner.geometry.Pose2d;
import org.firstinspires.ftc.teamcode.pedroPathing.tuning.FollowerConstants;
import java.util.ArrayList;
/**
* This is the Path class. This class handles containing information on the actual path the Follower
* will follow, not just the shape of the path that the BezierCurve class handles. This contains
* information on the stop criteria for a Path, the heading interpolation, and deceleration.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/10/2024
*/
public class Path {
private BezierCurve curve;
private double startHeading;
private double endHeading;
private double closestPointCurvature;
private double closestPointTValue;
private double linearInterpolationEndTime;
private Vector closestPointTangentVector;
private Vector closestPointNormalVector;
private boolean isTangentHeadingInterpolation = true;
private boolean followTangentReversed;
// A multiplier for the zero power acceleration to change the speed the robot decelerates at
// the end of paths.
// Increasing this will cause the robot to try to decelerate faster, at the risk of overshoots
// or localization slippage.
// Decreasing this will cause the deceleration at the end of the Path to be slower, making the
// robot slower but reducing risk of end-of-path overshoots or localization slippage.
// This can be set individually for each Path, but this is the default.
private static double zeroPowerAccelerationMultiplier = FollowerConstants.zeroPowerAccelerationMultiplier;
// When the robot is at the end of its current Path or PathChain and the velocity goes
// this value, then end the Path. This is in inches/second.
// This can be custom set for each Path.
private static double pathEndVelocityConstraint = FollowerConstants.pathEndVelocityConstraint;
// When the robot is at the end of its current Path or PathChain and the translational error
// goes below this value, then end the Path. This is in inches.
// This can be custom set for each Path.
private static double pathEndTranslationalConstraint = FollowerConstants.pathEndTranslationalConstraint;
// When the robot is at the end of its current Path or PathChain and the heading error goes
// below this value, then end the Path. This is in radians.
// This can be custom set for each Path.
private static double pathEndHeadingConstraint = FollowerConstants.pathEndHeadingConstraint;
// When the t-value of the closest point to the robot on the Path is greater than this value,
// then the Path is considered at its end.
// This can be custom set for each Path.
private static double pathEndTValueConstraint = FollowerConstants.pathEndTValueConstraint;
// When the Path is considered at its end parametrically, then the Follower has this many
// seconds to further correct by default.
// This can be custom set for each Path.
private static double pathEndTimeoutConstraint = FollowerConstants.pathEndTimeoutConstraint;
/**
* Creates a new Path from a BezierCurve. The default heading interpolation is tangential.
*
* @param curve the BezierCurve.
*/
public Path(BezierCurve curve) {
this.curve = curve;
}
/**
* This sets the heading interpolation to linear with a specified start heading and end heading
* for the Path. This will interpolate across the entire length of the Path, so there may be
* some issues with end heading accuracy and precision if this is used. If a precise end heading
* is necessary, then use the setLinearHeadingInterpolation(double startHeading,
* double endHeading, double endTime) method.
*
* @param startHeading The start of the linear heading interpolation.
* @param endHeading The end of the linear heading interpolation.
* This will be reached at the end of the Path if no end time is specified.
*/
public void setLinearHeadingInterpolation(double startHeading, double endHeading) {
linearInterpolationEndTime = 1;
isTangentHeadingInterpolation = false;
this.startHeading = startHeading;
this.endHeading = endHeading;
}
/**
* This sets the heading interpolation to linear with a specified start heading and end heading
* for the Path. This will interpolate from the start of the Path to the specified end time.
* This ensures high accuracy and precision than interpolating across the entire Path. However,
* interpolating too quickly can cause undesired oscillations and inaccuracies of its own, so
* generally interpolating to something like 0.8 of your Path should work best.
*
* @param startHeading The start of the linear heading interpolation.
* @param endHeading The end of the linear heading interpolation.
* This will be reached at the end of the Path if no end time is specified.
* @param endTime The end time on the Path that the linear heading interpolation will finish.
* This value ranges from [0, 1] since Bezier curves are parametric functions.
*/
public void setLinearHeadingInterpolation(double startHeading, double endHeading, double endTime) {
linearInterpolationEndTime = MathFunctions.clamp(endTime, 0.000000001, 1);
isTangentHeadingInterpolation = false;
this.startHeading = startHeading;
this.endHeading = endHeading;
}
/**
* This sets the heading interpolation to maintain a constant heading.
*
* @param setHeading the constant heading for the Path.
*/
public void setConstantHeadingInterpolation(double setHeading) {
linearInterpolationEndTime = 1;
isTangentHeadingInterpolation = false;
startHeading = setHeading;
endHeading = setHeading;
}
/**
* This gets the closest Point from a specified pose to the BezierCurve with a binary search
* that is limited to some specified step limit.
*
* @param pose the pose.
* @param searchStepLimit the binary search step limit.
* @return returns the closest Point.
*/
public Pose2d getClosestPoint(Pose2d pose, int searchStepLimit) {
double lower = 0;
double upper = 1;
Point returnPoint;
// we don't need to calculate the midpoint, so we start off at the 1/4 and 3/4 point
for (int i = 0; i < searchStepLimit; i++) {
if (MathFunctions.distance(pose, getPoint(lower + 0.25 * (upper-lower))) > MathFunctions.distance(pose, getPoint(lower + 0.75 * (upper-lower)))) {
lower += (upper-lower)/2.0;
} else {
upper -= (upper-lower)/2.0;
}
}
closestPointTValue = lower + 0.5 * (upper-lower);
returnPoint = getPoint(closestPointTValue);
closestPointTangentVector = curve.getDerivative(closestPointTValue);
closestPointNormalVector = curve.getApproxSecondDerivative(closestPointTValue);
closestPointCurvature = curve.getCurvature(closestPointTValue);
return new Pose2d(returnPoint.getX(), returnPoint.getY(), getClosestPointHeadingGoal());
}
/**
* This sets whether to follow the tangent heading facing away from (reverse) or towards the
* tangent. This will also set your heading interpolation to tangential.
*
* @param set sets tangential heading reversed or not.
*/
public void setReversed(boolean set) {
isTangentHeadingInterpolation = true;
followTangentReversed = set;
}
/**
* This sets the heading interpolation to tangential.
*/
public void setTangentHeadingInterpolation() {
isTangentHeadingInterpolation = true;
followTangentReversed = false;
}
/**
* This returns the unit tangent Vector at the end of the BezierCurve.
*
* @return returns the end tangent Vector.
*/
public Vector getEndTangent() {
return curve.getEndTangent();
}
/**
* This returns the point on the Bezier curve that is specified by the parametric t value. A
* Bezier curve is a parametric function that returns points along it with t ranging from [0, 1],
* with 0 being the beginning of the curve and 1 being at the end. The Follower will follow
* BezierCurves from 0 to 1, in terms of t.
*
* @param t this is the t value of the parametric curve. t is clamped to be between 0 and 1 inclusive.
* @return this returns the point requested.
*/
public Point getPoint(double t) {
return curve.getPoint(t);
}
/**
* This returns the t-value of the closest Point on the BezierCurve.
*
* @return returns the closest Point t-value.
*/
public double getClosestPointTValue() {
return closestPointTValue;
}
/**
* This returns the approximated length of the BezierCurve.
*
* @return returns the length of the BezierCurve.
*/
public double length() {
return curve.length();
}
/**
* This returns the curvature of the BezierCurve at a specified t-value.
*
* @param t the specified t-value.
* @return returns the curvature of the BezierCurve at the specified t-value.
*/
public double getCurvature(double t) {
return curve.getCurvature(t);
}
/**
* This returns the curvature of the BezierCurve at the closest Point.
*
* @return returns the curvature of the BezierCurve at the closest Point.
*/
public double getClosestPointCurvature() {
return closestPointCurvature;
}
/**
* This returns the normal Vector at the closest Point.
*
* @return returns the normal Vector at the closest Point.
*/
public Vector getClosestPointNormalVector() {
return MathFunctions.copyVector(closestPointNormalVector);
}
/**
* This returns the tangent Vector at the closest Point.
*
* @return returns the tangent Vector at the closest Point.
*/
public Vector getClosestPointTangentVector() {
return MathFunctions.copyVector(closestPointTangentVector);
}
/**
* This returns the heading goal at the closest Point.
*
* @return returns the heading goal at the closest Point.
*/
public double getClosestPointHeadingGoal() {
if (isTangentHeadingInterpolation) {
if (followTangentReversed) return MathFunctions.normalizeAngle(closestPointTangentVector.getTheta() + Math.PI);
return closestPointTangentVector.getTheta();
} else {
return getHeadingGoal(closestPointTValue);
}
}
/**
* This gets the heading goal at a specified t-value.
*
* @param t the specified t-value.
* @return returns the heading goal at the specified t-value.
*/
public double getHeadingGoal(double t) {
if (isTangentHeadingInterpolation) {
if (followTangentReversed) return MathFunctions.normalizeAngle(curve.getDerivative(t).getTheta() + Math.PI);
return curve.getDerivative(t).getTheta();
} else {
if (t > linearInterpolationEndTime) {
return MathFunctions.normalizeAngle(endHeading);
}
return MathFunctions.normalizeAngle(startHeading + MathFunctions.getTurnDirection(startHeading, endHeading) * MathFunctions.getSmallestAngleDifference(endHeading, startHeading) * (t / linearInterpolationEndTime));
}
}
/**
* This returns if the robot is at the end of the Path, according to the parametric t-value.
*
* @return returns if at end.
*/
public boolean isAtParametricEnd() {
if (closestPointTValue >= pathEndTValueConstraint) return true;
return false;
}
/**
* This returns if the robot is at the beginning of the Path, according to the parametric t-value.
*
* @return returns if at start.
*/
public boolean isAtParametricStart() {
if (closestPointTValue <= 1- pathEndTValueConstraint) return true;
return false;
}
/**
* Returns the ArrayList of control points for this BezierCurve.
*
* @return This returns the control points.
*/
public ArrayList<Point> getControlPoints() {
return curve.getControlPoints();
}
/**
* Returns the first control point for this BezierCurve.
*
* @return This returns the Point.
*/
public Point getFirstControlPoint() {
return curve.getFirstControlPoint();
}
/**
* Returns the second control point, or the one after the start, for this BezierCurve.
*
* @return This returns the Point.
*/
public Point getSecondControlPoint() {
return curve.getSecondControlPoint();
}
/**
* Returns the second to last control point for this BezierCurve.
*
* @return This returns the Point.
*/
public Point getSecondToLastControlPoint() {
return curve.getSecondToLastControlPoint();
}
/**
* Returns the last control point for this BezierCurve.
*
* @return This returns the Point.
*/
public Point getLastControlPoint() {
return curve.getLastControlPoint();
}
/**
* This sets the path's deceleration factor in terms of the natural deceleration of the robot
* when power is cut from the drivetrain.
*
* @param set This sets the multiplier.
*/
public void setZeroPowerAccelerationMultiplier(double set) {
zeroPowerAccelerationMultiplier = set;
}
/**
* This sets the velocity stop criteria. When velocity is below this amount, then this is met.
*
* @param set This sets the velocity end constraint.
*/
public void setPathEndVelocityConstraint(double set) {
pathEndVelocityConstraint = set;
}
/**
* This sets the translational stop criteria. When the translational error, or how far off the
* end point the robot is, goes below this, then the translational end criteria is met.
*
* @param set This sets the translational end constraint.
*/
public void setPathEndTranslationalConstraint(double set) {
pathEndTranslationalConstraint = set;
}
/**
* This sets the heading stop criteria. When the heading error is less than this amount, then
* the heading end criteria is met.
*
* @param set This sets the heading end constraint.
*/
public void setPathEndHeadingConstraint(double set) {
pathEndHeadingConstraint = set;
}
/**
* This sets the parametric end criteria. When the t-value of the closest Point on the Path is
* greater than this amount, then the parametric end criteria is met.
*
* @param set This sets the t-value end constraint.
*/
public void setPathEndTValueConstraint(double set) {
pathEndTValueConstraint = set;
}
/**
* This sets the Path end timeout. If the Path is at its end parametrically, then the Follower
* has this many seconds to correct before the Path gets ended anyways.
*
* @param set This sets the Path end timeout.
*/
public void setPathEndTimeoutConstraint(double set) {
pathEndTimeoutConstraint = set;
}
/**
* This gets the deceleration multiplier.
*
* @return This returns the deceleration multiplier.
*/
public double getZeroPowerAccelerationMultiplier() {
return zeroPowerAccelerationMultiplier;
}
/**
* This gets the velocity stop criteria.
*
* @return This returns the velocity stop criteria.
*/
public double getPathEndVelocityConstraint() {
return pathEndVelocityConstraint;
}
/**
* This gets the translational stop criteria.
*
* @return This returns the translational stop criteria.
*/
public double getPathEndTranslationalConstraint() {
return pathEndTranslationalConstraint;
}
/**
* This gets the heading stop criteria.
*
* @return This returns the heading stop criteria.
*/
public double getPathEndHeadingConstraint() {
return pathEndHeadingConstraint;
}
/**
* This gets the parametric end criteria.
*
* @return This returns the parametric end criteria.
*/
public double getPathEndTValueConstraint() {
return pathEndTValueConstraint;
}
/**
* This gets the Path end correction time.
*
* @return This returns the Path end correction time.
*/
public double getPathEndTimeoutConstraint() {
return pathEndTimeoutConstraint;
}
/**
* Returns the type of path. This is used in case we need to identify the type of BezierCurve
* this is.
*
* @return returns the type of path.
*/
public String pathType() {
return curve.pathType();
}
}

View File

@ -0,0 +1,217 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import java.util.ArrayList;
/**
* This is the PathBuilder class. This class makes it easier to create PathChains, so you don't have
* to individually create Path instances to create a PathChain. A PathBuilder can be accessed
* through running the pathBuilder() method on an instance of the Follower class, or just creating
* an instance of PathBuilder regularly.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/11/2024
*/
public class PathBuilder {
private ArrayList<Path> paths = new ArrayList<>();
private ArrayList<PathCallback> callbacks = new ArrayList<>();
/**
* This is an empty constructor for the PathBuilder class so it can get started.
* The PathBuilder allows for easier construction of PathChains.
* The proper usage is using an instance of the Follower class:
* Follower follower = new Follower(hardwareMap);
* Then calling "follower.pathBuilder.[INSERT PATH BUILDING METHODS].build();
* Of course, you can split up the method calls onto separate lines for readability.
*/
public PathBuilder() {
}
/**
* This adds a Path to the PathBuilder.
*
* @param path The Path being added.
* @return This returns itself with the updated data.
*/
public PathBuilder addPath(Path path) {
this.paths.add(path);
return this;
}
/**
* This adds a default Path defined by a specified BezierCurve to the PathBuilder.
*
* @param curve The curve is turned into a Path and added.
* @return This returns itself with the updated data.
*/
public PathBuilder addPath(BezierCurve curve) {
this.paths.add(new Path(curve));
return this;
}
/**
* This sets a linear heading interpolation on the last Path added to the PathBuilder.
*
* @param startHeading The start of the linear heading interpolation.
* @param endHeading The end of the linear heading interpolation.
* This will be reached at the end of the Path if no end time is specified.
* @return This returns itself with the updated data.
*/
public PathBuilder setLinearHeadingInterpolation(double startHeading, double endHeading) {
this.paths.get(paths.size()-1).setLinearHeadingInterpolation(startHeading, endHeading);
return this;
}
/**
* This sets a linear heading interpolation on the last Path added to the PathBuilder.
*
* @param startHeading The start of the linear heading interpolation.
* @param endHeading The end of the linear heading interpolation.
* This will be reached at the end of the Path if no end time is specified.
* @param endTime The end time on the Path that the linear heading interpolation will end.
* This value goes from [0, 1] since Bezier curves are parametric functions.
* @return This returns itself with the updated data.
*/
public PathBuilder setLinearHeadingInterpolation(double startHeading, double endHeading, double endTime) {
this.paths.get(paths.size()-1).setLinearHeadingInterpolation(startHeading, endHeading, endTime);
return this;
}
/**
* This sets a constant heading interpolation on the last Path added to the PathBuilder.
*
* @param setHeading The constant heading specified.
* @return This returns itself with the updated data.
*/
public PathBuilder setConstantHeadingInterpolation(double setHeading) {
this.paths.get(paths.size()-1).setConstantHeadingInterpolation(setHeading);
return this;
}
/**
* This sets a reversed or tangent heading interpolation on the last Path added to the PathBuilder.
*
* @param set This sets the heading to reversed tangent following if this parameter is true.
* Conversely, this sets a tangent following if this parameter is false.
* @return This returns itself with the updated data.
*/
public PathBuilder setReversed(boolean set) {
this.paths.get(paths.size()-1).setReversed(set);
return this;
}
/**
* This sets the heading interpolation to tangential on the last Path added to the PathBuilder.
* There really shouldn't be a reason to use this since the default heading interpolation is
* tangential but it's here.
*/
public PathBuilder setTangentHeadingInterpolation() {
this.paths.get(paths.size()-1).setTangentHeadingInterpolation();
return this;
}
/**
* This sets the deceleration multiplier on the last Path added to the PathBuilder.
*
* @param set This sets the multiplier for the goal for the deceleration of the robot.
* @return This returns itself with the updated data.
*/
public PathBuilder setZeroPowerAccelerationMultiplier(double set) {
this.paths.get(paths.size()-1).setZeroPowerAccelerationMultiplier(set);
return this;
}
/**
* This sets the path end velocity constraint on the last Path added to the PathBuilder.
*
* @param set This sets the path end velocity constraint.
* @return This returns itself with the updated data.
*/
public PathBuilder setPathEndVelocityConstraint(double set) {
this.paths.get(paths.size()-1).setPathEndVelocityConstraint(set);
return this;
}
/**
* This sets the path end translational constraint on the last Path added to the PathBuilder.
*
* @param set This sets the path end translational constraint.
* @return This returns itself with the updated data.
*/
public PathBuilder setPathEndTranslationalConstraint(double set) {
this.paths.get(paths.size()-1).setPathEndTranslationalConstraint(set);
return this;
}
/**
* This sets the path end heading constraint on the last Path added to the PathBuilder.
*
* @param set This sets the path end heading constraint.
* @return This returns itself with the updated data.
*/
public PathBuilder setPathEndHeadingConstraint(double set) {
this.paths.get(paths.size()-1).setPathEndHeadingConstraint(set);
return this;
}
/**
* This sets the path end t-value (parametric time) constraint on the last Path added to the PathBuilder.
*
* @param set This sets the path end t-value (parametric time) constraint.
* @return This returns itself with the updated data.
*/
public PathBuilder setPathEndTValueConstraint(double set) {
this.paths.get(paths.size()-1).setPathEndTValueConstraint(set);
return this;
}
/**
* This sets the path end timeout constraint on the last Path added to the PathBuilder.
*
* @param set This sets the path end timeout constraint.
* @return This returns itself with the updated data.
*/
public PathBuilder setPathEndTimeoutConstraint(double set) {
this.paths.get(paths.size()-1).setPathEndTimeoutConstraint(set);
return this;
}
/**
* This adds a temporal callback on the last Path added to the PathBuilder.
* This callback is set to run at a specified number of milliseconds after the start of the path.
*
* @param time This sets the number of milliseconds of wait between the start of the Path and
* the calling of the callback.
* @param runnable This sets the code for the callback to run. Use lambda statements for this.
* @return This returns itself with the updated data.
*/
public PathBuilder addTemporalCallback(double time, Runnable runnable) {
this.callbacks.add(new PathCallback(time, runnable, PathCallback.TIME, paths.size()-1));
return this;
}
/**
* This adds a parametric callback on the last Path added to the PathBuilder.
* This callback is set to run at a certain point on the Path.
*
* @param t This sets the t-value (parametric time) on the Path for when to run the callback.
* @param runnable This sets the code for the callback to run. Use lambda statements for this.
* @return This returns itself with the updated data.
*/
public PathBuilder addParametricCallback(double t, Runnable runnable) {
this.callbacks.add(new PathCallback(t, runnable, PathCallback.PARAMETRIC, paths.size()-1));
return this;
}
/**
* This builds all the Path and callback information together into a PathChain.
*
* @return This returns a PathChain made of all the specified paths and callbacks.
*/
public PathChain build() {
PathChain returnChain = new PathChain(paths);
returnChain.setCallbacks(callbacks);
return returnChain;
}
}

View File

@ -0,0 +1,78 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import org.firstinspires.ftc.teamcode.pedroPathing.util.SingleRunAction;
/**
* This is the PathCallback class. This class handles callbacks of Runnables in PathChains.
* Basically, this allows you to run non-blocking code in the middle of PathChains.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/11/2024
*/
public class PathCallback extends SingleRunAction {
private double startCondition;
private int type;
private int index;
public static final int TIME = 0;
public static final int PARAMETRIC = 1;
/**
* This creates a new PathCallback with a specified start condition (either time or parametric),
* a Runnable of code to run (preferably a lambda statement), a type (using the class constants),
* and an index for which Path within a PathChain the callback is to run on.
*
* @param startCondition This defines when the callback is to be run, either as a wait time in
* milliseconds or a t-value (parametric time) point.
* @param runnable This contains the code to run when the callback is called.
* @param type This defines the type of callback using the class constants.
* @param index This defines which Path within the PathChain the callback is to run on.
*/
public PathCallback(double startCondition, Runnable runnable, int type, int index) {
super(runnable);
this.startCondition = startCondition;
this.type = type;
if (this.type != TIME || this.type != PARAMETRIC) {
this.type = PARAMETRIC;
}
if (this.type == TIME && this.startCondition < 0) {
this.startCondition = 0.0;
}
if (this.type == PARAMETRIC) {
this.startCondition = MathFunctions.clamp(this.startCondition, 0, 1);
}
this.index = index;
}
/**
* This returns the type of callback this is (time or parametric).
*
* @return This returns the type of callback.
*/
public int getType() {
return type;
}
/**
* This returns the start condition for this callback. This will be the wait time in milliseconds
* if this is a time callback or a t-value if this is a parametric callback.
*
* @return This returns the start condition.
*/
public double getStartCondition() {
return startCondition;
}
/**
* This returns the index of which Path the callback is to run on within the PathChain.
*
* @return This returns the Path index of this callback.
*/
public int getIndex() {
return index;
}
}

View File

@ -0,0 +1,94 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import java.util.ArrayList;
/**
* This is the PathChain class. This class handles chaining together multiple Paths into a larger
* collection of Paths that can be run continuously. Additionally, this allows for the addition of
* PathCallbacks to specific Paths in the PathChain, allowing for non-blocking code to be run in
* the middle of a PathChain.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/11/2024
*/
public class PathChain {
private ArrayList<Path> pathChain = new ArrayList<>();
private ArrayList<PathCallback> callbacks = new ArrayList<>();
/**
* This creates a new PathChain from some specified Paths.
*
* IMPORTANT NOTE: Order matters here. The order in which the Paths are input is the order in
* which they will be run.
*
* @param paths the specified Paths.
*/
public PathChain(Path... paths) {
for (Path path : paths) {
pathChain.add(path);
}
}
/**
* This creates a new PathChain from an ArrayList of Paths.
*
* IMPORTANT NOTE: Order matters here. The order in which the Paths are input is the order in
* which they will be run.
*
* @param paths the ArrayList of Paths.
*/
public PathChain(ArrayList<Path> paths) {
pathChain = paths;
}
/**
* This returns the Path on the PathChain at a specified index.
*
* @param index the index.
* @return returns the Path at the index.
*/
public Path getPath(int index) {
return pathChain.get(index);
}
/**
* This returns the size of the PathChain.
*
* @return returns the size of the PathChain.
*/
public int size() {
return pathChain.size();
}
/**
* This sets the PathCallbacks of the PathChain with some specified PathCallbacks.
*
* @param callbacks the specified PathCallbacks.
*/
public void setCallbacks(PathCallback... callbacks) {
for (PathCallback callback : callbacks) {
this.callbacks.add(callback);
}
}
/**
* This sets the PathCallbacks of the PathChain with an ArrayList of PathCallbacks.
*
* @param callbacks the ArrayList of PathCallbacks.
*/
public void setCallbacks(ArrayList<PathCallback> callbacks) {
this.callbacks = callbacks;
}
/**
* This returns the PathCallbacks of this PathChain in an ArrayList.
*
* @return returns the PathCallbacks.
*/
public ArrayList<PathCallback> getCallbacks() {
return callbacks;
}
}

View File

@ -0,0 +1,175 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
import com.acmerobotics.roadrunner.geometry.Pose2d;
/**
* This is the Point class. This class handles storing information about the location of points in
* 2D space in both Cartesian, or rectangular, and polar coordinates. Additionally, this contains
* the method to find the distance between two Points.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/11/2024
*/
public class Point {
// IMPORTANT NOTE: theta is defined in radians.
// These are the values of the coordinate defined by this Point, in both polar and
// Cartesian systems.
private double r;
private double theta;
private double x;
private double y;
// these are used for ease of changing/setting identification
public static final int POLAR = 0;
public static final int CARTESIAN = 1;
/**
* This creates a new Point with coordinate inputs and a specified coordinate system.
*
* @param rOrX Depending on the coordinate system specified, this is either the r or x value.
* In polar coordinates, the r value is the distance from the origin.
* In Cartesian coordinates, the x value is the distance left/right from the origin.
* @param thetaOrY Depending on the coordinate system specified, this is either the theta or
* y value.
* In polar coordinates, the theta value is the angle from the positive x-axis.
* Increasing theta moves in the counter-clockwise direction.
* In Cartesian coordinates, the y value is the distance up/down from the origin.
* @param identifier this specifies what coordinate system the coordinate inputs are in.
*/
public Point(double rOrX, double thetaOrY, int identifier) {
setCoordinates(rOrX, thetaOrY, identifier);
}
/**
* This creates a new Point from a Pose2d.
*
* @param pose the Pose2d.
*/
public Point(Pose2d pose) {
setCoordinates(pose.getX(), pose.getY(), CARTESIAN);
}
/**
* This sets the coordinates of the Point using the specified coordinate system.
*
* @param rOrX Depending on the coordinate system specified, this is either the r or x value.
* In polar coordinates, the r value is the distance from the origin.
* In Cartesian coordinates, the x value is the distance left/right from the origin.
* @param thetaOrY Depending on the coordinate system specified, this is either the theta or
* y value.
* In polar coordinates, the theta value is the angle from the positive x-axis.
* Increasing theta moves in the counter-clockwise direction.
* In Cartesian coordinates, the y value is the distance up/down from the origin.
* @param identifier this specifies what coordinate system to use when setting values.
*/
public void setCoordinates(double rOrX, double thetaOrY, int identifier) {
double[] setOtherCoordinates;
switch (identifier) { // this detects which coordinate system to use
// there is no POLAR case since that's covered by the default
case CARTESIAN:
x = rOrX;
y = thetaOrY;
setOtherCoordinates = cartesianToPolar(x, y);
r = setOtherCoordinates[0];
theta = setOtherCoordinates[1];
break;
default:
if (rOrX<0) {
r = -rOrX;
theta = MathFunctions.normalizeAngle(thetaOrY+Math.PI);
} else {
r = rOrX;
theta = MathFunctions.normalizeAngle(thetaOrY);
}
setOtherCoordinates = polarToCartesian(r, theta);
x = setOtherCoordinates[0];
y = setOtherCoordinates[1];
break;
}
}
/**
* Calculates the distance between this Point and some other specified Point.
*
* @param otherPoint the other specified Point.
* @return returns the distance between the two Points.
*/
public double distanceFrom(Point otherPoint) {
return Math.sqrt(Math.pow(otherPoint.getX()-x, 2) + Math.pow(otherPoint.getY()-y, 2));
}
/**
* This takes in an r and theta value and converts them to Cartesian coordinates.
*
* @param r this is the r value of the Point being converted.
* @param theta this is the theta value of the Point being converted.
* @return this returns the x and y values, in that order, in an Array of doubles.
*/
public static double[] polarToCartesian(double r, double theta) {
return new double[] {r * Math.cos(theta), r * Math.sin(theta)};
}
/**
* This takes in an x and y value and converts them to polar coordinates.
*
* @param x this is the x value of the Point being converted.
* @param y this is the y value of the Point being converted.
* @return this returns the r and theta values, in that order, in an Array of doubles.
*/
public static double[] cartesianToPolar(double x, double y) {
if (x == 0) {
if (y > 0) {
return new double[] {Math.abs(y), Math.PI/2};
} else {
return new double[] {Math.abs(y), (3 * Math.PI) / 2};
}
}
double r = Math.sqrt(x*x+y*y);
if (x < 0) return new double[] {r, Math.PI+Math.atan(y/x)};
if (y > 0) {
return new double[]{r, Math.atan(y / x)};
} else {
return new double[]{r, (2*Math.PI) + Math.atan(y / x)};
}
}
/**
* Returns the r value of this Point. This is a polar coordinate value.
*
* @return returns the r value.
*/
public double getR() {
return r;
}
/**
* Returns the theta value of this Point. This is a polar coordinate value.
*
* @return returns the theta value.
*/
public double getTheta() {
return theta;
}
/**
* Returns the x value of this Point. This is a Cartesian coordinate value.
*
* @return returns the x value.
*/
public double getX() {
return x;
}
/**
* Returns the y value of this Point. This is a Cartesian coordinate value.
*
* @return returns the y value.
*/
public double getY() {
return y;
}
}

View File

@ -0,0 +1,141 @@
package org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration;
/**
* This is the Point class. This class handles storing information about vectors, which are
* basically Points but using polar coordinates as the default. The main reason this class exists
* is because some vector math needs to be done in the Follower, and dot products and cross
* products of Points just don't seem right. Also, there are a few more methods in here that make
* using Vectors a little easier than using a Point in polar coordinates.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/11/2024
*/
public class Vector {
// IMPORTANT NOTE: theta is defined in radians.
// These are the values of the coordinate defined by this Point, in both polar and
// Cartesian systems.
private double magnitude;
private double theta;
private double xComponent;
private double yComponent;
/**
* This creates a new Vector with zero magnitude and direction.
*/
public Vector() {
setComponents(0, 0);
}
/**
* This creates a new Vector with a specified magnitude and direction.
*
* @param magnitude magnitude of the Vector.
* @param theta the direction of the Vector in radians.
*/
public Vector(double magnitude, double theta) {
setComponents(magnitude, theta);
}
/**
* This sets the components of the Vector in regular vector coordinates.
*
* @param magnitude sets the magnitude of this Vector.
* @param theta sets the theta value of this Vector.
*/
public void setComponents(double magnitude, double theta) {
double[] orthogonalComponents;
if (magnitude<0) {
this.magnitude = -magnitude;
this.theta = MathFunctions.normalizeAngle(theta+Math.PI);
} else {
this.magnitude = magnitude;
this.theta = MathFunctions.normalizeAngle(theta);
}
orthogonalComponents = Point.polarToCartesian(magnitude, theta);
xComponent = orthogonalComponents[0];
yComponent = orthogonalComponents[1];
}
/**
* This sets only the magnitude of the Vector.
*
* @param magnitude sets the magnitude of this Vector.
*/
public void setMagnitude(double magnitude) {
setComponents(magnitude, theta);
}
/**
* This sets only the angle, theta, of the Vector.
*
* @param theta sets the angle, or theta value, of this Vector.
*/
public void setTheta(double theta) {
setComponents(magnitude, theta);
}
/**
* This rotates the Vector by an angle, theta.
*
* @param theta2 the angle to be added.
*/
public void rotateVector(double theta2) {
setTheta(theta+theta2);
}
/**
* This sets the orthogonal components of the Vector. These orthogonal components are assumed
* to be in the direction of the x-axis and y-axis. In other words, this is setting the
* coordinates of the Vector using x and y coordinates instead of a direction and magnitude.
*
* @param xComponent sets the x component of this Vector.
* @param yComponent sets the y component of this Vector.
*/
public void setOrthogonalComponents(double xComponent, double yComponent) {
double[] polarComponents;
this.xComponent = xComponent;
this.yComponent = yComponent;
polarComponents = Point.cartesianToPolar(xComponent, yComponent);
magnitude = polarComponents[0];
theta = polarComponents[1];
}
/**
* Returns the magnitude of this Vector.
*
* @return returns the magnitude.
*/
public double getMagnitude() {
return magnitude;
}
/**
* Returns the theta value, or angle, of this Vector.
*
* @return returns the theta value.
*/
public double getTheta() {
return theta;
}
/**
* Returns the x component of this Vector.
*
* @return returns the x component.
*/
public double getXComponent() {
return xComponent;
}
/**
* Returns the y component of this Vector.
*
* @return returns the y component.
*/
public double getYComponent() {
return yComponent;
}
}

View File

@ -0,0 +1,74 @@
package org.firstinspires.ftc.teamcode.pedroPathing.tuning;
import com.acmerobotics.dashboard.FtcDashboard;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.dashboard.telemetry.MultipleTelemetry;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.teamcode.pedroPathing.follower.Follower;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.BezierCurve;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.PathChain;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Point;
/**
* This is the Circle autonomous OpMode. It runs the robot in a PathChain that's actually not quite
* a circle, but some Bezier curves that have control points set essentially in a square. However,
* it turns enough to tune your centripetal force correction and some of your heading. Some lag in
* heading is to be expected.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/12/2024
*/
@Config
@Autonomous (name = "Circle", group = "Autonomous Pathing Tuning")
public class Circle extends OpMode {
private Telemetry telemetryA;
public static double RADIUS = 10;
private Follower follower;
private PathChain circle;
/**
* This initializes the Follower and creates the PathChain for the "circle". Additionally, this
* initializes the FTC Dashboard telemetry.
*/
@Override
public void init() {
follower = new Follower(hardwareMap);
circle = follower.pathBuilder()
.addPath(new BezierCurve(new Point(0,0, Point.CARTESIAN), new Point(RADIUS,0, Point.CARTESIAN), new Point(RADIUS, RADIUS, Point.CARTESIAN)))
.addPath(new BezierCurve(new Point(RADIUS, RADIUS, Point.CARTESIAN), new Point(RADIUS,2*RADIUS, Point.CARTESIAN), new Point(0,2*RADIUS, Point.CARTESIAN)))
.addPath(new BezierCurve(new Point(0,2*RADIUS, Point.CARTESIAN), new Point(-RADIUS,2*RADIUS, Point.CARTESIAN), new Point(-RADIUS, RADIUS, Point.CARTESIAN)))
.addPath(new BezierCurve(new Point(-RADIUS, RADIUS, Point.CARTESIAN), new Point(-RADIUS,0, Point.CARTESIAN), new Point(0,0, Point.CARTESIAN)))
.build();
follower.followPath(circle);
telemetryA = new MultipleTelemetry(this.telemetry, FtcDashboard.getInstance().getTelemetry());
telemetryA.addLine("This will run in a roughly circular shape of radius " + RADIUS
+ ", starting on the right-most edge. So, make sure you have enough "
+ "space to the left, front, and back to run the OpMode.");
telemetryA.update();
}
/**
* This runs the OpMode, updating the Follower as well as printing out the debug statements to
* the Telemetry, as well as the FTC Dashboard.
*/
@Override
public void loop() {
follower.update();
if (follower.atParametricEnd()) {
follower.followPath(circle);
}
follower.telemetryDebug(telemetryA);
}
}

View File

@ -0,0 +1,85 @@
package org.firstinspires.ftc.teamcode.pedroPathing.tuning;
import com.acmerobotics.dashboard.FtcDashboard;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.dashboard.telemetry.MultipleTelemetry;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.teamcode.pedroPathing.follower.Follower;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.BezierCurve;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Path;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Point;
/**
* This is the CurvedBackAndForth autonomous OpMode. It runs the robot in a specified distance
* forward and to the left. On reaching the end of the forward Path, the robot runs the backward
* Path the same distance back to the start. Rinse and repeat! This is good for testing a variety
* of Vectors, like the drive Vector, the translational Vector, the heading Vector, and the
* centripetal Vector. Remember to test your tunings on StraightBackAndForth as well, since tunings
* that work well for curves might have issues going in straight lines.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/13/2024
*/
@Config
@Autonomous (name = "Curved Back And Forth", group = "Autonomous Pathing Tuning")
public class CurvedBackAndForth extends OpMode {
private Telemetry telemetryA;
public static double DISTANCE = 20;
private boolean forward = true;
private Follower follower;
private Path forwards;
private Path backwards;
/**
* This initializes the Follower and creates the forward and backward Paths. Additionally, this
* initializes the FTC Dashboard telemetry.
*/
@Override
public void init() {
follower = new Follower(hardwareMap);
forwards = new Path(new BezierCurve(new Point(0,0, Point.CARTESIAN), new Point(DISTANCE,0, Point.CARTESIAN), new Point(DISTANCE,DISTANCE, Point.CARTESIAN)));
backwards = new Path(new BezierCurve(new Point(DISTANCE,DISTANCE, Point.CARTESIAN), new Point(DISTANCE,0, Point.CARTESIAN), new Point(0,0, Point.CARTESIAN)));
backwards.setReversed(true);
follower.followPath(forwards);
telemetryA = new MultipleTelemetry(this.telemetry, FtcDashboard.getInstance().getTelemetry());
telemetryA.addLine("This will run the robot in a curve going " + DISTANCE + " inches"
+ " to the left and the same number of inches forward. The robot will go"
+ "forward and backward continuously along the path. Make sure you have"
+ "enough room.");
telemetryA.update();
}
/**
* This runs the OpMode, updating the Follower as well as printing out the debug statements to
* the Telemetry, as well as the FTC Dashboard.
*/
@Override
public void loop() {
follower.update();
if (!follower.isBusy()) {
if (forward) {
forward = false;
follower.followPath(backwards);
} else {
forward = true;
follower.followPath(forwards);
}
}
telemetryA.addData("going forward", forward);
follower.telemetryDebug(telemetryA);
}
}

View File

@ -0,0 +1,181 @@
package org.firstinspires.ftc.teamcode.pedroPathing.tuning;
import com.acmerobotics.dashboard.config.Config;
import org.firstinspires.ftc.teamcode.pedroPathing.util.CustomPIDFCoefficients;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.MathFunctions;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Point;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Vector;
/**
* This is the FollowerConstants class. It holds many constants and parameters for various parts of
* the Follower. This is here to allow for easier tuning of Pedro Pathing, as well as concentrate
* everything tunable for the Paths themselves in one place.
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/4/2024
*/
@Config
public class FollowerConstants {
// This section is for setting the actual drive vector for the front left wheel, if the robot
// is facing a heading of 0 radians with the wheel centered at (0,0)
private static double xMovement = 81.34056;
private static double yMovement = 65.43028;
private static double[] convertToPolar = Point.cartesianToPolar(xMovement, -yMovement);
public static Vector frontLeftVector = MathFunctions.normalizeVector(new Vector(convertToPolar[0],convertToPolar[1]));
// Large heading error PIDF coefficients
public static CustomPIDFCoefficients largeHeadingPIDFCoefficients = new CustomPIDFCoefficients(
1,
0,
0,
0);
// Feed forward constant added on to the large heading PIDF
public static double largeHeadingPIDFFeedForward = 0.01;
// the limit at which the heading PIDF switches between the large and small heading PIDFs
public static double headingPIDFSwitch = Math.PI/20;
// Small heading error PIDF coefficients
public static CustomPIDFCoefficients smallHeadingPIDFCoefficients = new CustomPIDFCoefficients(
5,
0,
0.08,
0);
// Feed forward constant added on to the small heading PIDF
public static double smallHeadingPIDFFeedForward = 0.01;
// Large translational PIDF coefficients
public static CustomPIDFCoefficients largeTranslationalPIDFCoefficients = new CustomPIDFCoefficients(
0.1,
0,
0,
0);
// Feed forward constant added on to the large translational PIDF
public static double largeTranslationalPIDFFeedForward = 0.015;
// Large translational Integral
public static CustomPIDFCoefficients largeTranslationalIntegral = new CustomPIDFCoefficients(
0,
0,
0,
0);
// the limit at which the heading PIDF switches between the large and small translational PIDFs
public static double translationalPIDFSwitch = 3;
// Small translational PIDF coefficients
public static CustomPIDFCoefficients smallTranslationalPIDFCoefficients = new CustomPIDFCoefficients(
0.3,
0,
0.01,
0);
// Small translational Integral value
public static CustomPIDFCoefficients smallTranslationalIntegral = new CustomPIDFCoefficients(
0,
0,
0,
0);
// Feed forward constant added on to the small translational PIDF
public static double smallTranslationalPIDFFeedForward = 0.015;
// Large drive PIDF coefficients
public static CustomPIDFCoefficients largeDrivePIDFCoefficients = new CustomPIDFCoefficients(
0.025,
0,
0.00001,
0);
// Feed forward constant added on to the large drive PIDF
public static double largeDrivePIDFFeedForward = 0.01;
// the limit at which the heading PIDF switches between the large and small drive PIDFs
public static double drivePIDFSwitch = 20;
// Small drive PIDF coefficients
public static CustomPIDFCoefficients smallDrivePIDFCoefficients = new CustomPIDFCoefficients(
0.02,
0,
0.000005,
0);
// Feed forward constant added on to the small drive PIDF
public static double smallDrivePIDFFeedForward = 0.01;
// Mass of robot in kilograms
public static double mass = 10.65942;
// Centripetal force to power scaling
public static double centripetalScaling = 0.001;
// Acceleration of the drivetrain when power is cut in inches/second^2 (should be negative)
// if not negative, then the robot thinks that its going to go faster under 0 power
// this is for curves
public static double forwardZeroPowerAcceleration = -34.62719;
// Acceleration of the drivetrain when power is cut in inches/second^2 (should be negative)
// if not negative, then the robot thinks that its going to go faster under 0 power
// this is for curves
public static double lateralZeroPowerAcceleration = -78.15554;
// A multiplier for the zero power acceleration to change the speed the robot decelerates at
// the end of paths.
// Increasing this will cause the robot to try to decelerate faster, at the risk of overshoots
// or localization slippage.
// Decreasing this will cause the deceleration at the end of the Path to be slower, making the
// robot slower but reducing risk of end-of-path overshoots or localization slippage.
// This can be set individually for each Path, but this is the default.
public static double zeroPowerAccelerationMultiplier = 4;
// When the robot is at the end of its current Path or PathChain and the velocity goes below
// this value, then end the Path. This is in inches/second.
// This can be custom set for each Path.
public static double pathEndVelocityConstraint = 0.1;
// When the robot is at the end of its current Path or PathChain and the translational error
// goes below this value, then end the Path. This is in inches.
// This can be custom set for each Path.
public static double pathEndTranslationalConstraint = 0.1;
// When the robot is at the end of its current Path or PathChain and the heading error goes
// below this value, then end the Path. This is in radians.
// This can be custom set for each Path.
public static double pathEndHeadingConstraint = 0.007;
// When the t-value of the closest point to the robot on the Path is greater than this value,
// then the Path is considered at its end.
// This can be custom set for each Path.
public static double pathEndTValueConstraint = 0.995;
// When the Path is considered at its end parametrically, then the Follower has this many
// seconds to further correct by default.
// This can be custom set for each Path.
public static double pathEndTimeoutConstraint = 1.5;
// This is how many steps the BezierCurve class uses to approximate the length of a BezierCurve.
public static int APPROXIMATION_STEPS = 1000;
// This is scales the translational error correction power when the Follower is holding a Point.
public static double holdPointTranslationalScaling = 0.45;
// This is scales the heading error correction power when the Follower is holding a Point.
public static double holdPointHeadingScaling = 0.35;
// This is the number of times the velocity is recorded for averaging when approximating a first
// and second derivative for on the fly centripetal correction. The velocity is calculated using
// half of this number of samples, and the acceleration uses all of this number of samples.
public static int AVERAGED_VELOCITY_SAMPLE_NUMBER = 8;
// This is the number of steps the binary search for closest point uses. More steps is more
// accuracy, and this increases at an exponential rate. However, more steps also does take more
// time.
public static int BEZIER_CURVE_BINARY_STEP_LIMIT = 10;
}

View File

@ -0,0 +1,147 @@
package org.firstinspires.ftc.teamcode.pedroPathing.tuning;
import com.acmerobotics.dashboard.FtcDashboard;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.dashboard.telemetry.MultipleTelemetry;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.configuration.typecontainers.MotorConfigurationType;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.teamcode.pedroPathing.localization.PoseUpdater;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.MathFunctions;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Vector;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
/**
* This is the ForwardVelocityTuner autonomous tuning OpMode. This runs the robot forwards at max
* power until it reaches some specified distance. It records the most recent velocities, and on
* reaching the end of the distance, it averages them and prints out the velocity obtained. It is
* recommended to run this multiple times on a full battery to get the best results. What this does
* is, when paired with StrafeVelocityTuner, allows FollowerConstants to create a Vector that
* empirically represents the direction your mecanum wheels actually prefer to go in, allowing for
* more accurate following.
* You can adjust the distance the robot will travel on FTC Dashboard: 192/168/43/1:8080/dash
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/13/2024
*/
@Config
@Autonomous (name = "Forward Velocity Tuner", group = "Autonomous Pathing Tuning")
public class ForwardVelocityTuner extends OpMode {
private ArrayList<Double> velocities = new ArrayList<>();
private DcMotorEx leftFront;
private DcMotorEx leftRear;
private DcMotorEx rightFront;
private DcMotorEx rightRear;
private List<DcMotorEx> motors;
private PoseUpdater poseUpdater;
public static double DISTANCE = 40;
public static double RECORD_NUMBER = 10;
private Telemetry telemetryA;
private boolean end;
/**
* This initializes the drive motors as well as the cache of velocities and the FTC Dashboard
* telemetry.
*/
@Override
public void init() {
poseUpdater = new PoseUpdater(hardwareMap);
leftFront = hardwareMap.get(DcMotorEx.class, "leftFront");
leftRear = hardwareMap.get(DcMotorEx.class, "leftRear");
rightRear = hardwareMap.get(DcMotorEx.class, "rightRear");
rightFront = hardwareMap.get(DcMotorEx.class, "rightFront");
// TODO: Make sure that this is the direction your motors need to be reversed in.
leftFront.setDirection(DcMotorSimple.Direction.REVERSE);
leftRear.setDirection(DcMotorSimple.Direction.REVERSE);
motors = Arrays.asList(leftFront, leftRear, rightFront, rightRear);
for (DcMotorEx motor : motors) {
MotorConfigurationType motorConfigurationType = motor.getMotorType().clone();
motorConfigurationType.setAchieveableMaxRPMFraction(1.0);
motor.setMotorType(motorConfigurationType);
}
for (DcMotorEx motor : motors) {
motor.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.FLOAT);
}
for (int i = 0; i < RECORD_NUMBER; i++) {
velocities.add(0.0);
}
telemetryA = new MultipleTelemetry(this.telemetry, FtcDashboard.getInstance().getTelemetry());
telemetryA.addLine("The robot will run at 1 power until it reaches " + DISTANCE + " inches forward.");
telemetryA.addLine("Make sure you have enough room, since the robot has inertia after cutting power.");
telemetryA.addLine("After running the distance, the robot will cut power from the drivetrain and display the forward velocity.");
telemetryA.addLine("Press CROSS or A on game pad 1 to stop.");
telemetryA.update();
}
/**
* This starts the OpMode by setting the drive motors to run forward at full power.
*/
@Override
public void start() {
leftFront.setPower(1);
leftRear.setPower(1);
rightFront.setPower(1);
rightRear.setPower(1);
end = false;
}
/**
* This runs the OpMode. At any point during the running of the OpMode, pressing CROSS or A on
* game pad 1 will stop the OpMode. This continuously records the RECORD_NUMBER most recent
* velocities, and when the robot has run forward enough, these last velocities recorded are
* averaged and printed.
*/
@Override
public void loop() {
if (gamepad1.cross || gamepad1.a) {
requestOpModeStop();
}
poseUpdater.update();
if (!end) {
if (Math.abs(poseUpdater.getPose().getX()) > DISTANCE) {
end = true;
for (DcMotorEx motor : motors) {
motor.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
motor.setPower(0);
}
} else {
double currentVelocity = Math.abs(MathFunctions.dotProduct(poseUpdater.getVelocity(), new Vector(1, 0)));
velocities.add(currentVelocity);
velocities.remove(0);
}
} else {
double average = 0;
for (Double velocity : velocities) {
average += velocity;
}
average /= (double) velocities.size();
telemetryA.addData("forward velocity:", average);
telemetryA.update();
}
}
}

View File

@ -0,0 +1,153 @@
package org.firstinspires.ftc.teamcode.pedroPathing.tuning;
import com.acmerobotics.dashboard.FtcDashboard;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.dashboard.telemetry.MultipleTelemetry;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.configuration.typecontainers.MotorConfigurationType;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.teamcode.pedroPathing.localization.PoseUpdater;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.MathFunctions;
import org.firstinspires.ftc.teamcode.pedroPathing.pathGeneration.Vector;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
/**
* This is the ForwardZeroPowerAccelerationTuner autonomous tuning OpMode. This runs the robot
* forward until a specified velocity is achieved. Then, the robot cuts power to the motors, setting
* them to zero power. The deceleration, or negative acceleration, is then measured until the robot
* stops. The accelerations across the entire time the robot is slowing down is then averaged and
* that number is then printed. This is used to determine how the robot will decelerate in the
* forward direction when power is cut, making the estimations used in the calculations for the
* drive Vector more accurate and giving better braking at the end of Paths.
* You can adjust the max velocity the robot will hit on FTC Dashboard: 192/168/43/1:8080/dash
*
* @author Anyi Lin - 10158 Scott's Bots
* @author Aaron Yang - 10158 Scott's Bots
* @author Harrison Womack - 10158 Scott's Bots
* @version 1.0, 3/13/2024
*/
@Config
@Autonomous (name = "Forward Zero Power Acceleration Tuner", group = "Autonomous Pathing Tuning")
public class ForwardZeroPowerAccelerationTuner extends OpMode {
private ArrayList<Double> accelerations = new ArrayList<>();
private DcMotorEx leftFront;
private DcMotorEx leftRear;
private DcMotorEx rightFront;
private DcMotorEx rightRear;
private List<DcMotorEx> motors;
private PoseUpdater poseUpdater;
public static double VELOCITY = 30;
private double previousVelocity;
private long previousTimeNano;
private Telemetry telemetryA;
private boolean stopping;
private boolean end;
/**
* This initializes the drive motors as well as the FTC Dashboard telemetry.
*/
@Override
public void init() {
poseUpdater = new PoseUpdater(hardwareMap);
leftFront = hardwareMap.get(DcMotorEx.class, "leftFront");
leftRear = hardwareMap.get(DcMotorEx.class, "leftRear");
rightRear = hardwareMap.get(DcMotorEx.class, "rightRear");
rightFront = hardwareMap.get(DcMotorEx.class, "rightFront");
// TODO: Make sure that this is the direction your motors need to be reversed in.
leftFront.setDirection(DcMotorSimple.Direction.REVERSE);
leftRear.setDirection(DcMotorSimple.Direction.REVERSE);
motors = Arrays.asList(leftFront, leftRear, rightFront, rightRear);
for (DcMotorEx motor : motors) {
MotorConfigurationType motorConfigurationType = motor.getMotorType().clone();
motorConfigurationType.setAchieveableMaxRPMFraction(1.0);
motor.setMotorType(motorConfigurationType);
}
for (DcMotorEx motor : motors) {
motor.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.FLOAT);
}
telemetryA = new MultipleTelemetry(this.telemetry, FtcDashboard.getInstance().getTelemetry());
telemetryA.addLine("The robot will run forward until it reaches " + VELOCITY + " inches per second.");
telemetryA.addLine("Then, it will cut power from the drivetrain and roll to a stop.");
telemetryA.addLine("Make sure you have enough room.");
telemetryA.addLine("After stopping, the forward zero power acceleration (natural deceleration) will be displayed.");
telemetryA.addLine("Press CROSS or A on game pad 1 to stop.");
telemetryA.update();
}
/**
* This starts the OpMode by setting the drive motors to run forward at full power.
*/
@Override
public void start() {
leftFront.setPower(1);
leftRear.setPower(1);
rightFront.setPower(1);
rightRear.setPower(1);
}
/**
* This runs the OpMode. At any point during the running of the OpMode, pressing CROSS or A on
* game pad 1 will stop the OpMode. When the robot hits the specified velocity, the robot will
* record its deceleration / negative acceleration until it stops. Then, it will average all the
* recorded deceleration / negative acceleration and print that value.
*/
@Override
public void loop() {
if (gamepad1.cross || gamepad1.a) {
requestOpModeStop();
}
poseUpdater.update();
Vector heading = new Vector(1.0, poseUpdater.getPose().getHeading());
if (!end) {
if (!stopping) {
if (MathFunctions.dotProduct(poseUpdater.getVelocity(), heading) > VELOCITY) {
previousVelocity = MathFunctions.dotProduct(poseUpdater.getVelocity(), heading);
previousTimeNano = System.nanoTime();
stopping = true;
for (DcMotorEx motor : motors) {
motor.setPower(0);
}
}
} else {
double currentVelocity = MathFunctions.dotProduct(poseUpdater.getVelocity(), heading);
accelerations.add((currentVelocity - previousVelocity) / ((System.nanoTime() - previousTimeNano) / Math.pow(10.0, 9)));
previousVelocity = currentVelocity;
previousTimeNano = System.nanoTime();
if (currentVelocity < FollowerConstants.pathEndVelocityConstraint) {
end = true;
}
}
} else {
double average = 0;
for (Double acceleration : accelerations) {
average += acceleration;
}
average /= (double)accelerations.size();
telemetryA.addData("forward zero power acceleration (deceleration):", average);
telemetryA.update();
}
}
}

Some files were not shown because too many files have changed in this diff Show More