Removed the fork relationship from FTC SDK so that people can fork this repo without problem if they already had an FTC SDK fork.

This commit is contained in:
Titan Robotics Club
2024-05-28 17:44:09 -07:00
parent 34a9ce290b
commit 653736be1c
145 changed files with 602787 additions and 20 deletions

68
.github/CONTRIBUTING.md vendored Normal file
View File

@ -0,0 +1,68 @@
# Contributing to the FTC SDK
The following is a set of guidelines for contributing the FIRST FTC SDK. The FTC Technology Team welcomes suggestions for improvements to core software, ideas for new features, requests for built-in support of new sensors, and well written bug reports.
## How can I contribute?
### Pull requests
__STOP!__ If you are new to git, do not understand the mechanics of forks, branches, and pulls, if what you just read is confusing, __do not__ push this button. Most likely it won't do what you think it will.
![Pull Button](../doc/media/PullRequest.PNG)
If you are looking at this button then you've pushed some changes to your team's fork of ftctechnh/ftc_app. Congratulations! You are almost certainly finished.
The vast majority of pull requests seen on the ftctechnh/ftc_app repository are not intended to be merged into the official SDK. Team software is just that, your team's. It's specific to the tasks you are trying to accomplish, the testing you are doing, and goals your team has. You don't want that pushed into the official SDK.
If what you've read so far makes little sense, there are some very good git learning resources online.
[Git Book](https://git-scm.com/book/en/v2)
[Interactive Git Tutorial](https://try.github.io)
##### Guidlines for experienced GIT users.
If you are absolutely certain that you want to push the big green button above, read on. Otherwise back _slowly away from keyboard_.
The real intent for advanced users is often to issue a pull request from the [branch](https://www.atlassian.com/git/tutorials/using-branches/git-branch) on a local fork back to master on either the same local fork or a child of the team fork and not on the parent ftctechnh/ftc_app. See [Creating a Pull Request](https://help.github.com/articles/creating-a-pull-request-from-a-fork/).
If that is indeed the intent, then you can merge your [topic branch](https://git-scm.com/book/en/v2/Git-Branching-Branching-Workflows#Topic-Branches) into master locally by hand before pushing it up to github, or if you want a pull request for pulls between branches on the same repository because, say, you want team members to look at your software before merging into master, you can select the base fork from the dropdown on the "Open a pull request" page and select your team repo instead of ftctechnh's.
Alternatively, if you have a team repository forked from ftctechnh/ftc_app, and then team members individually fork from your team repository, then pull requests from the individual team member's forks will have the main team repository automatically selected as the base fork for the pull. And you won't inadvertently request to pull your team software into ftctechnh's repository.
The latter would be the "best" way to manage software among a large team. But as with all things git there are many options.
Pull requests that do not fall into the category above are evaluated by the FTC Technology Team on a case-by-case basis. Please note however that the deployment model of the SDK does not support direct pulls into ftctechnh/ftc_app.
### Report bugs
This section guides you through filing a bug report. The better the report the more likely it is to be root caused and fixed. Please refrain from feature requests or software enhancements when opening new issues. See Suggesting Enhancements below.
#### Before submitting a bug report
- Check the [forums](http://ftcforum.usfirst.org/forum.php) to see if someone else has run into the problem and whether there is an official solution that doesn't require a new SDK.
- Perform a search of current [issues](https://github.com/ftctechnh/ftc_app/issues) to see if the problem has already been reported. If so, add a comment to the existing issue instead of creating a new one.
#### How Do I Submit A (Good) Bug Report?
Bugs are tracked as GitHub issues. Create an issue on ftctechnh/ftc_app and provide the following information.
Explain the problem and include additional details to help maintainers reproduce the problem:
- Use a clear and descriptive title for the issue to identify the problem.
- Describe the exact steps which reproduce the problem in as many details as possible.
- Provide specific examples to demonstrate the steps.
- Describe the behavior you observed after following the steps and point out what exactly is the problem with that behavior. Explain which behavior you expected to see instead and why. If applicable, include screenshots which show you following the described steps and clearly demonstrate the problem.
- If you're reporting that the RobotController crashed, include the logfile with a stack trace of the crash. [Example of good bug report with stack trace](https://github.com/ftctechnh/ftc_app/issues/224)
- If the problem wasn't triggered by a specific action, describe what you were doing before the problem happened and share more information using the guidelines below.
### Suggesting Enhancements
FIRST volunteers are awesome. You all have great ideas and we want to hear them.
Enhancements should be broadly applicable to a large majority of teams, should not force teams to change their workflow, and should provide real value to the mission of FIRST as it relates to engaging youth in engineering activities.
The best way to get momentum behind new features is to post a description of your idea in the forums. Build community support for it. The FTC Technology Team monitors the forums. We'll hear you and if there's a large enough call for the feature it's very likely to get put on the list for a future release.

1
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@ -0,0 +1 @@
Before issuing a pull request, please see the contributing page.

103
.gitignore vendored
View File

@ -1,24 +1,91 @@
# Compiled class file
# Built application files
*.apk
*.aar
*.ap_
*.aab
!libs/*.aar
# Files for the ART/Dalvik VM
*.dex
# Java/JDK files
*.class
*.hprof
# Log file
*.log
# Generated files
bin/
gen/
out/
# Uncomment the following line in case you need and you don't have the release build type files in your app
# release/
# BlueJ files
*.ctxt
# Gradle files
.gradle/
build/
# Mobile Tools for Java (J2ME)
.mtj.tmp/
# Local configuration file (sdk path, etc)
local.properties
# Package Files #
*.jar
*.war
*.nar
*.ear
*.zip
*.tar.gz
*.rar
# Proguard folder generated by Eclipse
proguard/
# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
hs_err_pid*
replay_pid*
# Log Files
# *.log
# Android Studio Navigation editor temp files
.navigation/
# Android Studio captures folder
captures/
# IntelliJ
*.iml
.idea/
.idea/workspace.xml
.idea/tasks.xml
.idea/gradle.xml
.idea/assetWizardSettings.xml
.idea/dictionaries
.idea/libraries
# Android Studio 3 in .gitignore file.
.idea/caches
.idea/modules.xml
# Comment next line if keeping position of elements in Navigation Editor is relevant for you
.idea/navEditor.xml
# For Mac users
.DS_Store
# Keystore files
# Uncomment the following lines if you do not want to check your keystore files in.
#*.jks
#*.keystore
# External native build folder generated in Android Studio 2.2 and later
.externalNativeBuild
.cxx/
# Google Services (e.g. APIs or Firebase)
# google-services.json
# Freeline
freeline.py
freeline/
freeline_project_description.json
# fastlane
fastlane/report.xml
fastlane/Preview.html
fastlane/screenshots
fastlane/test_output
fastlane/readme.md
# Version control
vcs.xml
# lint
lint/intermediates/
lint/generated/
lint/outputs/
lint/tmp/
# lint/reports/

6
.gitmodules vendored Normal file
View File

@ -0,0 +1,6 @@
[submodule "TeamCode/src/main/java/TrcCommonLib"]
path = TeamCode/src/main/java/TrcCommonLib
url = https://github.com/trc492/TrcCommonLib
[submodule "TeamCode/src/main/java/TrcFtcLib"]
path = TeamCode/src/main/java/TrcFtcLib
url = https://github.com/trc492/TrcFtcLib

View File

@ -0,0 +1,26 @@
import java.text.SimpleDateFormat
//
// build.gradle in FtcRobotController
//
apply plugin: 'com.android.library'
android {
defaultConfig {
minSdkVersion 24
//noinspection ExpiredTargetSdkVersion
targetSdkVersion 28
buildConfigField "String", "APP_BUILD_TIME", '"' + (new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ", Locale.ROOT).format(new Date())) + '"'
}
compileSdkVersion 29
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
namespace = 'com.qualcomm.ftcrobotcontroller'
}
apply from: '../build.dependencies.gradle'

View File

@ -0,0 +1,79 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:versionCode="53"
android:versionName="9.1">
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />
<application
android:allowBackup="true"
android:largeHeap="true"
android:icon="@drawable/ic_launcher"
android:label="@string/app_name"
android:theme="@style/AppThemeRedRC"
android:usesCleartextTraffic="true">
<!-- Indicates to the ControlHubUpdater what the latest version of the Control Hub is that this app supports -->
<meta-data
android:name="org.firstinspires.latestSupportedControlHubVersion"
android:value="1" />
<!-- The main robot controller activity -->
<activity android:name="org.firstinspires.ftc.robotcontroller.internal.PermissionValidatorWrapper"
android:screenOrientation="fullUser"
android:configChanges="orientation|screenSize"
android:label="@string/app_name"
android:launchMode="singleTask" >
<intent-filter>
<category android:name="android.intent.category.LAUNCHER" />
<action android:name="android.intent.action.MAIN" />
</intent-filter>
</activity>
<activity
android:name="org.firstinspires.ftc.robotcontroller.internal.FtcRobotControllerActivity"
android:screenOrientation="fullUser"
android:configChanges="orientation|screenSize"
android:label="@string/app_name"
android:launchMode="singleTask" >
<intent-filter>
<action android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED" />
</intent-filter>
<meta-data
android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED"
android:resource="@xml/device_filter" />
<!--org.firstinspires.main.entry indicates that this app is compatible with the Dragonboard Control Hub-->
<meta-data
android:name="org.firstinspires.main.entry"
android:value="true" />
</activity>
<!-- The robot controller service in which most of the robot functionality is managed -->
<service
android:name="com.qualcomm.ftccommon.FtcRobotControllerService"
android:enabled="true" />
<!-- Assistant that autostarts the robot controller on android boot (if it's supposed to) -->
<receiver
android:enabled="true"
android:exported="true"
android:name="org.firstinspires.ftc.ftccommon.internal.RunOnBoot"
android:permission="android.permission.RECEIVE_BOOT_COMPLETED">
<intent-filter>
<category android:name="android.intent.category.DEFAULT" />
<action android:name="android.intent.action.BOOT_COMPLETED" />
<action android:name="android.intent.action.QUICKBOOT_POWERON" />
</intent-filter>
</receiver>
</application>
</manifest>

View File

@ -0,0 +1,167 @@
/* Copyright (c) 2021 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* This file contains an example of a Linear "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
* When a selection is made from the menu, the corresponding OpMode is executed.
*
* This particular OpMode illustrates driving a 4-motor Omni-Directional (or Holonomic) robot.
* This code will work with either a Mecanum-Drive or an X-Drive train.
* Both of these drives are illustrated at https://gm0.org/en/latest/docs/robot-design/drivetrains/holonomic.html
* Note that a Mecanum drive must display an X roller-pattern when viewed from above.
*
* Also note that it is critical to set the correct rotation direction for each motor. See details below.
*
* Holonomic drives provide the ability for the robot to move in three axes (directions) simultaneously.
* Each motion axis is controlled by one Joystick axis.
*
* 1) Axial: Driving forward and backward Left-joystick Forward/Backward
* 2) Lateral: Strafing right and left Left-joystick Right and Left
* 3) Yaw: Rotating Clockwise and counter clockwise Right-joystick Right and Left
*
* This code is written assuming that the right-side motors need to be reversed for the robot to drive forward.
* When you first test your robot, if it moves backward when you push the left stick forward, then you must flip
* the direction of all 4 motors (see code below).
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Omni Linear OpMode", group="Linear OpMode")
@Disabled
public class BasicOmniOpMode_Linear extends LinearOpMode {
// Declare OpMode members for each of the 4 motors.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor leftFrontDrive = null;
private DcMotor leftBackDrive = null;
private DcMotor rightFrontDrive = null;
private DcMotor rightBackDrive = null;
@Override
public void runOpMode() {
// Initialize the hardware variables. Note that the strings used here must correspond
// to the names assigned during the robot configuration step on the DS or RC devices.
leftFrontDrive = hardwareMap.get(DcMotor.class, "left_front_drive");
leftBackDrive = hardwareMap.get(DcMotor.class, "left_back_drive");
rightFrontDrive = hardwareMap.get(DcMotor.class, "right_front_drive");
rightBackDrive = hardwareMap.get(DcMotor.class, "right_back_drive");
// ########################################################################################
// !!! IMPORTANT Drive Information. Test your motor directions. !!!!!
// ########################################################################################
// Most robots need the motors on one side to be reversed to drive forward.
// The motor reversals shown here are for a "direct drive" robot (the wheels turn the same direction as the motor shaft)
// If your robot has additional gear reductions or uses a right-angled drive, it's important to ensure
// that your motors are turning in the correct direction. So, start out with the reversals here, BUT
// when you first test your robot, push the left joystick forward and observe the direction the wheels turn.
// Reverse the direction (flip FORWARD <-> REVERSE ) of any wheel that runs backward
// Keep testing until ALL the wheels move the robot forward when you push the left joystick forward.
leftFrontDrive.setDirection(DcMotor.Direction.REVERSE);
leftBackDrive.setDirection(DcMotor.Direction.REVERSE);
rightFrontDrive.setDirection(DcMotor.Direction.FORWARD);
rightBackDrive.setDirection(DcMotor.Direction.FORWARD);
// Wait for the game to start (driver presses PLAY)
telemetry.addData("Status", "Initialized");
telemetry.update();
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
double max;
// POV Mode uses left joystick to go forward & strafe, and right joystick to rotate.
double axial = -gamepad1.left_stick_y; // Note: pushing stick forward gives negative value
double lateral = gamepad1.left_stick_x;
double yaw = gamepad1.right_stick_x;
// Combine the joystick requests for each axis-motion to determine each wheel's power.
// Set up a variable for each drive wheel to save the power level for telemetry.
double leftFrontPower = axial + lateral + yaw;
double rightFrontPower = axial - lateral - yaw;
double leftBackPower = axial - lateral + yaw;
double rightBackPower = axial + lateral - yaw;
// Normalize the values so no wheel power exceeds 100%
// This ensures that the robot maintains the desired motion.
max = Math.max(Math.abs(leftFrontPower), Math.abs(rightFrontPower));
max = Math.max(max, Math.abs(leftBackPower));
max = Math.max(max, Math.abs(rightBackPower));
if (max > 1.0) {
leftFrontPower /= max;
rightFrontPower /= max;
leftBackPower /= max;
rightBackPower /= max;
}
// This is test code:
//
// Uncomment the following code to test your motor directions.
// Each button should make the corresponding motor run FORWARD.
// 1) First get all the motors to take to correct positions on the robot
// by adjusting your Robot Configuration if necessary.
// 2) Then make sure they run in the correct direction by modifying the
// the setDirection() calls above.
// Once the correct motors move in the correct direction re-comment this code.
/*
leftFrontPower = gamepad1.x ? 1.0 : 0.0; // X gamepad
leftBackPower = gamepad1.a ? 1.0 : 0.0; // A gamepad
rightFrontPower = gamepad1.y ? 1.0 : 0.0; // Y gamepad
rightBackPower = gamepad1.b ? 1.0 : 0.0; // B gamepad
*/
// Send calculated power to wheels
leftFrontDrive.setPower(leftFrontPower);
rightFrontDrive.setPower(rightFrontPower);
leftBackDrive.setPower(leftBackPower);
rightBackDrive.setPower(rightBackPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Front left/Right", "%4.2f, %4.2f", leftFrontPower, rightFrontPower);
telemetry.addData("Back left/Right", "%4.2f, %4.2f", leftBackPower, rightBackPower);
telemetry.update();
}
}}

View File

@ -0,0 +1,140 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/*
* This file contains an example of an iterative (Non-Linear) "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
* When a selection is made from the menu, the corresponding OpMode
* class is instantiated on the Robot Controller and executed.
*
* This particular OpMode just executes a basic Tank Drive Teleop for a two wheeled robot
* It includes all the skeletal structure that all iterative OpModes contain.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Iterative OpMode", group="Iterative OpMode")
@Disabled
public class BasicOpMode_Iterative extends OpMode
{
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
/*
* Code to run ONCE when the driver hits INIT
*/
@Override
public void init() {
telemetry.addData("Status", "Initialized");
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// Tell the driver that initialization is complete.
telemetry.addData("Status", "Initialized");
}
/*
* Code to run REPEATEDLY after the driver hits INIT, but before they hit PLAY
*/
@Override
public void init_loop() {
}
/*
* Code to run ONCE when the driver hits PLAY
*/
@Override
public void start() {
runtime.reset();
}
/*
* Code to run REPEATEDLY after the driver hits PLAY but before they hit STOP
*/
@Override
public void loop() {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
}
/*
* Code to run ONCE after the driver hits STOP
*/
@Override
public void stop() {
}
}

View File

@ -0,0 +1,115 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/*
* This file contains an minimal example of a Linear "OpMode". An OpMode is a 'program' that runs in either
* the autonomous or the teleop period of an FTC match. The names of OpModes appear on the menu
* of the FTC Driver Station. When a selection is made from the menu, the corresponding OpMode
* class is instantiated on the Robot Controller and executed.
*
* This particular OpMode just executes a basic Tank Drive Teleop for a two wheeled robot
* It includes all the skeletal structure that all linear OpModes contain.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Linear OpMode", group="Linear OpMode")
@Disabled
public class BasicOpMode_Linear extends LinearOpMode {
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
@Override
public void runOpMode() {
telemetry.addData("Status", "Initialized");
telemetry.update();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// Wait for the game to start (driver presses PLAY)
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
telemetry.update();
}
}
}

View File

@ -0,0 +1,217 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation,
* including Java Builder structures for specifying Vision parameters.
*
* For an introduction to AprilTags, see the FTC-DOCS link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* In this sample, any visible tag ID will be detected and displayed, but only tags that are included in the default
* "TagLibrary" will have their position and orientation information displayed. This default TagLibrary contains
* the current Season's AprilTags and a small set of "test Tags" in the high number range.
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* To experiment with using AprilTags to navigate, try out these two driving samples:
* RobotAutoDriveToAprilTagOmni and RobotAutoDriveToAprilTagTank
*
* There are many "default" VisionPortal and AprilTag configuration parameters that may be overridden if desired.
* These default parameters are shown as comments in the code below.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag", group = "Concept")
@Disabled
public class ConceptAprilTag extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end method runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor.
aprilTag = new AprilTagProcessor.Builder()
// The following default settings are available to un-comment and edit as needed.
//.setDrawAxes(false)
//.setDrawCubeProjection(false)
//.setDrawTagOutline(true)
//.setTagFamily(AprilTagProcessor.TagFamily.TAG_36h11)
//.setTagLibrary(AprilTagGameDatabase.getCenterStageTagLibrary())
//.setOutputUnits(DistanceUnit.INCH, AngleUnit.DEGREES)
// == CAMERA CALIBRATION ==
// If you do not manually specify calibration parameters, the SDK will attempt
// to load a predefined calibration for your camera.
//.setLensIntrinsics(578.272, 578.272, 402.145, 221.506)
// ... these parameters are fx, fy, cx, cy.
.build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second (default)
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second (default)
// Note: Decimation can be changed on-the-fly to adapt during a match.
//aprilTag.setDecimation(3);
// Create the vision portal by using a builder.
VisionPortal.Builder builder = new VisionPortal.Builder();
// Set the camera (webcam vs. built-in RC phone camera).
if (USE_WEBCAM) {
builder.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"));
} else {
builder.setCamera(BuiltinCameraDirection.BACK);
}
// Choose a camera resolution. Not all cameras support all resolutions.
//builder.setCameraResolution(new Size(640, 480));
// Enable the RC preview (LiveView). Set "false" to omit camera monitoring.
//builder.enableLiveView(true);
// Set the stream format; MJPEG uses less bandwidth than default YUY2.
//builder.setStreamFormat(VisionPortal.StreamFormat.YUY2);
// Choose whether or not LiveView stops if no processors are enabled.
// If set "true", monitor shows solid orange screen if no processors enabled.
// If set "false", monitor shows camera view without annotations.
//builder.setAutoStopLiveView(false);
// Set and enable the processor.
builder.addProcessor(aprilTag);
// Build the Vision Portal, using the above settings.
visionPortal = builder.build();
// Disable or re-enable the aprilTag processor at any time.
//visionPortal.setProcessorEnabled(aprilTag, true);
} // end method initAprilTag()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
} // end class

View File

@ -0,0 +1,163 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation, using
* the easy way.
*
* For an introduction to AprilTags, see the FTC-DOCS link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* In this sample, any visible tag ID will be detected and displayed, but only tags that are included in the default
* "TagLibrary" will have their position and orientation information displayed. This default TagLibrary contains
* the current Season's AprilTags and a small set of "test Tags" in the high number range.
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* To experiment with using AprilTags to navigate, try out these two driving samples:
* RobotAutoDriveToAprilTagOmni and RobotAutoDriveToAprilTagTank
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag Easy", group = "Concept")
@Disabled
public class ConceptAprilTagEasy extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end method runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor the easy way.
aprilTag = AprilTagProcessor.easyCreateWithDefaults();
// Create the vision portal the easy way.
if (USE_WEBCAM) {
visionPortal = VisionPortal.easyCreateWithDefaults(
hardwareMap.get(WebcamName.class, "Webcam 1"), aprilTag);
} else {
visionPortal = VisionPortal.easyCreateWithDefaults(
BuiltinCameraDirection.BACK, aprilTag);
}
} // end method initAprilTag()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
} // end class

View File

@ -0,0 +1,246 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode determines the best Exposure for minimizing image motion-blur on a Webcam
* Note that it is not possible to control the exposure for a Phone Camera, so if you are using a Phone for the Robot Controller
* this OpMode/Feature only applies to an externally connected Webcam
*
* The goal is to determine the smallest (shortest) Exposure value that still provides reliable Tag Detection.
* Starting with the minimum Exposure and maximum Gain, the exposure is slowly increased until the Tag is
* detected reliably from the likely operational distance.
*
*
* The best way to run this optimization is to view the camera preview screen while changing the exposure and gains.
*
* To do this, you need to view the RobotController screen directly (not from Driver Station)
* This can be done directly from a RC phone screen (if you are using an external Webcam), but for a Control Hub you must either plug an
* HDMI monitor into the Control Hub HDMI port, or use an external viewer program like ScrCpy (https://scrcpy.org/)
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name="Optimize AprilTag Exposure", group = "Concept")
@Disabled
public class ConceptAprilTagOptimizeExposure extends LinearOpMode
{
private VisionPortal visionPortal = null; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private int myExposure ;
private int minExposure ;
private int maxExposure ;
private int myGain ;
private int minGain ;
private int maxGain ;
boolean thisExpUp = false;
boolean thisExpDn = false;
boolean thisGainUp = false;
boolean thisGainDn = false;
boolean lastExpUp = false;
boolean lastExpDn = false;
boolean lastGainUp = false;
boolean lastGainDn = false;
@Override public void runOpMode()
{
// Initialize the Apriltag Detection process
initAprilTag();
// Establish Min and Max Gains and Exposure. Then set a low exposure with high gain
getCameraSetting();
myExposure = Math.min(5, minExposure);
myGain = maxGain;
setManualExposure(myExposure, myGain);
// Wait for the match to begin.
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
telemetry.addLine("Find lowest Exposure that gives reliable detection.");
telemetry.addLine("Use Left bump/trig to adjust Exposure.");
telemetry.addLine("Use Right bump/trig to adjust Gain.\n");
// Display how many Tags Detected
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
int numTags = currentDetections.size();
if (numTags > 0 )
telemetry.addData("Tag", "####### %d Detected ######", currentDetections.size());
else
telemetry.addData("Tag", "----------- none - ----------");
telemetry.addData("Exposure","%d (%d - %d)", myExposure, minExposure, maxExposure);
telemetry.addData("Gain","%d (%d - %d)", myGain, minGain, maxGain);
telemetry.update();
// check to see if we need to change exposure or gain.
thisExpUp = gamepad1.left_bumper;
thisExpDn = gamepad1.left_trigger > 0.25;
thisGainUp = gamepad1.right_bumper;
thisGainDn = gamepad1.right_trigger > 0.25;
// look for clicks to change exposure
if (thisExpUp && !lastExpUp) {
myExposure = Range.clip(myExposure + 1, minExposure, maxExposure);
setManualExposure(myExposure, myGain);
} else if (thisExpDn && !lastExpDn) {
myExposure = Range.clip(myExposure - 1, minExposure, maxExposure);
setManualExposure(myExposure, myGain);
}
// look for clicks to change the gain
if (thisGainUp && !lastGainUp) {
myGain = Range.clip(myGain + 1, minGain, maxGain );
setManualExposure(myExposure, myGain);
} else if (thisGainDn && !lastGainDn) {
myGain = Range.clip(myGain - 1, minGain, maxGain );
setManualExposure(myExposure, myGain);
}
lastExpUp = thisExpUp;
lastExpDn = thisExpDn;
lastGainUp = thisGainUp;
lastGainDn = thisGainDn;
sleep(20);
}
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Create the WEBCAM vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
}
/*
Manually set the camera gain and exposure.
Can only be called AFTER calling initAprilTag();
Returns true if controls are set.
*/
private boolean setManualExposure(int exposureMS, int gain) {
// Ensure Vision Portal has been setup.
if (visionPortal == null) {
return false;
}
// Wait for the camera to be open
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
// Set exposure. Make sure we are in Manual Mode for these values to take effect.
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
// Set Gain.
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
return (true);
} else {
return (false);
}
}
/*
Read this camera's minimum and maximum Exposure and Gain settings.
Can only be called AFTER calling initAprilTag();
*/
private void getCameraSetting() {
// Ensure Vision Portal has been setup.
if (visionPortal == null) {
return;
}
// Wait for the camera to be open
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Get camera control values unless we are stopping.
if (!isStopRequested()) {
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
minExposure = (int)exposureControl.getMinExposure(TimeUnit.MILLISECONDS) + 1;
maxExposure = (int)exposureControl.getMaxExposure(TimeUnit.MILLISECONDS);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
minGain = gainControl.getMinGain();
maxGain = gainControl.getMaxGain();
}
}
}

View File

@ -0,0 +1,196 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.VisionPortal.CameraState;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation, using
* two webcams.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag Switchable Cameras", group = "Concept")
@Disabled
public class ConceptAprilTagSwitchableCameras extends LinearOpMode {
/*
* Variables used for switching cameras.
*/
private WebcamName webcam1, webcam2;
private boolean oldLeftBumper;
private boolean oldRightBumper;
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryCameraSwitching();
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
doCameraSwitching();
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
CameraName switchableCamera = ClassFactory.getInstance()
.getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
// Create the vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(switchableCamera)
.addProcessor(aprilTag)
.build();
} // end method initAprilTag()
/**
* Add telemetry about camera switching.
*/
private void telemetryCameraSwitching() {
if (visionPortal.getActiveCamera().equals(webcam1)) {
telemetry.addData("activeCamera", "Webcam 1");
telemetry.addData("Press RightBumper", "to switch to Webcam 2");
} else {
telemetry.addData("activeCamera", "Webcam 2");
telemetry.addData("Press LeftBumper", "to switch to Webcam 1");
}
} // end method telemetryCameraSwitching()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
/**
* Set the active camera according to input from the gamepad.
*/
private void doCameraSwitching() {
if (visionPortal.getCameraState() == CameraState.STREAMING) {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
visionPortal.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
visionPortal.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
}
} // end method doCameraSwitching()
} // end class

View File

@ -0,0 +1,202 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of using both AprilTag recognition and TensorFlow
* Object Detection.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: Double Vision", group = "Concept")
@Disabled
public class ConceptDoubleVision extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal myVisionPortal;
@Override
public void runOpMode() {
initDoubleVision();
// This OpMode loops continuously, allowing the user to switch between
// AprilTag and TensorFlow Object Detection (TFOD) image processors.
while (!isStopRequested()) {
if (opModeInInit()) {
telemetry.addData("DS preview on/off","3 dots, Camera Stream");
telemetry.addLine();
telemetry.addLine("----------------------------------------");
}
if (myVisionPortal.getProcessorEnabled(aprilTag)) {
// User instructions: Dpad left or Dpad right.
telemetry.addLine("Dpad Left to disable AprilTag");
telemetry.addLine();
telemetryAprilTag();
} else {
telemetry.addLine("Dpad Right to enable AprilTag");
}
telemetry.addLine();
telemetry.addLine("----------------------------------------");
if (myVisionPortal.getProcessorEnabled(tfod)) {
telemetry.addLine("Dpad Down to disable TFOD");
telemetry.addLine();
telemetryTfod();
} else {
telemetry.addLine("Dpad Up to enable TFOD");
}
// Push telemetry to the Driver Station.
telemetry.update();
if (gamepad1.dpad_left) {
myVisionPortal.setProcessorEnabled(aprilTag, false);
} else if (gamepad1.dpad_right) {
myVisionPortal.setProcessorEnabled(aprilTag, true);
}
if (gamepad1.dpad_down) {
myVisionPortal.setProcessorEnabled(tfod, false);
} else if (gamepad1.dpad_up) {
myVisionPortal.setProcessorEnabled(tfod, true);
}
sleep(20);
} // end while loop
} // end method runOpMode()
/**
* Initialize AprilTag and TFOD.
*/
private void initDoubleVision() {
// -----------------------------------------------------------------------------------------
// AprilTag Configuration
// -----------------------------------------------------------------------------------------
aprilTag = new AprilTagProcessor.Builder()
.build();
// -----------------------------------------------------------------------------------------
// TFOD Configuration
// -----------------------------------------------------------------------------------------
tfod = new TfodProcessor.Builder()
.build();
// -----------------------------------------------------------------------------------------
// Camera Configuration
// -----------------------------------------------------------------------------------------
if (USE_WEBCAM) {
myVisionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessors(tfod, aprilTag)
.build();
} else {
myVisionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessors(tfod, aprilTag)
.build();
}
} // end initDoubleVision()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
} // end method telemetryAprilTag()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -0,0 +1,184 @@
/*
Copyright (c) 2022 REV Robotics, FIRST
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of REV Robotics nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/*
* This OpMode demonstrates the impact of setting the IMU orientation correctly or incorrectly. This
* code assumes there is an IMU configured with the name "imu".
*
* Note: This OpMode is more of a tool than a code sample. The User Interface portion of this code
* goes beyond simply showing how to interface to the IMU.<br>
* For a minimal example of interfacing to an IMU, please see the SensorIMUOrthogonal or SensorIMUNonOrthogonal sample OpModes.
*
* This OpMode enables you to re-specify the Hub Mounting orientation dynamically by using gamepad controls.
* While doing so, the sample will display how Pitch, Roll and Yaw angles change as the hub is moved.
*
* The gamepad controls let you change the two parameters that specify how the Control/Expansion Hub is mounted. <br>
* The first parameter specifies which direction the printed logo on the Hub is pointing. <br>
* The second parameter specifies which direction the USB connector on the Hub is pointing. <br>
* All directions are relative to the robot, and left/right is as viewed from behind the robot.
*
* How will you know if you have chosen the correct Orientation? With the correct orientation
* parameters selected, pitch/roll/yaw should act as follows:
*
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X) <br>
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y) <br>
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z) <br>
*
* The Yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
*
* The rotational velocities should follow the change in corresponding axes.
*/
@TeleOp(name="Concept: IMU Orientation", group="Concept")
@Disabled
public class ConceptExploringIMUOrientation extends LinearOpMode {
static RevHubOrientationOnRobot.LogoFacingDirection[] logoFacingDirections
= RevHubOrientationOnRobot.LogoFacingDirection.values();
static RevHubOrientationOnRobot.UsbFacingDirection[] usbFacingDirections
= RevHubOrientationOnRobot.UsbFacingDirection.values();
static int LAST_DIRECTION = logoFacingDirections.length - 1;
static float TRIGGER_THRESHOLD = 0.2f;
IMU imu;
int logoFacingDirectionPosition;
int usbFacingDirectionPosition;
boolean orientationIsValid = true;
@Override public void runOpMode() throws InterruptedException {
imu = hardwareMap.get(IMU.class, "imu");
logoFacingDirectionPosition = 0; // Up
usbFacingDirectionPosition = 2; // Forward
updateOrientation();
boolean justChangedLogoDirection = false;
boolean justChangedUsbDirection = false;
// Loop until stop requested
while (!isStopRequested()) {
// Check to see if Yaw reset is requested (Y button)
if (gamepad1.y) {
telemetry.addData("Yaw", "Resetting\n");
imu.resetYaw();
} else {
telemetry.addData("Yaw", "Press Y (triangle) on Gamepad to reset.\n");
}
// Check to see if new Logo Direction is requested
if (gamepad1.left_bumper || gamepad1.right_bumper) {
if (!justChangedLogoDirection) {
justChangedLogoDirection = true;
if (gamepad1.left_bumper) {
logoFacingDirectionPosition--;
if (logoFacingDirectionPosition < 0) {
logoFacingDirectionPosition = LAST_DIRECTION;
}
} else {
logoFacingDirectionPosition++;
if (logoFacingDirectionPosition > LAST_DIRECTION) {
logoFacingDirectionPosition = 0;
}
}
updateOrientation();
}
} else {
justChangedLogoDirection = false;
}
// Check to see if new USB Direction is requested
if (gamepad1.left_trigger > TRIGGER_THRESHOLD || gamepad1.right_trigger > TRIGGER_THRESHOLD) {
if (!justChangedUsbDirection) {
justChangedUsbDirection = true;
if (gamepad1.left_trigger > TRIGGER_THRESHOLD) {
usbFacingDirectionPosition--;
if (usbFacingDirectionPosition < 0) {
usbFacingDirectionPosition = LAST_DIRECTION;
}
} else {
usbFacingDirectionPosition++;
if (usbFacingDirectionPosition > LAST_DIRECTION) {
usbFacingDirectionPosition = 0;
}
}
updateOrientation();
}
} else {
justChangedUsbDirection = false;
}
// Display User instructions and IMU data
telemetry.addData("logo Direction (set with bumpers)", logoFacingDirections[logoFacingDirectionPosition]);
telemetry.addData("usb Direction (set with triggers)", usbFacingDirections[usbFacingDirectionPosition] + "\n");
if (orientationIsValid) {
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.DEGREES);
telemetry.addData("Yaw (Z)", "%.2f Deg. (Heading)", orientation.getYaw(AngleUnit.DEGREES));
telemetry.addData("Pitch (X)", "%.2f Deg.", orientation.getPitch(AngleUnit.DEGREES));
telemetry.addData("Roll (Y)", "%.2f Deg.\n", orientation.getRoll(AngleUnit.DEGREES));
telemetry.addData("Yaw (Z) velocity", "%.2f Deg/Sec", angularVelocity.zRotationRate);
telemetry.addData("Pitch (X) velocity", "%.2f Deg/Sec", angularVelocity.xRotationRate);
telemetry.addData("Roll (Y) velocity", "%.2f Deg/Sec", angularVelocity.yRotationRate);
} else {
telemetry.addData("Error", "Selected orientation on robot is invalid");
}
telemetry.update();
}
}
// apply any requested orientation changes.
void updateOrientation() {
RevHubOrientationOnRobot.LogoFacingDirection logo = logoFacingDirections[logoFacingDirectionPosition];
RevHubOrientationOnRobot.UsbFacingDirection usb = usbFacingDirections[usbFacingDirectionPosition];
try {
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logo, usb);
imu.initialize(new IMU.Parameters(orientationOnRobot));
orientationIsValid = true;
} catch (IllegalArgumentException e) {
orientationIsValid = false;
}
}
}

View File

@ -0,0 +1,142 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.Range;
/*
* This OpMode illustrates how to use an external "hardware" class to modularize all the robot's sensors and actuators.
* This approach is very efficient because the same hardware class can be used by all of your teleop and autonomous OpModes
* without requiring many copy & paste operations. Once you have defined and tested the hardware class with one OpMode,
* it is instantly available to other OpModes.
*
* The real benefit of this approach is that as you tweak your robot hardware, you only need to make changes in ONE place (the Hardware Class).
* So, to be effective you should put as much or your hardware setup and access code as possible in the hardware class.
* Essentially anything you do with hardware in BOTH Teleop and Auto should likely go in the hardware class.
*
* The Hardware Class is created in a separate file, and then an "instance" of this class is created in each OpMode.
* In order for the class to do typical OpMode things (like send telemetry data) it must be passed a reference to the
* OpMode object when it's created, so it can access all core OpMode functions. This is illustrated below.
*
* In this concept sample, the hardware class file is called RobotHardware.java and it must accompany this sample OpMode.
* So, if you copy ConceptExternalHardwareClass.java into TeamCode (using Android Studio or OnBotJava) then RobotHardware.java
* must also be copied to the same location (maintaining its name).
*
* For comparison purposes, this sample and its accompanying hardware class duplicates the functionality of the
* RobotTelopPOV_Linear OpMode. It assumes three motors (left_drive, right_drive and arm) and two servos (left_hand and right_hand)
*
* View the RobotHardware.java class file for more details
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* In OnBot Java, add a new OpMode, select this sample, and select TeleOp.
* Also add another new file named RobotHardware.java, select the sample with that name, and select Not an OpMode.
*/
@TeleOp(name="Concept: Robot Hardware Class", group="Robot")
@Disabled
public class ConceptExternalHardwareClass extends LinearOpMode {
// Create a RobotHardware object to be used to access robot hardware.
// Prefix any hardware functions with "robot." to access this class.
RobotHardware robot = new RobotHardware(this);
@Override
public void runOpMode() {
double drive = 0;
double turn = 0;
double arm = 0;
double handOffset = 0;
// initialize all the hardware, using the hardware class. See how clean and simple this is?
robot.init();
// Send telemetry message to signify robot waiting;
// Wait for the game to start (driver presses PLAY)
waitForStart();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Run wheels in POV mode (note: The joystick goes negative when pushed forward, so negate it)
// In this mode the Left stick moves the robot fwd and back, the Right stick turns left and right.
// This way it's also easy to just drive straight, or just turn.
drive = -gamepad1.left_stick_y;
turn = gamepad1.right_stick_x;
// Combine drive and turn for blended motion. Use RobotHardware class
robot.driveRobot(drive, turn);
// Use gamepad left & right Bumpers to open and close the claw
// Use the SERVO constants defined in RobotHardware class.
// Each time around the loop, the servos will move by a small amount.
// Limit the total offset to half of the full travel range
if (gamepad1.right_bumper)
handOffset += robot.HAND_SPEED;
else if (gamepad1.left_bumper)
handOffset -= robot.HAND_SPEED;
handOffset = Range.clip(handOffset, -0.5, 0.5);
// Move both servos to new position. Use RobotHardware class
robot.setHandPositions(handOffset);
// Use gamepad buttons to move arm up (Y) and down (A)
// Use the MOTOR constants defined in RobotHardware class.
if (gamepad1.y)
arm = robot.ARM_UP_POWER;
else if (gamepad1.a)
arm = robot.ARM_DOWN_POWER;
else
arm = 0;
robot.setArmPower(arm);
// Send telemetry messages to explain controls and show robot status
telemetry.addData("Drive", "Left Stick");
telemetry.addData("Turn", "Right Stick");
telemetry.addData("Arm Up/Down", "Y & A Buttons");
telemetry.addData("Hand Open/Closed", "Left and Right Bumpers");
telemetry.addData("-", "-------");
telemetry.addData("Drive Power", "%.2f", drive);
telemetry.addData("Turn Power", "%.2f", turn);
telemetry.addData("Arm Power", "%.2f", arm);
telemetry.addData("Hand Position", "Offset = %.2f", handOffset);
telemetry.update();
// Pace this loop so hands move at a reasonable speed.
sleep(50);
}
}
}

View File

@ -0,0 +1,200 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gamepad;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* This OpMode illustrates using the rumble feature of many gamepads.
*
* Note: Some gamepads "rumble" better than others.
* The Xbox & PS4 have a left (rumble1) and right (rumble2) rumble motor.
* These two gamepads have two distinct rumble modes: Large on the left, and small on the right
* The Etpark gamepad may only respond to rumble1, and may only run at full power.
* The Logitech F310 gamepad does not have *any* rumble ability.
*
* Moral: You should use this sample to experiment with your specific gamepads to explore their rumble features.
*
* The rumble motors are accessed through the standard gamepad1 and gamepad2 objects.
* Several new methods were added to the Gamepad class in FTC SDK Rev 7
* The key methods are as follows:
*
* .rumble(double rumble1, double rumble2, int durationMs)
* This method sets the rumble power of both motors for a specific time duration.
* Both rumble arguments are motor-power levels in the 0.0 to 1.0 range.
* durationMs is the desired length of the rumble action in milliseconds.
* This method returns immediately.
* Note:
* Use a durationMs of Gamepad.RUMBLE_DURATION_CONTINUOUS to provide a continuous rumble
* Use a power of 0, or duration of 0 to stop a rumble.
*
* .rumbleBlips(int count) allows an easy way to signal the driver with a series of rumble blips.
* Just specify how many blips you want.
* This method returns immediately.
*
* .runRumbleEffect(customRumbleEffect) allows you to run a custom rumble sequence that you have
* built using the Gamepad.RumbleEffect.Builder()
* A "custom effect" is a sequence of steps, where each step can rumble any of the
* rumble motors for a specific period at a specific power level.
* The Custom Effect will play as the (un-blocked) OpMode continues to run
*
* .isRumbling() returns true if there is a rumble effect in progress.
* Use this call to prevent stepping on a current rumble.
*
* .stopRumble() Stop any ongoing rumble or custom rumble effect.
*
* .rumble(int durationMs) Full power rumble for fixed duration.
*
* Note: Whenever a new Rumble command is issued, any currently executing rumble action will
* be truncated, and the new action started immediately. Take these precautions:
* 1) Do Not SPAM the rumble motors by issuing rapid fire commands
* 2) Multiple sources for rumble commands must coordinate to avoid tromping on each other.
*
* This can be achieved several possible ways:
* 1) Only having one source for rumble actions
* 2) Issuing rumble commands on transitions, rather than states.
* e.g. The moment a touch sensor is pressed, rather than the entire time it is being pressed.
* 3) Scheduling rumble commands based on timed events. e.g. 10 seconds prior to endgame
* 4) Rumble on non-overlapping mechanical actions. e.g. arm fully-extended or fully-retracted.
* 5) Use isRumbling() to hold off on a new rumble if one is already in progress.
*
* The examples shown here are representstive of how to invoke a gamepad rumble.
* It is assumed that these will be modified to suit the specific robot and team strategy needs.
*
* ######## Read the telemetry display on the Driver Station Screen for instructions. ######
*
* Ex 1) This example shows a) how to create a custom rumble effect, and then b) how to trigger it based
* on game time. One use for this might be to alert the driver that half-time or End-game is approaching.
*
* Ex 2) This example shows tying the rumble power to a changing sensor value.
* In this case it is the Gamepad trigger, but it could be any sensor output scaled to the 0-1 range.
* Since it takes over the rumble motors, it is only performed when the Left Bumper is pressed.
* Note that this approach MUST include a way to turn OFF the rumble when the button is released.
*
* Ex 3) This example shows a simple way to trigger a 3-blip sequence. In this case it is
* triggered by the gamepad A (Cross) button, but it could be any sensor, like a touch or light sensor.
* Note that this code ensures that it only rumbles once when the input goes true.
*
* Ex 4) This example shows how to trigger a single rumble when an input value gets over a certain value.
* In this case it is reading the Right Trigger, but it could be any variable sensor, like a
* range sensor, or position sensor. The code needs to ensure that it is only triggered once, so
* it waits till the sensor drops back below the threshold before it can trigger again.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@Disabled
@TeleOp(name="Concept: Gamepad Rumble", group ="Concept")
public class ConceptGamepadRumble extends LinearOpMode
{
boolean lastA = false; // Use to track the prior button state.
boolean lastLB = false; // Use to track the prior button state.
boolean highLevel = false; // used to prevent multiple level-based rumbles.
boolean secondHalf = false; // Use to prevent multiple half-time warning rumbles.
Gamepad.RumbleEffect customRumbleEffect; // Use to build a custom rumble sequence.
ElapsedTime runtime = new ElapsedTime(); // Use to determine when end game is starting.
final double HALF_TIME = 60.0; // Wait this many seconds before rumble-alert for half-time.
final double TRIGGER_THRESHOLD = 0.75; // Squeeze more than 3/4 to get rumble.
@Override
public void runOpMode()
{
// Example 1. a) start by creating a three-pulse rumble sequence: right, LEFT, LEFT
customRumbleEffect = new Gamepad.RumbleEffect.Builder()
.addStep(0.0, 1.0, 500) // Rumble right motor 100% for 500 mSec
.addStep(0.0, 0.0, 300) // Pause for 300 mSec
.addStep(1.0, 0.0, 250) // Rumble left motor 100% for 250 mSec
.addStep(0.0, 0.0, 250) // Pause for 250 mSec
.addStep(1.0, 0.0, 250) // Rumble left motor 100% for 250 mSec
.build();
telemetry.addData(">", "Press Start");
telemetry.update();
waitForStart();
runtime.reset(); // Start game timer.
// Loop while monitoring buttons for rumble triggers
while (opModeIsActive())
{
// Read and save the current gamepad button states.
boolean currentA = gamepad1.a ;
boolean currentLB = gamepad1.left_bumper ;
// Display the current Rumble status. Just for interest.
telemetry.addData(">", "Are we RUMBLING? %s\n", gamepad1.isRumbling() ? "YES" : "no" );
// ----------------------------------------------------------------------------------------
// Example 1. b) Watch the runtime timer, and run the custom rumble when we hit half-time.
// Make sure we only signal once by setting "secondHalf" flag to prevent further rumbles.
// ----------------------------------------------------------------------------------------
if ((runtime.seconds() > HALF_TIME) && !secondHalf) {
gamepad1.runRumbleEffect(customRumbleEffect);
secondHalf =true;
}
// Display the time remaining while we are still counting down.
if (!secondHalf) {
telemetry.addData(">", "Halftime Alert Countdown: %3.0f Sec \n", (HALF_TIME - runtime.seconds()) );
}
// ----------------------------------------------------------------------------------------
// Example 2. If Left Bumper is being pressed, power the rumble motors based on the two trigger depressions.
// This is useful to see how the rumble feels at various power levels.
// ----------------------------------------------------------------------------------------
if (currentLB) {
// Left Bumper is being pressed, so send left and right "trigger" values to left and right rumble motors.
gamepad1.rumble(gamepad1.left_trigger, gamepad1.right_trigger, Gamepad.RUMBLE_DURATION_CONTINUOUS);
// Show what is being sent to rumbles
telemetry.addData(">", "Squeeze triggers to control rumbles");
telemetry.addData("> : Rumble", "Left: %.0f%% Right: %.0f%%", gamepad1.left_trigger * 100, gamepad1.right_trigger * 100);
} else {
// Make sure rumble is turned off when Left Bumper is released (only one time each press)
if (lastLB) {
gamepad1.stopRumble();
}
// Prompt for manual rumble action
telemetry.addData(">", "Hold Left-Bumper to test Manual Rumble");
telemetry.addData(">", "Press A (Cross) for three blips");
telemetry.addData(">", "Squeeze right trigger slowly for 1 blip");
}
lastLB = currentLB; // remember the current button state for next time around the loop
// ----------------------------------------------------------------------------------------
// Example 3. Blip 3 times at the moment that A (Cross) is pressed. (look for pressed transition)
// BUT !!! Skip it altogether if the Gamepad is already rumbling.
// ----------------------------------------------------------------------------------------
if (currentA && !lastA) {
if (!gamepad1.isRumbling()) // Check for possible overlap of rumbles.
gamepad1.rumbleBlips(3);
}
lastA = currentA; // remember the current button state for next time around the loop
// ----------------------------------------------------------------------------------------
// Example 4. Rumble once when gamepad right trigger goes above the THRESHOLD.
// ----------------------------------------------------------------------------------------
if (gamepad1.right_trigger > TRIGGER_THRESHOLD) {
if (!highLevel) {
gamepad1.rumble(0.9, 0, 200); // 200 mSec burst on left motor.
highLevel = true; // Hold off any more triggers
}
} else {
highLevel = false; // We can trigger again now.
}
// Send the telemetry data to the Driver Station, and then pause to pace the program.
telemetry.update();
sleep(10);
}
}
}

View File

@ -0,0 +1,77 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/*
* This OpMode illustrates using the touchpad feature found on some gamepads.
*
* The Sony PS4 gamepad can detect two distinct touches on the central touchpad.
* Other gamepads with different touchpads may provide mixed results.
*
* The touchpads are accessed through the standard gamepad1 and gamepad2 objects.
* Several new members were added to the Gamepad class in FTC SDK Rev 7
*
* .touchpad_finger_1 returns true if at least one finger is detected.
* .touchpad_finger_1_x finger 1 X coordinate. Valid if touchpad_finger_1 is true
* .touchpad_finger_1_y finger 1 Y coordinate. Valid if touchpad_finger_1 is true
*
* .touchpad_finger_2 returns true if a second finger is detected
* .touchpad_finger_2_x finger 2 X coordinate. Valid if touchpad_finger_2 is true
* .touchpad_finger_2_y finger 2 Y coordinate. Valid if touchpad_finger_2 is true
*
* Finger touches are reported with an X and Y coordinate in following coordinate system.
*
* 1) X is the Horizontal axis, and Y is the vertical axis
* 2) The 0,0 origin is at the center of the touchpad.
* 3) 1.0, 1.0 is at the top right corner of the touchpad.
* 4) -1.0,-1.0 is at the bottom left corner of the touchpad.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@Disabled
@TeleOp(name="Concept: Gamepad Touchpad", group ="Concept")
public class ConceptGamepadTouchpad extends LinearOpMode
{
@Override
public void runOpMode()
{
telemetry.setDisplayFormat(Telemetry.DisplayFormat.MONOSPACE);
telemetry.addData(">", "Press Start");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
boolean finger = false;
// Display finger 1 x & y position if finger detected
if(gamepad1.touchpad_finger_1)
{
finger = true;
telemetry.addLine(String.format("Finger 1: x=%5.2f y=%5.2f\n", gamepad1.touchpad_finger_1_x, gamepad1.touchpad_finger_1_y));
}
// Display finger 2 x & y position if finger detected
if(gamepad1.touchpad_finger_2)
{
finger = true;
telemetry.addLine(String.format("Finger 2: x=%5.2f y=%5.2f\n", gamepad1.touchpad_finger_2_x, gamepad1.touchpad_finger_2_y));
}
if(!finger)
{
telemetry.addLine("No fingers");
}
telemetry.update();
sleep(10);
}
}
}

View File

@ -0,0 +1,227 @@
/* Copyright (c) 2019 Phil Malone. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.lynx.LynxModule;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.util.ElapsedTime;
import java.util.List;
/*
* This OpMode illustrates how to use the Expansion Hub's Bulk-Read feature to speed up control cycle times.
* In this example there are 4 motors that need their encoder positions, and velocities read.
* The sample is written to work with one or two expansion hubs, with no assumption as to where the motors are located.
*
* Three scenarios are tested:
* Cache Mode = OFF This is the normal default, where no cache is used, and every read produces a discrete transaction with
* an expansion hub, which is the slowest approach, but guarentees that the value is as fresh (recent) as possible..
*
* Cache Mode = AUTO This mode will attempt to minimize the number of discrete read commands, by performing bulk-reads
* and then returning values that have been cached. The cache is updated automatically whenever any specific encoder is re-read.
* This mode will always return new data, but it may perform more bulk-reads than absolutely required.
* Extra reads will be performed if multiple encoder/velocity reads are performed on the same encoder in one control cycle.
* This mode is a good compromise between the OFF and MANUAL modes.
* Note: If there are significant user-program delays between encoder reads, the cached value may not be fresh (recent).
* You can issue a clearBulkCache() call at any time force a fresh bulk-read on the next encoder read.
*
* Cache Mode = MANUAL This mode requires the user's code to determine the best time to clear the cached bulk-read data.
* Well organized code will reset the cache once at the beginning of the control cycle, and then immediately read and store all the encoder values.
* This approach will produce the shortest cycle times, but it does require the user to manually clear the cache.
* Since NO automatic Bulk-Reads are performed, neglecting to clear the bulk cache will result in the same values being returned
* each time an encoder read is performed.
*
* -------------------------------------
*
* General tip to speed up your control cycles:
*
* No matter what method you use to read encoders and other inputs, you should try to
* avoid reading the same encoder input multiple times around a control loop.
* Under normal conditions, this will slow down the control loop.
* The preferred method is to read all the required inputs ONCE at the beginning of the loop,
* and save the values in variable that can be used by other parts of the control code.
*
* eg: if you are sending encoder positions to your telemetry display, putting a getCurrentPosition()
* call in the telemetry statement will force the code to go and get another copy which will take time.
* It's much better read the position into a variable once, and use that variable for control AND display.
* Reading saved variables takes no time at all.
*
* Once you put all your sensor reads at the beginning of the control cycle, it's very easy to use
* the bulk-read AUTO mode to streamline your cycle timing.
*/
@TeleOp (name = "Motor Bulk Reads", group = "Tests")
@Disabled
public class ConceptMotorBulkRead extends LinearOpMode {
final int TEST_CYCLES = 500; // Number of control cycles to run to determine cycle times.
private DcMotorEx m1, m2, m3, m4; // Motor Objects
private long e1, e2, e3, e4; // Encoder Values
private double v1, v2, v3, v4; // Velocities
// Cycle Times
double t1 = 0;
double t2 = 0;
double t3 = 0;
@Override
public void runOpMode() {
int cycles;
// Important Step 1: Make sure you use DcMotorEx when you instantiate your motors.
m1 = hardwareMap.get(DcMotorEx.class, "m1"); // Configure the robot to use these 4 motor names,
m2 = hardwareMap.get(DcMotorEx.class, "m2"); // or change these strings to match your existing Robot Configuration.
m3 = hardwareMap.get(DcMotorEx.class, "m3");
m4 = hardwareMap.get(DcMotorEx.class, "m4");
// Important Step 2: Get access to a list of Expansion Hub Modules to enable changing caching methods.
List<LynxModule> allHubs = hardwareMap.getAll(LynxModule.class);
ElapsedTime timer = new ElapsedTime();
telemetry.addData(">", "Press play to start tests");
telemetry.addData(">", "Test results will update for each access method.");
telemetry.update();
waitForStart();
// --------------------------------------------------------------------------------------
// Run control loop using legacy encoder reads
// In this mode, a single read is done for each encoder position, and a bulk read is done for each velocity read.
// This is the worst case scenario.
// This is the same as using LynxModule.BulkCachingMode.OFF
// --------------------------------------------------------------------------------------
displayCycleTimes("Test 1 of 3 (Wait for completion)");
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
e1 = m1.getCurrentPosition();
e2 = m2.getCurrentPosition();
e3 = m3.getCurrentPosition();
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t1 = timer.milliseconds() / cycles;
displayCycleTimes("Test 2 of 3 (Wait for completion)");
// --------------------------------------------------------------------------------------
// Run test cycles using AUTO cache mode
// In this mode, only one bulk read is done per cycle, UNLESS you read a specific encoder/velocity item AGAIN in that cycle.
// --------------------------------------------------------------------------------------
// Important Step 3: Option A. Set all Expansion hubs to use the AUTO Bulk Caching mode
for (LynxModule module : allHubs) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.AUTO);
}
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
e1 = m1.getCurrentPosition(); // Uses 1 bulk-read for all 4 encoder/velocity reads,
e2 = m2.getCurrentPosition(); // but don't do any `get` operations more than once per cycle.
e3 = m3.getCurrentPosition();
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t2 = timer.milliseconds() / cycles;
displayCycleTimes("Test 3 of 3 (Wait for completion)");
// --------------------------------------------------------------------------------------
// Run test cycles using MANUAL cache mode
// In this mode, only one block read is done each control cycle.
// This is the MOST efficient method, but it does require that the cache is cleared manually each control cycle.
// --------------------------------------------------------------------------------------
// Important Step 3: Option B. Set all Expansion hubs to use the MANUAL Bulk Caching mode
for (LynxModule module : allHubs) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.MANUAL);
}
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
// Important Step 4: If you are using MANUAL mode, you must clear the BulkCache once per control cycle
for (LynxModule module : allHubs) {
module.clearBulkCache();
}
e1 = m1.getCurrentPosition(); // Uses 1 bulk-read to obtain ALL the motor data
e2 = m2.getCurrentPosition(); // There is no penalty for doing more `get` operations in this cycle,
e3 = m3.getCurrentPosition(); // but they will return the same data.
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t3 = timer.milliseconds() / cycles;
displayCycleTimes("Complete");
// wait until op-mode is stopped by user, before clearing display.
while (opModeIsActive()) ;
}
// Display three comparison times.
void displayCycleTimes(String status) {
telemetry.addData("Testing", status);
telemetry.addData("Cache = OFF", "%5.1f mS/cycle", t1);
telemetry.addData("Cache = AUTO", "%5.1f mS/cycle", t2);
telemetry.addData("Cache = MANUAL", "%5.1f mS/cycle", t3);
telemetry.update();
}
}

View File

@ -0,0 +1,89 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* Demonstrates an empty iterative OpMode
*/
@TeleOp(name = "Concept: NullOp", group = "Concept")
@Disabled
public class ConceptNullOp extends OpMode {
private ElapsedTime runtime = new ElapsedTime();
/**
* This method will be called once, when the INIT button is pressed.
*/
@Override
public void init() {
telemetry.addData("Status", "Initialized");
}
/**
* This method will be called repeatedly during the period between when
* the init button is pressed and when the play button is pressed (or the
* OpMode is stopped).
*/
@Override
public void init_loop() {
}
/**
* This method will be called once, when the play button is pressed.
*/
@Override
public void start() {
runtime.reset();
}
/**
* This method will be called repeatedly during the period between when
* the play button is pressed and when the OpMode is stopped.
*/
@Override
public void loop() {
telemetry.addData("Status", "Run Time: " + runtime.toString());
}
/**
* This method will be called once, when this OpMode is stopped.
* <p>
* Your ability to control hardware from this method will be limited.
*/
@Override
public void stop() {
}
}

View File

@ -0,0 +1,114 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
/*
* This OpMode ramps a single motor speed up and down repeatedly until Stop is pressed.
* The code is structured as a LinearOpMode
*
* This code assumes a DC motor configured with the name "left_drive" as is found on a Robot.
*
* INCREMENT sets how much to increase/decrease the power each cycle
* CYCLE_MS sets the update period.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: Ramp Motor Speed", group = "Concept")
@Disabled
public class ConceptRampMotorSpeed extends LinearOpMode {
static final double INCREMENT = 0.01; // amount to ramp motor each CYCLE_MS cycle
static final int CYCLE_MS = 50; // period of each cycle
static final double MAX_FWD = 1.0; // Maximum FWD power applied to motor
static final double MAX_REV = -1.0; // Maximum REV power applied to motor
// Define class members
DcMotor motor;
double power = 0;
boolean rampUp = true;
@Override
public void runOpMode() {
// Connect to motor (Assume standard left wheel)
// Change the text in quotes to match any motor name on your robot.
motor = hardwareMap.get(DcMotor.class, "left_drive");
// Wait for the start button
telemetry.addData(">", "Press Start to run Motors." );
telemetry.update();
waitForStart();
// Ramp motor speeds till stop pressed.
while(opModeIsActive()) {
// Ramp the motors, according to the rampUp variable.
if (rampUp) {
// Keep stepping up until we hit the max value.
power += INCREMENT ;
if (power >= MAX_FWD ) {
power = MAX_FWD;
rampUp = !rampUp; // Switch ramp direction
}
}
else {
// Keep stepping down until we hit the min value.
power -= INCREMENT ;
if (power <= MAX_REV ) {
power = MAX_REV;
rampUp = !rampUp; // Switch ramp direction
}
}
// Display the current value
telemetry.addData("Motor Power", "%5.2f", power);
telemetry.addData(">", "Press Stop to end test." );
telemetry.update();
// Set the motor to the new power and pause;
motor.setPower(power);
sleep(CYCLE_MS);
idle();
}
// Turn off motor and signal done;
motor.setPower(0);
telemetry.addData(">", "Done");
telemetry.update();
}
}

View File

@ -0,0 +1,111 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/*
* This OpMode executes a basic Tank Drive Teleop for a two wheeled robot using two REV SPARKminis.
* To use this example, connect two REV SPARKminis into servo ports on the Expansion Hub. On the
* robot configuration, use the drop down list under 'Servos' to select 'REV SPARKmini Controller'
* and name them 'left_drive' and 'right_drive'.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="REV SPARKmini Simple Drive Example", group="Concept")
@Disabled
public class ConceptRevSPARKMini extends LinearOpMode {
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotorSimple leftDrive = null;
private DcMotorSimple rightDrive = null;
@Override
public void runOpMode() {
telemetry.addData("Status", "Initialized");
telemetry.update();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotorSimple.class, "left_drive");
rightDrive = hardwareMap.get(DcMotorSimple.class, "right_drive");
// Most robots need the motor on one side to be reversed to drive forward
// Reverse the motor that runs backward when connected directly to the battery
leftDrive.setDirection(DcMotorSimple.Direction.FORWARD);
rightDrive.setDirection(DcMotorSimple.Direction.REVERSE);
// Wait for the game to start (driver presses PLAY)
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
telemetry.update();
}
}
}

View File

@ -0,0 +1,115 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Servo;
/*
* This OpMode scans a single servo back and forward until Stop is pressed.
* The code is structured as a LinearOpMode
* INCREMENT sets how much to increase/decrease the servo position each cycle
* CYCLE_MS sets the update period.
*
* This code assumes a Servo configured with the name "left_hand" as is found on a Robot.
*
* NOTE: When any servo position is set, ALL attached servos are activated, so ensure that any other
* connected servos are able to move freely before running this test.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: Scan Servo", group = "Concept")
@Disabled
public class ConceptScanServo extends LinearOpMode {
static final double INCREMENT = 0.01; // amount to slew servo each CYCLE_MS cycle
static final int CYCLE_MS = 50; // period of each cycle
static final double MAX_POS = 1.0; // Maximum rotational position
static final double MIN_POS = 0.0; // Minimum rotational position
// Define class members
Servo servo;
double position = (MAX_POS - MIN_POS) / 2; // Start at halfway position
boolean rampUp = true;
@Override
public void runOpMode() {
// Connect to servo (Assume Robot Left Hand)
// Change the text in quotes to match any servo name on your robot.
servo = hardwareMap.get(Servo.class, "left_hand");
// Wait for the start button
telemetry.addData(">", "Press Start to scan Servo." );
telemetry.update();
waitForStart();
// Scan servo till stop pressed.
while(opModeIsActive()){
// slew the servo, according to the rampUp (direction) variable.
if (rampUp) {
// Keep stepping up until we hit the max value.
position += INCREMENT ;
if (position >= MAX_POS ) {
position = MAX_POS;
rampUp = !rampUp; // Switch ramp direction
}
}
else {
// Keep stepping down until we hit the min value.
position -= INCREMENT ;
if (position <= MIN_POS ) {
position = MIN_POS;
rampUp = !rampUp; // Switch ramp direction
}
}
// Display the current value
telemetry.addData("Servo Position", "%5.2f", position);
telemetry.addData(">", "Press Stop to end test." );
telemetry.update();
// Set the servo to the new position and pause;
servo.setPosition(position);
sleep(CYCLE_MS);
idle();
}
// Signal done;
telemetry.addData(">", "Done");
telemetry.update();
}
}

View File

@ -0,0 +1,133 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
/*
* This OpMode demonstrates how to play simple sounds on both the RC and DS phones.
* It illustrates how to build sounds into your application as a resource.
* This technique is best suited for use with Android Studio since it assumes you will be creating a new application
*
* If you are using OnBotJava, please see the ConceptSoundsOnBotJava sample
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* Operation:
*
* Gamepad X & B buttons are used to trigger sounds in this example, but any event can be used.
* Note: Time should be allowed for sounds to complete before playing other sounds.
*
* For sound files to be used as a compiled-in resource, they need to be located in a folder called "raw" under your "res" (resources) folder.
* You can create your own "raw" folder from scratch, or you can copy the one from the FtcRobotController module.
*
* Android Studio coders will ultimately need a folder in your path as follows:
* <project root>/TeamCode/src/main/res/raw
*
* Copy any .wav files you want to play into this folder.
* Make sure that your files ONLY use lower-case characters, and have no spaces or special characters other than underscore.
*
* The name you give your .wav files will become the resource ID for these sounds.
* eg: gold.wav becomes R.raw.gold
*
* If you wish to use the sounds provided for this sample, they are located in:
* <project root>/FtcRobotController/src/main/res/raw
* You can copy and paste the entire 'raw' folder using Android Studio.
*
*/
@TeleOp(name="Concept: Sound Resources", group="Concept")
@Disabled
public class ConceptSoundsASJava extends LinearOpMode {
// Declare OpMode members.
private boolean goldFound; // Sound file present flags
private boolean silverFound;
private boolean isX = false; // Gamepad button state variables
private boolean isB = false;
private boolean wasX = false; // Gamepad button history variables
private boolean WasB = false;
@Override
public void runOpMode() {
// Determine Resource IDs for sounds built into the RC application.
int silverSoundID = hardwareMap.appContext.getResources().getIdentifier("silver", "raw", hardwareMap.appContext.getPackageName());
int goldSoundID = hardwareMap.appContext.getResources().getIdentifier("gold", "raw", hardwareMap.appContext.getPackageName());
// Determine if sound resources are found.
// Note: Preloading is NOT required, but it's a good way to verify all your sounds are available before you run.
if (goldSoundID != 0)
goldFound = SoundPlayer.getInstance().preload(hardwareMap.appContext, goldSoundID);
if (silverSoundID != 0)
silverFound = SoundPlayer.getInstance().preload(hardwareMap.appContext, silverSoundID);
// Display sound status
telemetry.addData("gold resource", goldFound ? "Found" : "NOT found\n Add gold.wav to /src/main/res/raw" );
telemetry.addData("silver resource", silverFound ? "Found" : "Not found\n Add silver.wav to /src/main/res/raw" );
// Wait for the game to start (driver presses PLAY)
telemetry.addData(">", "Press Start to continue");
telemetry.update();
waitForStart();
telemetry.addData(">", "Press X, B to play sounds.");
telemetry.update();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// say Silver each time gamepad X is pressed (This sound is a resource)
if (silverFound && (isX = gamepad1.x) && !wasX) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, silverSoundID);
telemetry.addData("Playing", "Resource Silver");
telemetry.update();
}
// say Gold each time gamepad B is pressed (This sound is a resource)
if (goldFound && (isB = gamepad1.b) && !WasB) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, goldSoundID);
telemetry.addData("Playing", "Resource Gold");
telemetry.update();
}
// Save last button states
wasX = isX;
WasB = isB;
}
}
}

View File

@ -0,0 +1,120 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.io.File;
/*
* This OpMode demonstrates how to play simple sounds on both the RC and DS phones.
* It illustrates how to play sound files that have been copied to the RC Phone
* This technique is best suited for use with OnBotJava since it does not require the app to be modified.
*
* Operation:
*
* Gamepad X & B buttons are used to trigger sounds in this example, but any event can be used.
* Note: Time should be allowed for sounds to complete before playing other sounds.
*
* To play a new sound, you will need to copy the .wav files to the phone, and then provide the full path to them as part of your OpMode.
* This is done in this sample for the two sound files. silver.wav and gold.wav
*
* You can put the files in a variety of soundPaths, but we recommend you put them in the /FIRST/blocks/sounds folder.
* Your OpModes will have guaranteed access to this folder, and you can transfer files into this folder using the BLOCKS web page.
* -- There is a link called "sounds" on the right hand side of the color bar on the BLOCKS page that can be used to send sound files to this folder by default.
* Or you can use Windows File Manager, or ADB to transfer the sound files
*
* To get full use of THIS sample, you will need to copy two sound file called silver.wav and gold.wav to /FIRST/blocks/sounds on the RC phone.
* They can be located here:
* https://github.com/ftctechnh/ftc_app/tree/master/FtcRobotController/src/main/res/raw/gold.wav
* https://github.com/ftctechnh/ftc_app/tree/master/FtcRobotController/src/main/res/raw/silver.wav
*/
@TeleOp(name="Concept: Sound Files", group="Concept")
@Disabled
public class ConceptSoundsOnBotJava extends LinearOpMode {
// Point to sound files on the phone's drive
private String soundPath = "/FIRST/blocks/sounds";
private File goldFile = new File("/sdcard" + soundPath + "/gold.wav");
private File silverFile = new File("/sdcard" + soundPath + "/silver.wav");
// Declare OpMode members.
private boolean isX = false; // Gamepad button state variables
private boolean isB = false;
private boolean wasX = false; // Gamepad button history variables
private boolean WasB = false;
@Override
public void runOpMode() {
// Make sure that the sound files exist on the phone
boolean goldFound = goldFile.exists();
boolean silverFound = silverFile.exists();
// Display sound status
telemetry.addData("gold sound", goldFound ? "Found" : "NOT Found \nCopy gold.wav to " + soundPath );
telemetry.addData("silver sound", silverFound ? "Found" : "NOT Found \nCopy silver.wav to " + soundPath );
// Wait for the game to start (driver presses PLAY)
telemetry.addData(">", "Press Start to continue");
telemetry.update();
waitForStart();
telemetry.addData(">", "Press X or B to play sounds.");
telemetry.update();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// say Silver each time gamepad X is pressed (This sound is a resource)
if (silverFound && (isX = gamepad1.x) && !wasX) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, silverFile);
telemetry.addData("Playing", "Silver File");
telemetry.update();
}
// say Gold each time gamepad B is pressed (This sound is a resource)
if (goldFound && (isB = gamepad1.b) && !WasB) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, goldFile);
telemetry.addData("Playing", "Gold File");
telemetry.update();
}
// Save last button states
wasX = isX;
WasB = isB;
}
}
}

View File

@ -0,0 +1,122 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.content.Context;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
/*
* This OpMode demonstrates how to play one of the several SKYSTONE/Star Wars sounds loaded into the SDK.
* It does this by creating a simple "chooser" controlled by the gamepad Up Down buttons.
* This code also prevents sounds from stacking up by setting a "playing" flag, which is cleared when the sound finishes playing.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* Operation:
* Use the DPAD to change the selected sound, and the Right Bumper to play it.
*/
@TeleOp(name="SKYSTONE Sounds", group="Concept")
@Disabled
public class ConceptSoundsSKYSTONE extends LinearOpMode {
// List of available sound resources
String sounds[] = {"ss_alarm", "ss_bb8_down", "ss_bb8_up", "ss_darth_vader", "ss_fly_by",
"ss_mf_fail", "ss_laser", "ss_laser_burst", "ss_light_saber", "ss_light_saber_long", "ss_light_saber_short",
"ss_light_speed", "ss_mine", "ss_power_up", "ss_r2d2_up", "ss_roger_roger", "ss_siren", "ss_wookie" };
boolean soundPlaying = false;
@Override
public void runOpMode() {
// Variables for choosing from the available sounds
int soundIndex = 0;
int soundID = -1;
boolean was_dpad_up = false;
boolean was_dpad_down = false;
Context myApp = hardwareMap.appContext;
// create a sound parameter that holds the desired player parameters.
SoundPlayer.PlaySoundParams params = new SoundPlayer.PlaySoundParams();
params.loopControl = 0;
params.waitForNonLoopingSoundsToFinish = true;
// In this sample, we will skip waiting for the user to press play, and start displaying sound choices right away
while (!isStopRequested()) {
// Look for DPAD presses to change the selection
if (gamepad1.dpad_down && !was_dpad_down) {
// Go to next sound (with list wrap) and display it
soundIndex = (soundIndex + 1) % sounds.length;
}
if (gamepad1.dpad_up && !was_dpad_up) {
// Go to previous sound (with list wrap) and display it
soundIndex = (soundIndex + sounds.length - 1) % sounds.length;
}
// Look for trigger to see if we should play sound
// Only start a new sound if we are currently not playing one.
if (gamepad1.right_bumper && !soundPlaying) {
// Determine Resource IDs for the sounds you want to play, and make sure it's valid.
if ((soundID = myApp.getResources().getIdentifier(sounds[soundIndex], "raw", myApp.getPackageName())) != 0){
// Signal that the sound is now playing.
soundPlaying = true;
// Start playing, and also Create a callback that will clear the playing flag when the sound is complete.
SoundPlayer.getInstance().startPlaying(myApp, soundID, params, null,
new Runnable() {
public void run() {
soundPlaying = false;
}} );
}
}
// Remember the last state of the dpad to detect changes.
was_dpad_up = gamepad1.dpad_up;
was_dpad_down = gamepad1.dpad_down;
// Display the current sound choice, and the playing status.
telemetry.addData("", "Use DPAD up/down to choose sound.");
telemetry.addData("", "Press Right Bumper to play sound.");
telemetry.addData("", "");
telemetry.addData("Sound >", sounds[soundIndex]);
telemetry.addData("Status >", soundPlaying ? "Playing" : "Stopped");
telemetry.update();
}
}
}

View File

@ -0,0 +1,177 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.VoltageSensor;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/*
* This OpMode illustrates various ways in which telemetry can be
* transmitted from the robot controller to the driver station. The sample illustrates
* numeric and text data, formatted output, and optimized evaluation of expensive-to-acquire
* information. The telemetry log is illustrated by scrolling a poem
* to the driver station.
*
* Also see the Telemetry javadocs.
*/
@TeleOp(name = "Concept: Telemetry", group = "Concept")
@Disabled
public class ConceptTelemetry extends LinearOpMode {
/** Keeps track of the line of the poem which is to be emitted next */
int poemLine = 0;
/** Keeps track of how long it's been since we last emitted a line of poetry */
ElapsedTime poemElapsed = new ElapsedTime();
static final String[] poem = new String[] {
"Mary had a little lamb,",
"His fleece was white as snow,",
"And everywhere that Mary went,",
"The lamb was sure to go.",
"",
"He followed her to school one day,",
"Which was against the rule,",
"It made the children laugh and play",
"To see a lamb at school.",
"",
"And so the teacher turned it out,",
"But still it lingered near,",
"And waited patiently about,",
"Till Mary did appear.",
"",
"\"Why does the lamb love Mary so?\"",
"The eager children cry.",
"\"Why, Mary loves the lamb, you know,\"",
"The teacher did reply.",
"",
""
};
@Override public void runOpMode() {
/* we keep track of how long it's been since the OpMode was started, just
* to have some interesting data to show */
ElapsedTime opmodeRunTime = new ElapsedTime();
// We show the log in oldest-to-newest order, as that's better for poetry
telemetry.log().setDisplayOrder(Telemetry.Log.DisplayOrder.OLDEST_FIRST);
// We can control the number of lines shown in the log
telemetry.log().setCapacity(6);
// The interval between lines of poetry, in seconds
double sPoemInterval = 0.6;
/*
* Wait until we've been given the ok to go. For something to do, we emit the
* elapsed time as we sit here and wait. If we didn't want to do anything while
* we waited, we would just call waitForStart().
*/
while (!isStarted()) {
telemetry.addData("time", "%.1f seconds", opmodeRunTime.seconds());
telemetry.update();
idle();
}
// Ok, we've been given the ok to go
/*
* As an illustration, the first line on our telemetry display will display the battery voltage.
* The idea here is that it's expensive to compute the voltage (at least for purposes of illustration)
* so you don't want to do it unless the data is _actually_ going to make it to the
* driver station (recall that telemetry transmission is throttled to reduce bandwidth use.
* Note that getBatteryVoltage() below returns 'Infinity' if there's no voltage sensor attached.
*
* @see Telemetry#getMsTransmissionInterval()
*/
telemetry.addData("voltage", "%.1f volts", new Func<Double>() {
@Override public Double value() {
return getBatteryVoltage();
}
});
// Reset to keep some timing stats for the post-'start' part of the OpMode
opmodeRunTime.reset();
int loopCount = 1;
// Go go gadget robot!
while (opModeIsActive()) {
// Emit poetry if it's been a while
if (poemElapsed.seconds() > sPoemInterval) {
emitPoemLine();
}
// As an illustration, show some loop timing information
telemetry.addData("loop count", loopCount);
telemetry.addData("ms/loop", "%.3f ms", opmodeRunTime.milliseconds() / loopCount);
// Show joystick information as some other illustrative data
telemetry.addLine("left joystick | ")
.addData("x", gamepad1.left_stick_x)
.addData("y", gamepad1.left_stick_y);
telemetry.addLine("right joystick | ")
.addData("x", gamepad1.right_stick_x)
.addData("y", gamepad1.right_stick_y);
/*
* Transmit the telemetry to the driver station, subject to throttling.
* See the documentation for Telemetry.getMsTransmissionInterval() for more information.
*/
telemetry.update();
// Update loop info
loopCount++;
}
}
// emits a line of poetry to the telemetry log
void emitPoemLine() {
telemetry.log().add(poem[poemLine]);
poemLine = (poemLine+1) % poem.length;
poemElapsed.reset();
}
// Computes the current battery voltage
double getBatteryVoltage() {
double result = Double.POSITIVE_INFINITY;
for (VoltageSensor sensor : hardwareMap.voltageSensor) {
double voltage = sensor.getVoltage();
if (voltage > 0) {
result = Math.min(result, voltage);
}
}
return result;
}
}

View File

@ -0,0 +1,199 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection,
* including Java Builder structures for specifying Vision parameters.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetection extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
// TFOD_MODEL_ASSET points to a model file stored in the project Asset location,
// this is only used for Android Studio when using models in Assets.
private static final String TFOD_MODEL_ASSET = "MyModelStoredAsAsset.tflite";
// TFOD_MODEL_FILE points to a model file stored onboard the Robot Controller's storage,
// this is used when uploading models directly to the RC using the model upload interface.
private static final String TFOD_MODEL_FILE = "/sdcard/FIRST/tflitemodels/myCustomModel.tflite";
// Define the labels recognized in the model for TFOD (must be in training order!)
private static final String[] LABELS = {
"Pixel",
};
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor by using a builder.
tfod = new TfodProcessor.Builder()
// With the following lines commented out, the default TfodProcessor Builder
// will load the default model for the season. To define a custom model to load,
// choose one of the following:
// Use setModelAssetName() if the custom TF Model is built in as an asset (AS only).
// Use setModelFileName() if you have downloaded a custom team model to the Robot Controller.
//.setModelAssetName(TFOD_MODEL_ASSET)
//.setModelFileName(TFOD_MODEL_FILE)
// The following default settings are available to un-comment and edit as needed to
// set parameters for custom models.
//.setModelLabels(LABELS)
//.setIsModelTensorFlow2(true)
//.setIsModelQuantized(true)
//.setModelInputSize(300)
//.setModelAspectRatio(16.0 / 9.0)
.build();
// Create the vision portal by using a builder.
VisionPortal.Builder builder = new VisionPortal.Builder();
// Set the camera (webcam vs. built-in RC phone camera).
if (USE_WEBCAM) {
builder.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"));
} else {
builder.setCamera(BuiltinCameraDirection.BACK);
}
// Choose a camera resolution. Not all cameras support all resolutions.
//builder.setCameraResolution(new Size(640, 480));
// Enable the RC preview (LiveView). Set "false" to omit camera monitoring.
//builder.enableLiveView(true);
// Set the stream format; MJPEG uses less bandwidth than default YUY2.
//builder.setStreamFormat(VisionPortal.StreamFormat.YUY2);
// Choose whether or not LiveView stops if no processors are enabled.
// If set "true", monitor shows solid orange screen if no processors enabled.
// If set "false", monitor shows camera view without annotations.
//builder.setAutoStopLiveView(false);
// Set and enable the processor.
builder.addProcessor(tfod);
// Build the Vision Portal, using the above settings.
visionPortal = builder.build();
// Set confidence threshold for TFOD recognitions, at any time.
//tfod.setMinResultConfidence(0.75f);
// Disable or re-enable the TFOD processor at any time.
//visionPortal.setProcessorEnabled(tfod, true);
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -0,0 +1,142 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* the easiest way.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Easy", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionEasy extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor the easy way.
tfod = TfodProcessor.easyCreateWithDefaults();
// Create the vision portal the easy way.
if (USE_WEBCAM) {
visionPortal = VisionPortal.easyCreateWithDefaults(
hardwareMap.get(WebcamName.class, "Webcam 1"), tfod);
} else {
visionPortal = VisionPortal.easyCreateWithDefaults(
BuiltinCameraDirection.BACK, tfod);
}
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -0,0 +1,186 @@
/* Copyright (c) 2020 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.VisionPortal.CameraState;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* two webcams.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Switchable Cameras", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionSwitchableCameras extends LinearOpMode {
/**
* Variables used for switching cameras.
*/
private WebcamName webcam1, webcam2;
private boolean oldLeftBumper;
private boolean oldRightBumper;
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryCameraSwitching();
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
doCameraSwitching();
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor by using a builder.
tfod = new TfodProcessor.Builder().build();
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
CameraName switchableCamera = ClassFactory.getInstance()
.getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
// Create the vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(switchableCamera)
.addProcessor(tfod)
.build();
} // end method initTfod()
/**
* Add telemetry about camera switching.
*/
private void telemetryCameraSwitching() {
if (visionPortal.getActiveCamera().equals(webcam1)) {
telemetry.addData("activeCamera", "Webcam 1");
telemetry.addData("Press RightBumper", "to switch to Webcam 2");
} else {
telemetry.addData("activeCamera", "Webcam 2");
telemetry.addData("Press LeftBumper", "to switch to Webcam 1");
}
} // end method telemetryCameraSwitching()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
/**
* Set the active camera according to input from the gamepad.
*/
private void doCameraSwitching() {
if (visionPortal.getCameraState() == CameraState.STREAMING) {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
visionPortal.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
visionPortal.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
}
} // end method doCameraSwitching()
} // end class

View File

@ -0,0 +1,187 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* This OpMode illustrates the concept of driving a path based on encoder counts.
* The code is structured as a LinearOpMode
*
* The code REQUIRES that you DO have encoders on the wheels,
* otherwise you would use: RobotAutoDriveByTime;
*
* This code ALSO requires that the drive Motors have been configured such that a positive
* power command moves them forward, and causes the encoders to count UP.
*
* The desired path in this example is:
* - Drive forward for 48 inches
* - Spin right for 12 Inches
* - Drive Backward for 24 inches
* - Stop and close the claw.
*
* The code is written using a method called: encoderDrive(speed, leftInches, rightInches, timeoutS)
* that performs the actual movement.
* This method assumes that each movement is relative to the last stopping place.
* There are other ways to perform encoder based moves, but this method is probably the simplest.
* This code uses the RUN_TO_POSITION mode to enable the Motor controllers to generate the run profile
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive By Encoder", group="Robot")
@Disabled
public class RobotAutoDriveByEncoder_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private ElapsedTime runtime = new ElapsedTime();
// Calculate the COUNTS_PER_INCH for your specific drive train.
// Go to your motor vendor website to determine your motor's COUNTS_PER_MOTOR_REV
// For external drive gearing, set DRIVE_GEAR_REDUCTION as needed.
// For example, use a value of 2.0 for a 12-tooth spur gear driving a 24-tooth spur gear.
// This is gearing DOWN for less speed and more torque.
// For gearing UP, use a gear ratio less than 1.0. Note this will affect the direction of wheel rotation.
static final double COUNTS_PER_MOTOR_REV = 1440 ; // eg: TETRIX Motor Encoder
static final double DRIVE_GEAR_REDUCTION = 1.0 ; // No External Gearing.
static final double WHEEL_DIAMETER_INCHES = 4.0 ; // For figuring circumference
static final double COUNTS_PER_INCH = (COUNTS_PER_MOTOR_REV * DRIVE_GEAR_REDUCTION) /
(WHEEL_DIAMETER_INCHES * 3.1415);
static final double DRIVE_SPEED = 0.6;
static final double TURN_SPEED = 0.5;
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
leftDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
rightDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Send telemetry message to indicate successful Encoder reset
telemetry.addData("Starting at", "%7d :%7d",
leftDrive.getCurrentPosition(),
rightDrive.getCurrentPosition());
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// Step through each leg of the path,
// Note: Reverse movement is obtained by setting a negative distance (not speed)
encoderDrive(DRIVE_SPEED, 48, 48, 5.0); // S1: Forward 47 Inches with 5 Sec timeout
encoderDrive(TURN_SPEED, 12, -12, 4.0); // S2: Turn Right 12 Inches with 4 Sec timeout
encoderDrive(DRIVE_SPEED, -24, -24, 4.0); // S3: Reverse 24 Inches with 4 Sec timeout
telemetry.addData("Path", "Complete");
telemetry.update();
sleep(1000); // pause to display final telemetry message.
}
/*
* Method to perform a relative move, based on encoder counts.
* Encoders are not reset as the move is based on the current position.
* Move will stop if any of three conditions occur:
* 1) Move gets to the desired position
* 2) Move runs out of time
* 3) Driver stops the OpMode running.
*/
public void encoderDrive(double speed,
double leftInches, double rightInches,
double timeoutS) {
int newLeftTarget;
int newRightTarget;
// Ensure that the OpMode is still active
if (opModeIsActive()) {
// Determine new target position, and pass to motor controller
newLeftTarget = leftDrive.getCurrentPosition() + (int)(leftInches * COUNTS_PER_INCH);
newRightTarget = rightDrive.getCurrentPosition() + (int)(rightInches * COUNTS_PER_INCH);
leftDrive.setTargetPosition(newLeftTarget);
rightDrive.setTargetPosition(newRightTarget);
// Turn On RUN_TO_POSITION
leftDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
rightDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
// reset the timeout time and start motion.
runtime.reset();
leftDrive.setPower(Math.abs(speed));
rightDrive.setPower(Math.abs(speed));
// keep looping while we are still active, and there is time left, and both motors are running.
// Note: We use (isBusy() && isBusy()) in the loop test, which means that when EITHER motor hits
// its target position, the motion will stop. This is "safer" in the event that the robot will
// always end the motion as soon as possible.
// However, if you require that BOTH motors have finished their moves before the robot continues
// onto the next step, use (isBusy() || isBusy()) in the loop test.
while (opModeIsActive() &&
(runtime.seconds() < timeoutS) &&
(leftDrive.isBusy() && rightDrive.isBusy())) {
// Display it for the driver.
telemetry.addData("Running to", " %7d :%7d", newLeftTarget, newRightTarget);
telemetry.addData("Currently at", " at %7d :%7d",
leftDrive.getCurrentPosition(), rightDrive.getCurrentPosition());
telemetry.update();
}
// Stop all motion;
leftDrive.setPower(0);
rightDrive.setPower(0);
// Turn off RUN_TO_POSITION
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
sleep(250); // optional pause after each move.
}
}
}

View File

@ -0,0 +1,429 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.IMU;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/*
* This OpMode illustrates the concept of driving an autonomous path based on Gyro (IMU) heading and encoder counts.
* The code is structured as a LinearOpMode
*
* The path to be followed by the robot is built from a series of drive, turn or pause steps.
* Each step on the path is defined by a single function call, and these can be strung together in any order.
*
* The code REQUIRES that you have encoders on the drive motors, otherwise you should use: RobotAutoDriveByTime;
*
* This code uses the Universal IMU interface so it will work with either the BNO055, or BHI260 IMU.
* To run as written, the Control/Expansion hub should be mounted horizontally on a flat part of the robot chassis.
* The REV Logo should be facing UP, and the USB port should be facing forward.
* If this is not the configuration of your REV Control Hub, then the code should be modified to reflect the correct orientation.
*
* This sample requires that the drive Motors have been configured with names : left_drive and right_drive.
* It also requires that a positive power command moves both motors forward, and causes the encoders to count UP.
* So please verify that both of your motors move the robot forward on the first move. If not, make the required correction.
* See the beginning of runOpMode() to set the FORWARD/REVERSE option for each motor.
*
* This code uses RUN_TO_POSITION mode for driving straight, and RUN_USING_ENCODER mode for turning and holding.
* Note: This code implements the requirement of calling setTargetPosition() at least once before switching to RUN_TO_POSITION mode.
*
* Notes:
*
* All angles are referenced to the coordinate-frame that is set whenever resetHeading() is called.
* In this sample, the heading is reset when the Start button is touched on the Driver station.
* Note: It would be possible to reset the heading after each move, but this would accumulate steering errors.
*
* The angle of movement/rotation is assumed to be a standardized rotation around the robot Z axis,
* which means that a Positive rotation is Counter Clockwise, looking down on the field.
* This is consistent with the FTC field coordinate conventions set out in the document:
* https://ftc-docs.firstinspires.org/field-coordinate-system
*
* Control Approach.
*
* To reach, or maintain a required heading, this code implements a basic Proportional Controller where:
*
* Steering power = Heading Error * Proportional Gain.
*
* "Heading Error" is calculated by taking the difference between the desired heading and the actual heading,
* and then "normalizing" it by converting it to a value in the +/- 180 degree range.
*
* "Proportional Gain" is a constant that YOU choose to set the "strength" of the steering response.
*
* Use Android Studio to Copy this Class, and Paste it into your "TeamCode" folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive By Gyro", group="Robot")
@Disabled
public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private IMU imu = null; // Control/Expansion Hub IMU
private double headingError = 0;
// These variable are declared here (as class members) so they can be updated in various methods,
// but still be displayed by sendTelemetry()
private double targetHeading = 0;
private double driveSpeed = 0;
private double turnSpeed = 0;
private double leftSpeed = 0;
private double rightSpeed = 0;
private int leftTarget = 0;
private int rightTarget = 0;
// Calculate the COUNTS_PER_INCH for your specific drive train.
// Go to your motor vendor website to determine your motor's COUNTS_PER_MOTOR_REV
// For external drive gearing, set DRIVE_GEAR_REDUCTION as needed.
// For example, use a value of 2.0 for a 12-tooth spur gear driving a 24-tooth spur gear.
// This is gearing DOWN for less speed and more torque.
// For gearing UP, use a gear ratio less than 1.0. Note this will affect the direction of wheel rotation.
static final double COUNTS_PER_MOTOR_REV = 537.7 ; // eg: GoBILDA 312 RPM Yellow Jacket
static final double DRIVE_GEAR_REDUCTION = 1.0 ; // No External Gearing.
static final double WHEEL_DIAMETER_INCHES = 4.0 ; // For figuring circumference
static final double COUNTS_PER_INCH = (COUNTS_PER_MOTOR_REV * DRIVE_GEAR_REDUCTION) /
(WHEEL_DIAMETER_INCHES * 3.1415);
// These constants define the desired driving/control characteristics
// They can/should be tweaked to suit the specific robot drive train.
static final double DRIVE_SPEED = 0.4; // Max driving speed for better distance accuracy.
static final double TURN_SPEED = 0.2; // Max Turn speed to limit turn rate
static final double HEADING_THRESHOLD = 1.0 ; // How close must the heading get to the target before moving to next step.
// Requiring more accuracy (a smaller number) will often make the turn take longer to get into the final position.
// Define the Proportional control coefficient (or GAIN) for "heading control".
// We define one value when Turning (larger errors), and the other is used when Driving straight (smaller errors).
// Increase these numbers if the heading does not corrects strongly enough (eg: a heavy robot or using tracks)
// Decrease these numbers if the heading does not settle on the correct value (eg: very agile robot with omni wheels)
static final double P_TURN_GAIN = 0.02; // Larger is more responsive, but also less stable
static final double P_DRIVE_GAIN = 0.03; // Larger is more responsive, but also less stable
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
/* The next two lines define Hub orientation.
* The Default Orientation (shown) is when a hub is mounted horizontally with the printed logo pointing UP and the USB port pointing FORWARD.
*
* To Do: EDIT these two lines to match YOUR mounting configuration.
*/
RevHubOrientationOnRobot.LogoFacingDirection logoDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
RevHubOrientationOnRobot.UsbFacingDirection usbDirection = RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logoDirection, usbDirection);
// Now initialize the IMU with this mounting orientation
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Ensure the robot is stationary. Reset the encoders and set the motors to BRAKE mode
leftDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
rightDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
leftDrive.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
rightDrive.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
// Wait for the game to start (Display Gyro value while waiting)
while (opModeInInit()) {
telemetry.addData(">", "Robot Heading = %4.0f", getHeading());
telemetry.update();
}
// Set the encoders for closed loop speed control, and reset the heading.
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
imu.resetYaw();
// Step through each leg of the path,
// Notes: Reverse movement is obtained by setting a negative distance (not speed)
// holdHeading() is used after turns to let the heading stabilize
// Add a sleep(2000) after any step to keep the telemetry data visible for review
driveStraight(DRIVE_SPEED, 24.0, 0.0); // Drive Forward 24"
turnToHeading( TURN_SPEED, -45.0); // Turn CW to -45 Degrees
holdHeading( TURN_SPEED, -45.0, 0.5); // Hold -45 Deg heading for a 1/2 second
driveStraight(DRIVE_SPEED, 17.0, -45.0); // Drive Forward 17" at -45 degrees (12"x and 12"y)
turnToHeading( TURN_SPEED, 45.0); // Turn CCW to 45 Degrees
holdHeading( TURN_SPEED, 45.0, 0.5); // Hold 45 Deg heading for a 1/2 second
driveStraight(DRIVE_SPEED, 17.0, 45.0); // Drive Forward 17" at 45 degrees (-12"x and 12"y)
turnToHeading( TURN_SPEED, 0.0); // Turn CW to 0 Degrees
holdHeading( TURN_SPEED, 0.0, 1.0); // Hold 0 Deg heading for 1 second
driveStraight(DRIVE_SPEED,-48.0, 0.0); // Drive in Reverse 48" (should return to approx. staring position)
telemetry.addData("Path", "Complete");
telemetry.update();
sleep(1000); // Pause to display last telemetry message.
}
/*
* ====================================================================================================
* Driving "Helper" functions are below this line.
* These provide the high and low level methods that handle driving straight and turning.
* ====================================================================================================
*/
// ********** HIGH Level driving functions. ********************
/**
* Drive in a straight line, on a fixed compass heading (angle), based on encoder counts.
* Move will stop if either of these conditions occur:
* 1) Move gets to the desired position
* 2) Driver stops the OpMode running.
*
* @param maxDriveSpeed MAX Speed for forward/rev motion (range 0 to +1.0) .
* @param distance Distance (in inches) to move from current position. Negative distance means move backward.
* @param heading Absolute Heading Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from the current robotHeading.
*/
public void driveStraight(double maxDriveSpeed,
double distance,
double heading) {
// Ensure that the OpMode is still active
if (opModeIsActive()) {
// Determine new target position, and pass to motor controller
int moveCounts = (int)(distance * COUNTS_PER_INCH);
leftTarget = leftDrive.getCurrentPosition() + moveCounts;
rightTarget = rightDrive.getCurrentPosition() + moveCounts;
// Set Target FIRST, then turn on RUN_TO_POSITION
leftDrive.setTargetPosition(leftTarget);
rightDrive.setTargetPosition(rightTarget);
leftDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
rightDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
// Set the required driving speed (must be positive for RUN_TO_POSITION)
// Start driving straight, and then enter the control loop
maxDriveSpeed = Math.abs(maxDriveSpeed);
moveRobot(maxDriveSpeed, 0);
// keep looping while we are still active, and BOTH motors are running.
while (opModeIsActive() &&
(leftDrive.isBusy() && rightDrive.isBusy())) {
// Determine required steering to keep on heading
turnSpeed = getSteeringCorrection(heading, P_DRIVE_GAIN);
// if driving in reverse, the motor correction also needs to be reversed
if (distance < 0)
turnSpeed *= -1.0;
// Apply the turning correction to the current driving speed.
moveRobot(driveSpeed, turnSpeed);
// Display drive status for the driver.
sendTelemetry(true);
}
// Stop all motion & Turn off RUN_TO_POSITION
moveRobot(0, 0);
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
}
}
/**
* Spin on the central axis to point in a new direction.
* <p>
* Move will stop if either of these conditions occur:
* <p>
* 1) Move gets to the heading (angle)
* <p>
* 2) Driver stops the OpMode running.
*
* @param maxTurnSpeed Desired MAX speed of turn. (range 0 to +1.0)
* @param heading Absolute Heading Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from current heading.
*/
public void turnToHeading(double maxTurnSpeed, double heading) {
// Run getSteeringCorrection() once to pre-calculate the current error
getSteeringCorrection(heading, P_DRIVE_GAIN);
// keep looping while we are still active, and not on heading.
while (opModeIsActive() && (Math.abs(headingError) > HEADING_THRESHOLD)) {
// Determine required steering to keep on heading
turnSpeed = getSteeringCorrection(heading, P_TURN_GAIN);
// Clip the speed to the maximum permitted value.
turnSpeed = Range.clip(turnSpeed, -maxTurnSpeed, maxTurnSpeed);
// Pivot in place by applying the turning correction
moveRobot(0, turnSpeed);
// Display drive status for the driver.
sendTelemetry(false);
}
// Stop all motion;
moveRobot(0, 0);
}
/**
* Obtain & hold a heading for a finite amount of time
* <p>
* Move will stop once the requested time has elapsed
* <p>
* This function is useful for giving the robot a moment to stabilize it's heading between movements.
*
* @param maxTurnSpeed Maximum differential turn speed (range 0 to +1.0)
* @param heading Absolute Heading Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from current heading.
* @param holdTime Length of time (in seconds) to hold the specified heading.
*/
public void holdHeading(double maxTurnSpeed, double heading, double holdTime) {
ElapsedTime holdTimer = new ElapsedTime();
holdTimer.reset();
// keep looping while we have time remaining.
while (opModeIsActive() && (holdTimer.time() < holdTime)) {
// Determine required steering to keep on heading
turnSpeed = getSteeringCorrection(heading, P_TURN_GAIN);
// Clip the speed to the maximum permitted value.
turnSpeed = Range.clip(turnSpeed, -maxTurnSpeed, maxTurnSpeed);
// Pivot in place by applying the turning correction
moveRobot(0, turnSpeed);
// Display drive status for the driver.
sendTelemetry(false);
}
// Stop all motion;
moveRobot(0, 0);
}
// ********** LOW Level driving functions. ********************
/**
* Use a Proportional Controller to determine how much steering correction is required.
*
* @param desiredHeading The desired absolute heading (relative to last heading reset)
* @param proportionalGain Gain factor applied to heading error to obtain turning power.
* @return Turning power needed to get to required heading.
*/
public double getSteeringCorrection(double desiredHeading, double proportionalGain) {
targetHeading = desiredHeading; // Save for telemetry
// Determine the heading current error
headingError = targetHeading - getHeading();
// Normalize the error to be within +/- 180 degrees
while (headingError > 180) headingError -= 360;
while (headingError <= -180) headingError += 360;
// Multiply the error by the gain to determine the required steering correction/ Limit the result to +/- 1.0
return Range.clip(headingError * proportionalGain, -1, 1);
}
/**
* Take separate drive (fwd/rev) and turn (right/left) requests,
* combines them, and applies the appropriate speed commands to the left and right wheel motors.
* @param drive forward motor speed
* @param turn clockwise turning motor speed.
*/
public void moveRobot(double drive, double turn) {
driveSpeed = drive; // save this value as a class member so it can be used by telemetry.
turnSpeed = turn; // save this value as a class member so it can be used by telemetry.
leftSpeed = drive - turn;
rightSpeed = drive + turn;
// Scale speeds down if either one exceeds +/- 1.0;
double max = Math.max(Math.abs(leftSpeed), Math.abs(rightSpeed));
if (max > 1.0)
{
leftSpeed /= max;
rightSpeed /= max;
}
leftDrive.setPower(leftSpeed);
rightDrive.setPower(rightSpeed);
}
/**
* Display the various control parameters while driving
*
* @param straight Set to true if we are driving straight, and the encoder positions should be included in the telemetry.
*/
private void sendTelemetry(boolean straight) {
if (straight) {
telemetry.addData("Motion", "Drive Straight");
telemetry.addData("Target Pos L:R", "%7d:%7d", leftTarget, rightTarget);
telemetry.addData("Actual Pos L:R", "%7d:%7d", leftDrive.getCurrentPosition(),
rightDrive.getCurrentPosition());
} else {
telemetry.addData("Motion", "Turning");
}
telemetry.addData("Heading- Target : Current", "%5.2f : %5.0f", targetHeading, getHeading());
telemetry.addData("Error : Steer Pwr", "%5.1f : %5.1f", headingError, turnSpeed);
telemetry.addData("Wheel Speeds L : R", "%5.2f : %5.2f", leftSpeed, rightSpeed);
telemetry.update();
}
/**
* read the Robot heading directly from the IMU (in degrees)
*/
public double getHeading() {
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
return orientation.getYaw(AngleUnit.DEGREES);
}
}

View File

@ -0,0 +1,128 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/*
* This OpMode illustrates the concept of driving a path based on time.
* The code is structured as a LinearOpMode
*
* The code assumes that you do NOT have encoders on the wheels,
* otherwise you would use: RobotAutoDriveByEncoder;
*
* The desired path in this example is:
* - Drive forward for 3 seconds
* - Spin right for 1.3 seconds
* - Drive Backward for 1 Second
*
* The code is written in a simple form with no optimizations.
* However, there are several ways that this type of sequence could be streamlined,
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive By Time", group="Robot")
@Disabled
public class RobotAutoDriveByTime_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private ElapsedTime runtime = new ElapsedTime();
static final double FORWARD_SPEED = 0.6;
static final double TURN_SPEED = 0.5;
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Ready to run"); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// Step through each leg of the path, ensuring that the Auto mode has not been stopped along the way
// Step 1: Drive forward for 3 seconds
leftDrive.setPower(FORWARD_SPEED);
rightDrive.setPower(FORWARD_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 3.0)) {
telemetry.addData("Path", "Leg 1: %4.1f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 2: Spin right for 1.3 seconds
leftDrive.setPower(TURN_SPEED);
rightDrive.setPower(-TURN_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 1.3)) {
telemetry.addData("Path", "Leg 2: %4.1f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 3: Drive Backward for 1 Second
leftDrive.setPower(-FORWARD_SPEED);
rightDrive.setPower(-FORWARD_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 1.0)) {
telemetry.addData("Path", "Leg 3: %4.1f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 4: Stop
leftDrive.setPower(0);
rightDrive.setPower(0);
telemetry.addData("Path", "Complete");
telemetry.update();
sleep(1000);
}
}

View File

@ -0,0 +1,321 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates using a camera to locate and drive towards a specific AprilTag.
* The code assumes a Holonomic (Mecanum or X Drive) Robot.
*
* For an introduction to AprilTags, see the ftc-docs link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* The drive goal is to rotate to keep the Tag centered in the camera, while strafing to be directly in front of the tag, and
* driving towards the tag to achieve the desired distance.
* To reduce any motion blur (which will interrupt the detection process) the Camera exposure is reduced to a very low value (5mS)
* You can determine the best Exposure and Gain values by using the ConceptAprilTagOptimizeExposure OpMode in this Samples folder.
*
* The code assumes a Robot Configuration with motors named: leftfront_drive and rightfront_drive, leftback_drive and rightback_drive.
* The motor directions must be set so a positive power goes forward on all wheels.
* This sample assumes that the current game AprilTag Library (usually for the current season) is being loaded by default,
* so you should choose to approach a valid tag ID (usually starting at 0)
*
* Under manual control, the left stick will move forward/back & left/right. The right stick will rotate the robot.
* Manually drive the robot until it displays Target data on the Driver Station.
*
* Press and hold the *Left Bumper* to enable the automatic "Drive to target" mode.
* Release the Left Bumper to return to manual driving mode.
*
* Under "Drive To Target" mode, the robot has three goals:
* 1) Turn the robot to always keep the Tag centered on the camera frame. (Use the Target Bearing to turn the robot.)
* 2) Strafe the robot towards the centerline of the Tag, so it approaches directly in front of the tag. (Use the Target Yaw to strafe the robot)
* 3) Drive towards the Tag to get to the desired distance. (Use Tag Range to drive the robot forward/backward)
*
* Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
* Speed and Turn sensitivity can be adjusted using the SPEED_GAIN, STRAFE_GAIN and TURN_GAIN constants.
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
*/
@TeleOp(name="Omni Drive To AprilTag", group = "Concept")
@Disabled
public class RobotAutoDriveToAprilTagOmni extends LinearOpMode
{
// Adjust these numbers to suit your robot.
final double DESIRED_DISTANCE = 12.0; // this is how close the camera should get to the target (inches)
// Set the GAIN constants to control the relationship between the measured position error, and how much power is
// applied to the drive motors to correct the error.
// Drive = Error * Gain Make these values smaller for smoother control, or larger for a more aggressive response.
final double SPEED_GAIN = 0.02 ; // Forward Speed Control "Gain". eg: Ramp up to 50% power at a 25 inch error. (0.50 / 25.0)
final double STRAFE_GAIN = 0.015 ; // Strafe Speed Control "Gain". eg: Ramp up to 25% power at a 25 degree Yaw error. (0.25 / 25.0)
final double TURN_GAIN = 0.01 ; // Turn Control "Gain". eg: Ramp up to 25% power at a 25 degree error. (0.25 / 25.0)
final double MAX_AUTO_SPEED = 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_STRAFE= 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_TURN = 0.3; // Clip the turn speed to this max value (adjust for your robot)
private DcMotor leftFrontDrive = null; // Used to control the left front drive wheel
private DcMotor rightFrontDrive = null; // Used to control the right front drive wheel
private DcMotor leftBackDrive = null; // Used to control the left back drive wheel
private DcMotor rightBackDrive = null; // Used to control the right back drive wheel
private static final boolean USE_WEBCAM = true; // Set true to use a webcam, or false for a phone camera
private static final int DESIRED_TAG_ID = -1; // Choose the tag you want to approach or set to -1 for ANY tag.
private VisionPortal visionPortal; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private AprilTagDetection desiredTag = null; // Used to hold the data for a detected AprilTag
@Override public void runOpMode()
{
boolean targetFound = false; // Set to true when an AprilTag target is detected
double drive = 0; // Desired forward power/speed (-1 to +1)
double strafe = 0; // Desired strafe power/speed (-1 to +1)
double turn = 0; // Desired turning power/speed (-1 to +1)
// Initialize the Apriltag Detection process
initAprilTag();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must match the names assigned during the robot configuration.
// step (using the FTC Robot Controller app on the phone).
leftFrontDrive = hardwareMap.get(DcMotor.class, "leftfront_drive");
rightFrontDrive = hardwareMap.get(DcMotor.class, "rightfront_drive");
leftBackDrive = hardwareMap.get(DcMotor.class, "leftback_drive");
rightBackDrive = hardwareMap.get(DcMotor.class, "rightback_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftFrontDrive.setDirection(DcMotor.Direction.REVERSE);
leftBackDrive.setDirection(DcMotor.Direction.REVERSE);
rightFrontDrive.setDirection(DcMotor.Direction.FORWARD);
rightBackDrive.setDirection(DcMotor.Direction.FORWARD);
if (USE_WEBCAM)
setManualExposure(6, 250); // Use low exposure time to reduce motion blur
// Wait for driver to press start
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
targetFound = false;
desiredTag = null;
// Step through the list of detected tags and look for a matching tag
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
for (AprilTagDetection detection : currentDetections) {
// Look to see if we have size info on this tag.
if (detection.metadata != null) {
// Check to see if we want to track towards this tag.
if ((DESIRED_TAG_ID < 0) || (detection.id == DESIRED_TAG_ID)) {
// Yes, we want to use this tag.
targetFound = true;
desiredTag = detection;
break; // don't look any further.
} else {
// This tag is in the library, but we do not want to track it right now.
telemetry.addData("Skipping", "Tag ID %d is not desired", detection.id);
}
} else {
// This tag is NOT in the library, so we don't have enough information to track to it.
telemetry.addData("Unknown", "Tag ID %d is not in TagLibrary", detection.id);
}
}
// Tell the driver what we see, and what to do.
if (targetFound) {
telemetry.addData("\n>","HOLD Left-Bumper to Drive to Target\n");
telemetry.addData("Found", "ID %d (%s)", desiredTag.id, desiredTag.metadata.name);
telemetry.addData("Range", "%5.1f inches", desiredTag.ftcPose.range);
telemetry.addData("Bearing","%3.0f degrees", desiredTag.ftcPose.bearing);
telemetry.addData("Yaw","%3.0f degrees", desiredTag.ftcPose.yaw);
} else {
telemetry.addData("\n>","Drive using joysticks to find valid target\n");
}
// If Left Bumper is being pressed, AND we have found the desired target, Drive to target Automatically .
if (gamepad1.left_bumper && targetFound) {
// Determine heading, range and Yaw (tag image rotation) error so we can use them to control the robot automatically.
double rangeError = (desiredTag.ftcPose.range - DESIRED_DISTANCE);
double headingError = desiredTag.ftcPose.bearing;
double yawError = desiredTag.ftcPose.yaw;
// Use the speed and turn "gains" to calculate how we want the robot to move.
drive = Range.clip(rangeError * SPEED_GAIN, -MAX_AUTO_SPEED, MAX_AUTO_SPEED);
turn = Range.clip(headingError * TURN_GAIN, -MAX_AUTO_TURN, MAX_AUTO_TURN) ;
strafe = Range.clip(-yawError * STRAFE_GAIN, -MAX_AUTO_STRAFE, MAX_AUTO_STRAFE);
telemetry.addData("Auto","Drive %5.2f, Strafe %5.2f, Turn %5.2f ", drive, strafe, turn);
} else {
// drive using manual POV Joystick mode. Slow things down to make the robot more controlable.
drive = -gamepad1.left_stick_y / 2.0; // Reduce drive rate to 50%.
strafe = -gamepad1.left_stick_x / 2.0; // Reduce strafe rate to 50%.
turn = -gamepad1.right_stick_x / 3.0; // Reduce turn rate to 33%.
telemetry.addData("Manual","Drive %5.2f, Strafe %5.2f, Turn %5.2f ", drive, strafe, turn);
}
telemetry.update();
// Apply desired axes motions to the drivetrain.
moveRobot(drive, strafe, turn);
sleep(10);
}
}
/**
* Move robot according to desired axes motions
* <p>
* Positive X is forward
* <p>
* Positive Y is strafe left
* <p>
* Positive Yaw is counter-clockwise
*/
public void moveRobot(double x, double y, double yaw) {
// Calculate wheel powers.
double leftFrontPower = x -y -yaw;
double rightFrontPower = x +y +yaw;
double leftBackPower = x +y -yaw;
double rightBackPower = x -y +yaw;
// Normalize wheel powers to be less than 1.0
double max = Math.max(Math.abs(leftFrontPower), Math.abs(rightFrontPower));
max = Math.max(max, Math.abs(leftBackPower));
max = Math.max(max, Math.abs(rightBackPower));
if (max > 1.0) {
leftFrontPower /= max;
rightFrontPower /= max;
leftBackPower /= max;
rightBackPower /= max;
}
// Send powers to the wheels.
leftFrontDrive.setPower(leftFrontPower);
rightFrontDrive.setPower(rightFrontPower);
leftBackDrive.setPower(leftBackPower);
rightBackDrive.setPower(rightBackPower);
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second
// Note: Decimation can be changed on-the-fly to adapt during a match.
aprilTag.setDecimation(2);
// Create the vision portal by using a builder.
if (USE_WEBCAM) {
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
} else {
visionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessor(aprilTag)
.build();
}
}
/*
Manually set the camera gain and exposure.
This can only be called AFTER calling initAprilTag(), and only works for Webcams;
*/
private void setManualExposure(int exposureMS, int gain) {
// Wait for the camera to be open, then use the controls
if (visionPortal == null) {
return;
}
// Make sure camera is streaming before we try to set the exposure controls
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
}
}
}

View File

@ -0,0 +1,298 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates using a camera to locate and drive towards a specific AprilTag.
* The code assumes a basic two-wheel (Tank) Robot Drivetrain
*
* For an introduction to AprilTags, see the ftc-docs link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* The driving goal is to rotate to keep the tag centered in the camera, while driving towards the tag to achieve the desired distance.
* To reduce any motion blur (which will interrupt the detection process) the Camera exposure is reduced to a very low value (5mS)
* You can determine the best exposure and gain values by using the ConceptAprilTagOptimizeExposure OpMode in this Samples folder.
*
* The code assumes a Robot Configuration with motors named left_drive and right_drive.
* The motor directions must be set so a positive power goes forward on both wheels;
* This sample assumes that the default AprilTag Library (usually for the current season) is being loaded by default
* so you should choose to approach a valid tag ID (usually starting at 0)
*
* Under manual control, the left stick will move forward/back, and the right stick will rotate the robot.
* This is called POV Joystick mode, different than Tank Drive (where each joystick controls a wheel).
*
* Manually drive the robot until it displays Target data on the Driver Station.
* Press and hold the *Left Bumper* to enable the automatic "Drive to target" mode.
* Release the Left Bumper to return to manual driving mode.
*
* Under "Drive To Target" mode, the robot has two goals:
* 1) Turn the robot to always keep the Tag centered on the camera frame. (Use the Target Bearing to turn the robot.)
* 2) Drive towards the Tag to get to the desired distance. (Use Tag Range to drive the robot forward/backward)
*
* Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
* Speed and Turn sensitivity can be adjusted using the SPEED_GAIN and TURN_GAIN constants.
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
*/
@TeleOp(name="Tank Drive To AprilTag", group = "Concept")
@Disabled
public class RobotAutoDriveToAprilTagTank extends LinearOpMode
{
// Adjust these numbers to suit your robot.
final double DESIRED_DISTANCE = 12.0; // this is how close the camera should get to the target (inches)
// Set the GAIN constants to control the relationship between the measured position error, and how much power is
// applied to the drive motors to correct the error.
// Drive = Error * Gain Make these values smaller for smoother control, or larger for a more aggressive response.
final double SPEED_GAIN = 0.02 ; // Speed Control "Gain". eg: Ramp up to 50% power at a 25 inch error. (0.50 / 25.0)
final double TURN_GAIN = 0.01 ; // Turn Control "Gain". eg: Ramp up to 25% power at a 25 degree error. (0.25 / 25.0)
final double MAX_AUTO_SPEED = 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_TURN = 0.25; // Clip the turn speed to this max value (adjust for your robot)
private DcMotor leftDrive = null; // Used to control the left drive wheel
private DcMotor rightDrive = null; // Used to control the right drive wheel
private static final boolean USE_WEBCAM = true; // Set true to use a webcam, or false for a phone camera
private static final int DESIRED_TAG_ID = -1; // Choose the tag you want to approach or set to -1 for ANY tag.
private VisionPortal visionPortal; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private AprilTagDetection desiredTag = null; // Used to hold the data for a detected AprilTag
@Override public void runOpMode()
{
boolean targetFound = false; // Set to true when an AprilTag target is detected
double drive = 0; // Desired forward power/speed (-1 to +1) +ve is forward
double turn = 0; // Desired turning power/speed (-1 to +1) +ve is CounterClockwise
// Initialize the Apriltag Detection process
initAprilTag();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must match the names assigned during the robot configuration.
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Single Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
if (USE_WEBCAM)
setManualExposure(6, 250); // Use low exposure time to reduce motion blur
// Wait for the driver to press Start
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
targetFound = false;
desiredTag = null;
// Step through the list of detected tags and look for a matching tag
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
for (AprilTagDetection detection : currentDetections) {
// Look to see if we have size info on this tag.
if (detection.metadata != null) {
// Check to see if we want to track towards this tag.
if ((DESIRED_TAG_ID < 0) || (detection.id == DESIRED_TAG_ID)) {
// Yes, we want to use this tag.
targetFound = true;
desiredTag = detection;
break; // don't look any further.
} else {
// This tag is in the library, but we do not want to track it right now.
telemetry.addData("Skipping", "Tag ID %d is not desired", detection.id);
}
} else {
// This tag is NOT in the library, so we don't have enough information to track to it.
telemetry.addData("Unknown", "Tag ID %d is not in TagLibrary", detection.id);
}
}
// Tell the driver what we see, and what to do.
if (targetFound) {
telemetry.addData("\n>","HOLD Left-Bumper to Drive to Target\n");
telemetry.addData("Found", "ID %d (%s)", desiredTag.id, desiredTag.metadata.name);
telemetry.addData("Range", "%5.1f inches", desiredTag.ftcPose.range);
telemetry.addData("Bearing","%3.0f degrees", desiredTag.ftcPose.bearing);
} else {
telemetry.addData("\n>","Drive using joysticks to find valid target\n");
}
// If Left Bumper is being pressed, AND we have found the desired target, Drive to target Automatically .
if (gamepad1.left_bumper && targetFound) {
// Determine heading and range error so we can use them to control the robot automatically.
double rangeError = (desiredTag.ftcPose.range - DESIRED_DISTANCE);
double headingError = desiredTag.ftcPose.bearing;
// Use the speed and turn "gains" to calculate how we want the robot to move. Clip it to the maximum
drive = Range.clip(rangeError * SPEED_GAIN, -MAX_AUTO_SPEED, MAX_AUTO_SPEED);
turn = Range.clip(headingError * TURN_GAIN, -MAX_AUTO_TURN, MAX_AUTO_TURN) ;
telemetry.addData("Auto","Drive %5.2f, Turn %5.2f", drive, turn);
} else {
// drive using manual POV Joystick mode.
drive = -gamepad1.left_stick_y / 2.0; // Reduce drive rate to 50%.
turn = -gamepad1.right_stick_x / 4.0; // Reduce turn rate to 25%.
telemetry.addData("Manual","Drive %5.2f, Turn %5.2f", drive, turn);
}
telemetry.update();
// Apply desired axes motions to the drivetrain.
moveRobot(drive, turn);
sleep(10);
}
}
/**
* Move robot according to desired axes motions
* <p>
* Positive X is forward
* <p>
* Positive Yaw is counter-clockwise
*/
public void moveRobot(double x, double yaw) {
// Calculate left and right wheel powers.
double leftPower = x - yaw;
double rightPower = x + yaw;
// Normalize wheel powers to be less than 1.0
double max = Math.max(Math.abs(leftPower), Math.abs(rightPower));
if (max >1.0) {
leftPower /= max;
rightPower /= max;
}
// Send powers to the wheels.
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second
// Note: Decimation can be changed on-the-fly to adapt during a match.
aprilTag.setDecimation(2);
// Create the vision portal by using a builder.
if (USE_WEBCAM) {
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
} else {
visionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessor(aprilTag)
.build();
}
}
/*
Manually set the camera gain and exposure.
This can only be called AFTER calling initAprilTag(), and only works for Webcams;
*/
private void setManualExposure(int exposureMS, int gain) {
// Wait for the camera to be open, then use the controls
if (visionPortal == null) {
return;
}
// Make sure camera is streaming before we try to set the exposure controls
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
telemetry.addData("Camera", "Ready");
telemetry.update();
}
}
}

View File

@ -0,0 +1,142 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.NormalizedColorSensor;
import com.qualcomm.robotcore.hardware.NormalizedRGBA;
import com.qualcomm.robotcore.hardware.SwitchableLight;
/*
* This OpMode illustrates the concept of driving up to a line and then stopping.
* The code is structured as a LinearOpMode
*
* The Sensor used here can be a REV Color Sensor V2 or V3. Make sure the white LED is turned on.
* The sensor can be plugged into any I2C port, and must be named "sensor_color" in the active configuration.
*
* Depending on the height of your color sensor, you may want to set the sensor "gain".
* The higher the gain, the greater the reflected light reading will be.
* Use the SensorColor sample in this folder to determine the minimum gain value that provides an
* "Alpha" reading of 1.0 when you are on top of the white line. In this sample, we use a gain of 15
* which works well with a Rev V2 color sensor
*
* Setting the correct WHITE_THRESHOLD value is key to stopping correctly.
* This should be set halfway between the bare-tile, and white-line "Alpha" values.
* The reflected light value can be read on the screen once the OpMode has been INIT, but before it is STARTED.
* Move the sensor on and off the white line and note the min and max readings.
* Edit this code to make WHITE_THRESHOLD halfway between the min and max.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive To Line", group="Robot")
@Disabled
public class RobotAutoDriveToLine_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
/** The variable to store a reference to our color sensor hardware object */
NormalizedColorSensor colorSensor;
static final double WHITE_THRESHOLD = 0.5; // spans between 0.0 - 1.0 from dark to light
static final double APPROACH_SPEED = 0.25;
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Get a reference to our sensor object. It's recommended to use NormalizedColorSensor over
// ColorSensor, because NormalizedColorSensor consistently gives values between 0 and 1, while
// the values you get from ColorSensor are dependent on the specific sensor you're using.
colorSensor = hardwareMap.get(NormalizedColorSensor.class, "sensor_color");
// If necessary, turn ON the white LED (if there is no LED switch on the sensor)
if (colorSensor instanceof SwitchableLight) {
((SwitchableLight)colorSensor).enableLight(true);
}
// Some sensors allow you to set your light sensor gain for optimal sensitivity...
// See the SensorColor sample in this folder for how to determine the optimal gain.
// A gain of 15 causes a Rev Color Sensor V2 to produce an Alpha value of 1.0 at about 1.5" above the floor.
colorSensor.setGain(15);
// Wait for driver to press PLAY)
// Abort this loop is started or stopped.
while (opModeInInit()) {
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Ready to drive to white line."); //
// Display the light level while we are waiting to start
getBrightness();
}
// Start the robot moving forward, and then begin looking for a white line.
leftDrive.setPower(APPROACH_SPEED);
rightDrive.setPower(APPROACH_SPEED);
// run until the white line is seen OR the driver presses STOP;
while (opModeIsActive() && (getBrightness() < WHITE_THRESHOLD)) {
sleep(5);
}
// Stop all motors
leftDrive.setPower(0);
rightDrive.setPower(0);
}
// to obtain reflected light, read the normalized values from the color sensor. Return the Alpha channel.
double getBrightness() {
NormalizedRGBA colors = colorSensor.getNormalizedColors();
telemetry.addData("Light Level (0 to 1)", "%4.2f", colors.alpha);
telemetry.update();
return colors.alpha;
}
}

View File

@ -0,0 +1,167 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/*
* This file works in conjunction with the External Hardware Class sample called: ConceptExternalHardwareClass.java
* Please read the explanations in that Sample about how to use this class definition.
*
* This file defines a Java Class that performs all the setup and configuration for a sample robot's hardware (motors and sensors).
* It assumes three motors (left_drive, right_drive and arm) and two servos (left_hand and right_hand)
*
* This one file/class can be used by ALL of your OpModes without having to cut & paste the code each time.
*
* Where possible, the actual hardware objects are "abstracted" (or hidden) so the OpMode code just makes calls into the class,
* rather than accessing the internal hardware directly. This is why the objects are declared "private".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with *exactly the same name*.
*
* Or... In OnBot Java, add a new file named RobotHardware.java, select this sample, and select Not an OpMode.
* Also add a new OpMode, select the sample ConceptExternalHardwareClass.java, and select TeleOp.
*
*/
public class RobotHardware {
/* Declare OpMode members. */
private LinearOpMode myOpMode = null; // gain access to methods in the calling OpMode.
// Define Motor and Servo objects (Make them private so they can't be accessed externally)
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private DcMotor armMotor = null;
private Servo leftHand = null;
private Servo rightHand = null;
// Define Drive constants. Make them public so they CAN be used by the calling OpMode
public static final double MID_SERVO = 0.5 ;
public static final double HAND_SPEED = 0.02 ; // sets rate to move servo
public static final double ARM_UP_POWER = 0.45 ;
public static final double ARM_DOWN_POWER = -0.45 ;
// Define a constructor that allows the OpMode to pass a reference to itself.
public RobotHardware (LinearOpMode opmode) {
myOpMode = opmode;
}
/**
* Initialize all the robot's hardware.
* This method must be called ONCE when the OpMode is initialized.
* <p>
* All of the hardware devices are accessed via the hardware map, and initialized.
*/
public void init() {
// Define and Initialize Motors (note: need to use reference to actual OpMode).
leftDrive = myOpMode.hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = myOpMode.hardwareMap.get(DcMotor.class, "right_drive");
armMotor = myOpMode.hardwareMap.get(DcMotor.class, "arm");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Define and initialize ALL installed servos.
leftHand = myOpMode.hardwareMap.get(Servo.class, "left_hand");
rightHand = myOpMode.hardwareMap.get(Servo.class, "right_hand");
leftHand.setPosition(MID_SERVO);
rightHand.setPosition(MID_SERVO);
myOpMode.telemetry.addData(">", "Hardware Initialized");
myOpMode.telemetry.update();
}
/**
* Calculates the left/right motor powers required to achieve the requested
* robot motions: Drive (Axial motion) and Turn (Yaw motion).
* Then sends these power levels to the motors.
*
* @param Drive Fwd/Rev driving power (-1.0 to 1.0) +ve is forward
* @param Turn Right/Left turning power (-1.0 to 1.0) +ve is CW
*/
public void driveRobot(double Drive, double Turn) {
// Combine drive and turn for blended motion.
double left = Drive + Turn;
double right = Drive - Turn;
// Scale the values so neither exceed +/- 1.0
double max = Math.max(Math.abs(left), Math.abs(right));
if (max > 1.0)
{
left /= max;
right /= max;
}
// Use existing function to drive both wheels.
setDrivePower(left, right);
}
/**
* Pass the requested wheel motor powers to the appropriate hardware drive motors.
*
* @param leftWheel Fwd/Rev driving power (-1.0 to 1.0) +ve is forward
* @param rightWheel Fwd/Rev driving power (-1.0 to 1.0) +ve is forward
*/
public void setDrivePower(double leftWheel, double rightWheel) {
// Output the values to the motor drives.
leftDrive.setPower(leftWheel);
rightDrive.setPower(rightWheel);
}
/**
* Pass the requested arm power to the appropriate hardware drive motor
*
* @param power driving power (-1.0 to 1.0)
*/
public void setArmPower(double power) {
armMotor.setPower(power);
}
/**
* Send the two hand-servos to opposing (mirrored) positions, based on the passed offset.
*
* @param offset
*/
public void setHandPositions(double offset) {
offset = Range.clip(offset, -0.5, 0.5);
leftHand.setPosition(MID_SERVO + offset);
rightHand.setPosition(MID_SERVO - offset);
}
}

View File

@ -0,0 +1,159 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/*
* This OpMode executes a POV Game style Teleop for a direct drive robot
* The code is structured as a LinearOpMode
*
* In this mode the left stick moves the robot FWD and back, the Right stick turns left and right.
* It raises and lowers the arm using the Gamepad Y and A buttons respectively.
* It also opens and closes the claws slowly using the left and right Bumper buttons.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Robot: Teleop POV", group="Robot")
@Disabled
public class RobotTeleopPOV_Linear extends LinearOpMode {
/* Declare OpMode members. */
public DcMotor leftDrive = null;
public DcMotor rightDrive = null;
public DcMotor leftArm = null;
public Servo leftClaw = null;
public Servo rightClaw = null;
double clawOffset = 0;
public static final double MID_SERVO = 0.5 ;
public static final double CLAW_SPEED = 0.02 ; // sets rate to move servo
public static final double ARM_UP_POWER = 0.45 ;
public static final double ARM_DOWN_POWER = -0.45 ;
@Override
public void runOpMode() {
double left;
double right;
double drive;
double turn;
double max;
// Define and Initialize Motors
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
leftArm = hardwareMap.get(DcMotor.class, "left_arm");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Define and initialize ALL installed servos.
leftClaw = hardwareMap.get(Servo.class, "left_hand");
rightClaw = hardwareMap.get(Servo.class, "right_hand");
leftClaw.setPosition(MID_SERVO);
rightClaw.setPosition(MID_SERVO);
// Send telemetry message to signify robot waiting;
telemetry.addData(">", "Robot Ready. Press Play."); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Run wheels in POV mode (note: The joystick goes negative when pushed forward, so negate it)
// In this mode the Left stick moves the robot fwd and back, the Right stick turns left and right.
// This way it's also easy to just drive straight, or just turn.
drive = -gamepad1.left_stick_y;
turn = gamepad1.right_stick_x;
// Combine drive and turn for blended motion.
left = drive + turn;
right = drive - turn;
// Normalize the values so neither exceed +/- 1.0
max = Math.max(Math.abs(left), Math.abs(right));
if (max > 1.0)
{
left /= max;
right /= max;
}
// Output the safe vales to the motor drives.
leftDrive.setPower(left);
rightDrive.setPower(right);
// Use gamepad left & right Bumpers to open and close the claw
if (gamepad1.right_bumper)
clawOffset += CLAW_SPEED;
else if (gamepad1.left_bumper)
clawOffset -= CLAW_SPEED;
// Move both servos to new position. Assume servos are mirror image of each other.
clawOffset = Range.clip(clawOffset, -0.5, 0.5);
leftClaw.setPosition(MID_SERVO + clawOffset);
rightClaw.setPosition(MID_SERVO - clawOffset);
// Use gamepad buttons to move arm up (Y) and down (A)
if (gamepad1.y)
leftArm.setPower(ARM_UP_POWER);
else if (gamepad1.a)
leftArm.setPower(ARM_DOWN_POWER);
else
leftArm.setPower(0.0);
// Send telemetry message to signify robot running;
telemetry.addData("claw", "Offset = %.2f", clawOffset);
telemetry.addData("left", "%.2f", left);
telemetry.addData("right", "%.2f", right);
telemetry.update();
// Pace this loop so jaw action is reasonable speed.
sleep(50);
}
}
}

View File

@ -0,0 +1,160 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/*
* This OpMode executes a Tank Drive control TeleOp a direct drive robot
* The code is structured as an Iterative OpMode
*
* In this mode, the left and right joysticks control the left and right motors respectively.
* Pushing a joystick forward will make the attached motor drive forward.
* It raises and lowers the claw using the Gamepad Y and A buttons respectively.
* It also opens and closes the claws slowly using the left and right Bumper buttons.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Robot: Teleop Tank", group="Robot")
@Disabled
public class RobotTeleopTank_Iterative extends OpMode{
/* Declare OpMode members. */
public DcMotor leftDrive = null;
public DcMotor rightDrive = null;
public DcMotor leftArm = null;
public Servo leftClaw = null;
public Servo rightClaw = null;
double clawOffset = 0;
public static final double MID_SERVO = 0.5 ;
public static final double CLAW_SPEED = 0.02 ; // sets rate to move servo
public static final double ARM_UP_POWER = 0.50 ; // Run arm motor up at 50% power
public static final double ARM_DOWN_POWER = -0.25 ; // Run arm motor down at -25% power
/*
* Code to run ONCE when the driver hits INIT
*/
@Override
public void init() {
// Define and Initialize Motors
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
leftArm = hardwareMap.get(DcMotor.class, "left_arm");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left and right sticks forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Define and initialize ALL installed servos.
leftClaw = hardwareMap.get(Servo.class, "left_hand");
rightClaw = hardwareMap.get(Servo.class, "right_hand");
leftClaw.setPosition(MID_SERVO);
rightClaw.setPosition(MID_SERVO);
// Send telemetry message to signify robot waiting;
telemetry.addData(">", "Robot Ready. Press Play."); //
}
/*
* Code to run REPEATEDLY after the driver hits INIT, but before they hit PLAY
*/
@Override
public void init_loop() {
}
/*
* Code to run ONCE when the driver hits PLAY
*/
@Override
public void start() {
}
/*
* Code to run REPEATEDLY after the driver hits PLAY but before they hit STOP
*/
@Override
public void loop() {
double left;
double right;
// Run wheels in tank mode (note: The joystick goes negative when pushed forward, so negate it)
left = -gamepad1.left_stick_y;
right = -gamepad1.right_stick_y;
leftDrive.setPower(left);
rightDrive.setPower(right);
// Use gamepad left & right Bumpers to open and close the claw
if (gamepad1.right_bumper)
clawOffset += CLAW_SPEED;
else if (gamepad1.left_bumper)
clawOffset -= CLAW_SPEED;
// Move both servos to new position. Assume servos are mirror image of each other.
clawOffset = Range.clip(clawOffset, -0.5, 0.5);
leftClaw.setPosition(MID_SERVO + clawOffset);
rightClaw.setPosition(MID_SERVO - clawOffset);
// Use gamepad buttons to move the arm up (Y) and down (A)
if (gamepad1.y)
leftArm.setPower(ARM_UP_POWER);
else if (gamepad1.a)
leftArm.setPower(ARM_DOWN_POWER);
else
leftArm.setPower(0.0);
// Send telemetry message to signify robot running;
telemetry.addData("claw", "Offset = %.2f", clawOffset);
telemetry.addData("left", "%.2f", left);
telemetry.addData("right", "%.2f", right);
}
/*
* Code to run ONCE after the driver hits STOP
*/
@Override
public void stop() {
}
}

View File

@ -0,0 +1,163 @@
/*
* Copyright (c) 2018 Craig MacFarlane
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification, are permitted
* (subject to the limitations in the disclaimer below) provided that the following conditions are
* met:
*
* Redistributions of source code must retain the above copyright notice, this list of conditions
* and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this list of conditions
* and the following disclaimer in the documentation and/or other materials provided with the
* distribution.
*
* Neither the name of Craig MacFarlane nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS LICENSE. THIS
* SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED
* WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
* OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF
* THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevBlinkinLedDriver;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.robotcore.internal.system.Deadline;
import java.util.concurrent.TimeUnit;
/*
* This OpMode demonstrates use of the REV Robotics Blinkin LED Driver.
* AUTO mode cycles through all of the patterns.
* MANUAL mode allows the user to manually change patterns using the
* left and right bumpers of a gamepad.
*
* Configure the driver on a servo port, and name it "blinkin".
*
* Displays the first pattern upon init.
*/
@TeleOp(name="BlinkinExample")
@Disabled
public class SampleRevBlinkinLedDriver extends OpMode {
/*
* Change the pattern every 10 seconds in AUTO mode.
*/
private final static int LED_PERIOD = 10;
/*
* Rate limit gamepad button presses to every 500ms.
*/
private final static int GAMEPAD_LOCKOUT = 500;
RevBlinkinLedDriver blinkinLedDriver;
RevBlinkinLedDriver.BlinkinPattern pattern;
Telemetry.Item patternName;
Telemetry.Item display;
DisplayKind displayKind;
Deadline ledCycleDeadline;
Deadline gamepadRateLimit;
protected enum DisplayKind {
MANUAL,
AUTO
}
@Override
public void init()
{
displayKind = DisplayKind.AUTO;
blinkinLedDriver = hardwareMap.get(RevBlinkinLedDriver.class, "blinkin");
pattern = RevBlinkinLedDriver.BlinkinPattern.RAINBOW_RAINBOW_PALETTE;
blinkinLedDriver.setPattern(pattern);
display = telemetry.addData("Display Kind: ", displayKind.toString());
patternName = telemetry.addData("Pattern: ", pattern.toString());
ledCycleDeadline = new Deadline(LED_PERIOD, TimeUnit.SECONDS);
gamepadRateLimit = new Deadline(GAMEPAD_LOCKOUT, TimeUnit.MILLISECONDS);
}
@Override
public void loop()
{
handleGamepad();
if (displayKind == DisplayKind.AUTO) {
doAutoDisplay();
} else {
/*
* MANUAL mode: Nothing to do, setting the pattern as a result of a gamepad event.
*/
}
}
/*
* handleGamepad
*
* Responds to a gamepad button press. Demonstrates rate limiting for
* button presses. If loop() is called every 10ms and and you don't rate
* limit, then any given button press may register as multiple button presses,
* which in this application is problematic.
*
* A: Manual mode, Right bumper displays the next pattern, left bumper displays the previous pattern.
* B: Auto mode, pattern cycles, changing every LED_PERIOD seconds.
*/
protected void handleGamepad()
{
if (!gamepadRateLimit.hasExpired()) {
return;
}
if (gamepad1.a) {
setDisplayKind(DisplayKind.MANUAL);
gamepadRateLimit.reset();
} else if (gamepad1.b) {
setDisplayKind(DisplayKind.AUTO);
gamepadRateLimit.reset();
} else if ((displayKind == DisplayKind.MANUAL) && (gamepad1.left_bumper)) {
pattern = pattern.previous();
displayPattern();
gamepadRateLimit.reset();
} else if ((displayKind == DisplayKind.MANUAL) && (gamepad1.right_bumper)) {
pattern = pattern.next();
displayPattern();
gamepadRateLimit.reset();
}
}
protected void setDisplayKind(DisplayKind displayKind)
{
this.displayKind = displayKind;
display.setValue(displayKind.toString());
}
protected void doAutoDisplay()
{
if (ledCycleDeadline.hasExpired()) {
pattern = pattern.next();
displayPattern();
ledCycleDeadline.reset();
}
}
protected void displayPattern()
{
blinkinLedDriver.setPattern(pattern);
patternName.setValue(pattern.toString());
}
}

View File

@ -0,0 +1,186 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.bosch.BNO055IMU;
import com.qualcomm.hardware.bosch.JustLoggingAccelerationIntegrator;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.navigation.Acceleration;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.Position;
import org.firstinspires.ftc.robotcore.external.navigation.Velocity;
import java.util.Locale;
/*
* This OpMode gives a short demo on how to use the BNO055 Inertial Motion Unit (IMU) from AdaFruit.
*
* Note: this is a Legacy example that will not work with newer Control/Expansion Hubs that use a different IMU
* Please use the new SensorIMUOrthogonal or SensorIMUNonOrthogonal samples for a more universal IMU interface.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* @see <a href="http://www.adafruit.com/products/2472">Adafruit IMU</a>
*/
@TeleOp(name = "Sensor: BNO055 IMU", group = "Sensor")
@Disabled // Comment this out to add to the OpMode list
public class SensorBNO055IMU extends LinearOpMode
{
//----------------------------------------------------------------------------------------------
// State
//----------------------------------------------------------------------------------------------
// The IMU sensor object
BNO055IMU imu;
// State used for updating telemetry
Orientation angles;
Acceleration gravity;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() {
// Set up the parameters with which we will use our IMU. Note that integration
// algorithm here just reports accelerations to the logcat log; it doesn't actually
// provide positional information.
BNO055IMU.Parameters parameters = new BNO055IMU.Parameters();
parameters.angleUnit = BNO055IMU.AngleUnit.DEGREES;
parameters.accelUnit = BNO055IMU.AccelUnit.METERS_PERSEC_PERSEC;
parameters.calibrationDataFile = "BNO055IMUCalibration.json"; // see the calibration sample OpMode
parameters.loggingEnabled = true;
parameters.loggingTag = "IMU";
parameters.accelerationIntegrationAlgorithm = new JustLoggingAccelerationIntegrator();
// Retrieve and initialize the IMU. We expect the IMU to be attached to an I2C port
// on a Core Device Interface Module, configured to be a sensor of type "AdaFruit IMU",
// and named "imu".
imu = hardwareMap.get(BNO055IMU.class, "imu");
imu.initialize(parameters);
// Set up our telemetry dashboard
composeTelemetry();
// Wait until we're told to go
waitForStart();
// Start the logging of measured acceleration
imu.startAccelerationIntegration(new Position(), new Velocity(), 1000);
// Loop and update the dashboard
while (opModeIsActive()) {
telemetry.update();
}
}
//----------------------------------------------------------------------------------------------
// Telemetry Configuration
//----------------------------------------------------------------------------------------------
void composeTelemetry() {
// At the beginning of each telemetry update, grab a bunch of data
// from the IMU that we will then display in separate lines.
telemetry.addAction(new Runnable() { @Override public void run()
{
// Acquiring the angles is relatively expensive; we don't want
// to do that in each of the three items that need that info, as that's
// three times the necessary expense.
angles = imu.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
gravity = imu.getGravity();
}
});
telemetry.addLine()
.addData("status", new Func<String>() {
@Override public String value() {
return imu.getSystemStatus().toShortString();
}
})
.addData("calib", new Func<String>() {
@Override public String value() {
return imu.getCalibrationStatus().toString();
}
});
telemetry.addLine()
.addData("heading", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.firstAngle);
}
})
.addData("roll", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.secondAngle);
}
})
.addData("pitch", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.thirdAngle);
}
});
telemetry.addLine()
.addData("grvty", new Func<String>() {
@Override public String value() {
return gravity.toString();
}
})
.addData("mag", new Func<String>() {
@Override public String value() {
return String.format(Locale.getDefault(), "%.3f",
Math.sqrt(gravity.xAccel*gravity.xAccel
+ gravity.yAccel*gravity.yAccel
+ gravity.zAccel*gravity.zAccel));
}
});
}
//----------------------------------------------------------------------------------------------
// Formatting
//----------------------------------------------------------------------------------------------
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format(Locale.getDefault(), "%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,230 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.bosch.BNO055IMU;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ReadWriteFile;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.internal.system.AppUtil;
import java.io.File;
import java.util.Locale;
/*
* This OpMode calibrates a BNO055 IMU per
* "Section 3.11 Calibration" of the BNO055 specification.
*
* Note: this is a Legacy example that will not work with newer Control/Expansion Hubs that use a different IMU
* Please use the new SensorIMUOrthogonal or SensorIMUNonOrthogonal samples for a more universal IMU interface.
*
* Manual calibration of the IMU is definitely NOT necessary: except for the magnetometer (which is not used by the
* default "IMU" SensorMode), the BNO055 is internally self-calibrating and thus can be very successfully used without
* manual intervention. That said, performing a one-time calibration, saving the results persistently, then loading them
* again at each run can help reduce the time that automatic calibration requires.
*
* This summary of the calibration process from Intel is informative:
* http://iotdk.intel.com/docs/master/upm/classupm_1_1_b_n_o055.html
*
* "This device requires calibration in order to operate accurately. [...] Calibration data is
* lost on a power cycle. See one of the examples for a description of how to calibrate the device,
* but in essence:
*
* There is a calibration status register available [...] that returns the calibration status
* of the accelerometer (ACC), magnetometer (MAG), gyroscope (GYR), and overall system (SYS).
* Each of these values range from 0 (uncalibrated) to 3 (fully calibrated). Calibration [ideally]
* involves certain motions to get all 4 values at 3. The motions are as follows (though see the
* datasheet for more information):
*
* 1. GYR: Simply let the sensor sit flat for a few seconds.</ol>
* 2. ACC: Move the sensor in various positions. Start flat, then rotate slowly by 45
* degrees, hold for a few seconds, then continue rotating another 45 degrees and
* hold, etc. 6 or more movements of this type may be required. You can move through
* any axis you desire, but make sure that the device is lying at least once
* perpendicular to the x, y, and z axis.</ol>
* 3. MAG: Move slowly in a figure 8 pattern in the air, until the calibration values reaches 3.</ol>
* 4. SYS: This will usually reach 3 when the other items have also reached 3. If not, continue
* slowly moving the device though various axes until it does."</ol>
*
* To calibrate the IMU, run this sample OpMode with a gamepad attached to the driver station.
* Once the IMU has reached sufficient calibration as reported on telemetry, press the 'A'
* button on the gamepad to write the calibration to a file. That file can then be indicated
* later when running an OpMode which uses the IMU.
*
* Note: if your intended uses of the IMU do not include use of all its sensors (for example,
* you might not use the magnetometer), then it makes little sense for you to wait for full
* calibration of the sensors you are not using before saving the calibration data. Indeed,
* it appears that in a SensorMode that doesn't use the magnetometer (for example), the
* magnetometer cannot actually be calibrated.
*
* References:
* The AdafruitBNO055IMU Javadoc
* The BNO055IMU.Parameters.calibrationDataFile Javadoc
* The BNO055 product page: https://www.bosch-sensortec.com/bst/products/all_products/bno055
* The BNO055 datasheet: https://www.bosch-sensortec.com/media/boschsensortec/downloads/datasheets/bst-bno055-ds000.pdf
*/
@TeleOp(name = "Sensor: BNO055 IMU Calibration", group = "Sensor")
@Disabled // Uncomment this to add to the OpMode list
public class SensorBNO055IMUCalibration extends LinearOpMode
{
//----------------------------------------------------------------------------------------------
// State
//----------------------------------------------------------------------------------------------
// Our sensors, motors, and other devices go here, along with other long term state
BNO055IMU imu;
// State used for updating telemetry
Orientation angles;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() {
telemetry.log().setCapacity(12);
telemetry.log().add("");
telemetry.log().add("Please refer to the calibration instructions");
telemetry.log().add("contained in the Adafruit IMU calibration");
telemetry.log().add("sample OpMode.");
telemetry.log().add("");
telemetry.log().add("When sufficient calibration has been reached,");
telemetry.log().add("press the 'A' button to write the current");
telemetry.log().add("calibration data to a file.");
telemetry.log().add("");
// We are expecting the IMU to be attached to an I2C port on a Core Device Interface Module and named "imu".
BNO055IMU.Parameters parameters = new BNO055IMU.Parameters();
parameters.loggingEnabled = true;
parameters.loggingTag = "IMU";
imu = hardwareMap.get(BNO055IMU.class, "imu");
imu.initialize(parameters);
composeTelemetry();
telemetry.log().add("Waiting for start...");
// Wait until we're told to go
while (!isStarted()) {
telemetry.update();
idle();
}
telemetry.log().add("...started...");
while (opModeIsActive()) {
if (gamepad1.a) {
// Get the calibration data
BNO055IMU.CalibrationData calibrationData = imu.readCalibrationData();
// Save the calibration data to a file. You can choose whatever file
// name you wish here, but you'll want to indicate the same file name
// when you initialize the IMU in an OpMode in which it is used. If you
// have more than one IMU on your robot, you'll of course want to use
// different configuration file names for each.
String filename = "AdafruitIMUCalibration.json";
File file = AppUtil.getInstance().getSettingsFile(filename);
ReadWriteFile.writeFile(file, calibrationData.serialize());
telemetry.log().add("saved to '%s'", filename);
// Wait for the button to be released
while (gamepad1.a) {
telemetry.update();
idle();
}
}
telemetry.update();
}
}
void composeTelemetry() {
// At the beginning of each telemetry update, grab a bunch of data
// from the IMU that we will then display in separate lines.
telemetry.addAction(new Runnable() { @Override public void run()
{
// Acquiring the angles is relatively expensive; we don't want
// to do that in each of the three items that need that info, as that's
// three times the necessary expense.
angles = imu.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
}
});
telemetry.addLine()
.addData("status", new Func<String>() {
@Override public String value() {
return imu.getSystemStatus().toShortString();
}
})
.addData("calib", new Func<String>() {
@Override public String value() {
return imu.getCalibrationStatus().toString();
}
});
telemetry.addLine()
.addData("heading", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.firstAngle);
}
})
.addData("roll", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.secondAngle);
}
})
.addData("pitch", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.thirdAngle);
}
});
}
//----------------------------------------------------------------------------------------------
// Formatting
//----------------------------------------------------------------------------------------------
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format(Locale.getDefault(), "%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,221 @@
/* Copyright (c) 2017-2020 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import com.qualcomm.robotcore.hardware.NormalizedColorSensor;
import com.qualcomm.robotcore.hardware.NormalizedRGBA;
import com.qualcomm.robotcore.hardware.SwitchableLight;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/*
* This OpMode shows how to use a color sensor in a generic
* way, regardless of which particular make or model of color sensor is used. The OpMode
* assumes that the color sensor is configured with a name of "sensor_color".
*
* There will be some variation in the values measured depending on the specific sensor you are using.
*
* You can increase the gain (a multiplier to make the sensor report higher values) by holding down
* the A button on the gamepad, and decrease the gain by holding down the B button on the gamepad.
*
* If the color sensor has a light which is controllable from software, you can use the X button on
* the gamepad to toggle the light on and off. The REV sensors don't support this, but instead have
* a physical switch on them to turn the light on and off, beginning with REV Color Sensor V2.
*
* If the color sensor also supports short-range distance measurements (usually via an infrared
* proximity sensor), the reported distance will be written to telemetry. As of September 2020,
* the only color sensors that support this are the ones from REV Robotics. These infrared proximity
* sensor measurements are only useful at very small distances, and are sensitive to ambient light
* and surface reflectivity. You should use a different sensor if you need precise distance measurements.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: Color", group = "Sensor")
@Disabled
public class SensorColor extends LinearOpMode {
/** The colorSensor field will contain a reference to our color sensor hardware object */
NormalizedColorSensor colorSensor;
/** The relativeLayout field is used to aid in providing interesting visual feedback
* in this sample application; you probably *don't* need this when you use a color sensor on your
* robot. Note that you won't see anything change on the Driver Station, only on the Robot Controller. */
View relativeLayout;
/*
* The runOpMode() method is the root of this OpMode, as it is in all LinearOpModes.
* Our implementation here, though is a bit unusual: we've decided to put all the actual work
* in the runSample() method rather than directly in runOpMode() itself. The reason we do that is
* that in this sample we're changing the background color of the robot controller screen as the
* OpMode runs, and we want to be able to *guarantee* that we restore it to something reasonable
* and palatable when the OpMode ends. The simplest way to do that is to use a try...finally
* block around the main, core logic, and an easy way to make that all clear was to separate
* the former from the latter in separate methods.
*/
@Override public void runOpMode() {
// Get a reference to the RelativeLayout so we can later change the background
// color of the Robot Controller app to match the hue detected by the RGB sensor.
int relativeLayoutId = hardwareMap.appContext.getResources().getIdentifier("RelativeLayout", "id", hardwareMap.appContext.getPackageName());
relativeLayout = ((Activity) hardwareMap.appContext).findViewById(relativeLayoutId);
try {
runSample(); // actually execute the sample
} finally {
// On the way out, *guarantee* that the background is reasonable. It doesn't actually start off
// as pure white, but it's too much work to dig out what actually was used, and this is good
// enough to at least make the screen reasonable again.
// Set the panel back to the default color
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.WHITE);
}
});
}
}
protected void runSample() {
// You can give the sensor a gain value, will be multiplied by the sensor's raw value before the
// normalized color values are calculated. Color sensors (especially the REV Color Sensor V3)
// can give very low values (depending on the lighting conditions), which only use a small part
// of the 0-1 range that is available for the red, green, and blue values. In brighter conditions,
// you should use a smaller gain than in dark conditions. If your gain is too high, all of the
// colors will report at or near 1, and you won't be able to determine what color you are
// actually looking at. For this reason, it's better to err on the side of a lower gain
// (but always greater than or equal to 1).
float gain = 2;
// Once per loop, we will update this hsvValues array. The first element (0) will contain the
// hue, the second element (1) will contain the saturation, and the third element (2) will
// contain the value. See http://web.archive.org/web/20190311170843/https://infohost.nmt.edu/tcc/help/pubs/colortheory/web/hsv.html
// for an explanation of HSV color.
final float[] hsvValues = new float[3];
// xButtonPreviouslyPressed and xButtonCurrentlyPressed keep track of the previous and current
// state of the X button on the gamepad
boolean xButtonPreviouslyPressed = false;
boolean xButtonCurrentlyPressed = false;
// Get a reference to our sensor object. It's recommended to use NormalizedColorSensor over
// ColorSensor, because NormalizedColorSensor consistently gives values between 0 and 1, while
// the values you get from ColorSensor are dependent on the specific sensor you're using.
colorSensor = hardwareMap.get(NormalizedColorSensor.class, "sensor_color");
// If possible, turn the light on in the beginning (it might already be on anyway,
// we just make sure it is if we can).
if (colorSensor instanceof SwitchableLight) {
((SwitchableLight)colorSensor).enableLight(true);
}
// Wait for the start button to be pressed.
waitForStart();
// Loop until we are asked to stop
while (opModeIsActive()) {
// Explain basic gain information via telemetry
telemetry.addLine("Hold the A button on gamepad 1 to increase gain, or B to decrease it.\n");
telemetry.addLine("Higher gain values mean that the sensor will report larger numbers for Red, Green, and Blue, and Value\n");
// Update the gain value if either of the A or B gamepad buttons is being held
if (gamepad1.a) {
// Only increase the gain by a small amount, since this loop will occur multiple times per second.
gain += 0.005;
} else if (gamepad1.b && gain > 1) { // A gain of less than 1 will make the values smaller, which is not helpful.
gain -= 0.005;
}
// Show the gain value via telemetry
telemetry.addData("Gain", gain);
// Tell the sensor our desired gain value (normally you would do this during initialization,
// not during the loop)
colorSensor.setGain(gain);
// Check the status of the X button on the gamepad
xButtonCurrentlyPressed = gamepad1.x;
// If the button state is different than what it was, then act
if (xButtonCurrentlyPressed != xButtonPreviouslyPressed) {
// If the button is (now) down, then toggle the light
if (xButtonCurrentlyPressed) {
if (colorSensor instanceof SwitchableLight) {
SwitchableLight light = (SwitchableLight)colorSensor;
light.enableLight(!light.isLightOn());
}
}
}
xButtonPreviouslyPressed = xButtonCurrentlyPressed;
// Get the normalized colors from the sensor
NormalizedRGBA colors = colorSensor.getNormalizedColors();
/* Use telemetry to display feedback on the driver station. We show the red, green, and blue
* normalized values from the sensor (in the range of 0 to 1), as well as the equivalent
* HSV (hue, saturation and value) values. See http://web.archive.org/web/20190311170843/https://infohost.nmt.edu/tcc/help/pubs/colortheory/web/hsv.html
* for an explanation of HSV color. */
// Update the hsvValues array by passing it to Color.colorToHSV()
Color.colorToHSV(colors.toColor(), hsvValues);
telemetry.addLine()
.addData("Red", "%.3f", colors.red)
.addData("Green", "%.3f", colors.green)
.addData("Blue", "%.3f", colors.blue);
telemetry.addLine()
.addData("Hue", "%.3f", hsvValues[0])
.addData("Saturation", "%.3f", hsvValues[1])
.addData("Value", "%.3f", hsvValues[2]);
telemetry.addData("Alpha", "%.3f", colors.alpha);
/* If this color sensor also has a distance sensor, display the measured distance.
* Note that the reported distance is only useful at very close range, and is impacted by
* ambient light and surface reflectivity. */
if (colorSensor instanceof DistanceSensor) {
telemetry.addData("Distance (cm)", "%.3f", ((DistanceSensor) colorSensor).getDistance(DistanceUnit.CM));
}
telemetry.update();
// Change the Robot Controller's background color to match the color detected by the color sensor.
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.HSVToColor(hsvValues));
}
});
}
}
}

View File

@ -0,0 +1,78 @@
/* Copyright (c) 2024 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DigitalChannel;
/*
* This OpMode demonstrates how to use a digital channel.
*
* The OpMode assumes that the digital channel is configured with a name of "digitalTouch".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Sensor: digital channel", group = "Sensor")
@Disabled
public class SensorDigitalTouch extends LinearOpMode {
DigitalChannel digitalTouch; // Digital channel Object
@Override
public void runOpMode() {
// get a reference to our touchSensor object.
digitalTouch = hardwareMap.get(DigitalChannel.class, "digitalTouch");
digitalTouch.setMode(DigitalChannel.Mode.INPUT);
telemetry.addData("DigitalTouchSensorExample", "Press start to continue...");
telemetry.update();
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read the digital channel.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// button is pressed if value returned is LOW or false.
// send the info back to driver station using telemetry function.
if (digitalTouch.getState() == false) {
telemetry.addData("Button", "PRESSED");
} else {
telemetry.addData("Button", "NOT PRESSED");
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,149 @@
/*
Copyright (c) 2023 FIRST
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of FIRST nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.dfrobot.HuskyLens;
import com.qualcomm.hardware.rev.Rev2mDistanceSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
import org.firstinspires.ftc.robotcore.internal.system.Deadline;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates how to use the DFRobot HuskyLens.
*
* The HuskyLens is a Vision Sensor with a built-in object detection model. It can
* detect a number of predefined objects and AprilTags in the 36h11 family, can
* recognize colors, and can be trained to detect custom objects. See this website for
* documentation: https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336
*
* This sample illustrates how to detect AprilTags, but can be used to detect other types
* of objects by changing the algorithm. It assumes that the HuskyLens is configured with
* a name of "huskylens".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: HuskyLens", group = "Sensor")
@Disabled
public class SensorHuskyLens extends LinearOpMode {
private final int READ_PERIOD = 1;
private HuskyLens huskyLens;
@Override
public void runOpMode()
{
huskyLens = hardwareMap.get(HuskyLens.class, "huskylens");
/*
* This sample rate limits the reads solely to allow a user time to observe
* what is happening on the Driver Station telemetry. Typical applications
* would not likely rate limit.
*/
Deadline rateLimit = new Deadline(READ_PERIOD, TimeUnit.SECONDS);
/*
* Immediately expire so that the first time through we'll do the read.
*/
rateLimit.expire();
/*
* Basic check to see if the device is alive and communicating. This is not
* technically necessary here as the HuskyLens class does this in its
* doInitialization() method which is called when the device is pulled out of
* the hardware map. However, sometimes it's unclear why a device reports as
* failing on initialization. In the case of this device, it's because the
* call to knock() failed.
*/
if (!huskyLens.knock()) {
telemetry.addData(">>", "Problem communicating with " + huskyLens.getDeviceName());
} else {
telemetry.addData(">>", "Press start to continue");
}
/*
* The device uses the concept of an algorithm to determine what types of
* objects it will look for and/or what mode it is in. The algorithm may be
* selected using the scroll wheel on the device, or via software as shown in
* the call to selectAlgorithm().
*
* The SDK itself does not assume that the user wants a particular algorithm on
* startup, and hence does not set an algorithm.
*
* Users, should, in general, explicitly choose the algorithm they want to use
* within the OpMode by calling selectAlgorithm() and passing it one of the values
* found in the enumeration HuskyLens.Algorithm.
*/
huskyLens.selectAlgorithm(HuskyLens.Algorithm.TAG_RECOGNITION);
telemetry.update();
waitForStart();
/*
* Looking for AprilTags per the call to selectAlgorithm() above. A handy grid
* for testing may be found at https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336#target_20.
*
* Note again that the device only recognizes the 36h11 family of tags out of the box.
*/
while(opModeIsActive()) {
if (!rateLimit.hasExpired()) {
continue;
}
rateLimit.reset();
/*
* All algorithms, except for LINE_TRACKING, return a list of Blocks where a
* Block represents the outline of a recognized object along with its ID number.
* ID numbers allow you to identify what the device saw. See the HuskyLens documentation
* referenced in the header comment above for more information on IDs and how to
* assign them to objects.
*
* Returns an empty array if no objects are seen.
*/
HuskyLens.Block[] blocks = huskyLens.blocks();
telemetry.addData("Block count", blocks.length);
for (int i = 0; i < blocks.length; i++) {
telemetry.addData("Block", blocks[i].toString());
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,181 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
import static com.qualcomm.hardware.rev.RevHubOrientationOnRobot.xyzOrientation;
/*
* This OpMode shows how to use the new universal IMU interface. This
* interface may be used with the BNO055 IMU or the BHI260 IMU. It assumes that an IMU is configured
* on the robot with the name "imu".
*
* The sample will display the current Yaw, Pitch and Roll of the robot.<br>
* With the correct orientation parameters selected, pitch/roll/yaw should act as follows:
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X)
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y)
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z)
*
* The yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
*
* This specific sample DOES NOT assume that the Hub is mounted on one of the three orthogonal
* planes (X/Y, X/Z or Y/Z) OR that the Hub has only been rotated in a range of 90 degree increments.
*
* Note: if your Hub is mounted Orthogonally (on a orthogonal surface, angled at some multiple of
* 90 Degrees) then you should use the simpler SensorImuOrthogonal sample in this folder.
*
* But... If your Hub is mounted Non-Orthogonally, you must specify one or more rotational angles
* that transform a "Default" Hub orientation into your desired orientation. That is what is
* illustrated here.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* Finally, edit this OpMode to use at least one angle around an axis to orient your Hub.
*/
@TeleOp(name = "Sensor: IMU Non-Orthogonal", group = "Sensor")
@Disabled // Comment this out to add to the OpMode list
public class SensorIMUNonOrthogonal extends LinearOpMode
{
// The IMU sensor object
IMU imu;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() throws InterruptedException {
// Retrieve and initialize the IMU.
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
/* Define how the hub is mounted to the robot to get the correct Yaw, Pitch and Roll values.
*
* You can apply up to three axis rotations to orient your Hub according to how it's mounted on the robot.
*
* The starting point for these rotations is the "Default" Hub orientation, which is:
* 1) Hub laying flat on a horizontal surface, with the Printed Logo facing UP
* 2) Rotated such that the USB ports are facing forward on the robot.
*
* The order that the rotations are performed matters, so this sample shows doing them in the order X, Y, then Z.
* For specifying non-orthogonal hub mounting orientations, we must temporarily use axes
* defined relative to the Hub itself, instead of the usual Robot Coordinate System axes
* used for the results the IMU gives us. In the starting orientation, the Hub axes are
* aligned with the Robot Coordinate System:
*
* X Axis: Starting at Center of Hub, pointing out towards I2C connectors
* Y Axis: Starting at Center of Hub, pointing out towards USB connectors
* Z Axis: Starting at Center of Hub, pointing Up through LOGO
*
* Positive rotation is defined by right-hand rule with thumb pointing in +ve direction on axis.
*
* Some examples.
*
* ----------------------------------------------------------------------------------------------------------------------------------
* Example A) Assume that the hub is mounted on a sloped plate at the back of the robot, with the USB ports coming out the top of the hub.
* The plate is tilted UP 60 degrees from horizontal.
*
* To get the "Default" hub into this configuration you would just need a single rotation.
* 1) Rotate the Hub +60 degrees around the X axis to tilt up the front edge.
* 2) No rotation around the Y or Z axes.
*
* So the X,Y,Z rotations would be 60,0,0
*
* ----------------------------------------------------------------------------------------------------------------------------------
* Example B) Assume that the hub is laying flat on the chassis, but it has been twisted 30 degrees towards the right front wheel to make
* the USB cable accessible.
*
* To get the "Default" hub into this configuration you would just need a single rotation, but around a different axis.
* 1) No rotation around the X or Y axes.
* 1) Rotate the Hub -30 degrees (Clockwise) around the Z axis, since a positive angle would be Counter Clockwise.
*
* So the X,Y,Z rotations would be 0,0,-30
*
* ----------------------------------------------------------------------------------------------------------------------------------
* Example C) Assume that the hub is mounted on a vertical plate on the right side of the robot, with the Logo facing out, and the
* Hub rotated so that the USB ports are facing down 30 degrees towards the back wheels of the robot.
*
* To get the "Default" hub into this configuration will require several rotations.
* 1) Rotate the hub +90 degrees around the X axis to get it standing upright with the logo pointing backwards on the robot
* 2) Next, rotate the hub +90 around the Y axis to get it facing to the right.
* 3) Finally rotate the hub +120 degrees around the Z axis to take the USB ports from vertical to sloping down 30 degrees and
* facing towards the back of the robot.
*
* So the X,Y,Z rotations would be 90,90,120
*/
// The next three lines define the desired axis rotations.
// To Do: EDIT these values to match YOUR mounting configuration.
double xRotation = 0; // enter the desired X rotation angle here.
double yRotation = 0; // enter the desired Y rotation angle here.
double zRotation = 0; // enter the desired Z rotation angle here.
Orientation hubRotation = xyzOrientation(xRotation, yRotation, zRotation);
// Now initialize the IMU with this mounting orientation
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(hubRotation);
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Loop and update the dashboard
while (!isStopRequested()) {
telemetry.addData("Hub orientation", "X=%.1f, Y=%.1f, Z=%.1f \n", xRotation, yRotation, zRotation);
// Check to see if heading reset is requested
if (gamepad1.y) {
telemetry.addData("Yaw", "Resetting\n");
imu.resetYaw();
} else {
telemetry.addData("Yaw", "Press Y (triangle) on Gamepad to reset\n");
}
// Retrieve Rotational Angles and Velocities
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.DEGREES);
telemetry.addData("Yaw (Z)", "%.2f Deg. (Heading)", orientation.getYaw(AngleUnit.DEGREES));
telemetry.addData("Pitch (X)", "%.2f Deg.", orientation.getPitch(AngleUnit.DEGREES));
telemetry.addData("Roll (Y)", "%.2f Deg.\n", orientation.getRoll(AngleUnit.DEGREES));
telemetry.addData("Yaw (Z) velocity", "%.2f Deg/Sec", angularVelocity.zRotationRate);
telemetry.addData("Pitch (X) velocity", "%.2f Deg/Sec", angularVelocity.xRotationRate);
telemetry.addData("Roll (Y) velocity", "%.2f Deg/Sec", angularVelocity.yRotationRate);
telemetry.update();
}
}
}

View File

@ -0,0 +1,143 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/*
* This OpMode shows how to use the new universal IMU interface. This
* interface may be used with the BNO055 IMU or the BHI260 IMU. It assumes that an IMU is configured
* on the robot with the name "imu".
*
* The sample will display the current Yaw, Pitch and Roll of the robot.<br>
* With the correct orientation parameters selected, pitch/roll/yaw should act as follows:
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X) <br>
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y) <br>
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z) <br>
*
* The yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
*
* This specific sample assumes that the Hub is mounted on one of the three orthogonal planes
* (X/Y, X/Z or Y/Z) and that the Hub has only been rotated in a range of 90 degree increments.
*
* Note: if your Hub is mounted on a surface angled at some non-90 Degree multiple (like 30) look at
* the alternative SensorImuNonOrthogonal sample in this folder.
*
* This "Orthogonal" requirement means that:
*
* 1) The Logo printed on the top of the Hub can ONLY be pointing in one of six directions:
* FORWARD, BACKWARD, UP, DOWN, LEFT and RIGHT.
*
* 2) The USB ports can only be pointing in one of the same six directions:<br>
* FORWARD, BACKWARD, UP, DOWN, LEFT and RIGHT.
*
* So, To fully define how your Hub is mounted to the robot, you must simply specify:<br>
* logoFacingDirection<br>
* usbFacingDirection
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* Finally, choose the two correct parameters to define how your Hub is mounted and edit this OpMode
* to use those parameters.
*/
@TeleOp(name = "Sensor: IMU Orthogonal", group = "Sensor")
@Disabled // Comment this out to add to the OpMode list
public class SensorIMUOrthogonal extends LinearOpMode
{
// The IMU sensor object
IMU imu;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() throws InterruptedException {
// Retrieve and initialize the IMU.
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
/* Define how the hub is mounted on the robot to get the correct Yaw, Pitch and Roll values.
*
* Two input parameters are required to fully specify the Orientation.
* The first parameter specifies the direction the printed logo on the Hub is pointing.
* The second parameter specifies the direction the USB connector on the Hub is pointing.
* All directions are relative to the robot, and left/right is as-viewed from behind the robot.
*/
/* The next two lines define Hub orientation.
* The Default Orientation (shown) is when a hub is mounted horizontally with the printed logo pointing UP and the USB port pointing FORWARD.
*
* To Do: EDIT these two lines to match YOUR mounting configuration.
*/
RevHubOrientationOnRobot.LogoFacingDirection logoDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
RevHubOrientationOnRobot.UsbFacingDirection usbDirection = RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logoDirection, usbDirection);
// Now initialize the IMU with this mounting orientation
// Note: if you choose two conflicting directions, this initialization will cause a code exception.
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Loop and update the dashboard
while (!isStopRequested()) {
telemetry.addData("Hub orientation", "Logo=%s USB=%s\n ", logoDirection, usbDirection);
// Check to see if heading reset is requested
if (gamepad1.y) {
telemetry.addData("Yaw", "Resetting\n");
imu.resetYaw();
} else {
telemetry.addData("Yaw", "Press Y (triangle) on Gamepad to reset\n");
}
// Retrieve Rotational Angles and Velocities
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.DEGREES);
telemetry.addData("Yaw (Z)", "%.2f Deg. (Heading)", orientation.getYaw(AngleUnit.DEGREES));
telemetry.addData("Pitch (X)", "%.2f Deg.", orientation.getPitch(AngleUnit.DEGREES));
telemetry.addData("Roll (Y)", "%.2f Deg.\n", orientation.getRoll(AngleUnit.DEGREES));
telemetry.addData("Yaw (Z) velocity", "%.2f Deg/Sec", angularVelocity.zRotationRate);
telemetry.addData("Pitch (X) velocity", "%.2f Deg/Sec", angularVelocity.xRotationRate);
telemetry.addData("Roll (Y) velocity", "%.2f Deg/Sec", angularVelocity.yRotationRate);
telemetry.update();
}
}
}

View File

@ -0,0 +1,128 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.kauailabs.NavxMicroNavigationSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gyroscope;
import com.qualcomm.robotcore.hardware.IntegratingGyroscope;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
/*
* This OpMode shows how to use Kauai Labs navX Micro Robotics Navigation
* Sensor. It assumes that the sensor is configured with a name of "navx".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: KL navX Micro", group = "Sensor")
@Disabled
public class SensorKLNavxMicro extends LinearOpMode {
/** In this sample, for illustration purposes we use two interfaces on the one gyro object.
* That's likely atypical: you'll probably use one or the other in any given situation,
* depending on what you're trying to do. {@link IntegratingGyroscope} (and it's base interface,
* {@link Gyroscope}) are common interfaces supported by possibly several different gyro
* implementations. {@link NavxMicroNavigationSensor}, by contrast, provides functionality that
* is unique to the navX Micro sensor.
*/
IntegratingGyroscope gyro;
NavxMicroNavigationSensor navxMicro;
// A timer helps provide feedback while calibration is taking place
ElapsedTime timer = new ElapsedTime();
@Override public void runOpMode() throws InterruptedException {
// Get a reference to a Modern Robotics GyroSensor object. We use several interfaces
// on this object to illustrate which interfaces support which functionality.
navxMicro = hardwareMap.get(NavxMicroNavigationSensor.class, "navx");
gyro = (IntegratingGyroscope)navxMicro;
// If you're only interested int the IntegratingGyroscope interface, the following will suffice.
// gyro = hardwareMap.get(IntegratingGyroscope.class, "navx");
// The gyro automatically starts calibrating. This takes a few seconds.
telemetry.log().add("Gyro Calibrating. Do Not Move!");
// Wait until the gyro calibration is complete
timer.reset();
while (navxMicro.isCalibrating()) {
telemetry.addData("calibrating", "%s", Math.round(timer.seconds())%2==0 ? "|.." : "..|");
telemetry.update();
Thread.sleep(50);
}
telemetry.log().clear(); telemetry.log().add("Gyro Calibrated. Press Start.");
telemetry.clear(); telemetry.update();
// Wait for the start button to be pressed
waitForStart();
telemetry.log().clear();
while (opModeIsActive()) {
// Read dimensionalized data from the gyro. This gyro can report angular velocities
// about all three axes. Additionally, it internally integrates the Z axis to
// be able to report an absolute angular Z orientation.
AngularVelocity rates = gyro.getAngularVelocity(AngleUnit.DEGREES);
Orientation angles = gyro.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
telemetry.addLine()
.addData("dx", formatRate(rates.xRotationRate))
.addData("dy", formatRate(rates.yRotationRate))
.addData("dz", "%s deg/s", formatRate(rates.zRotationRate));
telemetry.addLine()
.addData("heading", formatAngle(angles.angleUnit, angles.firstAngle))
.addData("roll", formatAngle(angles.angleUnit, angles.secondAngle))
.addData("pitch", "%s deg", formatAngle(angles.angleUnit, angles.thirdAngle));
telemetry.update();
idle(); // Always call idle() at the bottom of your while(opModeIsActive()) loop
}
}
String formatRate(float rate) {
return String.format("%.3f", rate);
}
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format("%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,138 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.ColorSensor;
/*
*
* This OpMode that shows how to use
* a Modern Robotics Color Sensor.
*
* The OpMode assumes that the color sensor
* is configured with a name of "sensor_color".
*
* You can use the X button on gamepad1 to toggle the LED on and off.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR Color", group = "Sensor")
@Disabled
public class SensorMRColor extends LinearOpMode {
ColorSensor colorSensor; // Hardware Device Object
@Override
public void runOpMode() {
// hsvValues is an array that will hold the hue, saturation, and value information.
float hsvValues[] = {0F,0F,0F};
// values is a reference to the hsvValues array.
final float values[] = hsvValues;
// get a reference to the RelativeLayout so we can change the background
// color of the Robot Controller app to match the hue detected by the RGB sensor.
int relativeLayoutId = hardwareMap.appContext.getResources().getIdentifier("RelativeLayout", "id", hardwareMap.appContext.getPackageName());
final View relativeLayout = ((Activity) hardwareMap.appContext).findViewById(relativeLayoutId);
// bPrevState and bCurrState represent the previous and current state of the button.
boolean bPrevState = false;
boolean bCurrState = false;
// bLedOn represents the state of the LED.
boolean bLedOn = true;
// get a reference to our ColorSensor object.
colorSensor = hardwareMap.get(ColorSensor.class, "sensor_color");
// Set the LED in the beginning
colorSensor.enableLed(bLedOn);
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read the RGB data.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// check the status of the x button on either gamepad.
bCurrState = gamepad1.x;
// check for button state transitions.
if (bCurrState && (bCurrState != bPrevState)) {
// button is transitioning to a pressed state. So Toggle LED
bLedOn = !bLedOn;
colorSensor.enableLed(bLedOn);
}
// update previous state variable.
bPrevState = bCurrState;
// convert the RGB values to HSV values.
Color.RGBToHSV(colorSensor.red() * 8, colorSensor.green() * 8, colorSensor.blue() * 8, hsvValues);
// send the info back to driver station using telemetry function.
telemetry.addData("LED", bLedOn ? "On" : "Off");
telemetry.addData("Clear", colorSensor.alpha());
telemetry.addData("Red ", colorSensor.red());
telemetry.addData("Green", colorSensor.green());
telemetry.addData("Blue ", colorSensor.blue());
telemetry.addData("Hue", hsvValues[0]);
// change the background color to match the color detected by the RGB sensor.
// pass a reference to the hue, saturation, and value array as an argument
// to the HSVToColor method.
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.HSVToColor(0xff, values));
}
});
telemetry.update();
}
// Set the panel back to the default color
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.WHITE);
}
});
}
}

View File

@ -0,0 +1,160 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cGyro;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IntegratingGyroscope;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
/*
* This OpMode shows how to use the Modern Robotics Gyro.
*
* The OpMode assumes that the gyro sensor is attached to a Device Interface Module
* I2C channel and is configured with a name of "gyro".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR Gyro", group = "Sensor")
@Disabled
public class SensorMRGyro extends LinearOpMode {
/* In this sample, for illustration purposes we use two interfaces on the one gyro object.
* That's likely atypical: you'll probably use one or the other in any given situation,
* depending on what you're trying to do. {@link IntegratingGyroscope} (and it's base interface,
* {@link Gyroscope}) are common interfaces supported by possibly several different gyro
* implementations. {@link ModernRoboticsI2cGyro}, by contrast, provides functionality that
* is unique to the Modern Robotics gyro sensor.
*/
IntegratingGyroscope gyro;
ModernRoboticsI2cGyro modernRoboticsI2cGyro;
// A timer helps provide feedback while calibration is taking place
ElapsedTime timer = new ElapsedTime();
@Override
public void runOpMode() {
boolean lastResetState = false;
boolean curResetState = false;
// Get a reference to a Modern Robotics gyro object. We use several interfaces
// on this object to illustrate which interfaces support which functionality.
modernRoboticsI2cGyro = hardwareMap.get(ModernRoboticsI2cGyro.class, "gyro");
gyro = (IntegratingGyroscope)modernRoboticsI2cGyro;
// If you're only interested int the IntegratingGyroscope interface, the following will suffice.
// gyro = hardwareMap.get(IntegratingGyroscope.class, "gyro");
// A similar approach will work for the Gyroscope interface, if that's all you need.
// Start calibrating the gyro. This takes a few seconds and is worth performing
// during the initialization phase at the start of each OpMode.
telemetry.log().add("Gyro Calibrating. Do Not Move!");
modernRoboticsI2cGyro.calibrate();
// Wait until the gyro calibration is complete
timer.reset();
while (!isStopRequested() && modernRoboticsI2cGyro.isCalibrating()) {
telemetry.addData("calibrating", "%s", Math.round(timer.seconds())%2==0 ? "|.." : "..|");
telemetry.update();
sleep(50);
}
telemetry.log().clear(); telemetry.log().add("Gyro Calibrated. Press Start.");
telemetry.clear(); telemetry.update();
// Wait for the start button to be pressed
waitForStart();
telemetry.log().clear();
telemetry.log().add("Press A & B to reset heading");
// Loop until we're asked to stop
while (opModeIsActive()) {
// If the A and B buttons are pressed just now, reset Z heading.
curResetState = (gamepad1.a && gamepad1.b);
if (curResetState && !lastResetState) {
modernRoboticsI2cGyro.resetZAxisIntegrator();
}
lastResetState = curResetState;
// The raw() methods report the angular rate of change about each of the
// three axes directly as reported by the underlying sensor IC.
int rawX = modernRoboticsI2cGyro.rawX();
int rawY = modernRoboticsI2cGyro.rawY();
int rawZ = modernRoboticsI2cGyro.rawZ();
int heading = modernRoboticsI2cGyro.getHeading();
int integratedZ = modernRoboticsI2cGyro.getIntegratedZValue();
// Read dimensionalized data from the gyro. This gyro can report angular velocities
// about all three axes. Additionally, it internally integrates the Z axis to
// be able to report an absolute angular Z orientation.
AngularVelocity rates = gyro.getAngularVelocity(AngleUnit.DEGREES);
float zAngle = gyro.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES).firstAngle;
// Read administrative information from the gyro
int zAxisOffset = modernRoboticsI2cGyro.getZAxisOffset();
int zAxisScalingCoefficient = modernRoboticsI2cGyro.getZAxisScalingCoefficient();
telemetry.addLine()
.addData("dx", formatRate(rates.xRotationRate))
.addData("dy", formatRate(rates.yRotationRate))
.addData("dz", "%s deg/s", formatRate(rates.zRotationRate));
telemetry.addData("angle", "%s deg", formatFloat(zAngle));
telemetry.addData("heading", "%3d deg", heading);
telemetry.addData("integrated Z", "%3d", integratedZ);
telemetry.addLine()
.addData("rawX", formatRaw(rawX))
.addData("rawY", formatRaw(rawY))
.addData("rawZ", formatRaw(rawZ));
telemetry.addLine().addData("z offset", zAxisOffset).addData("z coeff", zAxisScalingCoefficient);
telemetry.update();
}
}
String formatRaw(int rawValue) {
return String.format("%d", rawValue);
}
String formatRate(float rate) {
return String.format("%.3f", rate);
}
String formatFloat(float rate) {
return String.format("%.3f", rate);
}
}

View File

@ -0,0 +1,70 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.OpticalDistanceSensor;
/*
* This OpMode shows how to use a Modern Robotics Optical Distance Sensor
* It assumes that the ODS sensor is configured with a name of "sensor_ods".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR ODS", group = "Sensor")
@Disabled
public class SensorMROpticalDistance extends LinearOpMode {
OpticalDistanceSensor odsSensor; // Hardware Device Object
@Override
public void runOpMode() {
// get a reference to our Light Sensor object.
odsSensor = hardwareMap.get(OpticalDistanceSensor.class, "sensor_ods");
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read the light levels.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// send the info back to driver station using telemetry function.
telemetry.addData("Raw", odsSensor.getRawLightDetected());
telemetry.addData("Normal", odsSensor.getLightDetected());
telemetry.update();
}
}
}

View File

@ -0,0 +1,70 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cRangeSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/*
* This OpMode illustrates how to use the Modern Robotics Range Sensor.
*
* The OpMode assumes that the range sensor is configured with a name of "sensor_range".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* @see <a href="http://modernroboticsinc.com/range-sensor">MR Range Sensor</a>
*/
@TeleOp(name = "Sensor: MR range sensor", group = "Sensor")
@Disabled // comment out or remove this line to enable this OpMode
public class SensorMRRangeSensor extends LinearOpMode {
ModernRoboticsI2cRangeSensor rangeSensor;
@Override public void runOpMode() {
// get a reference to our compass
rangeSensor = hardwareMap.get(ModernRoboticsI2cRangeSensor.class, "sensor_range");
// wait for the start button to be pressed
waitForStart();
while (opModeIsActive()) {
telemetry.addData("raw ultrasonic", rangeSensor.rawUltrasonic());
telemetry.addData("raw optical", rangeSensor.rawOptical());
telemetry.addData("cm optical", "%.2f cm", rangeSensor.cmOptical());
telemetry.addData("cm", "%.2f cm", rangeSensor.getDistance(DistanceUnit.CM));
telemetry.update();
}
}
}

View File

@ -0,0 +1,87 @@
/*
Copyright (c) 2018 FIRST
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of FIRST nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.Rev2mDistanceSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/*
* This OpMode illustrates how to use the REV Robotics 2M Distance Sensor.
*
* The OpMode assumes that the sensor is configured with a name of "sensor_distance".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* See the sensor's product page: https://www.revrobotics.com/rev-31-1505/
*/
@TeleOp(name = "Sensor: REV2mDistance", group = "Sensor")
@Disabled
public class SensorREV2mDistance extends LinearOpMode {
private DistanceSensor sensorDistance;
@Override
public void runOpMode() {
// you can use this as a regular DistanceSensor.
sensorDistance = hardwareMap.get(DistanceSensor.class, "sensor_distance");
// you can also cast this to a Rev2mDistanceSensor if you want to use added
// methods associated with the Rev2mDistanceSensor class.
Rev2mDistanceSensor sensorTimeOfFlight = (Rev2mDistanceSensor) sensorDistance;
telemetry.addData(">>", "Press start to continue");
telemetry.update();
waitForStart();
while(opModeIsActive()) {
// generic DistanceSensor methods.
telemetry.addData("deviceName", sensorDistance.getDeviceName() );
telemetry.addData("range", String.format("%.01f mm", sensorDistance.getDistance(DistanceUnit.MM)));
telemetry.addData("range", String.format("%.01f cm", sensorDistance.getDistance(DistanceUnit.CM)));
telemetry.addData("range", String.format("%.01f m", sensorDistance.getDistance(DistanceUnit.METER)));
telemetry.addData("range", String.format("%.01f in", sensorDistance.getDistance(DistanceUnit.INCH)));
// Rev2mDistanceSensor specific methods.
telemetry.addData("ID", String.format("%x", sensorTimeOfFlight.getModelID()));
telemetry.addData("did time out", Boolean.toString(sensorTimeOfFlight.didTimeoutOccur()));
telemetry.update();
}
}
}

View File

@ -0,0 +1,78 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.TouchSensor;
/*
* This OpMode demonstrates how to use a REV Robotics Touch Sensor, REV Robotics Magnetic Limit Switch, or other device
* that implements the TouchSensor interface. Any touch sensor that connects its output to ground when pressed
* (known as "active low") can be configured as a "REV Touch Sensor". This includes REV's Magnetic Limit Switch.
*
* The OpMode assumes that the touch sensor is configured with a name of "sensor_touch".
*
* A REV Robotics Touch Sensor must be configured on digital port number 1, 3, 5, or 7.
* A Magnetic Limit Switch can be configured on any digital port.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Sensor: REV touch sensor", group = "Sensor")
@Disabled
public class SensorTouch extends LinearOpMode {
TouchSensor touchSensor; // Touch sensor Object
@Override
public void runOpMode() {
// get a reference to our touchSensor object.
touchSensor = hardwareMap.get(TouchSensor.class, "sensor_touch");
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read whether the sensor is being pressed.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// send the info back to driver station using telemetry function.
if (touchSensor.isPressed()) {
telemetry.addData("Touch Sensor", "Is Pressed");
} else {
telemetry.addData("Touch Sensor", "Is Not Pressed");
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,127 @@
/*
* Copyright (c) 2023 FIRST
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior
* written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
* TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
* THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import java.util.Locale;
/*
* This OpMode helps calibrate a webcam or RC phone camera, useful for AprilTag pose estimation
* with the FTC VisionPortal. It captures a camera frame (image) and stores it on the Robot Controller
* (Control Hub or RC phone), with each press of the gamepad button X (or Square).
* Full calibration instructions are here:
*
* https://ftc-docs.firstinspires.org/camera-calibration
*
* In Android Studio, copy this class into your "teamcode" folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* In OnBot Java, use "Add File" to add this OpMode from the list of Samples.
*/
@TeleOp(name = "Utility: Camera Frame Capture", group = "Utility")
@Disabled
public class UtilityCameraFrameCapture extends LinearOpMode
{
/*
* EDIT THESE PARAMETERS AS NEEDED
*/
final boolean USING_WEBCAM = false;
final BuiltinCameraDirection INTERNAL_CAM_DIR = BuiltinCameraDirection.BACK;
final int RESOLUTION_WIDTH = 640;
final int RESOLUTION_HEIGHT = 480;
// Internal state
boolean lastX;
int frameCount;
long capReqTime;
@Override
public void runOpMode()
{
VisionPortal portal;
if (USING_WEBCAM)
{
portal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.setCameraResolution(new Size(RESOLUTION_WIDTH, RESOLUTION_HEIGHT))
.build();
}
else
{
portal = new VisionPortal.Builder()
.setCamera(INTERNAL_CAM_DIR)
.setCameraResolution(new Size(RESOLUTION_WIDTH, RESOLUTION_HEIGHT))
.build();
}
while (!isStopRequested())
{
boolean x = gamepad1.x;
if (x && !lastX)
{
portal.saveNextFrameRaw(String.format(Locale.US, "CameraFrameCapture-%06d", frameCount++));
capReqTime = System.currentTimeMillis();
}
lastX = x;
telemetry.addLine("######## Camera Capture Utility ########");
telemetry.addLine(String.format(Locale.US, " > Resolution: %dx%d", RESOLUTION_WIDTH, RESOLUTION_HEIGHT));
telemetry.addLine(" > Press X (or Square) to capture a frame");
telemetry.addData(" > Camera Status", portal.getCameraState());
if (capReqTime != 0)
{
telemetry.addLine("\nCaptured Frame!");
}
if (capReqTime != 0 && System.currentTimeMillis() - capReqTime > 1000)
{
capReqTime = 0;
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,45 @@
## Caution
No Team-specific code should be placed or modified in this ``.../samples`` folder.
Samples should be Copied from here, and then Pasted into the team's
[/TeamCode/src/main/java/org/firstinspires/ftc/teamcode](../../../../../../../../../../TeamCode/src/main/java/org/firstinspires/ftc/teamcode)
folder, using the Android Studio cut and paste commands. This automatically changes all file and
class names to be consistent. From there, the sample can be modified to suit the team's needs.
For more detailed instructions see the /teamcode readme.
### Naming of Samples
To gain a better understanding of how the samples are organized, and how to interpret the
naming system, it will help to understand the conventions that were used during their creation.
These conventions are described (in detail) in the sample_conventions.md file in this folder.
To summarize: A range of different samples classes will reside in the java/external/samples.
The class names will follow a naming convention which indicates the purpose of each class.
The prefix of the name will be one of the following:
Basic: This is a minimally functional OpMode used to illustrate the skeleton/structure
of a particular style of OpMode. These are bare bones examples.
Sensor: This is a Sample OpMode that shows how to use a specific sensor.
It is not intended to drive a functioning robot, it is simply showing the minimal code
required to read and display the sensor values.
Robot: This is a Sample OpMode that assumes a simple two-motor (differential) drive base.
It may be used to provide a common baseline driving OpMode, or
to demonstrate how a particular sensor or concept can be used to navigate.
Concept: This is a sample OpMode that illustrates performing a specific function or concept.
These may be complex, but their operation should be explained clearly in the comments,
or the comments should reference an external doc, guide or tutorial.
Each OpMode should try to only demonstrate a single concept so they are easy to
locate based on their name. These OpModes may not produce a drivable robot.
After the prefix, other conventions will apply:
* Sensor class names are constructed as: Sensor - Company - Type
* Robot class names are constructed as: Robot - Mode - Action - OpModetype
* Concept class names are constructed as: Concept - Topic - OpModetype

View File

@ -0,0 +1,108 @@
## Sample Class/Opmode conventions
#### V 1.1.0 8/9/2017
This document defines the FTC Sample OpMode and Class conventions.
### OpMode Name
To gain a better understanding of how the samples are organized, and how to interpret the
naming system, it will help to understand the conventions that were used during their creation.
To summarize: A range of different samples classes will reside in the java/external/samples.
The class names will follow a naming convention which indicates the purpose of each class.
The prefix of the name will be one of the following:
Basic: This is a minimally functional OpMode used to illustrate the skeleton/structure
of a particular style of OpMode. These are bare bones examples.
Sensor: This is a Sample OpMode that shows how to use a specific sensor.
It is not intended to drive a functioning robot, it is simply showing the minimal code
required to read and display the sensor values.
Robot: This is a Sample OpMode that assumes a simple two-motor (differential) drive base.
It may be used to provide a common baseline driving OpMode, or
to demonstrate how a particular sensor or concept can be used to navigate.
Concept: This is a sample OpMode that illustrates performing a specific function or concept.
These may be complex, but their operation should be explained clearly in the comments,
or the comments should reference an external doc, guide or tutorial.
Each OpMode should try to only demonstrate a single concept so they are easy to
locate based on their name. These OpModes may not produce a drivable robot.
After the prefix, other conventions will apply:
* Sensor class names should constructed as: Sensor - Company - Type
* Robot class names should be constructed as: Robot - Mode - Action - OpModetype
* Concept class names should be constructed as: Concept - Topic - OpModetype
### Sample OpMode Content/Style
Code is formatted as per the Google Style Guide:
https://google.github.io/styleguide/javaguide.html
With “Sensor” and “Hardware” samples, the code should demonstrate the essential function,
and not be embellished with too much additional “clever” code. If a sensor has special
addressing needs, or has a variety of modes or outputs, these should be demonstrated as
simply as possible.
Special programming methods, or robot control techniques should be reserved for “Concept” Samples,
and where possible, Samples should strive to only demonstrate a single concept,
eg: State machine coding, or a User Menu system, and not combine them into a single “all inclusive”
sample. This will prevent an “all inclusive” Sample being deleted just because one part of it
becomes obsolete.
### Device Configuration Names
The following device names are used in the external samples
** Motors:
left_drive
right_drive
left_arm
** Servos:
left_hand
right_hand
arm
claw
** Sensors:
sensor_color
sensor_ir
sensor_light
sensor_ods
sensor_range
sensor_touch
sensor_color_distance
sensor_digital
digin
digout
** Localization:
compass
gyro
imu
navx
### Device Object Names
Device Object names should use the same words as the devices configuration name, but they
should be re-structured to be a suitable Java variable name. This should keep the same word order,
but adopt the style of beginning with a lower case letter, and then each subsequent word
starting with an upper case letter.
Eg: from the examples above: tool, leftMotor, rightClawServo, rearLightSensor.
Note: Sometimes its helpful to put the device type first, followed by the variant.
eg: motorLeft and motorRight, but this should only be done if the same word order
is used on the device configuration name.
### OpMode code Comments
Sample comments should read like normal code comments, that is, as an explanation of what the
sample code is doing. They should NOT be directives to the user,
like: “insert your joystick code here” as these comments typically arent
detailed enough to be useful. They also often get left in the code and become garbage.
Instead, an example of the joystick code should be shown with a comment describing what it is doing.

View File

@ -0,0 +1,70 @@
/* Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */
package org.firstinspires.ftc.robotcontroller.internal;
import com.qualcomm.robotcore.eventloop.opmode.OpModeManager;
import com.qualcomm.robotcore.eventloop.opmode.OpModeRegister;
import org.firstinspires.ftc.robotcontroller.external.samples.ConceptNullOp;
/**
* {@link FtcOpModeRegister} is responsible for registering OpModes for use in an FTC game.
* @see #register(OpModeManager)
*/
public class FtcOpModeRegister implements OpModeRegister {
/**
* {@link #register(OpModeManager)} is called by the SDK game in order to register
* OpMode classes or instances that will participate in an FTC game.
*
* There are two mechanisms by which an OpMode may be registered.
*
* 1) The preferred method is by means of class annotations in the OpMode itself.
* See, for example the class annotations in {@link ConceptNullOp}.
*
* 2) The other, retired, method is to modify this {@link #register(OpModeManager)}
* method to include explicit calls to OpModeManager.register().
* This method of modifying this file directly is discouraged, as it
* makes updates to the SDK harder to integrate into your code.
*
* @param manager the object which contains methods for carrying out OpMode registrations
*
* @see com.qualcomm.robotcore.eventloop.opmode.TeleOp
* @see com.qualcomm.robotcore.eventloop.opmode.Autonomous
*/
public void register(OpModeManager manager) {
/**
* Any manual OpMode class registrations should go here.
*/
}
}

View File

@ -0,0 +1,845 @@
/* Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */
package org.firstinspires.ftc.robotcontroller.internal;
import android.app.ActionBar;
import android.app.Activity;
import android.app.ActivityManager;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.SharedPreferences;
import android.content.res.Configuration;
import android.hardware.usb.UsbDevice;
import android.hardware.usb.UsbManager;
import android.net.wifi.WifiManager;
import android.os.Bundle;
import android.os.IBinder;
import android.preference.PreferenceManager;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.StringRes;
import android.view.Menu;
import android.view.MenuItem;
import android.view.MotionEvent;
import android.view.View;
import android.view.WindowManager;
import android.webkit.WebView;
import android.widget.ImageButton;
import android.widget.LinearLayout;
import android.widget.LinearLayout.LayoutParams;
import android.widget.PopupMenu;
import android.widget.TextView;
import com.google.blocks.ftcrobotcontroller.ProgrammingWebHandlers;
import com.google.blocks.ftcrobotcontroller.runtime.BlocksOpMode;
import com.qualcomm.ftccommon.ClassManagerFactory;
import com.qualcomm.ftccommon.FtcAboutActivity;
import com.qualcomm.ftccommon.FtcEventLoop;
import com.qualcomm.ftccommon.FtcEventLoopIdle;
import com.qualcomm.ftccommon.FtcRobotControllerService;
import com.qualcomm.ftccommon.FtcRobotControllerService.FtcRobotControllerBinder;
import com.qualcomm.ftccommon.FtcRobotControllerSettingsActivity;
import com.qualcomm.ftccommon.LaunchActivityConstantsList;
import com.qualcomm.ftccommon.LaunchActivityConstantsList.RequestCode;
import com.qualcomm.ftccommon.Restarter;
import com.qualcomm.ftccommon.UpdateUI;
import com.qualcomm.ftccommon.configuration.EditParameters;
import com.qualcomm.ftccommon.configuration.FtcLoadFileActivity;
import com.qualcomm.ftccommon.configuration.RobotConfigFile;
import com.qualcomm.ftccommon.configuration.RobotConfigFileManager;
import com.qualcomm.ftcrobotcontroller.BuildConfig;
import com.qualcomm.ftcrobotcontroller.R;
import com.qualcomm.hardware.HardwareFactory;
import com.qualcomm.robotcore.eventloop.EventLoopManager;
import com.qualcomm.robotcore.eventloop.opmode.FtcRobotControllerServiceState;
import com.qualcomm.robotcore.eventloop.opmode.OpModeRegister;
import com.qualcomm.robotcore.hardware.configuration.LynxConstants;
import com.qualcomm.robotcore.hardware.configuration.Utility;
import com.qualcomm.robotcore.robot.Robot;
import com.qualcomm.robotcore.robot.RobotState;
import com.qualcomm.robotcore.util.ClockWarningSource;
import com.qualcomm.robotcore.util.Device;
import com.qualcomm.robotcore.util.Dimmer;
import com.qualcomm.robotcore.util.ImmersiveMode;
import com.qualcomm.robotcore.util.RobotLog;
import com.qualcomm.robotcore.util.WebServer;
import com.qualcomm.robotcore.wifi.NetworkConnection;
import com.qualcomm.robotcore.wifi.NetworkConnectionFactory;
import com.qualcomm.robotcore.wifi.NetworkType;
import org.firstinspires.ftc.ftccommon.external.SoundPlayingRobotMonitor;
import org.firstinspires.ftc.ftccommon.internal.AnnotatedHooksClassFilter;
import org.firstinspires.ftc.ftccommon.internal.FtcRobotControllerWatchdogService;
import org.firstinspires.ftc.ftccommon.internal.ProgramAndManageActivity;
import org.firstinspires.ftc.onbotjava.ExternalLibraries;
import org.firstinspires.ftc.onbotjava.OnBotJavaHelperImpl;
import org.firstinspires.ftc.onbotjava.OnBotJavaProgrammingMode;
import org.firstinspires.ftc.robotcore.external.navigation.MotionDetection;
import org.firstinspires.ftc.robotcore.internal.hardware.android.AndroidBoard;
import org.firstinspires.ftc.robotcore.internal.network.DeviceNameManagerFactory;
import org.firstinspires.ftc.robotcore.internal.network.PreferenceRemoterRC;
import org.firstinspires.ftc.robotcore.internal.network.StartResult;
import org.firstinspires.ftc.robotcore.internal.network.WifiDirectChannelChanger;
import org.firstinspires.ftc.robotcore.internal.network.WifiMuteEvent;
import org.firstinspires.ftc.robotcore.internal.network.WifiMuteStateMachine;
import org.firstinspires.ftc.robotcore.internal.opmode.ClassManager;
import org.firstinspires.ftc.robotcore.internal.opmode.OnBotJavaHelper;
import org.firstinspires.ftc.robotcore.internal.system.AppAliveNotifier;
import org.firstinspires.ftc.robotcore.internal.system.AppUtil;
import org.firstinspires.ftc.robotcore.internal.system.Assert;
import org.firstinspires.ftc.robotcore.internal.system.PreferencesHelper;
import org.firstinspires.ftc.robotcore.internal.system.ServiceController;
import org.firstinspires.ftc.robotcore.internal.ui.ThemedActivity;
import org.firstinspires.ftc.robotcore.internal.ui.UILocation;
import org.firstinspires.ftc.robotcore.internal.webserver.RobotControllerWebInfo;
import org.firstinspires.ftc.robotserver.internal.programmingmode.ProgrammingModeManager;
import org.firstinspires.inspection.RcInspectionActivity;
import org.threeten.bp.YearMonth;
import org.xmlpull.v1.XmlPullParserException;
import java.io.FileNotFoundException;
import java.util.List;
import java.util.Queue;
import java.util.concurrent.ConcurrentLinkedQueue;
@SuppressWarnings("WeakerAccess")
public class FtcRobotControllerActivity extends Activity
{
public static final String TAG = "RCActivity";
public String getTag() { return TAG; }
private static final int REQUEST_CONFIG_WIFI_CHANNEL = 1;
private static final int NUM_GAMEPADS = 2;
protected WifiManager.WifiLock wifiLock;
protected RobotConfigFileManager cfgFileMgr;
private OnBotJavaHelper onBotJavaHelper;
protected ProgrammingModeManager programmingModeManager;
protected UpdateUI.Callback callback;
protected Context context;
protected Utility utility;
protected StartResult prefRemoterStartResult = new StartResult();
protected StartResult deviceNameStartResult = new StartResult();
protected PreferencesHelper preferencesHelper;
protected final SharedPreferencesListener sharedPreferencesListener = new SharedPreferencesListener();
protected ImageButton buttonMenu;
protected TextView textDeviceName;
protected TextView textNetworkConnectionStatus;
protected TextView textRobotStatus;
protected TextView[] textGamepad = new TextView[NUM_GAMEPADS];
protected TextView textOpMode;
protected TextView textErrorMessage;
protected ImmersiveMode immersion;
protected UpdateUI updateUI;
protected Dimmer dimmer;
protected LinearLayout entireScreenLayout;
protected FtcRobotControllerService controllerService;
protected NetworkType networkType;
protected FtcEventLoop eventLoop;
protected Queue<UsbDevice> receivedUsbAttachmentNotifications;
protected WifiMuteStateMachine wifiMuteStateMachine;
protected MotionDetection motionDetection;
private static boolean permissionsValidated = false;
private WifiDirectChannelChanger wifiDirectChannelChanger;
protected class RobotRestarter implements Restarter {
public void requestRestart() {
requestRobotRestart();
}
}
protected boolean serviceShouldUnbind = false;
protected ServiceConnection connection = new ServiceConnection() {
@Override
public void onServiceConnected(ComponentName name, IBinder service) {
FtcRobotControllerBinder binder = (FtcRobotControllerBinder) service;
onServiceBind(binder.getService());
}
@Override
public void onServiceDisconnected(ComponentName name) {
RobotLog.vv(FtcRobotControllerService.TAG, "%s.controllerService=null", TAG);
controllerService = null;
}
};
@Override
protected void onNewIntent(Intent intent) {
super.onNewIntent(intent);
if (UsbManager.ACTION_USB_DEVICE_ATTACHED.equals(intent.getAction())) {
UsbDevice usbDevice = intent.getParcelableExtra(UsbManager.EXTRA_DEVICE);
RobotLog.vv(TAG, "ACTION_USB_DEVICE_ATTACHED: %s", usbDevice.getDeviceName());
if (usbDevice != null) { // paranoia
// We might get attachment notifications before the event loop is set up, so
// we hold on to them and pass them along only when we're good and ready.
if (receivedUsbAttachmentNotifications != null) { // *total* paranoia
receivedUsbAttachmentNotifications.add(usbDevice);
passReceivedUsbAttachmentsToEventLoop();
}
}
}
}
protected void passReceivedUsbAttachmentsToEventLoop() {
if (this.eventLoop != null) {
for (;;) {
UsbDevice usbDevice = receivedUsbAttachmentNotifications.poll();
if (usbDevice == null)
break;
this.eventLoop.onUsbDeviceAttached(usbDevice);
}
}
else {
// Paranoia: we don't want the pending list to grow without bound when we don't
// (yet) have an event loop
while (receivedUsbAttachmentNotifications.size() > 100) {
receivedUsbAttachmentNotifications.poll();
}
}
}
/**
* There are cases where a permission may be revoked and the system restart will restart the
* FtcRobotControllerActivity, instead of the launch activity. Detect when that happens, and throw
* the device back to the permission validator activity.
*/
protected boolean enforcePermissionValidator() {
if (!permissionsValidated) {
RobotLog.vv(TAG, "Redirecting to permission validator");
Intent permissionValidatorIntent = new Intent(AppUtil.getDefContext(), PermissionValidatorWrapper.class);
startActivity(permissionValidatorIntent);
finish();
return true;
} else {
RobotLog.vv(TAG, "Permissions validated already");
return false;
}
}
public static void setPermissionsValidated() {
permissionsValidated = true;
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (enforcePermissionValidator()) {
return;
}
RobotLog.onApplicationStart(); // robustify against onCreate() following onDestroy() but using the same app instance, which apparently does happen
RobotLog.vv(TAG, "onCreate()");
ThemedActivity.appAppThemeToActivity(getTag(), this); // do this way instead of inherit to help AppInventor
// Oddly, sometimes after a crash & restart the root activity will be something unexpected, like from the before crash? We don't yet understand
RobotLog.vv(TAG, "rootActivity is of class %s", AppUtil.getInstance().getRootActivity().getClass().getSimpleName());
RobotLog.vv(TAG, "launchActivity is of class %s", FtcRobotControllerWatchdogService.launchActivity());
Assert.assertTrue(FtcRobotControllerWatchdogService.isLaunchActivity(AppUtil.getInstance().getRootActivity()));
Assert.assertTrue(AppUtil.getInstance().isRobotController());
// Quick check: should we pretend we're not here, and so allow the Lynx to operate as
// a stand-alone USB-connected module?
if (LynxConstants.isRevControlHub()) {
// Double-sure check that we can talk to the DB over the serial TTY
AndroidBoard.getInstance().getAndroidBoardIsPresentPin().setState(true);
}
context = this;
utility = new Utility(this);
DeviceNameManagerFactory.getInstance().start(deviceNameStartResult);
PreferenceRemoterRC.getInstance().start(prefRemoterStartResult);
receivedUsbAttachmentNotifications = new ConcurrentLinkedQueue<UsbDevice>();
eventLoop = null;
setContentView(R.layout.activity_ftc_controller);
preferencesHelper = new PreferencesHelper(TAG, context);
preferencesHelper.writeBooleanPrefIfDifferent(context.getString(R.string.pref_rc_connected), true);
preferencesHelper.getSharedPreferences().registerOnSharedPreferenceChangeListener(sharedPreferencesListener);
// Check if this RC app is from a later FTC season than what was installed previously
int ftcSeasonYearOfPreviouslyInstalledRc = preferencesHelper.readInt(getString(R.string.pref_ftc_season_year_of_current_rc), 0);
int ftcSeasonYearOfCurrentlyInstalledRc = AppUtil.getInstance().getFtcSeasonYear(AppUtil.getInstance().getLocalSdkBuildMonth()).getValue();
if (ftcSeasonYearOfCurrentlyInstalledRc > ftcSeasonYearOfPreviouslyInstalledRc) {
preferencesHelper.writeIntPrefIfDifferent(getString(R.string.pref_ftc_season_year_of_current_rc), ftcSeasonYearOfCurrentlyInstalledRc);
// Since it's a new FTC season, we should reset certain settings back to their default values.
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_2_4_ghz_band), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_obsolete_software), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_mismatched_app_versions), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_incorrect_clocks), true);
}
entireScreenLayout = (LinearLayout) findViewById(R.id.entire_screen);
buttonMenu = (ImageButton) findViewById(R.id.menu_buttons);
buttonMenu.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
PopupMenu popupMenu = new PopupMenu(FtcRobotControllerActivity.this, v);
popupMenu.setOnMenuItemClickListener(new PopupMenu.OnMenuItemClickListener() {
@Override
public boolean onMenuItemClick(MenuItem item) {
return onOptionsItemSelected(item); // Delegate to the handler for the hardware menu button
}
});
popupMenu.inflate(R.menu.ftc_robot_controller);
AnnotatedHooksClassFilter.getInstance().callOnCreateMenuMethods(
FtcRobotControllerActivity.this, popupMenu.getMenu());
popupMenu.show();
}
});
updateMonitorLayout(getResources().getConfiguration());
BlocksOpMode.setActivityAndWebView(this, (WebView) findViewById(R.id.webViewBlocksRuntime));
ExternalLibraries.getInstance().onCreate();
onBotJavaHelper = new OnBotJavaHelperImpl();
/*
* Paranoia as the ClassManagerFactory requires EXTERNAL_STORAGE permissions
* and we've seen on the DS where the finish() call above does not short-circuit
* the onCreate() call for the activity and then we crash here because we don't
* have permissions. So...
*/
if (permissionsValidated) {
ClassManager.getInstance().setOnBotJavaClassHelper(onBotJavaHelper);
ClassManagerFactory.registerFilters();
ClassManagerFactory.processAllClasses();
}
cfgFileMgr = new RobotConfigFileManager(this);
// Clean up 'dirty' status after a possible crash
RobotConfigFile configFile = cfgFileMgr.getActiveConfig();
if (configFile.isDirty()) {
configFile.markClean();
cfgFileMgr.setActiveConfig(false, configFile);
}
textDeviceName = (TextView) findViewById(R.id.textDeviceName);
textNetworkConnectionStatus = (TextView) findViewById(R.id.textNetworkConnectionStatus);
textRobotStatus = (TextView) findViewById(R.id.textRobotStatus);
textOpMode = (TextView) findViewById(R.id.textOpMode);
textErrorMessage = (TextView) findViewById(R.id.textErrorMessage);
textGamepad[0] = (TextView) findViewById(R.id.textGamepad1);
textGamepad[1] = (TextView) findViewById(R.id.textGamepad2);
immersion = new ImmersiveMode(getWindow().getDecorView());
dimmer = new Dimmer(this);
dimmer.longBright();
programmingModeManager = new ProgrammingModeManager();
programmingModeManager.register(new ProgrammingWebHandlers());
programmingModeManager.register(new OnBotJavaProgrammingMode());
updateUI = createUpdateUI();
callback = createUICallback(updateUI);
PreferenceManager.setDefaultValues(this, R.xml.app_settings, false);
WifiManager wifiManager = (WifiManager) getApplicationContext().getSystemService(Context.WIFI_SERVICE);
wifiLock = wifiManager.createWifiLock(WifiManager.WIFI_MODE_FULL_HIGH_PERF, "");
hittingMenuButtonBrightensScreen();
wifiLock.acquire();
callback.networkConnectionUpdate(NetworkConnection.NetworkEvent.DISCONNECTED);
readNetworkType();
ServiceController.startService(FtcRobotControllerWatchdogService.class);
bindToService();
RobotLog.logAppInfo();
RobotLog.logDeviceInfo();
AndroidBoard.getInstance().logAndroidBoardInfo();
if (preferencesHelper.readBoolean(getString(R.string.pref_wifi_automute), false)) {
initWifiMute(true);
}
FtcAboutActivity.setBuildTimeFromBuildConfig(BuildConfig.APP_BUILD_TIME);
// check to see if there is a preferred Wi-Fi to use.
checkPreferredChannel();
AnnotatedHooksClassFilter.getInstance().callOnCreateMethods(this);
}
protected UpdateUI createUpdateUI() {
Restarter restarter = new RobotRestarter();
UpdateUI result = new UpdateUI(this, dimmer);
result.setRestarter(restarter);
result.setTextViews(textNetworkConnectionStatus, textRobotStatus, textGamepad, textOpMode, textErrorMessage, textDeviceName);
return result;
}
protected UpdateUI.Callback createUICallback(UpdateUI updateUI) {
UpdateUI.Callback result = updateUI.new Callback();
result.setStateMonitor(new SoundPlayingRobotMonitor());
return result;
}
@Override
protected void onStart() {
super.onStart();
RobotLog.vv(TAG, "onStart()");
entireScreenLayout.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
dimmer.handleDimTimer();
return false;
}
});
}
@Override
protected void onResume() {
super.onResume();
RobotLog.vv(TAG, "onResume()");
// In case the user just got back from fixing their clock, refresh ClockWarningSource
ClockWarningSource.getInstance().onPossibleRcClockUpdate();
}
@Override
protected void onPause() {
super.onPause();
RobotLog.vv(TAG, "onPause()");
}
@Override
protected void onStop() {
// Note: this gets called even when the configuration editor is launched. That is, it gets
// called surprisingly often. So, we don't actually do much here.
super.onStop();
RobotLog.vv(TAG, "onStop()");
}
@Override
protected void onDestroy() {
super.onDestroy();
RobotLog.vv(TAG, "onDestroy()");
shutdownRobot(); // Ensure the robot is put away to bed
if (callback != null) callback.close();
PreferenceRemoterRC.getInstance().stop(prefRemoterStartResult);
DeviceNameManagerFactory.getInstance().stop(deviceNameStartResult);
unbindFromService();
// If the app manually (?) is stopped, then we don't need the auto-starting function (?)
ServiceController.stopService(FtcRobotControllerWatchdogService.class);
if (wifiLock != null) wifiLock.release();
if (preferencesHelper != null) preferencesHelper.getSharedPreferences().unregisterOnSharedPreferenceChangeListener(sharedPreferencesListener);
RobotLog.cancelWriteLogcatToDisk();
AnnotatedHooksClassFilter.getInstance().callOnDestroyMethods(this);
}
protected void bindToService() {
readNetworkType();
Intent intent = new Intent(this, FtcRobotControllerService.class);
intent.putExtra(NetworkConnectionFactory.NETWORK_CONNECTION_TYPE, networkType);
serviceShouldUnbind = bindService(intent, connection, Context.BIND_AUTO_CREATE);
}
protected void unbindFromService() {
if (serviceShouldUnbind) {
unbindService(connection);
serviceShouldUnbind = false;
}
}
protected void readNetworkType() {
// Control hubs are always running the access point model. Everything else, for the time
// being always runs the Wi-Fi Direct model.
if (Device.isRevControlHub() == true) {
networkType = NetworkType.RCWIRELESSAP;
} else {
networkType = NetworkType.fromString(preferencesHelper.readString(context.getString(R.string.pref_pairing_kind), NetworkType.globalDefaultAsString()));
}
// update the app_settings
preferencesHelper.writeStringPrefIfDifferent(context.getString(R.string.pref_pairing_kind), networkType.toString());
}
@Override
public void onWindowFocusChanged(boolean hasFocus) {
super.onWindowFocusChanged(hasFocus);
if (hasFocus) {
immersion.hideSystemUI();
getWindow().setFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION, WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.ftc_robot_controller, menu);
AnnotatedHooksClassFilter.getInstance().callOnCreateMenuMethods(this, menu);
return true;
}
private boolean isRobotRunning() {
if (controllerService == null) {
return false;
}
Robot robot = controllerService.getRobot();
if ((robot == null) || (robot.eventLoopManager == null)) {
return false;
}
RobotState robotState = robot.eventLoopManager.state;
if (robotState != RobotState.RUNNING) {
return false;
} else {
return true;
}
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
int id = item.getItemId();
if (id == R.id.action_program_and_manage) {
if (isRobotRunning()) {
Intent programmingModeIntent = new Intent(AppUtil.getDefContext(), ProgramAndManageActivity.class);
RobotControllerWebInfo webInfo = programmingModeManager.getWebServer().getConnectionInformation();
programmingModeIntent.putExtra(LaunchActivityConstantsList.RC_WEB_INFO, webInfo.toJson());
startActivity(programmingModeIntent);
} else {
AppUtil.getInstance().showToast(UILocation.ONLY_LOCAL, context.getString(R.string.toastWifiUpBeforeProgrammingMode));
}
} else if (id == R.id.action_inspection_mode) {
Intent inspectionModeIntent = new Intent(AppUtil.getDefContext(), RcInspectionActivity.class);
startActivity(inspectionModeIntent);
return true;
} else if (id == R.id.action_restart_robot) {
dimmer.handleDimTimer();
AppUtil.getInstance().showToast(UILocation.BOTH, context.getString(R.string.toastRestartingRobot));
requestRobotRestart();
return true;
}
else if (id == R.id.action_configure_robot) {
EditParameters parameters = new EditParameters();
Intent intentConfigure = new Intent(AppUtil.getDefContext(), FtcLoadFileActivity.class);
parameters.putIntent(intentConfigure);
startActivityForResult(intentConfigure, RequestCode.CONFIGURE_ROBOT_CONTROLLER.ordinal());
}
else if (id == R.id.action_settings) {
// historical: this once erroneously used FTC_CONFIGURE_REQUEST_CODE_ROBOT_CONTROLLER
Intent settingsIntent = new Intent(AppUtil.getDefContext(), FtcRobotControllerSettingsActivity.class);
startActivityForResult(settingsIntent, RequestCode.SETTINGS_ROBOT_CONTROLLER.ordinal());
return true;
}
else if (id == R.id.action_about) {
Intent intent = new Intent(AppUtil.getDefContext(), FtcAboutActivity.class);
startActivity(intent);
return true;
}
else if (id == R.id.action_exit_app) {
//Clear backstack and everything to prevent edge case where VM might be
//restarted (after it was exited) if more than one activity was on the
//backstack for some reason.
finishAffinity();
//For lollipop and up, we can clear ourselves from the recents list too
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
ActivityManager manager = (ActivityManager) getSystemService(ACTIVITY_SERVICE);
List<ActivityManager.AppTask> tasks = manager.getAppTasks();
for (ActivityManager.AppTask task : tasks) {
task.finishAndRemoveTask();
}
}
// Allow the user to use the Control Hub operating system's UI, instead of relaunching the app
AppAliveNotifier.getInstance().disableAppWatchdogUntilNextAppStart();
//Finally, nuke the VM from orbit
AppUtil.getInstance().exitApplication();
return true;
}
return super.onOptionsItemSelected(item);
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
// don't destroy assets on screen rotation
updateMonitorLayout(newConfig);
}
/**
* Updates the orientation of monitorContainer (which contains cameraMonitorView)
* based on the given configuration. Makes the children split the space.
*/
private void updateMonitorLayout(Configuration configuration) {
LinearLayout monitorContainer = (LinearLayout) findViewById(R.id.monitorContainer);
if (configuration.orientation == Configuration.ORIENTATION_LANDSCAPE) {
// When the phone is landscape, lay out the monitor views horizontally.
monitorContainer.setOrientation(LinearLayout.HORIZONTAL);
for (int i = 0; i < monitorContainer.getChildCount(); i++) {
View view = monitorContainer.getChildAt(i);
view.setLayoutParams(new LayoutParams(0, LayoutParams.MATCH_PARENT, 1 /* weight */));
}
} else {
// When the phone is portrait, lay out the monitor views vertically.
monitorContainer.setOrientation(LinearLayout.VERTICAL);
for (int i = 0; i < monitorContainer.getChildCount(); i++) {
View view = monitorContainer.getChildAt(i);
view.setLayoutParams(new LayoutParams(LayoutParams.MATCH_PARENT, 0, 1 /* weight */));
}
}
monitorContainer.requestLayout();
}
@Override
protected void onActivityResult(int request, int result, Intent intent) {
if (request == REQUEST_CONFIG_WIFI_CHANNEL) {
if (result == RESULT_OK) {
AppUtil.getInstance().showToast(UILocation.BOTH, context.getString(R.string.toastWifiConfigurationComplete));
}
}
// was some historical confusion about launch codes here, so we err safely
if (request == RequestCode.CONFIGURE_ROBOT_CONTROLLER.ordinal() || request == RequestCode.SETTINGS_ROBOT_CONTROLLER.ordinal()) {
// We always do a refresh, whether it was a cancel or an OK, for robustness
shutdownRobot();
cfgFileMgr.getActiveConfigAndUpdateUI();
updateUIAndRequestRobotSetup();
}
}
public void onServiceBind(final FtcRobotControllerService service) {
RobotLog.vv(FtcRobotControllerService.TAG, "%s.controllerService=bound", TAG);
controllerService = service;
updateUI.setControllerService(controllerService);
controllerService.setOnBotJavaHelper(onBotJavaHelper);
updateUIAndRequestRobotSetup();
programmingModeManager.setState(new FtcRobotControllerServiceState() {
@NonNull
@Override
public WebServer getWebServer() {
return service.getWebServer();
}
@Nullable
@Override
public OnBotJavaHelper getOnBotJavaHelper() {
return service.getOnBotJavaHelper();
}
@Override
public EventLoopManager getEventLoopManager() {
return service.getRobot().eventLoopManager;
}
});
AnnotatedHooksClassFilter.getInstance().callWebHandlerRegistrarMethods(this,
service.getWebServer().getWebHandlerManager());
}
private void updateUIAndRequestRobotSetup() {
if (controllerService != null) {
callback.networkConnectionUpdate(controllerService.getNetworkConnectionStatus());
callback.updateRobotStatus(controllerService.getRobotStatus());
// Only show this first-time toast on headless systems: what we have now on non-headless suffices
requestRobotSetup(LynxConstants.isRevControlHub()
? new Runnable() {
@Override public void run() {
showRestartRobotCompleteToast(R.string.toastRobotSetupComplete);
}
}
: null);
}
}
private void requestRobotSetup(@Nullable Runnable runOnComplete) {
if (controllerService == null) return;
RobotConfigFile file = cfgFileMgr.getActiveConfigAndUpdateUI();
HardwareFactory hardwareFactory = new HardwareFactory(context);
try {
hardwareFactory.setXmlPullParser(file.getXml());
} catch (FileNotFoundException | XmlPullParserException e) {
RobotLog.ww(TAG, e, "Unable to set configuration file %s. Falling back on noConfig.", file.getName());
file = RobotConfigFile.noConfig(cfgFileMgr);
try {
hardwareFactory.setXmlPullParser(file.getXml());
cfgFileMgr.setActiveConfigAndUpdateUI(false, file);
} catch (FileNotFoundException | XmlPullParserException e1) {
RobotLog.ee(TAG, e1, "Failed to fall back on noConfig");
}
}
OpModeRegister userOpModeRegister = createOpModeRegister();
eventLoop = new FtcEventLoop(hardwareFactory, userOpModeRegister, callback, this);
FtcEventLoopIdle idleLoop = new FtcEventLoopIdle(hardwareFactory, userOpModeRegister, callback, this);
controllerService.setCallback(callback);
controllerService.setupRobot(eventLoop, idleLoop, runOnComplete);
passReceivedUsbAttachmentsToEventLoop();
AndroidBoard.showErrorIfUnknownControlHub();
AnnotatedHooksClassFilter.getInstance().callOnCreateEventLoopMethods(this, eventLoop);
}
protected OpModeRegister createOpModeRegister() {
return new FtcOpModeRegister();
}
private void shutdownRobot() {
if (controllerService != null) controllerService.shutdownRobot();
}
private void requestRobotRestart() {
AppUtil.getInstance().showToast(UILocation.BOTH, AppUtil.getDefContext().getString(R.string.toastRestartingRobot));
//
RobotLog.clearGlobalErrorMsg();
RobotLog.clearGlobalWarningMsg();
shutdownRobot();
requestRobotSetup(new Runnable() {
@Override public void run() {
showRestartRobotCompleteToast(R.string.toastRestartRobotComplete);
}
});
}
private void showRestartRobotCompleteToast(@StringRes int resid) {
AppUtil.getInstance().showToast(UILocation.BOTH, AppUtil.getDefContext().getString(resid));
}
private void checkPreferredChannel() {
// For P2P network, check to see what preferred channel is.
if (networkType == NetworkType.WIFIDIRECT) {
int prefChannel = preferencesHelper.readInt(getString(com.qualcomm.ftccommon.R.string.pref_wifip2p_channel), -1);
if (prefChannel == -1) {
prefChannel = 0;
RobotLog.vv(TAG, "pref_wifip2p_channel: No preferred channel defined. Will use a default value of %d", prefChannel);
} else {
RobotLog.vv(TAG, "pref_wifip2p_channel: Found existing preferred channel (%d).", prefChannel);
}
// attempt to set the preferred channel.
RobotLog.vv(TAG, "pref_wifip2p_channel: attempting to set preferred channel...");
wifiDirectChannelChanger = new WifiDirectChannelChanger();
wifiDirectChannelChanger.changeToChannel(prefChannel);
}
}
protected void hittingMenuButtonBrightensScreen() {
ActionBar actionBar = getActionBar();
if (actionBar != null) {
actionBar.addOnMenuVisibilityListener(new ActionBar.OnMenuVisibilityListener() {
@Override
public void onMenuVisibilityChanged(boolean isVisible) {
if (isVisible) {
dimmer.handleDimTimer();
}
}
});
}
}
protected class SharedPreferencesListener implements SharedPreferences.OnSharedPreferenceChangeListener {
@Override public void onSharedPreferenceChanged(SharedPreferences sharedPreferences, String key) {
if (key.equals(context.getString(R.string.pref_app_theme))) {
ThemedActivity.restartForAppThemeChange(getTag(), getString(R.string.appThemeChangeRestartNotifyRC));
} else if (key.equals(context.getString(R.string.pref_wifi_automute))) {
if (preferencesHelper.readBoolean(context.getString(R.string.pref_wifi_automute), false)) {
initWifiMute(true);
} else {
initWifiMute(false);
}
}
}
}
protected void initWifiMute(boolean enable) {
if (enable) {
wifiMuteStateMachine = new WifiMuteStateMachine();
wifiMuteStateMachine.initialize();
wifiMuteStateMachine.start();
motionDetection = new MotionDetection(2.0, 10);
motionDetection.startListening();
motionDetection.registerListener(new MotionDetection.MotionDetectionListener() {
@Override
public void onMotionDetected(double vector)
{
wifiMuteStateMachine.consumeEvent(WifiMuteEvent.USER_ACTIVITY);
}
});
} else {
wifiMuteStateMachine.stop();
wifiMuteStateMachine = null;
motionDetection.stopListening();
motionDetection.purgeListeners();
motionDetection = null;
}
}
@Override
public void onUserInteraction() {
if (wifiMuteStateMachine != null) {
wifiMuteStateMachine.consumeEvent(WifiMuteEvent.USER_ACTIVITY);
}
}
}

View File

@ -0,0 +1,91 @@
/*
* Copyright (c) 2018 Craig MacFarlane
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification, are permitted
* (subject to the limitations in the disclaimer below) provided that the following conditions are
* met:
*
* Redistributions of source code must retain the above copyright notice, this list of conditions
* and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this list of conditions
* and the following disclaimer in the documentation and/or other materials provided with the
* distribution.
*
* Neither the name of Craig MacFarlane nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS LICENSE. THIS
* SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED
* WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
* OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF
* THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.internal;
import android.Manifest;
import android.os.Bundle;
import com.qualcomm.ftcrobotcontroller.R;
import org.firstinspires.ftc.robotcore.internal.system.Misc;
import org.firstinspires.ftc.robotcore.internal.system.PermissionValidatorActivity;
import java.util.ArrayList;
import java.util.List;
public class PermissionValidatorWrapper extends PermissionValidatorActivity {
private final String TAG = "PermissionValidatorWrapper";
/*
* The list of dangerous permissions the robot controller needs.
*/
protected List<String> robotControllerPermissions = new ArrayList<String>() {{
add(Manifest.permission.WRITE_EXTERNAL_STORAGE);
add(Manifest.permission.READ_EXTERNAL_STORAGE);
add(Manifest.permission.CAMERA);
add(Manifest.permission.ACCESS_COARSE_LOCATION);
add(Manifest.permission.ACCESS_FINE_LOCATION);
add(Manifest.permission.READ_PHONE_STATE);
}};
private final static Class startApplication = FtcRobotControllerActivity.class;
public String mapPermissionToExplanation(final String permission) {
if (permission.equals(Manifest.permission.WRITE_EXTERNAL_STORAGE)) {
return Misc.formatForUser(R.string.permRcWriteExternalStorageExplain);
} else if (permission.equals(Manifest.permission.READ_EXTERNAL_STORAGE)) {
return Misc.formatForUser(R.string.permRcReadExternalStorageExplain);
} else if (permission.equals(Manifest.permission.CAMERA)) {
return Misc.formatForUser(R.string.permRcCameraExplain);
} else if (permission.equals(Manifest.permission.ACCESS_COARSE_LOCATION)) {
return Misc.formatForUser(R.string.permAccessLocationExplain);
} else if (permission.equals(Manifest.permission.ACCESS_FINE_LOCATION)) {
return Misc.formatForUser(R.string.permAccessLocationExplain);
} else if (permission.equals(Manifest.permission.READ_PHONE_STATE)) {
return Misc.formatForUser(R.string.permReadPhoneState);
}
return Misc.formatForUser(R.string.permGenericExplain);
}
@Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
permissions = robotControllerPermissions;
}
protected Class onStartApplication()
{
FtcRobotControllerActivity.setPermissionsValidated();
return startApplication;
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 975 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.7 KiB

View File

@ -0,0 +1,184 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:style="http://schemas.android.com/apk/res-auto"
tools:context="org.firstinspires.ftc.robotcontroller.internal.FtcRobotControllerActivity"
android:focusable="true"
android:id="@+id/entire_screen"
android:orientation="vertical">
<!-- black bar on top -->
<RelativeLayout
android:id="@+id/top_bar"
android:layout_width="fill_parent"
android:layout_height="80dp"
android:background="@color/background_black">
<ImageView
android:id="@+id/robotIcon"
android:src="@drawable/icon_robotcontroller"
android:layout_width="wrap_content"
android:layout_height="fill_parent"
android:adjustViewBounds="true"
android:layout_margin="1dp"/>
<TextView
android:id="@+id/textDeviceName"
android:layout_toEndOf="@id/robotIcon"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="?attr/textWhite"
android:textAppearance="?android:attr/textAppearanceMedium"
android:padding="8dp"
android:textStyle="bold"/>
<ImageButton
android:id="@+id/menu_buttons"
android:scaleType="fitXY"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:src="@drawable/icon_menu"
android:background="@android:color/transparent"
android:padding="15dp"
android:adjustViewBounds="true"
android:layout_alignParentEnd="true"
android:layout_centerInParent="true"
android:layout_centerHorizontal="true"
android:layout_margin="10dp"/>
</RelativeLayout>
<!-- end of black bar -->
<include layout="@layout/header"
android:id="@+id/included_header"/>
<RelativeLayout
android:background="@color/background_very_light_grey"
android:id="@+id/RelativeLayout"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:keepScreenOn="true"
android:paddingBottom="@dimen/activity_vertical_margin"
android:paddingLeft="@dimen/activity_horizontal_margin"
android:paddingRight="@dimen/activity_horizontal_margin"
android:paddingTop="@dimen/activity_vertical_margin" >
<TextView
android:id="@+id/textNetworkConnectionStatus"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textRobotStatus"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textNetworkConnectionStatus"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textOpMode"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textRobotStatus"
android:visibility="invisible"
android:text="" />
<FrameLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textOpMode"
android:layout_above="@+id/textGamepad1">
<LinearLayout
android:id="@+id/monitorContainer"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical">
<LinearLayout
android:id="@+id/cameraMonitorViewId"
android:visibility="gone"
android:layout_width="match_parent"
android:layout_height="0dp"
android:layout_weight="1"
android:orientation="vertical"
/>
</LinearLayout>
<!-- When visible, the error message will overlay and thus partially
obscure part of the camera monitor. We make this trade off so as to
allow for a bigger camera monitor view in the common case when the
error is not in fact present
-->
<TextView style="@style/FtcTextViewStyleBold"
android:id="@+id/textErrorMessage"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:text=""
android:visibility="invisible"
android:textColor="?attr/textMediumDark" />
</FrameLayout>
<TextView
android:id="@+id/textGamepad1"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_above="@+id/textGamepad2"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textGamepad2"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:visibility="invisible"
android:text="" />
</RelativeLayout>
<WebView
android:id="@+id/webViewBlocksRuntime"
android:layout_width="1dp"
android:layout_height="1dp" />
</LinearLayout>

View File

@ -0,0 +1,78 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<menu xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<item
android:id="@+id/action_settings"
android:orderInCategory="100"
app:showAsAction="never"
android:title="@string/settings_menu_item"/>
<item
android:id="@+id/action_restart_robot"
android:orderInCategory="200"
app:showAsAction="never"
android:title="@string/restart_robot_menu_item"/>
<item
android:id="@+id/action_configure_robot"
android:orderInCategory="300"
app:showAsAction="never"
android:title="@string/configure_robot_menu_item"/>
<item
android:id="@+id/action_program_and_manage"
android:orderInCategory="550"
app:showAsAction="never"
android:title="@string/program_and_manage_menu_item"/>
<item
android:id="@+id/action_inspection_mode"
android:orderInCategory="600"
app:showAsAction="never"
android:title="@string/inspection_mode_menu_item"/>
<item
android:id="@+id/action_about"
android:orderInCategory="999"
android:icon="@android:drawable/ic_menu_info_details"
android:title="@string/about_menu_item"/>
<item
android:id="@+id/action_exit_app"
android:orderInCategory="1000"
android:icon="@android:drawable/ic_menu_info_details"
android:title="@string/exit_menu_item"/>
</menu>

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,40 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<resources>
<!-- Default screen margins, per the Android Design guidelines. -->
<dimen name="activity_horizontal_margin">16dp</dimen>
<dimen name="activity_vertical_margin">5dp</dimen>
</resources>

View File

@ -0,0 +1,72 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
strings.xml in FtcRobotController
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<resources>
<string name="app_name">FTC Robot Controller</string>
<!-- Menu -->
<string name="inspection_mode_menu_item">Self Inspect</string>
<string name="program_and_manage_menu_item">Program &amp; Manage</string>
<string name="blocks_menu_item">Blocks</string>
<string name="settings_menu_item">Settings</string>
<string name="restart_robot_menu_item">Restart Robot</string>
<string name="configure_robot_menu_item">Configure Robot</string>
<string name="about_menu_item">About</string>
<string name="exit_menu_item">Exit</string>
<!-- Toast messages -->
<string name="toastWifiConfigurationComplete">Configuration Complete</string>
<string name="toastRestartingRobot">Restarting Robot</string>
<string name="toastWifiUpBeforeProgrammingMode">The Robot Controller must be fully up and running before entering Program and Manage Mode.</string>
<!-- for interpreting pref_app_theme contents. may be override in merged resources -->
<integer-array name="app_theme_ids">
<item>@style/AppThemeRedRC</item>
<item>@style/AppThemeGreenRC</item>
<item>@style/AppThemeBlueRC</item>
<item>@style/AppThemePurpleRC</item>
<item>@style/AppThemeOrangeRC</item>
<item>@style/AppThemeTealRC</item>
</integer-array>
<string translatable="false" name="pref_ftc_season_year_of_current_rc">pref_ftc_season_year_of_current_rc_new</string>
<string name="packageName">@string/packageNameRobotController</string>
</resources>

View File

@ -0,0 +1,23 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- This is a placeholder for adding future style-specific content in the robot controller only -->
<resources>
<style name="AppThemeRedRC" parent="AppTheme.Red">
</style>
<style name="AppThemeGreenRC" parent="AppTheme.Green">
</style>
<style name="AppThemeBlueRC" parent="AppTheme.Blue">
</style>
<style name="AppThemePurpleRC" parent="AppTheme.Purple">
</style>
<style name="AppThemeOrangeRC" parent="AppTheme.Orange">
</style>
<style name="AppThemeTealRC" parent="AppTheme.Teal">
</style>
</resources>

View File

@ -0,0 +1,93 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
app_settings.xml in FtcRobotController
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
See https://developer.android.com/guide/topics/ui/settings.html
-->
<PreferenceScreen xmlns:tools="http://schemas.android.com/tools"
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
tools:context=".FtcRobotControllerActivity">
<PreferenceCategory
android:title="@string/prefcat_configure_robot">
<EditTextPreference
android:title="@string/prefedit_device_name_rc"
android:summary="@string/prefedit_device_name_summary_rc"
android:key="@string/pref_device_name"
android:defaultValue=""
/>
<org.firstinspires.ftc.robotcore.internal.ui.ColorListPreference
android:title="@string/prefedit_app_theme_rc"
android:summary="@string/prefedit_app_theme_summary_rc"
android:key="@string/pref_app_theme"
android:entries="@array/app_theme_names"
android:entryValues="@array/app_theme_tokens"
app:colors="@array/app_theme_colors"
/>
<SwitchPreference
android:title="@string/prefedit_sound_on_off"
android:summary="@string/prefedit_sound_on_off_summary_rc"
android:key="@string/pref_sound_on_off"
android:defaultValue="true"
/>
<PreferenceScreen
android:title="@string/titleAdvancedRCSettings"
android:summary="@string/summaryAdvancedRCSettings"
android:key="@string/pref_launch_advanced_rc_settings">
<intent
android:targetPackage="@string/packageName"
android:targetClass="com.qualcomm.ftccommon.FtcAdvancedRCSettingsActivity"
/>
</PreferenceScreen>
<PreferenceScreen
android:title="@string/prefedit_view_logs"
android:summary="@string/prefedit_view_logs_summary"
android:key="@string/pref_launch_viewlogs">
<intent
android:targetPackage="@string/packageName"
android:targetClass="com.qualcomm.ftccommon.ViewLogsActivity" />
</PreferenceScreen>
</PreferenceCategory>
</PreferenceScreen>

View File

@ -0,0 +1,49 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<!--
https://developer.android.com/guide/topics/connectivity/usb/host
-->
<!-- see also RobotUsbDevice.getUsbIdentifiers() -->
<resources>
<usb-device vendor-id="1027" product-id="24597" /> <!-- FT232 Lynx: 0x0403/0x6015 -->
<!-- cameras -->
<!-- We don't currently auto-connect to UVC cameras, instead relying
on the app itself to poll. But we could change that if we wished -->
<!-- Update: turns out we need that if we are to get onNewIntent() notifications
in our activity when cameras attach. See FtcRobotControllerActivity. -->
<usb-device class="14" subclass="2"/>
</resources>

View File

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2024 Titan Robotics Club
Copyright (c) 2021 Titan Robotics Club
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

1553
README.md

File diff suppressed because it is too large Load Diff

30
TeamCode/build.gradle Normal file
View File

@ -0,0 +1,30 @@
//
// build.gradle in TeamCode
//
// Most of the definitions for building your module reside in a common, shared
// file 'build.common.gradle'. Being factored in this way makes it easier to
// integrate updates to the FTC into your code. If you really need to customize
// the build definitions, you can place those customizations in this file, but
// please think carefully as to whether such customizations are really necessary
// before doing so.
// Custom definitions may go here
// Include common definitions from above.
apply from: '../build.common.gradle'
apply from: '../build.dependencies.gradle'
android {
namespace = 'teamcode'
packagingOptions {
jniLibs.useLegacyPackaging true
}
}
dependencies {
implementation project(':FtcRobotController')
implementation 'org.apache.commons:commons-math3:3.6.1'
annotationProcessor files('lib/OpModeAnnotationProcessor.jar')
}

Binary file not shown.

View File

@ -0,0 +1,11 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Note: the actual manifest file used in your APK merges this file with contributions
from the modules on which your app depends (such as FtcRobotController, etc).
So it won't ultimately be as empty as it might here appear to be :-) -->
<!-- The package name here determines the package for your R class and your BuildConfig class -->
<manifest
xmlns:android="http://schemas.android.com/apk/res/android">
<application/>
</manifest>

View File

@ -0,0 +1,376 @@
/*
* Copyright (c) 2022 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import java.util.Locale;
import TrcCommonLib.command.CmdPidDrive;
import TrcCommonLib.command.CmdTimedDrive;
import TrcCommonLib.trclib.TrcDbgTrace;
import TrcCommonLib.trclib.TrcPose2D;
import TrcCommonLib.trclib.TrcRobot;
import TrcCommonLib.trclib.TrcTimer;
import TrcFtcLib.ftclib.FtcChoiceMenu;
import TrcFtcLib.ftclib.FtcMatchInfo;
import TrcFtcLib.ftclib.FtcMenu;
import TrcFtcLib.ftclib.FtcOpMode;
import TrcFtcLib.ftclib.FtcValueMenu;
/**
* This class contains the Autonomous Mode program.
*/
@Autonomous(name="FtcAutonomous", group="Ftcxxxx")
public class FtcAuto extends FtcOpMode
{
private static final String moduleName = FtcAuto.class.getSimpleName();
public enum Alliance
{
RED_ALLIANCE,
BLUE_ALLIANCE
} //enum Alliance
public enum StartPos
{
LEFT,
RIGHT
} //enum StartPos
public enum AutoStrategy
{
PID_DRIVE,
TIMED_DRIVE,
DO_NOTHING
} //enum AutoStrategy
/**
* This class stores the autonomous menu choices.
*/
public static class AutoChoices
{
public double delay = 0.0;
public Alliance alliance = Alliance.RED_ALLIANCE;
public StartPos startPos = StartPos.LEFT;
public AutoStrategy strategy = AutoStrategy.DO_NOTHING;
public double xTarget = 0.0;
public double yTarget = 0.0;
public double turnTarget = 0.0;
public double driveTime = 0.0;
public double drivePower = 0.0;
@Override
public String toString()
{
return String.format(
Locale.US,
"delay=%.0f " +
"alliance=\"%s\" " +
"startPos=\"%s\" " +
"strategy=\"%s\" " +
"xTarget=%.1f " +
"yTarget=%.1f " +
"turnTarget=%.0f " +
"driveTime=%.0f " +
"drivePower=%.1f",
delay, alliance, startPos, strategy, xTarget, yTarget, turnTarget, driveTime, drivePower);
} //toString
} //class AutoChoices
public static final AutoChoices autoChoices = new AutoChoices();
private Robot robot;
private TrcRobot.RobotCommand autoCommand;
//
// Implements FtcOpMode abstract method.
//
/**
* This method is called to initialize the robot. In FTC, this is called when the "Init" button on the Driver
* Station is pressed.
*/
@Override
public void robotInit()
{
//
// Create and initialize robot object.
//
robot = new Robot(TrcRobot.getRunMode());
//
// Open trace log.
//
if (RobotParams.Preferences.useTraceLog)
{
Robot.matchInfo = FtcMatchInfo.getMatchInfo();
String filePrefix = String.format(
Locale.US, "%s%02d_Auto", Robot.matchInfo.matchType, Robot.matchInfo.matchNumber);
TrcDbgTrace.openTraceLog(RobotParams.LOG_FOLDER_PATH, filePrefix);
}
//
// Create and run choice menus.
//
doAutoChoicesMenus();
//
// Create autonomous command according to chosen strategy.
//
switch (autoChoices.strategy)
{
case PID_DRIVE:
if (RobotParams.Preferences.robotType != RobotParams.RobotType.NoRobot)
{
autoCommand = new CmdPidDrive(
robot.robotDrive.driveBase, robot.robotDrive.pidDrive, autoChoices.delay,
autoChoices.drivePower, null,
new TrcPose2D(autoChoices.xTarget*12.0, autoChoices.yTarget*12.0, autoChoices.turnTarget));
}
break;
case TIMED_DRIVE:
if (RobotParams.Preferences.robotType != RobotParams.RobotType.NoRobot)
{
autoCommand = new CmdTimedDrive(
robot.robotDrive.driveBase, autoChoices.delay, autoChoices.driveTime,
0.0, autoChoices.drivePower, 0.0);
}
break;
case DO_NOTHING:
default:
autoCommand = null;
break;
}
if (robot.vision != null)
{
// Enabling vision early so we can detect target before match starts if necessary.
// Only enable the necessary vision for that purpose.
// if (robot.vision.aprilTagVision != null)
// {
// robot.globalTracer.traceInfo(moduleName, "Enabling AprilTagVision.");
// robot.vision.setAprilTagVisionEnabled(true);
// }
//
// if (robot.vision.redBlobVision != null)
// {
// robot.globalTracer.traceInfo(moduleName, "Enabling RedBlobVision.");
// robot.vision.setRedBlobVisionEnabled(true);
// }
//
// if (robot.vision.blueBlobVision != null)
// {
// robot.globalTracer.traceInfo(moduleName, "Enabling BlueBlobVision.");
// robot.vision.setBlueBlobVisionEnabled(true);
// }
//
// if (robot.vision.tensorFlowVision != null)
// {
// robot.globalTracer.traceInfo(moduleName, "Enabling TensorFlowVision.");
// robot.vision.setTensorFlowVisionEnabled(true);
// }
}
robot.zeroCalibrate();
} //robotInit
//
// Overrides TrcRobot.RobotMode methods.
//
/**
* This method is called periodically after robotInit() is called but before competition starts. For example,
* we can put vision code here to detect target before autonomous starts.
*/
@Override
public void initPeriodic()
{
} //initPeriodic
/**
* This method is called when the competition mode is about to start. In FTC, this is called when the "Play"
* button on the Driver Station is pressed. Typically, you put code that will prepare the robot for start of
* competition here such as resetting the encoders/sensors and enabling some sensors to start sampling.
*
* @param prevMode specifies the previous RunMode it is coming from (always null for FTC).
* @param nextMode specifies the next RunMode it is going into.
*/
@Override
public void startMode(TrcRobot.RunMode prevMode, TrcRobot.RunMode nextMode)
{
if (TrcDbgTrace.isTraceLogOpened())
{
TrcDbgTrace.setTraceLogEnabled(true);
}
robot.globalTracer.traceInfo(
moduleName, "***** Starting autonomous: " + TrcTimer.getCurrentTimeString() + " *****");
if (Robot.matchInfo != null)
{
robot.globalTracer.logInfo(moduleName, "MatchInfo", Robot.matchInfo.toString());
}
robot.globalTracer.logInfo(moduleName, "AutoChoices", autoChoices.toString());
robot.dashboard.clearDisplay();
//
// Tell robot object opmode is about to start so it can do the necessary start initialization for the mode.
//
robot.startMode(nextMode);
if (robot.vision != null)
{
// We are done with detecting object with TensorFlow, shut it down.
if (robot.vision.tensorFlowVision != null)
{
robot.globalTracer.traceInfo(moduleName, "Disabling TensorFlowVision.");
robot.vision.setTensorFlowVisionEnabled(false);
}
}
if (robot.battery != null)
{
robot.battery.setEnabled(true);
}
} //startMode
/**
* This method is called when competition mode is about to end. Typically, you put code that will do clean
* up here such as disabling the sampling of some sensors.
*
* @param prevMode specifies the previous RunMode it is coming from.
* @param nextMode specifies the next RunMode it is going into (always null for FTC).
*/
@Override
public void stopMode(TrcRobot.RunMode prevMode, TrcRobot.RunMode nextMode)
{
//
// Opmode is about to stop, cancel autonomous command in progress if any.
//
if (autoCommand != null)
{
autoCommand.cancel();
}
//
// Tell robot object opmode is about to stop so it can do the necessary cleanup for the mode.
//
robot.stopMode(prevMode);
if (robot.battery != null)
{
robot.battery.setEnabled(false);
}
printPerformanceMetrics();
robot.globalTracer.traceInfo(
moduleName, "***** Stopping autonomous: " + TrcTimer.getCurrentTimeString() + " *****");
if (TrcDbgTrace.isTraceLogOpened())
{
TrcDbgTrace.closeTraceLog();
}
} //stopMode
/**
* This method is called periodically on the main robot thread. Typically, you put TeleOp control code here that
* doesn't require frequent update For example, TeleOp joystick code or status display code can be put here since
* human responses are considered slow.
*
* @param elapsedTime specifies the elapsed time since the mode started.
* @param slowPeriodicLoop specifies true if it is running the slow periodic loop on the main robot thread,
* false otherwise.
*/
@Override
public void periodic(double elapsedTime, boolean slowPeriodicLoop)
{
if (autoCommand != null)
{
//
// Run the autonomous command.
//
autoCommand.cmdPeriodic(elapsedTime);
}
} //periodic
/**
* This method creates the autonomous menus, displays them and stores the choices.
*/
private void doAutoChoicesMenus()
{
//
// Construct menus.
//
FtcValueMenu delayMenu = new FtcValueMenu("Delay time:", null, 0.0, 30.0, 1.0, 0.0, " %.0f sec");
FtcChoiceMenu<Alliance> allianceMenu = new FtcChoiceMenu<>("Alliance:", delayMenu);
FtcChoiceMenu<StartPos> startPosMenu = new FtcChoiceMenu<>("Start Position:", allianceMenu);
FtcChoiceMenu<AutoStrategy> strategyMenu = new FtcChoiceMenu<>("Auto Strategies:", startPosMenu);
FtcValueMenu xTargetMenu = new FtcValueMenu(
"xTarget:", strategyMenu, -12.0, 12.0, 0.5, 4.0, " %.1f ft");
FtcValueMenu yTargetMenu = new FtcValueMenu(
"yTarget:", xTargetMenu, -12.0, 12.0, 0.5, 4.0, " %.1f ft");
FtcValueMenu turnTargetMenu = new FtcValueMenu(
"turnTarget:", yTargetMenu, -180.0, 180.0, 5.0, 90.0, " %.0f deg");
FtcValueMenu driveTimeMenu = new FtcValueMenu(
"Drive time:", strategyMenu, 0.0, 30.0, 1.0, 5.0, " %.0f sec");
FtcValueMenu drivePowerMenu = new FtcValueMenu(
"Drive power:", strategyMenu, -1.0, 1.0, 0.1, 0.5, " %.1f");
// Link Value Menus to their children.
delayMenu.setChildMenu(allianceMenu);
xTargetMenu.setChildMenu(yTargetMenu);
yTargetMenu.setChildMenu(turnTargetMenu);
turnTargetMenu.setChildMenu(drivePowerMenu);
driveTimeMenu.setChildMenu(drivePowerMenu);
//
// Populate choice menus.
//
allianceMenu.addChoice("Red", Alliance.RED_ALLIANCE, true, startPosMenu);
allianceMenu.addChoice("Blue", Alliance.BLUE_ALLIANCE, false, startPosMenu);
startPosMenu.addChoice("Start Position Left", StartPos.LEFT, true, strategyMenu);
startPosMenu.addChoice("Start Position Right", StartPos.RIGHT, false, strategyMenu);
strategyMenu.addChoice("PID Drive", AutoStrategy.PID_DRIVE, false, xTargetMenu);
strategyMenu.addChoice("Timed Drive", AutoStrategy.TIMED_DRIVE, false, driveTimeMenu);
strategyMenu.addChoice("Do nothing", AutoStrategy.DO_NOTHING, true);
//
// Traverse menus.
//
FtcMenu.walkMenuTree(delayMenu);
//
// Fetch choices.
//
autoChoices.delay = delayMenu.getCurrentValue();
autoChoices.alliance = allianceMenu.getCurrentChoiceObject();
autoChoices.startPos = startPosMenu.getCurrentChoiceObject();
autoChoices.strategy = strategyMenu.getCurrentChoiceObject();
autoChoices.xTarget = xTargetMenu.getCurrentValue();
autoChoices.yTarget = yTargetMenu.getCurrentValue();
autoChoices.turnTarget = turnTargetMenu.getCurrentValue();
autoChoices.driveTime = driveTimeMenu.getCurrentValue();
autoChoices.drivePower = drivePowerMenu.getCurrentValue();
//
// Show choices.
//
robot.dashboard.displayPrintf(1, "Auto Choices: %s", autoChoices);
} //doAutoChoicesMenus
} //class FtcAuto

View File

@ -0,0 +1,413 @@
/*
* Copyright (c) 2022 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.util.Locale;
import TrcCommonLib.trclib.TrcDbgTrace;
import TrcCommonLib.trclib.TrcDriveBase;
import TrcCommonLib.trclib.TrcGameController;
import TrcCommonLib.trclib.TrcPose2D;
import TrcCommonLib.trclib.TrcRobot;
import TrcCommonLib.trclib.TrcTimer;
import TrcFtcLib.ftclib.FtcGamepad;
import TrcFtcLib.ftclib.FtcOpMode;
import teamcode.drivebases.SwerveDrive;
/**
* This class contains the TeleOp Mode program.
*/
@TeleOp(name="FtcTeleOp", group="Ftcxxxx")
public class FtcTeleOp extends FtcOpMode
{
private static final String moduleName = FtcTeleOp.class.getSimpleName();
protected Robot robot;
protected FtcGamepad driverGamepad;
protected FtcGamepad operatorGamepad;
private double drivePowerScale = RobotParams.DRIVE_POWER_SCALE_NORMAL;
private double turnPowerScale = RobotParams.TURN_POWER_SCALE_NORMAL;
private boolean manualOverride = false;
private boolean relocalizing = false;
private TrcPose2D robotFieldPose = null;
//
// Implements FtcOpMode abstract method.
//
/**
* This method is called to initialize the robot. In FTC, this is called when the "Init" button on the Driver
* Station is pressed.
*/
@Override
public void robotInit()
{
//
// Create and initialize robot object.
//
robot = new Robot(TrcRobot.getRunMode());
//
// Open trace log.
//
if (RobotParams.Preferences.useTraceLog)
{
String filePrefix = Robot.matchInfo != null?
String.format(Locale.US, "%s%02d_TeleOp", Robot.matchInfo.matchType, Robot.matchInfo.matchNumber):
"Unknown_TeleOp";
TrcDbgTrace.openTraceLog(RobotParams.LOG_FOLDER_PATH, filePrefix);
}
//
// Create and initialize Gamepads.
//
driverGamepad = new FtcGamepad("DriverGamepad", gamepad1, this::driverButtonEvent);
operatorGamepad = new FtcGamepad("OperatorGamepad", gamepad2, this::operatorButtonEvent);
driverGamepad.setYInverted(true);
operatorGamepad.setYInverted(true);
setDriveOrientation(TrcDriveBase.DriveOrientation.ROBOT);
} //robotInit
//
// Overrides TrcRobot.RobotMode methods.
//
/**
* This method is called when the competition mode is about to start. In FTC, this is called when the "Play"
* button on the Driver Station is pressed. Typically, you put code that will prepare the robot for start of
* competition here such as resetting the encoders/sensors and enabling some sensors to start sampling.
*
* @param prevMode specifies the previous RunMode it is coming from (always null for FTC).
* @param nextMode specifies the next RunMode it is going into.
*/
@Override
public void startMode(TrcRobot.RunMode prevMode, TrcRobot.RunMode nextMode)
{
if (TrcDbgTrace.isTraceLogOpened())
{
TrcDbgTrace.setTraceLogEnabled(true);
}
robot.globalTracer.traceInfo(
moduleName, "***** Starting TeleOp: " + TrcTimer.getCurrentTimeString() + " *****");
robot.dashboard.clearDisplay();
//
// Tell robot object opmode is about to start so it can do the necessary start initialization for the mode.
//
robot.startMode(nextMode);
//
// Enable AprilTag vision for re-localization.
//
if (robot.vision != null && robot.vision.aprilTagVision != null)
{
robot.globalTracer.traceInfo(moduleName, "Enabling AprilTagVision.");
robot.vision.setAprilTagVisionEnabled(true);
}
} //startMode
/**
* This method is called when competition mode is about to end. Typically, you put code that will do clean
* up here such as disabling the sampling of some sensors.
*
* @param prevMode specifies the previous RunMode it is coming from.
* @param nextMode specifies the next RunMode it is going into (always null for FTC).
*/
@Override
public void stopMode(TrcRobot.RunMode prevMode, TrcRobot.RunMode nextMode)
{
//
// Tell robot object opmode is about to stop so it can do the necessary cleanup for the mode.
//
robot.stopMode(prevMode);
printPerformanceMetrics();
robot.globalTracer.traceInfo(
moduleName, "***** Stopping TeleOp: " + TrcTimer.getCurrentTimeString() + " *****");
if (TrcDbgTrace.isTraceLogOpened())
{
TrcDbgTrace.closeTraceLog();
}
} //stopMode
/**
* This method is called periodically on the main robot thread. Typically, you put TeleOp control code here that
* doesn't require frequent update For example, TeleOp joystick code or status display code can be put here since
* human responses are considered slow.
*
* @param elapsedTime specifies the elapsed time since the mode started.
* @param slowPeriodicLoop specifies true if it is running the slow periodic loop on the main robot thread,
* false otherwise.
*/
@Override
public void periodic(double elapsedTime, boolean slowPeriodicLoop)
{
if (slowPeriodicLoop)
{
//
// DriveBase subsystem.
//
if (robot.robotDrive != null)
{
double[] inputs = driverGamepad.getDriveInputs(
RobotParams.ROBOT_DRIVE_MODE, true, drivePowerScale, turnPowerScale);
if (robot.robotDrive.driveBase.supportsHolonomicDrive())
{
robot.robotDrive.driveBase.holonomicDrive(
null, inputs[0], inputs[1], inputs[2], robot.robotDrive.driveBase.getDriveGyroAngle());
}
else
{
robot.robotDrive.driveBase.arcadeDrive(inputs[1], inputs[2]);
}
robot.dashboard.displayPrintf(
1, "RobotDrive: Power=(%.2f,y=%.2f,rot=%.2f),Mode:%s",
inputs[0], inputs[1], inputs[2], robot.robotDrive.driveBase.getDriveOrientation());
// We are trying to re-localize the robot and vision hasn't seen AprilTag yet.
if (relocalizing && robotFieldPose == null)
{
robotFieldPose = robot.vision.getRobotFieldPose();
}
}
//
// Other subsystems.
//
if (RobotParams.Preferences.useSubsystems)
{
}
// Display subsystem status.
if (RobotParams.Preferences.doStatusUpdate)
{
robot.updateStatus();
}
}
} //periodic
/**
* This method sets the drive orientation mode and updates the LED to indicate so.
*
* @param orientation specifies the drive orientation (FIELD, ROBOT, INVERTED).
*/
public void setDriveOrientation(TrcDriveBase.DriveOrientation orientation)
{
if (robot.robotDrive != null)
{
robot.globalTracer.traceInfo(moduleName, "driveOrientation=" + orientation);
robot.robotDrive.driveBase.setDriveOrientation(
orientation, orientation == TrcDriveBase.DriveOrientation.FIELD);
if (robot.blinkin != null)
{
robot.blinkin.setDriveOrientation(orientation);
}
}
} //setDriveOrientation
//
// Implements TrcGameController.ButtonHandler interface.
//
/**
* This method is called when driver gamepad button event is detected.
*
* @param gamepad specifies the game controller object that generated the event.
* @param button specifies the button ID that generates the event.
* @param pressed specifies true if the button is pressed, false otherwise.
*/
public void driverButtonEvent(TrcGameController gamepad, int button, boolean pressed)
{
robot.dashboard.displayPrintf(8, "%s: %04x->%s", gamepad, button, pressed? "Pressed": "Released");
switch (button)
{
case FtcGamepad.GAMEPAD_A:
if (pressed)
{
robot.globalTracer.traceInfo(moduleName, ">>>>> CancelAll is pressed.");
if (robot.robotDrive != null)
{
// Cancel all auto-assist driving.
robot.robotDrive.cancel();
}
}
break;
case FtcGamepad.GAMEPAD_B:
break;
case FtcGamepad.GAMEPAD_X:
break;
case FtcGamepad.GAMEPAD_Y:
if (pressed && robot.robotDrive != null)
{
if (robot.robotDrive.driveBase.isGyroAssistEnabled())
{
// Disable GyroAssist drive.
robot.globalTracer.traceInfo(moduleName, ">>>>> Disabling GyroAssist.");
robot.robotDrive.driveBase.setGyroAssistEnabled(null);
}
else
{
// Enable GyroAssist drive.
robot.globalTracer.traceInfo(moduleName, ">>>>> Enabling GyroAssist.");
robot.robotDrive.driveBase.setGyroAssistEnabled(robot.robotDrive.pidDrive.getTurnPidCtrl());
}
}
break;
case FtcGamepad.GAMEPAD_LBUMPER:
// Toggle between field or robot oriented driving, only applicable for holonomic drive base.
if (pressed && robot.robotDrive != null && robot.robotDrive.driveBase.supportsHolonomicDrive())
{
if (robot.robotDrive.driveBase.getDriveOrientation() != TrcDriveBase.DriveOrientation.FIELD)
{
robot.globalTracer.traceInfo(moduleName, ">>>>> Enabling FIELD mode.");
setDriveOrientation(TrcDriveBase.DriveOrientation.FIELD);
}
else
{
robot.globalTracer.traceInfo(moduleName, ">>>>> Enabling ROBOT mode.");
setDriveOrientation(TrcDriveBase.DriveOrientation.ROBOT);
}
}
break;
case FtcGamepad.GAMEPAD_RBUMPER:
// Press and hold for slow drive.
if (pressed)
{
robot.globalTracer.traceInfo(moduleName, ">>>>> DrivePower slow.");
drivePowerScale = RobotParams.DRIVE_POWER_SCALE_SLOW;
turnPowerScale = RobotParams.TURN_POWER_SCALE_SLOW;
}
else
{
robot.globalTracer.traceInfo(moduleName, ">>>>> DrivePower normal.");
drivePowerScale = RobotParams.DRIVE_POWER_SCALE_NORMAL;
turnPowerScale = RobotParams.TURN_POWER_SCALE_NORMAL;
}
break;
case FtcGamepad.GAMEPAD_DPAD_UP:
break;
case FtcGamepad.GAMEPAD_DPAD_DOWN:
break;
case FtcGamepad.GAMEPAD_DPAD_LEFT:
break;
case FtcGamepad.GAMEPAD_DPAD_RIGHT:
break;
case FtcGamepad.GAMEPAD_START:
if (robot.vision != null && robot.vision.aprilTagVision != null && robot.robotDrive != null)
{
// On press of the button, we will start looking for AprilTag for re-localization.
// On release of the button, we will set the robot's field location if we found the AprilTag.
relocalizing = pressed;
if (!pressed)
{
if (robotFieldPose != null)
{
// Vision found an AprilTag, set the new robot field location.
robot.globalTracer.traceInfo(
moduleName, ">>>>> Finish re-localizing: pose=" + robotFieldPose);
robot.robotDrive.driveBase.setFieldPosition(robotFieldPose, false);
robotFieldPose = null;
}
}
else
{
robot.globalTracer.traceInfo(moduleName, ">>>>> Start re-localizing ...");
}
}
break;
case FtcGamepad.GAMEPAD_BACK:
if (pressed && robot.robotDrive != null && robot.robotDrive instanceof SwerveDrive)
{
// Drive base is a Swerve Drive, align all steering wheels forward.
robot.globalTracer.traceInfo(moduleName, ">>>>> Set SteerAngle to zero.");
((SwerveDrive) robot.robotDrive).setSteerAngle(0.0, false, false);
}
break;
}
} //driverButtonEvent
/**
* This method is called when operator gamepad button event is detected.
*
* @param gamepad specifies the game controller object that generated the event.
* @param button specifies the button ID that generates the event.
* @param pressed specifies true if the button is pressed, false otherwise.
*/
public void operatorButtonEvent(TrcGameController gamepad, int button, boolean pressed)
{
robot.dashboard.displayPrintf(8, "%s: %04x->%s", gamepad, button, pressed? "Pressed": "Released");
switch (button)
{
case FtcGamepad.GAMEPAD_A:
break;
case FtcGamepad.GAMEPAD_B:
break;
case FtcGamepad.GAMEPAD_X:
break;
case FtcGamepad.GAMEPAD_Y:
break;
case FtcGamepad.GAMEPAD_LBUMPER:
break;
case FtcGamepad.GAMEPAD_RBUMPER:
robot.globalTracer.traceInfo(moduleName, ">>>>> ManulOverride=" + pressed);
manualOverride = pressed;
break;
case FtcGamepad.GAMEPAD_DPAD_UP:
break;
case FtcGamepad.GAMEPAD_DPAD_DOWN:
break;
case FtcGamepad.GAMEPAD_DPAD_LEFT:
break;
case FtcGamepad.GAMEPAD_DPAD_RIGHT:
break;
case FtcGamepad.GAMEPAD_BACK:
if (pressed)
{
// Zero calibrate all subsystems (arm, elevator and turret).
robot.globalTracer.traceInfo(moduleName, ">>>>> ZeroCalibrate pressed.");
robot.zeroCalibrate(moduleName);
}
break;
}
} //operatorButtonEvent
} //class FtcTeleOp

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2022 Titan Robotics Club
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@ -0,0 +1,323 @@
/*
* Copyright (c) 2022 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode;
import TrcCommonLib.trclib.TrcDbgTrace;
import TrcCommonLib.trclib.TrcDigitalInput;
import TrcCommonLib.trclib.TrcMotor;
import TrcCommonLib.trclib.TrcPose2D;
import TrcCommonLib.trclib.TrcRobot;
import TrcCommonLib.trclib.TrcServo;
import TrcCommonLib.trclib.TrcTimer;
import TrcFtcLib.ftclib.FtcDashboard;
import TrcFtcLib.ftclib.FtcMatchInfo;
import TrcFtcLib.ftclib.FtcOpMode;
import TrcFtcLib.ftclib.FtcRobotBattery;
import teamcode.drivebases.MecanumDrive;
import teamcode.drivebases.RobotDrive;
import teamcode.drivebases.SwerveDrive;
import teamcode.subsystems.BlinkinLEDs;
import teamcode.vision.Vision;
/**
* This class creates the robot object that consists of sensors, indicators, drive base and all the subsystems.
*/
public class Robot
{
private static final String moduleName = Robot.class.getSimpleName();
private static final double STATUS_UPDATE_INTERVAL = 0.1; // 100 msec
//
// Global objects.
//
public final FtcOpMode opMode;
public final TrcDbgTrace globalTracer;
public final FtcDashboard dashboard;
public static FtcMatchInfo matchInfo = null;
private static TrcPose2D endOfAutoRobotPose = null;
private static double nextStatusUpdateTime = 0.0;
//
// Vision subsystems.
//
public Vision vision;
//
// Sensors and indicators.
//
public BlinkinLEDs blinkin;
public FtcRobotBattery battery;
//
// Subsystems.
//
public RobotDrive robotDrive;
/**
* Constructor: Create an instance of the object.
*
* @param runMode specifies robot running mode (Auto, TeleOp, Test), can be used to create and initialize mode
* specific sensors and subsystems if necessary.
*/
public Robot(TrcRobot.RunMode runMode)
{
//
// Initialize global objects.
//
opMode = FtcOpMode.getInstance();
globalTracer = TrcDbgTrace.getGlobalTracer();
dashboard = FtcDashboard.getInstance();
nextStatusUpdateTime = TrcTimer.getCurrentTime();
speak("Init starting");
//
// Initialize vision subsystems.
//
if (RobotParams.Preferences.tuneColorBlobVision ||
RobotParams.Preferences.useAprilTagVision ||
RobotParams.Preferences.useColorBlobVision ||
RobotParams.Preferences.useTensorFlowVision)
{
vision = new Vision(this);
}
//
// If robotType is NoRobot, the robot controller is disconnected from the robot for testing vision.
// In this case, we should not instantiate any robot hardware.
//
if (RobotParams.Preferences.robotType != RobotParams.RobotType.NoRobot)
{
//
// Create and initialize sensors and indicators.
//
if (RobotParams.Preferences.useBlinkin)
{
blinkin = new BlinkinLEDs(RobotParams.HWNAME_BLINKIN);
}
if (RobotParams.Preferences.useBatteryMonitor)
{
battery = new FtcRobotBattery();
}
//
// Create and initialize RobotDrive.
//
robotDrive =
RobotParams.Preferences.robotType == RobotParams.RobotType.SwerveRobot?
new SwerveDrive(): new MecanumDrive();
//
// Create and initialize other subsystems.
//
if (RobotParams.Preferences.useSubsystems)
{
}
}
speak("Init complete");
} //Robot
/**
* This method returns the instance name.
*
* @return instance name.
*/
@Override
public String toString()
{
return RobotParams.ROBOT_NAME;
} //toString
/**
* This method is call when the robot mode is about to start. It contains code to initialize robot hardware
* necessary for running the robot mode.
*
* @param runMode specifies the robot mode it is about to start, can be used to initialize mode specific hardware.
*/
public void startMode(TrcRobot.RunMode runMode)
{
if (robotDrive != null)
{
//
// Since the IMU gyro is giving us cardinal heading, we need to enable its cardinal to cartesian converter.
//
if (robotDrive.gyro != null)
{
robotDrive.gyro.setEnabled(true);
// The following are performance counters, could be disabled for competition if you want.
// But it might give you some insight if somehow autonomous wasn't performing as expected.
robotDrive.gyro.setElapsedTimerEnabled(true);
}
//
// Enable odometry for all opmodes. We may need odometry in TeleOp for auto-assist drive.
//
robotDrive.driveBase.setOdometryEnabled(true);
if (runMode == TrcRobot.RunMode.TELEOP_MODE)
{
if (endOfAutoRobotPose != null)
{
// We had a previous autonomous run that saved the robot position at the end, use it.
robotDrive.driveBase.setFieldPosition(endOfAutoRobotPose);
globalTracer.traceInfo(moduleName, "Restore saved RobotPose=" + endOfAutoRobotPose);
}
}
// Consume it so it's no longer valid for next run.
endOfAutoRobotPose = null;
}
TrcDigitalInput.setElapsedTimerEnabled(true);
TrcMotor.setElapsedTimerEnabled(true);
TrcServo.setElapsedTimerEnabled(true);
} //startMode
/**
* This method is call when the robot mode is about to end. It contains code to cleanup robot hardware before
* exiting the robot mode.
*
* @param runMode specifies the robot mode it is about to stop, can be used to cleanup mode specific hardware.
*/
public void stopMode(TrcRobot.RunMode runMode)
{
//
// Print all performance counters if there are any.
//
if (robotDrive != null && robotDrive.gyro != null)
{
robotDrive.gyro.printElapsedTime(globalTracer);
robotDrive.gyro.setElapsedTimerEnabled(false);
}
TrcDigitalInput.printElapsedTime(globalTracer);
TrcDigitalInput.setElapsedTimerEnabled(false);
TrcMotor.printElapsedTime(globalTracer);
TrcMotor.setElapsedTimerEnabled(false);
TrcServo.printElapsedTime(globalTracer);
TrcServo.setElapsedTimerEnabled(false);
//
// Disable vision.
//
if (vision != null)
{
if (vision.rawColorBlobVision != null)
{
globalTracer.traceInfo(moduleName, "Disabling RawColorBlobVision.");
vision.setRawColorBlobVisionEnabled(false);
}
if (vision.aprilTagVision != null)
{
globalTracer.traceInfo(moduleName, "Disabling AprilTagVision.");
vision.setAprilTagVisionEnabled(false);
}
if (vision.redBlobVision != null)
{
globalTracer.traceInfo(moduleName, "Disabling RedBlobVision.");
vision.setRedBlobVisionEnabled(false);
}
if (vision.blueBlobVision != null)
{
globalTracer.traceInfo(moduleName, "Disabling BlueBlobVision.");
vision.setBlueBlobVisionEnabled(false);
}
if (vision.tensorFlowVision != null)
{
globalTracer.traceInfo(moduleName, "Disabling TensorFlowVision.");
vision.setTensorFlowVisionEnabled(false);
}
vision.close();
}
if (robotDrive != null)
{
if (runMode == TrcRobot.RunMode.AUTO_MODE)
{
// Save current robot location at the end of autonomous so subsequent teleop run can restore it.
endOfAutoRobotPose = robotDrive.driveBase.getFieldPosition();
globalTracer.traceInfo(moduleName, "Saved robot pose=" + endOfAutoRobotPose);
}
//
// Disable odometry.
//
robotDrive.driveBase.setOdometryEnabled(false);
//
// Disable gyro task.
//
if (robotDrive.gyro != null)
{
robotDrive.gyro.setEnabled(false);
}
}
} //stopMode
/**
* This method update all subsystem status on the dashboard.
*/
public void updateStatus()
{
if (TrcTimer.getCurrentTime() > nextStatusUpdateTime)
{
int lineNum = 2;
nextStatusUpdateTime += STATUS_UPDATE_INTERVAL;
if (robotDrive != null)
{
dashboard.displayPrintf(lineNum++, "DriveBase: Pose=%s", robotDrive.driveBase.getFieldPosition());
}
//
// Display other subsystem status here.
//
}
} //updateStatus
/**
* This method zero calibrates all subsystems.
*
* @param owner specifies the owner ID to check if the caller has ownership of the motor.
*/
public void zeroCalibrate(String owner)
{
} //zeroCalibrate
/**
* This method zero calibrates all subsystems.
*/
public void zeroCalibrate()
{
zeroCalibrate(null);
} //zeroCalibrate
/**
* This method sets the robot's starting position according to the autonomous choices.
*
* @param autoChoices specifies all the auto choices.
*/
public void setRobotStartPosition(FtcAuto.AutoChoices autoChoices)
{
} //setRobotStartPosition
/**
* This method sends the text string to the Driver Station to be spoken using text to speech.
*
* @param sentence specifies the sentence to be spoken by the Driver Station.
*/
public void speak(String sentence)
{
opMode.telemetry.speak(sentence);
} //speak
} //class Robot

View File

@ -0,0 +1,295 @@
/*
* Copyright (c) 2023 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode;
import android.os.Environment;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import org.openftc.easyopencv.OpenCvCameraRotation;
import TrcCommonLib.trclib.TrcDriveBase.DriveOrientation;
import TrcCommonLib.trclib.TrcHomographyMapper;
import TrcCommonLib.trclib.TrcPidController;
import TrcCommonLib.trclib.TrcPose2D;
import TrcCommonLib.trclib.TrcUtil;
import TrcFtcLib.ftclib.FtcGamepad;
/**
* This class contains robot and subsystem constants and parameters.
*/
public class RobotParams
{
public static enum RobotType
{
MecanumRobot,
SwerveRobot,
NoRobot
} //RobotType
/**
* This class contains robot preferences. It controls enabling/disabling of various robot features.
*/
public static class Preferences
{
// Global config
public static RobotType robotType = RobotType.MecanumRobot;
public static boolean inCompetition = false;
// Miscellaneous
public static boolean useTraceLog = true;
public static boolean useLoopPerformanceMonitor = true;
public static boolean useBlinkin = false;
public static boolean useBatteryMonitor = false;
public static boolean doStatusUpdate = !inCompetition;
// Vision
public static boolean useWebCam = true;
public static boolean hasWebCam2 = false;
public static boolean useBuiltinCamBack = false;
public static boolean tuneColorBlobVision = false;
public static boolean useAprilTagVision = false;
public static boolean useColorBlobVision = false;
public static boolean useTensorFlowVision = false;
public static boolean showVisionView = !inCompetition;
public static boolean showVisionStat = false;
// Drive Base
public static boolean useExternalOdometry = false;
public static boolean doSwervePhysicalAlignment = false;
public static boolean swerveDualServoSteering = true;
// Subsystems
public static boolean useSubsystems = false;
} //class Preferences
public static final String ROBOT_NAME = "Robotxxxx";
public static final RevHubOrientationOnRobot.LogoFacingDirection hubLogoDirection =
RevHubOrientationOnRobot.LogoFacingDirection.UP;
public static final RevHubOrientationOnRobot.UsbFacingDirection hubUsbDirection =
RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
public static final String TEAM_FOLDER_PATH =
Environment.getExternalStorageDirectory().getPath() + "/FIRST/ftcxxxx";
public static final String LOG_FOLDER_PATH = TEAM_FOLDER_PATH + "/tracelogs";
public static final String STEERING_CALIBRATION_DATA_FILE = "SteerCalibration.txt";
//
// Hardware names.
//
// Miscellaneous.
public static final String HWNAME_IMU = "imu";
public static final String HWNAME_WEBCAM1 = "Webcam 1";
public static final String HWNAME_WEBCAM2 = "Webcam 2";
public static final String HWNAME_BLINKIN = "blinkin";
// Drive Base.
public static final String HWNAME_LFDRIVE_MOTOR = "lfDriveMotor";
public static final String HWNAME_RFDRIVE_MOTOR = "rfDriveMotor";
public static final String HWNAME_LBDRIVE_MOTOR = "lbDriveMotor";
public static final String HWNAME_RBDRIVE_MOTOR = "rbDriveMotor";
public static final String HWNAME_LFSTEER_SERVO = "lfSteerServo";
public static final String HWNAME_RFSTEER_SERVO = "rfSteerServo";
public static final String HWNAME_LBSTEER_SERVO = "lbSteerServo";
public static final String HWNAME_RBSTEER_SERVO = "rbSteerServo";
public static final String HWNAME_LFSTEER_ENCODER = "lfSteerEncoder";
public static final String HWNAME_RFSTEER_ENCODER = "rfSteerEncoder";
public static final String HWNAME_LBSTEER_ENCODER = "lbSteerEncoder";
public static final String HWNAME_RBSTEER_ENCODER = "rbSteerEncoder";
// Subsystems.
//
// Field dimensions.
//
public static final double FULL_FIELD_INCHES = 141.24;
public static final double HALF_FIELD_INCHES = FULL_FIELD_INCHES/2.0;
public static final double FULL_TILE_INCHES = FULL_FIELD_INCHES/6.0;
//
// Robot dimensions.
//
public static final double ROBOT_LENGTH = 17.0;
public static final double ROBOT_WIDTH = 17.0;
public static final double DRIVE_BASE_LENGTH = (24.0 * 14)*TrcUtil.INCHES_PER_MM;
public static final double DRIVE_BASE_WIDTH = 16.0;
//
// Game related locations.
//
public static final TrcPose2D[] APRILTAG_POSES = new TrcPose2D[] {
new TrcPose2D(0.0, 0.0, 0.0), // TagId 1
new TrcPose2D(0.0, 0.0, 0.0), // TagId 2
new TrcPose2D(0.0, 0.0, 0.0), // TagId 3
new TrcPose2D(0.0, 0.0, 0.0) // TagId 4
};
//
// Vision subsystem.
//
public static final int CAM_IMAGE_WIDTH = 640;
public static final int CAM_IMAGE_HEIGHT = 480;
public static final OpenCvCameraRotation CAM_ORIENTATION = OpenCvCameraRotation.UPRIGHT;
// Camera location on robot.
public static final double CAM_X_OFFSET = 0.0; //Camera right offset from robot centroid
public static final double CAM_Y_OFFSET = 2.0; //Camera forward offset from robot centroid
public static final double CAM_Z_OFFSET = 9.75; //Camera height offset from floor
public static final double CAM_TILT_DOWN = 15.00;//Camera tilt down angle from horizontal in deg
public static final TrcPose2D CAM_POSE = new TrcPose2D(CAM_X_OFFSET, CAM_Y_OFFSET, 0.0);
// // Camera: Micorosoft Lifecam HD 3000 v1/v2
// public static final double WEBCAM_FX = 678.154; // in pixels
// public static final double WEBCAM_FY = 678.170; // in pixels
// public static final double WEBCAM_CX = 318.135; // in pixels
// public static final double WEBCAM_CY = 228.374; // in pixels
// // Camera: Logitech C270
// public static final double WEBCAM_FX = 822.317; // in pixels
// public static final double WEBCAM_FY = 822.317; // in pixels
// public static final double WEBCAM_CX = 319.495; // in pixels
// public static final double WEBCAM_CY = 242.502; // in pixels
// // Camera: Logitech C310
// public static final double WEBCAM_FX = 821.993; // in pixels
// public static final double WEBCAM_FY = 821.993; // in pixels
// public static final double WEBCAM_CX = 330.489; // in pixels
// public static final double WEBCAM_CY = 248.997; // in pixels
// // Camera: Logitech C920
// public static final double WEBCAM_FX = 622.001; // in pixels
// public static final double WEBCAM_FY = 622.001; // in pixels
// public static final double WEBCAM_CX = 319.803; // in pixels
// public static final double WEBCAM_CY = 241.251; // in pixels
// Measurement unit: pixels
public static final double HOMOGRAPHY_CAMERA_TOPLEFT_X = 0.0;
public static final double HOMOGRAPHY_CAMERA_TOPLEFT_Y = 120.0;
public static final double HOMOGRAPHY_CAMERA_TOPRIGHT_X = CAM_IMAGE_WIDTH - 1;
public static final double HOMOGRAPHY_CAMERA_TOPRIGHT_Y = 120.0;
public static final double HOMOGRAPHY_CAMERA_BOTTOMLEFT_X = 0.0;
public static final double HOMOGRAPHY_CAMERA_BOTTOMLEFT_Y = CAM_IMAGE_HEIGHT - 1;
public static final double HOMOGRAPHY_CAMERA_BOTTOMRIGHT_X = CAM_IMAGE_WIDTH - 1;
public static final double HOMOGRAPHY_CAMERA_BOTTOMRIGHT_Y = CAM_IMAGE_HEIGHT - 1;
// Measurement unit: inches
public static final double HOMOGRAPHY_WORLD_TOPLEFT_X = -12.5625;
public static final double HOMOGRAPHY_WORLD_TOPLEFT_Y = 48.0 - ROBOT_LENGTH/2.0 - CAM_Y_OFFSET;
public static final double HOMOGRAPHY_WORLD_TOPRIGHT_X = 11.4375;
public static final double HOMOGRAPHY_WORLD_TOPRIGHT_Y = 44.75 - ROBOT_LENGTH/2.0 - CAM_Y_OFFSET;
public static final double HOMOGRAPHY_WORLD_BOTTOMLEFT_X = -2.5625;
public static final double HOMOGRAPHY_WORLD_BOTTOMLEFT_Y = 21.0 - ROBOT_LENGTH/2.0 - CAM_Y_OFFSET;
public static final double HOMOGRAPHY_WORLD_BOTTOMRIGHT_X = 2.5626;
public static final double HOMOGRAPHY_WORLD_BOTTOMRIGHT_Y = 21.0 - ROBOT_LENGTH/2.0 + CAM_Y_OFFSET;
public static final TrcHomographyMapper.Rectangle cameraRect = new TrcHomographyMapper.Rectangle(
RobotParams.HOMOGRAPHY_CAMERA_TOPLEFT_X, RobotParams.HOMOGRAPHY_CAMERA_TOPLEFT_Y,
RobotParams.HOMOGRAPHY_CAMERA_TOPRIGHT_X, RobotParams.HOMOGRAPHY_CAMERA_TOPRIGHT_Y,
RobotParams.HOMOGRAPHY_CAMERA_BOTTOMLEFT_X, RobotParams.HOMOGRAPHY_CAMERA_BOTTOMLEFT_Y,
RobotParams.HOMOGRAPHY_CAMERA_BOTTOMRIGHT_X, RobotParams.HOMOGRAPHY_CAMERA_BOTTOMRIGHT_Y);
public static final TrcHomographyMapper.Rectangle worldRect = new TrcHomographyMapper.Rectangle(
RobotParams.HOMOGRAPHY_WORLD_TOPLEFT_X, RobotParams.HOMOGRAPHY_WORLD_TOPLEFT_Y,
RobotParams.HOMOGRAPHY_WORLD_TOPRIGHT_X, RobotParams.HOMOGRAPHY_WORLD_TOPRIGHT_Y,
RobotParams.HOMOGRAPHY_WORLD_BOTTOMLEFT_X, RobotParams.HOMOGRAPHY_WORLD_BOTTOMLEFT_Y,
RobotParams.HOMOGRAPHY_WORLD_BOTTOMRIGHT_X, RobotParams.HOMOGRAPHY_WORLD_BOTTOMRIGHT_Y);
//
// Motor Odometries.
//
// https://www.gobilda.com/5203-series-yellow-jacket-planetary-gear-motor-19-2-1-ratio-24mm-length-8mm-rex-shaft-312-rpm-3-3-5v-encoder/
public static final double GOBILDA_5203_312_ENCODER_PPR = (((1.0 + 46.0/17.0)*(1.0 + 46.0/11.0))*28.0);
public static final double GOBILDA_5203_312_RPM = 312.0;
public static final double GOBILDA_5203_312_MAX_VELOCITY_PPS=
GOBILDA_5203_312_ENCODER_PPR*GOBILDA_5203_312_RPM/60.0; // 2795.9872 pps
//https://www.gobilda.com/5203-series-yellow-jacket-planetary-gear-motor-13-7-1-ratio-24mm-length-8mm-rex-shaft-435-rpm-3-3-5v-encoder/
public static final double GOBILDA_5203_435_ENCODER_PPR = (((1.0 + 46.0/17.0)*(1.0 + 46.0/17.0))*28.0);
public static final double GOBILDA_5203_435_RPM = 435.0;
public static final double GOBILDA_5203_435_MAX_VELOCITY_PPS=
GOBILDA_5203_435_ENCODER_PPR*GOBILDA_5203_435_RPM/60.0; // 2787.9135 pps
//
// DriveBase subsystem.
//
public static DriveOrientation DEF_DRIVE_ORIENTATION = DriveOrientation.FIELD;
public static final double LFSTEER_ZERO_POS = 0.474812;
public static final double RFSTEER_ZERO_POS = 0.467663;
public static final double LBSTEER_ZERO_POS = 0.541338;
public static final double RBSTEER_ZERO_POS = 0.545340;
public static final boolean LFDRIVE_INVERTED = true;
public static final boolean RFDRIVE_INVERTED = false;
public static final boolean LBDRIVE_INVERTED = true;
public static final boolean RBDRIVE_INVERTED = false;
public static final boolean LFSTEER_INVERTED = true;
public static final boolean RFSTEER_INVERTED = true;
public static final boolean LBSTEER_INVERTED = true;
public static final boolean RBSTEER_INVERTED = true;
public static final double STEER_SERVO_KP = 0.01;
public static final double STEER_SERVO_KI = 0.0;
public static final double STEER_SERVO_KD = 0.0;
public static final double STEER_SERVO_KF = 0.0;
public static final double STEER_SERVO_IZONE = 0.0;
public static final double STEER_SERVO_TOLERANCE = 0.5;
public static final boolean DRIVE_WHEEL_BRAKE_MODE_ON = true;
public static final double TURN_POWER_LIMIT = 0.5;
public static final double DRIVE_POWER_SCALE_SLOW = 0.5;
public static final double DRIVE_POWER_SCALE_NORMAL = 1.0;
public static final double TURN_POWER_SCALE_SLOW = 0.5;
public static final double TURN_POWER_SCALE_NORMAL = 1.0;
// Optii Odometry Wheel:
public static final double ODWHEEL_DIAMETER = 35.0 * TrcUtil.INCHES_PER_MM;
public static final double ODWHEEL_CPR = 4096.0;
public static final double ODWHEEL_INCHES_PER_COUNT = Math.PI*ODWHEEL_DIAMETER/ODWHEEL_CPR;
// Scale = 0.00105687652708656383937269814237 inches/count
public static final double YLEFT_ODWHEEL_X_OFFSET = -144.0 * TrcUtil.INCHES_PER_MM;
public static final double YLEFT_ODWHEEL_Y_OFFSET = -12.0 * TrcUtil.INCHES_PER_MM;
public static final double YRIGHT_ODWHEEL_X_OFFSET = 144.0 * TrcUtil.INCHES_PER_MM;
public static final double YRIGHT_ODWHEEL_Y_OFFSET = -12.0 * TrcUtil.INCHES_PER_MM;
public static final double X_ODWHEEL_X_OFFSET = 0.0;
public static final double X_ODWHEEL_Y_OFFSET = -168.0 * TrcUtil.INCHES_PER_MM;
public static final FtcGamepad.DriveMode ROBOT_DRIVE_MODE = FtcGamepad.DriveMode.ARCADE_MODE;
//
// Velocity controlled constants.
//
public static final double DRIVE_MOTOR_MAX_VELOCITY_PPS = GOBILDA_5203_312_MAX_VELOCITY_PPS;
public static final TrcPidController.PidCoefficients xPosPidCoeff =
new TrcPidController.PidCoefficients(0.095, 0.0, 0.001, 0.0);
public static final double XPOS_TOLERANCE = 1.0;
public static final double XPOS_INCHES_PER_COUNT = 0.01924724265461924299065420560748;
public static final Double X_RAMP_RATE = null;//10.0;
public static final TrcPidController.PidCoefficients yPosPidCoeff =
new TrcPidController.PidCoefficients(0.06, 0.0, 0.002, 0.0);
public static final double YPOS_TOLERANCE = 1.0;
public static final double YPOS_INCHES_PER_COUNT = 0.02166184604662450653409090909091;
public static final Double Y_RAMP_RATE = null;//10.0;
public static final TrcPidController.PidCoefficients turnPidCoeff =
new TrcPidController.PidCoefficients(0.02, 0.08, 0.003, 0.0, 30.0);
public static final double TURN_TOLERANCE = 1.0;
public static final Double TURN_RAMP_RATE = null;//10.0;
//
// Pure Pursuit parameters.
//
// No-Load max velocity (i.e. theoretical maximum)
// goBILDA 5203-312 motor, max shaft speed = 312 RPM
// motor-to-wheel gear ratio = 1:1
// max wheel speed = pi * wheel diameter * wheel gear ratio * motor RPM / 60.0
// = 3.1415926535897932384626433832795 * 4 in. * 1.0 * 312.0 / 60.0
// = 65.345127194667699360022982372214 in./sec.
public static final double ROBOT_MAX_VELOCITY = 23.0; // measured maximum from drive speed test.
public static final double ROBOT_MAX_ACCELERATION = 500.0; // measured maximum from drive speed test.
// KF should be set to the reciprocal of max tangential velocity (time to travel unit distance), units: sec./in.
public static final TrcPidController.PidCoefficients velPidCoeff =
new TrcPidController.PidCoefficients(0.0, 0.0, 0.0, 1.0/ROBOT_MAX_VELOCITY);
public static final double PPD_FOLLOWING_DISTANCE = 6.0;
public static final double PPD_POS_TOLERANCE = 1.0;
public static final double PPD_POS_ERR_RATE_THRESHOLD = 1.0;
public static final double PPD_TURN_TOLERANCE = 1.0;
public static final double PPD_TURN_ERR_RATE_THRESHOLD = 1.0;
} //class RobotParams

View File

@ -0,0 +1,141 @@
/*
* Copyright (c) 2024 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode.autocommands;
import TrcCommonLib.trclib.TrcEvent;
import TrcCommonLib.trclib.TrcRobot;
import TrcCommonLib.trclib.TrcStateMachine;
import TrcCommonLib.trclib.TrcTimer;
import teamcode.FtcAuto;
import teamcode.Robot;
/**
* This class implements an autonomous strategy.
*/
public class CmdAuto implements TrcRobot.RobotCommand
{
private static final String moduleName = CmdAuto.class.getSimpleName();
private enum State
{
START,
DONE
} //enum State
private final Robot robot;
private final FtcAuto.AutoChoices autoChoices;
private final TrcTimer timer;
private final TrcEvent event;
private final TrcStateMachine<State> sm;
/**
* Constructor: Create an instance of the object.
*
* @param robot specifies the robot object for providing access to various global objects.
* @param autoChoices specifies the autoChoices object.
*/
public CmdAuto(Robot robot, FtcAuto.AutoChoices autoChoices)
{
this.robot = robot;
this.autoChoices = autoChoices;
timer = new TrcTimer(moduleName);
event = new TrcEvent(moduleName);
sm = new TrcStateMachine<>(moduleName);
sm.start(State.START);
} //CmdAuto
//
// Implements the TrcRobot.RobotCommand interface.
//
/**
* This method checks if the current RobotCommand is running.
*
* @return true if the command is running, false otherwise.
*/
@Override
public boolean isActive()
{
return sm.isEnabled();
} //isActive
/**
* This method cancels the command if it is active.
*/
@Override
public void cancel()
{
timer.cancel();
sm.stop();
} //cancel
/**
* This method must be called periodically by the caller to drive the command sequence forward.
*
* @param elapsedTime specifies the elapsed time in seconds since the start of the robot mode.
* @return true if the command sequence is completed, false otherwise.
*/
@Override
public boolean cmdPeriodic(double elapsedTime)
{
State state = sm.checkReadyAndGetState();
if (state == null)
{
robot.dashboard.displayPrintf(8, "State: disabled or waiting (nextState=%s)...", sm.getNextState());
}
else
{
robot.dashboard.displayPrintf(8, "State: %s", state);
switch (state)
{
case START:
if (autoChoices.delay > 0.0)
{
timer.set(autoChoices.delay, event);
sm.waitForSingleEvent(event, State.DONE);
}
else
{
sm.setState(State.DONE);
}
break;
default:
case DONE:
// We are done.
cancel();
break;
}
robot.globalTracer.traceStateInfo(
sm.toString(), state, robot.robotDrive.driveBase, robot.robotDrive.pidDrive,
robot.robotDrive.purePursuitDrive, null);
}
return !sm.isEnabled();
} //cmdPeriodic
} //class CmdAuto

View File

@ -0,0 +1,181 @@
/*
* Copyright (c) 2024 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode.autotasks;
import TrcCommonLib.trclib.TrcAutoTask;
import TrcCommonLib.trclib.TrcEvent;
import TrcCommonLib.trclib.TrcOwnershipMgr;
import TrcCommonLib.trclib.TrcRobot;
import TrcCommonLib.trclib.TrcTaskMgr;
import teamcode.Robot;
/**
* This class implements auto-assist task.
*/
public class TaskAuto extends TrcAutoTask<TaskAuto.State>
{
private static final String moduleName = TaskAuto.class.getSimpleName();
public enum State
{
START,
DONE
} //enum State
private static class TaskParams
{
TaskParams()
{
} //TaskParams
} //class TaskParams
private final String ownerName;
private final Robot robot;
private String currOwner = null;
/**
* Constructor: Create an instance of the object.
*
* @param ownerName specifies the owner name to take subsystem ownership, can be null if no ownership required.
* @param robot specifies the robot object that contains all the necessary subsystems.
*/
public TaskAuto(String ownerName, Robot robot)
{
super(moduleName, ownerName, TrcTaskMgr.TaskType.POST_PERIODIC_TASK);
this.ownerName = ownerName;
this.robot = robot;
} //TaskAuto
/**
* This method starts the auto-assist operation.
*
* @param completionEvent specifies the event to signal when done, can be null if none provided.
*/
public void autoAssist(TrcEvent completionEvent)
{
tracer.traceInfo(moduleName, "event=%s", completionEvent);
startAutoTask(State.START, new TaskParams(), completionEvent);
} //autoAssist
/**
* This method cancels an in progress auto-assist operation if any.
*/
public void autoAssistCancel()
{
tracer.traceInfo(moduleName, "Canceling auto-assist.");
stopAutoTask(false);
} //autoAssistCancel
//
// Implement TrcAutoTask abstract methods.
//
/**
* This method is called by the super class to acquire ownership of all subsystems involved in the auto-assist
* operation. This is typically done before starting an auto-assist operation.
*
* @return true if acquired all subsystems ownership, false otherwise. It releases all ownership if any acquire
* failed.
*/
@Override
protected boolean acquireSubsystemsOwnership()
{
boolean success = ownerName == null ||
(robot.robotDrive.driveBase.acquireExclusiveAccess(ownerName));
if (success)
{
currOwner = ownerName;
tracer.traceInfo(moduleName, "Successfully acquired subsystem ownerships.");
}
else
{
TrcOwnershipMgr ownershipMgr = TrcOwnershipMgr.getInstance();
tracer.traceWarn(
moduleName, "Failed to acquire subsystem ownership (currOwner=%s, robotDrive=%s).",
currOwner, ownershipMgr.getOwner(robot.robotDrive.driveBase));
releaseSubsystemsOwnership();
}
return success;
} //acquireSubsystemsOwnership
/**
* This method is called by the super class to release ownership of all subsystems involved in the auto-assist
* operation. This is typically done if the auto-assist operation is completed or canceled.
*/
@Override
protected void releaseSubsystemsOwnership()
{
if (ownerName != null)
{
TrcOwnershipMgr ownershipMgr = TrcOwnershipMgr.getInstance();
tracer.traceInfo(
moduleName, "Releasing subsystem ownership (currOwner=%s, robotDrive=%s).",
currOwner, ownershipMgr.getOwner(robot.robotDrive.driveBase));
robot.robotDrive.driveBase.releaseExclusiveAccess(currOwner);
currOwner = null;
}
} //releaseSubsystemsOwnership
/**
* This method is called by the super class to stop all the subsystems.
*/
@Override
protected void stopSubsystems()
{
tracer.traceInfo(moduleName, "Stopping subsystems.");
robot.robotDrive.cancel(currOwner);
} //stopSubsystems
/**
* This methods is called periodically to run the auto-assist task.
*
* @param params specifies the task parameters.
* @param state specifies the current state of the task.
* @param taskType specifies the type of task being run.
* @param runMode specifies the competition mode (e.g. Autonomous, TeleOp, Test).
* @param slowPeriodicLoop specifies true if it is running the slow periodic loop on the main robot thread,
* false if running the fast loop on the main robot thread.
*/
@Override
protected void runTaskState(
Object params, State state, TrcTaskMgr.TaskType taskType, TrcRobot.RunMode runMode, boolean slowPeriodicLoop)
{
TaskParams taskParams = (TaskParams) params;
switch (state)
{
case START:
break;
default:
case DONE:
// Stop task.
stopAutoTask(true);
break;
}
} //runTaskState
} //class TaskAuto

View File

@ -0,0 +1,103 @@
/*
* Copyright (c) 2023 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode.drivebases;
import TrcCommonLib.trclib.TrcDbgTrace;
import TrcCommonLib.trclib.TrcMecanumDriveBase;
import TrcCommonLib.trclib.TrcOdometryWheels;
import TrcCommonLib.trclib.TrcPidDrive;
import TrcCommonLib.trclib.TrcPurePursuitDrive;
import teamcode.RobotParams;
/**
* This class creates the RobotDrive subsystem that consists of wheel motors and related objects for driving the
* robot.
*/
public class MecanumDrive extends RobotDrive
{
/**
* Constructor: Create an instance of the object.
*/
public MecanumDrive()
{
super();
driveMotors = createDriveMotors(driveMotorNames, driveMotorInverted);
driveBase = new TrcMecanumDriveBase(
driveMotors[INDEX_LEFT_FRONT], driveMotors[INDEX_LEFT_BACK],
driveMotors[INDEX_RIGHT_FRONT], driveMotors[INDEX_RIGHT_BACK], gyro);
if (RobotParams.Preferences.useExternalOdometry)
{
// Create the external odometry device that uses the right back encoder port as the X odometry and
// the left and right front encoder ports as the Y1 and Y2 odometry. Gyro will serve as the angle
// odometry.
TrcOdometryWheels driveBaseOdometry = new TrcOdometryWheels(
new TrcOdometryWheels.AxisSensor(
driveMotors[INDEX_RIGHT_BACK], RobotParams.X_ODWHEEL_Y_OFFSET, RobotParams.X_ODWHEEL_X_OFFSET),
new TrcOdometryWheels.AxisSensor[] {
new TrcOdometryWheels.AxisSensor(
driveMotors[INDEX_LEFT_FRONT], RobotParams.YLEFT_ODWHEEL_X_OFFSET,
RobotParams.YLEFT_ODWHEEL_Y_OFFSET),
new TrcOdometryWheels.AxisSensor(
driveMotors[INDEX_RIGHT_FRONT], RobotParams.YRIGHT_ODWHEEL_X_OFFSET,
RobotParams.YRIGHT_ODWHEEL_Y_OFFSET)},
gyro);
// Set the drive base to use the external odometry device overriding the built-in one.
driveBase.setDriveBaseOdometry(driveBaseOdometry);
driveBase.setOdometryScales(RobotParams.ODWHEEL_INCHES_PER_COUNT, RobotParams.ODWHEEL_INCHES_PER_COUNT);
}
else
{
driveBase.setOdometryScales(RobotParams.XPOS_INCHES_PER_COUNT, RobotParams.YPOS_INCHES_PER_COUNT);
}
//
// Create and initialize PID controllers.
//
pidDrive = new TrcPidDrive(
"pidDrive", driveBase,
RobotParams.xPosPidCoeff, RobotParams.XPOS_TOLERANCE, driveBase::getXPosition,
RobotParams.yPosPidCoeff, RobotParams.YPOS_TOLERANCE, driveBase::getYPosition,
RobotParams.turnPidCoeff, RobotParams.TURN_TOLERANCE, driveBase::getHeading);
pidDrive.setStallDetectionEnabled(true);
pidDrive.getXPidCtrl().setRampRate(RobotParams.X_RAMP_RATE);
pidDrive.getYPidCtrl().setRampRate(RobotParams.Y_RAMP_RATE);
pidDrive.getTurnPidCtrl().setRampRate(RobotParams.TURN_RAMP_RATE);
pidDrive.getTurnPidCtrl().setAbsoluteSetPoint(true);
// FTC robots generally have USB performance issues where the sampling rate of the gyro is not high enough.
// If the robot turns too fast, PID will cause oscillation. By limiting turn power, the robot turns slower.
pidDrive.getTurnPidCtrl().setOutputLimit(RobotParams.TURN_POWER_LIMIT);
// AbsoluteTargetMode eliminates cumulative errors on multi-segment runs because drive base is keeping track
// of the absolute target position.
pidDrive.setAbsoluteTargetModeEnabled(true);
pidDrive.setTraceLevel(TrcDbgTrace.MsgLevel.INFO, false, false, false);
purePursuitDrive = new TrcPurePursuitDrive(
"purePursuitDrive", driveBase,
RobotParams.PPD_FOLLOWING_DISTANCE, RobotParams.PPD_POS_TOLERANCE, RobotParams.PPD_TURN_TOLERANCE,
RobotParams.xPosPidCoeff, RobotParams.yPosPidCoeff, RobotParams.turnPidCoeff, RobotParams.velPidCoeff);
purePursuitDrive.setStallDetectionEnabled(true);
purePursuitDrive.setFastModeEnabled(true);
purePursuitDrive.setTraceLevel(TrcDbgTrace.MsgLevel.INFO, false, false, false);
} //MecanumDrive
} //class MecanumDrive

View File

@ -0,0 +1,122 @@
/*
* Copyright (c) 2023 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode.drivebases;
import TrcCommonLib.trclib.TrcDriveBase;
import TrcCommonLib.trclib.TrcPidDrive;
import TrcCommonLib.trclib.TrcPurePursuitDrive;
import TrcCommonLib.trclib.TrcUtil;
import TrcFtcLib.ftclib.FtcDcMotor;
import TrcFtcLib.ftclib.FtcImu;
import teamcode.RobotParams;
/**
* This class is intended to be extended by subclasses implementing different robot drive bases.
*/
public class RobotDrive
{
public static final int INDEX_LEFT_FRONT = 0;
public static final int INDEX_RIGHT_FRONT = 1;
public static final int INDEX_LEFT_BACK = 2;
public static final int INDEX_RIGHT_BACK = 3;
protected final String[] driveMotorNames = {
RobotParams.HWNAME_LFDRIVE_MOTOR, RobotParams.HWNAME_RFDRIVE_MOTOR,
RobotParams.HWNAME_LBDRIVE_MOTOR, RobotParams.HWNAME_RBDRIVE_MOTOR};
protected final boolean[] driveMotorInverted = {
RobotParams.LFDRIVE_INVERTED, RobotParams.RFDRIVE_INVERTED,
RobotParams.LBDRIVE_INVERTED, RobotParams.RBDRIVE_INVERTED};
//
// Sensors.
//
public final FtcImu gyro;
//
// Subclass needs to initialize the following variables.
//
// Drive motors.
public FtcDcMotor[] driveMotors;
// Drive Base.
public TrcDriveBase driveBase;
// Drive Controllers.
public TrcPidDrive pidDrive;
public TrcPurePursuitDrive purePursuitDrive;
/**
* Constructor: Create an instance of the object.
*/
public RobotDrive()
{
gyro = new FtcImu(RobotParams.HWNAME_IMU, RobotParams.hubLogoDirection, RobotParams.hubUsbDirection);
} //RobotDrive
/**
* This method cancels any PIDDrive operation still in progress.
*
* @param owner specifies the owner that requested the cancel.
*/
public void cancel(String owner)
{
if (pidDrive != null && pidDrive.isActive())
{
pidDrive.cancel(owner);
}
if (purePursuitDrive != null && purePursuitDrive.isActive())
{
purePursuitDrive.cancel(owner);
}
driveBase.stop(owner);
} //cancel
/**
* This method cancels any PIDDrive operation still in progress.
*/
public void cancel()
{
cancel(null);
} //cancel
/**
* This method creates and configures all drive motors.
*
* @param motorNames specifies an array of names for each drive motor.
* @param inverted specifies an array of boolean indicating if the drive motor needs to be inverted.
* @return an array of created drive motors.
*/
protected FtcDcMotor[] createDriveMotors(String[] motorNames, boolean[] inverted)
{
FtcDcMotor[] motors = new FtcDcMotor[motorNames.length];
for (int i = 0; i < motorNames.length; i++)
{
motors[i] = new FtcDcMotor(motorNames[i]);
motors[i].setBrakeModeEnabled(RobotParams.DRIVE_WHEEL_BRAKE_MODE_ON);
motors[i].setMotorInverted(inverted[i]);
motors[i].setVoltageCompensationEnabled(TrcUtil.BATTERY_NOMINAL_VOLTAGE);
}
return motors;
} //createDriveMotors
} //class RobotDrive

View File

@ -0,0 +1,393 @@
/*
* Copyright (c) 2023 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode.drivebases;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.PrintStream;
import java.util.Arrays;
import java.util.Scanner;
import TrcCommonLib.trclib.TrcDbgTrace;
import TrcCommonLib.trclib.TrcOdometryWheels;
import TrcCommonLib.trclib.TrcPidDrive;
import TrcCommonLib.trclib.TrcPurePursuitDrive;
import TrcCommonLib.trclib.TrcSwerveDriveBase;
import TrcCommonLib.trclib.TrcSwerveModule;
import TrcFtcLib.ftclib.FtcAnalogEncoder;
import TrcFtcLib.ftclib.FtcCRServo;
import TrcFtcLib.ftclib.FtcDcMotor;
import teamcode.RobotParams;
/**
* This class creates the RobotDrive subsystem that consists of wheel motors and related objects for driving the
* robot.
*/
public class SwerveDrive extends RobotDrive
{
private static final String moduleName = SwerveDrive.class.getSimpleName();
private static final TrcDbgTrace globalTracer = TrcDbgTrace.getGlobalTracer();
private final String[] steerEncoderNames = {
RobotParams.HWNAME_LFSTEER_ENCODER, RobotParams.HWNAME_RFSTEER_ENCODER,
RobotParams.HWNAME_LBSTEER_ENCODER, RobotParams.HWNAME_RBSTEER_ENCODER};
public double[] zeroPositions = {
RobotParams.LFSTEER_ZERO_POS, RobotParams.RFSTEER_ZERO_POS,
RobotParams.LBSTEER_ZERO_POS, RobotParams.RBSTEER_ZERO_POS};
private final String[] steerServoNames = {
RobotParams.HWNAME_LFSTEER_SERVO, RobotParams.HWNAME_RFSTEER_SERVO,
RobotParams.HWNAME_LBSTEER_SERVO, RobotParams.HWNAME_RBSTEER_SERVO};
private final boolean[] steerServoInverted = {
RobotParams.LFSTEER_INVERTED, RobotParams.RFSTEER_INVERTED,
RobotParams.LBSTEER_INVERTED, RobotParams.RBSTEER_INVERTED};
private final String[] swerveModuleNames = {"lfSwerveModule", "rfSwerveModule", "lbSwerveModule", "rbSwerveModule"};
//
// Swerve steering motors and modules.
//
public final FtcAnalogEncoder[] steerEncoders;
public final FtcCRServo[] steerServos;
public final TrcSwerveModule[] swerveModules;
public int calibrationCount = 0;
/**
* Constructor: Create an instance of the object.
*/
public SwerveDrive()
{
super();
readSteeringCalibrationData();
driveMotors = createDriveMotors(driveMotorNames, driveMotorInverted);
steerEncoders = createSteerEncoders(
steerEncoderNames,
RobotParams.Preferences.doSwervePhysicalAlignment ? new double[] {0.0, 0.0, 0.0, 0.0} : zeroPositions);
steerServos = createSteerServos(steerServoNames, steerServoInverted, steerEncoders);
swerveModules = createSwerveModules(swerveModuleNames, driveMotors, steerServos);
driveBase = new TrcSwerveDriveBase(
swerveModules[INDEX_LEFT_FRONT], swerveModules[INDEX_LEFT_BACK],
swerveModules[INDEX_RIGHT_FRONT], swerveModules[INDEX_RIGHT_BACK],
gyro, RobotParams.DRIVE_BASE_WIDTH, RobotParams.DRIVE_BASE_LENGTH);
driveBase.setDriveOrientation(RobotParams.DEF_DRIVE_ORIENTATION, true);
if (RobotParams.Preferences.useExternalOdometry)
{
//
// Create the external odometry device that uses the right back encoder port as the X odometry and
// the left and right front encoder ports as the Y1 and Y2 odometry. Gyro will serve as the angle
// odometry.
//
TrcOdometryWheels driveBaseOdometry = new TrcOdometryWheels(
new TrcOdometryWheels.AxisSensor(
driveMotors[INDEX_RIGHT_BACK], RobotParams.X_ODWHEEL_Y_OFFSET, RobotParams.X_ODWHEEL_X_OFFSET),
new TrcOdometryWheels.AxisSensor[] {
new TrcOdometryWheels.AxisSensor(
driveMotors[INDEX_LEFT_FRONT], RobotParams.YLEFT_ODWHEEL_X_OFFSET,
RobotParams.YLEFT_ODWHEEL_Y_OFFSET),
new TrcOdometryWheels.AxisSensor(
driveMotors[INDEX_RIGHT_FRONT], RobotParams.YRIGHT_ODWHEEL_X_OFFSET,
RobotParams.YRIGHT_ODWHEEL_Y_OFFSET)},
gyro);
//
// Set the drive base to use the external odometry device overriding the built-in one.
//
driveBase.setDriveBaseOdometry(driveBaseOdometry);
driveBase.setOdometryScales(RobotParams.ODWHEEL_INCHES_PER_COUNT, RobotParams.ODWHEEL_INCHES_PER_COUNT);
}
else
{
driveBase.setOdometryScales(RobotParams.YPOS_INCHES_PER_COUNT, RobotParams.YPOS_INCHES_PER_COUNT);
}
//
// Create and initialize PID controllers.
//
pidDrive = new TrcPidDrive(
"pidDrive", driveBase,
RobotParams.xPosPidCoeff, RobotParams.XPOS_TOLERANCE, driveBase::getXPosition,
RobotParams.yPosPidCoeff, RobotParams.YPOS_TOLERANCE, driveBase::getYPosition,
RobotParams.turnPidCoeff, RobotParams.TURN_TOLERANCE, driveBase::getHeading);
pidDrive.getXPidCtrl().setRampRate(RobotParams.X_RAMP_RATE);
pidDrive.getYPidCtrl().setRampRate(RobotParams.Y_RAMP_RATE);
pidDrive.getTurnPidCtrl().setRampRate(RobotParams.TURN_RAMP_RATE);
pidDrive.getTurnPidCtrl().setAbsoluteSetPoint(true);
// FTC robots generally have USB performance issues where the sampling rate of the gyro is not high enough.
// If the robot turns too fast, PID will cause oscillation. By limiting turn power, the robot turns slower.
pidDrive.getTurnPidCtrl().setOutputLimit(RobotParams.TURN_POWER_LIMIT);
// AbsoluteTargetMode eliminates cumulative errors on multi-segment runs because drive base is keeping track
// of the absolute target position.
pidDrive.setAbsoluteTargetModeEnabled(true);
pidDrive.setTraceLevel(TrcDbgTrace.MsgLevel.INFO, false, false, false);
purePursuitDrive = new TrcPurePursuitDrive(
"purePursuitDrive", driveBase,
RobotParams.PPD_FOLLOWING_DISTANCE, RobotParams.PPD_POS_TOLERANCE, RobotParams.PPD_TURN_TOLERANCE,
RobotParams.xPosPidCoeff, RobotParams.yPosPidCoeff, RobotParams.turnPidCoeff, RobotParams.velPidCoeff);
purePursuitDrive.setFastModeEnabled(true);
purePursuitDrive.setTraceLevel(TrcDbgTrace.MsgLevel.INFO, false, false, false);
} //SwerveDrive
/**
* This method creates and configures all steer encoders.
*
* @param encoderNames specifies an array of names for each steer encoder.
* @param zeroOffsets specifies an array of zero offsets for each steer encoder.
* @return an array of created steer encoders.
*/
private FtcAnalogEncoder[] createSteerEncoders(String[] encoderNames, double[] zeroOffsets)
{
FtcAnalogEncoder[] encoders = new FtcAnalogEncoder[encoderNames.length];
for (int i = 0; i < steerEncoderNames.length; i++)
{
encoders[i] = new FtcAnalogEncoder(encoderNames[i]);
encoders[i].setScaleAndOffset(180.0, 0.0, zeroOffsets[i]);
// Enable Cartesian converter.
encoders[i].setEnabled(true);
}
return encoders;
} //createSteerEncoders
/**
* This method creates and configures all steer servos.
*
* @param servoNames specifies an array of names for each steer servo.
* @param inverted specifies an array of boolean indicating if the servo needs to be inverted.
* @param encoders specifies an array of encoders for each steer servo.
* @return an array of created steer servos.
*/
private FtcCRServo[] createSteerServos(String[] servoNames, boolean[] inverted, FtcAnalogEncoder[] encoders)
{
FtcCRServo[] servos = new FtcCRServo[servoNames.length];
for (int i = 0; i < servoNames.length; i++)
{
servos[i] = new FtcCRServo(servoNames[i], encoders[i]);
if (RobotParams.Preferences.swerveDualServoSteering)
{
FtcCRServo slaveServo = new FtcCRServo(servoNames[i] + ".slave", null);
slaveServo.follow(servos[i], false);
}
servos[i].setMotorInverted(inverted[i]);
servos[i].setSoftwarePidEnabled(true);
servos[i].setPositionPidCoefficients(
RobotParams.STEER_SERVO_KP, RobotParams.STEER_SERVO_KI,
RobotParams.STEER_SERVO_KD, RobotParams.STEER_SERVO_KF, RobotParams.STEER_SERVO_IZONE);
servos[i].setPositionPidTolerance(RobotParams.STEER_SERVO_TOLERANCE);
}
return servos;
} //createSteerServos
/**
* This method creates and configures all swerve modules.
*
* @param moduleNames specifies an array of names for each swerve module.needs to be inverted.
* @param driveMotors specifies an array of drive motors.
* @param steerServos specifies an array of steer servos.
* @return an array of created swerve modules.
*/
private TrcSwerveModule[] createSwerveModules(
String[] moduleNames, FtcDcMotor[] driveMotors, FtcCRServo[] steerServos)
{
TrcSwerveModule[] modules = new TrcSwerveModule[moduleNames.length];
for (int i = 0; i < moduleNames.length; i++)
{
modules[i] = new TrcSwerveModule(moduleNames[i], driveMotors[i], steerServos[i]);
}
return modules;
} //createSwerveModules
/**
* This method enables/disables performance monitoring of all steering servo motors.
*
* @param enabled specifies true to enable, false to disable.
*/
public void setSteerPerformanceMonitorEnabled(boolean enabled)
{
for (FtcCRServo servo: steerServos)
{
servo.setPerformanceMonitorEnabled(enabled);
}
} //setSteerPerformanceMonitorEnabled
/**
* This method prints the performance info to the trace log.
*/
public void printSteerPerformanceInfo()
{
for (FtcCRServo servo : steerServos)
{
servo.printPidControlTaskPerformance();
}
} //printSteerPerformanceInfo
/**
* This method sets the steering angle of all swerve modules.
*
* @param angle specifies the steer angle.
* @param optimize specifies true to optimize (only turns within +/- 90 degrees), false otherwse.
* @param hold specifies true to hold the angle, false otherwise.
*/
public void setSteerAngle(double angle, boolean optimize, boolean hold)
{
for (TrcSwerveModule module: swerveModules)
{
module.setSteerAngle(angle, optimize, hold);
}
} //setSteerAngle
/**
* This method checks if anti-defense mode is enabled.
*
* @return true if anti-defense mode is enabled, false if disabled.
*/
public boolean isAntiDefenseEnabled()
{
return ((TrcSwerveDriveBase) driveBase).isAntiDefenseEnabled();
} //isAntiDefenseEnabled
/**
* This method enables/disables the anti-defense mode where it puts all swerve wheels into an X-formation.
* By doing so, it is very difficult for others to push us around.
*
* @param owner specifies the ID string of the caller for checking ownership, can be null if caller is not
* ownership aware.
* @param enabled specifies true to enable anti-defense mode, false to disable.
*/
public void setAntiDefenseEnabled(String owner, boolean enabled)
{
if (owner == null || !enabled || driveBase.acquireExclusiveAccess(owner))
{
((TrcSwerveDriveBase) driveBase).setAntiDefenseEnabled(owner, enabled);
if (!enabled)
{
driveBase.releaseExclusiveAccess(owner);
}
}
} //setAntiDefenseEnabled
/**
* This method starts the steering calibration.
*/
public void startSteeringCalibration()
{
calibrationCount = 0;
Arrays.fill(zeroPositions, 0.0);
} //startSteeringCalibration
/**
* This method stops the steering calibration and saves the calibration data to a file.
*/
public void stopSteeringCalibration()
{
for (int i = 0; i < zeroPositions.length; i++)
{
zeroPositions[i] /= calibrationCount;
}
calibrationCount = 0;
saveSteeringCalibrationData();
} //stopSteeringCalibration
/**
* This method is called periodically to sample the steer encoders for averaging the zero position data.
*/
public void runSteeringCalibration()
{
for (int i = 0; i < zeroPositions.length; i++)
{
zeroPositions[i] += steerEncoders[i].getRawPosition();
}
calibrationCount++;
} //runSteeringCalibration
/**
* This method saves the calibration data to a file on the Robot Controller.
*/
public void saveSteeringCalibrationData()
{
try (PrintStream out = new PrintStream(new FileOutputStream(
RobotParams.TEAM_FOLDER_PATH + "/" + RobotParams.STEERING_CALIBRATION_DATA_FILE)))
{
for (int i = 0; i < steerServoNames.length; i++)
{
out.printf("%s: %f\n", steerServoNames[i], zeroPositions[i]);
}
out.close();
globalTracer.traceInfo(
moduleName, "SteeringCalibrationData%s=%s",
Arrays.toString(steerServoNames), Arrays.toString(zeroPositions));
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
} //saveSteeringCalibrationData
/**
* This method reads the steering calibration data from a file on the Robot Controller.
*
* @throws RuntimeException if file contains invalid data.
*/
public void readSteeringCalibrationData()
{
String line = null;
try (Scanner in = new Scanner(new FileReader(
RobotParams.TEAM_FOLDER_PATH + "/" + RobotParams.STEERING_CALIBRATION_DATA_FILE)))
{
for (int i = 0; i < steerServoNames.length; i++)
{
line = in.nextLine();
int colonPos = line.indexOf(':');
String name = colonPos == -1? null: line.substring(0, colonPos);
if (name == null || !name.equals(steerServoNames[i]))
{
throw new RuntimeException("Invalid servo name in line " + line);
}
zeroPositions[i] = Double.parseDouble(line.substring(colonPos + 1));
}
}
catch (FileNotFoundException e)
{
globalTracer.traceWarn(moduleName, "Steering calibration data file not found, using built-in defaults.");
}
catch (NumberFormatException e)
{
globalTracer.traceErr(moduleName, "Invalid zero position value in line %s", line);
}
catch (RuntimeException e)
{
globalTracer.traceErr(moduleName, "Invalid servo name in line %s", line);
}
globalTracer.traceInfo(
moduleName, "SteeringCalibrationData%s=%s", Arrays.toString(steerServoNames),
Arrays.toString(zeroPositions));
} //readSteeringCalibrationData
} //class SwerveDrive

View File

@ -0,0 +1,108 @@
/*
* Copyright (c) 2023 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode.subsystems;
import TrcCommonLib.trclib.TrcDriveBase;
import TrcCommonLib.trclib.TrcRevBlinkin;
import TrcFtcLib.ftclib.FtcRevBlinkin;
/**
* This class encapsulates the REV Blinkin LED controller to provide a priority indicator showing the status of the
* robot.
*/
public class BlinkinLEDs extends FtcRevBlinkin
{
// LED pattern names.
public static final String APRIL_TAG = "AprilTag";
public static final String RED_BLOB = "RedBlob";
public static final String BLUE_BLOB = "BlueBlob";
public static final String TENSOR_FLOW = "TensorFlow";
public static final String DRIVE_ORIENTATION_FIELD = "FieldMode";
public static final String DRIVE_ORIENTATION_ROBOT = "RobotMode";
public static final String DRIVE_ORIENTATION_INVERTED = "InvertedMode";
public static final String OFF_PATTERN = "Off";
/**
* Constructor: Create an instance of the object.
*
* @param instanceName specifies the instance name. This is also the REV Blinkin's hardware name.
*/
public BlinkinLEDs(String instanceName)
{
super(instanceName);
// LED Patterns are sorted in decreasing priority order.
final TrcRevBlinkin.Pattern[] ledPatternPriorities = {
// Highest priority.
new TrcRevBlinkin.Pattern(APRIL_TAG, RevLedPattern.SolidAqua),
new TrcRevBlinkin.Pattern(RED_BLOB, RevLedPattern.SolidRed),
new TrcRevBlinkin.Pattern(BLUE_BLOB, RevLedPattern.SolidBlue),
new TrcRevBlinkin.Pattern(TENSOR_FLOW, RevLedPattern.SolidYellow),
new TrcRevBlinkin.Pattern(DRIVE_ORIENTATION_FIELD, RevLedPattern.SolidViolet),
new TrcRevBlinkin.Pattern(DRIVE_ORIENTATION_ROBOT, RevLedPattern.SolidWhite),
new TrcRevBlinkin.Pattern(DRIVE_ORIENTATION_INVERTED, RevLedPattern.SolidGray),
new TrcRevBlinkin.Pattern(OFF_PATTERN, RevLedPattern.SolidBlack)
// Lowest priority.
};
setPatternPriorities(ledPatternPriorities);
} //BlinkinLEDs
/**
* This method sets the pattern ON for a period of time and turns off automatically afterwards.
*
* @param patternName specifies the name of the LED pattern to turn on.
*/
public void setDetectedPattern(String patternName)
{
setPatternState(patternName, true, 0.5);
} //setDetectedPattern
/**
* This method sets the LED to indicate the drive orientation mode of the robot.
*
* @param orientation specifies the drive orientation mode.
*/
public void setDriveOrientation(TrcDriveBase.DriveOrientation orientation)
{
switch (orientation)
{
case INVERTED:
setPatternState(DRIVE_ORIENTATION_INVERTED, true);
setPatternState(DRIVE_ORIENTATION_ROBOT, false);
setPatternState(DRIVE_ORIENTATION_FIELD, false);
break;
case ROBOT:
setPatternState(DRIVE_ORIENTATION_INVERTED, false);
setPatternState(DRIVE_ORIENTATION_ROBOT, true);
setPatternState(DRIVE_ORIENTATION_FIELD, false);
break;
case FIELD:
setPatternState(DRIVE_ORIENTATION_INVERTED, false);
setPatternState(DRIVE_ORIENTATION_ROBOT, false);
setPatternState(DRIVE_ORIENTATION_FIELD, true);
break;
}
} //setDriveOrientation
} //class BlinkinLEDs

View File

@ -0,0 +1,642 @@
/*
* Copyright (c) 2023 Titan Robotics Club (http://www.titanrobotics.com)
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package teamcode.vision;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
import org.firstinspires.ftc.vision.VisionProcessor;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import org.opencv.imgproc.Imgproc;
import org.openftc.easyopencv.OpenCvCamera;
import org.openftc.easyopencv.OpenCvCameraFactory;
import java.util.ArrayList;
import TrcCommonLib.trclib.TrcDbgTrace;
import TrcCommonLib.trclib.TrcOpenCvColorBlobPipeline;
import TrcCommonLib.trclib.TrcOpenCvDetector;
import TrcCommonLib.trclib.TrcPose2D;
import TrcCommonLib.trclib.TrcVisionTargetInfo;
import TrcFtcLib.ftclib.FtcEocvColorBlobProcessor;
import TrcFtcLib.ftclib.FtcOpMode;
import TrcFtcLib.ftclib.FtcRawEocvColorBlobPipeline;
import TrcFtcLib.ftclib.FtcRawEocvVision;
import TrcFtcLib.ftclib.FtcVision;
import TrcFtcLib.ftclib.FtcVisionAprilTag;
import TrcFtcLib.ftclib.FtcVisionEocvColorBlob;
import TrcFtcLib.ftclib.FtcVisionTensorFlow;
import teamcode.Robot;
import teamcode.RobotParams;
import teamcode.subsystems.BlinkinLEDs;
/**
* This class implements AprilTag/TensorFlow/Eocv Vision for the game season. It creates and initializes all the
* vision target info as well as providing info for the robot, camera and the field. It also provides methods to get
* the location of the robot and detected targets.
*/
public class Vision
{
private static final String moduleName = Vision.class.getSimpleName();
// Warning: EOCV converts camera stream to RGBA whereas Desktop OpenCV converts it to BGRA. Therefore, the correct
// color conversion must be RGBA (or RGB) to whatever color space you want to convert.
//
// YCrCb Color Space.
private static final int colorConversion = Imgproc.COLOR_RGB2YCrCb;
private static final double[] redBlobColorThresholds = {20.0, 120.0, 180.0, 240.0, 90.0, 120.0};
private static final double[] blueBlobColorThresholds = {20.0, 250.0, 40.0, 250.0, 160.0, 240.0};
private static final TrcOpenCvColorBlobPipeline.FilterContourParams colorBlobFilterContourParams =
new TrcOpenCvColorBlobPipeline.FilterContourParams()
.setMinArea(5000.0)
.setMinPerimeter(200.0)
.setWidthRange(50.0, 1000.0)
.setHeightRange(80.0, 1000.0)
.setSolidityRange(0.0, 100.0)
.setVerticesRange(0.0, 1000.0)
.setAspectRatioRange(0.3, 1.0);
private static final String TFOD_MODEL_ASSET = "MyObject.tflite";
private static final float TFOD_MIN_CONFIDENCE = 0.75f;
public static final String TFOD_OBJECT_LABEL = "MyObject";
public static final String[] TFOD_TARGET_LABELS = {TFOD_OBJECT_LABEL};
private final TrcDbgTrace tracer;
private final Robot robot;
private final WebcamName webcam1, webcam2;
private FtcRawEocvColorBlobPipeline rawColorBlobPipeline;
public FtcRawEocvVision rawColorBlobVision;
public FtcVisionAprilTag aprilTagVision;
private AprilTagProcessor aprilTagProcessor;
public FtcVisionEocvColorBlob redBlobVision;
private FtcEocvColorBlobProcessor redBlobProcessor;
public FtcVisionEocvColorBlob blueBlobVision;
private FtcEocvColorBlobProcessor blueBlobProcessor;
public FtcVisionTensorFlow tensorFlowVision;
private TfodProcessor tensorFlowProcessor;
public FtcVision vision;
/**
* Constructor: Create an instance of the object.
*
* @param robot specifies the robot object.
*/
public Vision(Robot robot)
{
FtcOpMode opMode = FtcOpMode.getInstance();
this.tracer = new TrcDbgTrace();
this.robot = robot;
this.webcam1 = opMode.hardwareMap.get(WebcamName.class, RobotParams.HWNAME_WEBCAM1);
this.webcam2 = RobotParams.Preferences.hasWebCam2?
opMode.hardwareMap.get(WebcamName.class, RobotParams.HWNAME_WEBCAM2): null;
if (RobotParams.Preferences.tuneColorBlobVision)
{
OpenCvCamera openCvCamera;
if (RobotParams.Preferences.showVisionView)
{
int cameraViewId = opMode.hardwareMap.appContext.getResources().getIdentifier(
"cameraMonitorViewId", "id", opMode.hardwareMap.appContext.getPackageName());
openCvCamera = OpenCvCameraFactory.getInstance().createWebcam(webcam1, cameraViewId);
}
else
{
openCvCamera = OpenCvCameraFactory.getInstance().createWebcam(webcam1);
}
tracer.traceInfo(moduleName, "Starting RawEocvColorBlobVision...");
rawColorBlobPipeline = new FtcRawEocvColorBlobPipeline(
"rawColorBlobPipeline", colorConversion, redBlobColorThresholds, colorBlobFilterContourParams, true);
// By default, display original Mat.
rawColorBlobPipeline.setVideoOutput(0);
rawColorBlobPipeline.setAnnotateEnabled(true);
rawColorBlobVision = new FtcRawEocvVision(
"rawColorBlobVision", RobotParams.CAM_IMAGE_WIDTH, RobotParams.CAM_IMAGE_HEIGHT, null, null,
openCvCamera, RobotParams.CAM_ORIENTATION);
rawColorBlobVision.setFpsMeterEnabled(RobotParams.Preferences.showVisionStat);
setRawColorBlobVisionEnabled(false);
}
else
{
// Creating Vision Processors for VisionPortal.
ArrayList<VisionProcessor> visionProcessorsList = new ArrayList<>();
if (RobotParams.Preferences.useAprilTagVision)
{
tracer.traceInfo(moduleName, "Starting AprilTagVision...");
FtcVisionAprilTag.Parameters aprilTagParams = new FtcVisionAprilTag.Parameters()
.setDrawTagIdEnabled(true)
.setDrawTagOutlineEnabled(true)
.setDrawAxesEnabled(false)
.setDrawCubeProjectionEnabled(false)
// .setLensIntrinsics(
// RobotParams.WEBCAM_FX, RobotParams.WEBCAM_FY, RobotParams.WEBCAM_CX, RobotParams.WEBCAM_CY)
.setOutputUnits(DistanceUnit.INCH, AngleUnit.DEGREES);
aprilTagVision = new FtcVisionAprilTag(aprilTagParams, AprilTagProcessor.TagFamily.TAG_36h11);
aprilTagProcessor = aprilTagVision.getVisionProcessor();
visionProcessorsList.add(aprilTagProcessor);
}
if (RobotParams.Preferences.useColorBlobVision)
{
tracer.traceInfo(moduleName, "Starting ColorBlobVision...");
redBlobVision = new FtcVisionEocvColorBlob(
"RedBlob", colorConversion, redBlobColorThresholds, colorBlobFilterContourParams, true,
RobotParams.cameraRect, RobotParams.worldRect, true);
redBlobProcessor = redBlobVision.getVisionProcessor();
visionProcessorsList.add(redBlobProcessor);
blueBlobVision = new FtcVisionEocvColorBlob(
"BlueBlob", colorConversion, blueBlobColorThresholds, colorBlobFilterContourParams, true,
RobotParams.cameraRect, RobotParams.worldRect, true);
blueBlobProcessor = blueBlobVision.getVisionProcessor();
visionProcessorsList.add(blueBlobProcessor);
}
if (RobotParams.Preferences.useTensorFlowVision)
{
tracer.traceInfo(moduleName, "Starting TensorFlowVision...");
tensorFlowVision = new FtcVisionTensorFlow(
null, true, TFOD_MODEL_ASSET, TFOD_TARGET_LABELS, RobotParams.cameraRect, RobotParams.worldRect);
tensorFlowProcessor = tensorFlowVision.getVisionProcessor();
tensorFlowProcessor.setMinResultConfidence(TFOD_MIN_CONFIDENCE);
visionProcessorsList.add(tensorFlowProcessor);
}
VisionProcessor[] visionProcessors = new VisionProcessor[visionProcessorsList.size()];
visionProcessorsList.toArray(visionProcessors);
if (RobotParams.Preferences.useWebCam)
{
// Use USB webcams.
vision = new FtcVision(
webcam1, webcam2, RobotParams.CAM_IMAGE_WIDTH, RobotParams.CAM_IMAGE_HEIGHT,
RobotParams.Preferences.showVisionView, RobotParams.Preferences.showVisionStat, visionProcessors);
}
else
{
// Use phone camera.
vision = new FtcVision(
RobotParams.Preferences.useBuiltinCamBack?
BuiltinCameraDirection.BACK: BuiltinCameraDirection.FRONT,
RobotParams.CAM_IMAGE_WIDTH, RobotParams.CAM_IMAGE_HEIGHT,
RobotParams.Preferences.showVisionView, RobotParams.Preferences.showVisionStat, visionProcessors);
}
// Disable all vision until they are needed.
for (VisionProcessor processor: visionProcessors)
{
vision.setProcessorEnabled(processor, false);
}
}
} //Vision
/**
* This method closes the vision portal and is normally called at the end of an opmode.
*/
public void close()
{
if (vision != null)
{
vision.getVisionPortal().close();
}
} //close
/**
* This method enables/disables FPS meter on the viewport.
*
* @param enabled specifies true to enable FPS meter, false to disable.
*/
public void setFpsMeterEnabled(boolean enabled)
{
if (rawColorBlobVision != null)
{
rawColorBlobVision.setFpsMeterEnabled(enabled);
}
else if (vision != null)
{
vision.setFpsMeterEnabled(enabled);
}
} //setFpsMeterEnabled
/**
* This method returns the front webcam.
*
* @return front webcam.
*/
public WebcamName getFrontWebcam()
{
return webcam1;
} //getFrontWebcam
/**
* This method returns the rear webcam.
*
* @return rear webcam.
*/
public WebcamName getRearWebcam()
{
return webcam2;
} //getRearWebcam
/**
* This method returns the active camera if we have two webcams.
*
* @return active camera.
*/
public WebcamName getActiveWebcam()
{
return vision.getActiveWebcam();
} //getActiveWebcam
/**
* This method sets the active webcam.
*
* @param webcam specifies the webcam to be set as active.
*/
public void setActiveWebcam(WebcamName webcam)
{
vision.setActiveWebcam(webcam);
} //setActiveWebcam
/**
* This method displays the exposure settings on the dashboard. This helps tuning camera exposure.
*
* @param lineNum specifies the dashboard line number to display the info.
*/
public void displayExposureSettings(int lineNum)
{
long[] exposureSetting = vision.getExposureSetting();
long currExposure = vision.getCurrentExposure();
int[] gainSetting = vision.getGainSetting();
int currGain = vision.getCurrentGain();
robot.dashboard.displayPrintf(
lineNum, "Exp: %d (%d:%d), Gain: %d (%d:%d)",
currExposure, exposureSetting[0], exposureSetting[1],
currGain, gainSetting != null? gainSetting[0]: 0, gainSetting != null? gainSetting[1]: 0);
} //displayExposureSettings
/**
* This method returns the color threshold values of rawColorBlobVision.
*
* @return array of color threshold values.
*/
public double[] getRawColorBlobThresholds()
{
return rawColorBlobPipeline != null? rawColorBlobPipeline.getColorThresholds(): null;
} //getRawColorBlobThresholds
/**
* This method sets the color threshold values of rawColorBlobVision.
*
* @param colorThresholds specifies an array of color threshold values.
*/
public void setRawColorBlobThresholds(double... colorThresholds)
{
if (rawColorBlobPipeline != null)
{
rawColorBlobPipeline.setColorThresholds(colorThresholds);
}
} //setRawColorBlobThresholds
/**
* This method enables/disables raw ColorBlob vision.
*
* @param enabled specifies true to enable, false to disable.
*/
public void setRawColorBlobVisionEnabled(boolean enabled)
{
if (rawColorBlobVision != null)
{
rawColorBlobVision.setPipeline(enabled? rawColorBlobPipeline: null);
}
} //setRawColorBlobVisionEnabled
/**
* This method checks if raw ColorBlob vision is enabled.
*
* @return true if enabled, false if disabled.
*/
public boolean isRawColorBlobVisionEnabled()
{
return rawColorBlobVision != null && rawColorBlobVision.getPipeline() != null;
} //isRawColorBlobVisionEnabled
/**
* This method calls RawColorBlob vision to detect the color blob for color threshold tuning.
*
* @param lineNum specifies the dashboard line number to display the detected object info, -1 to disable printing.
* @return detected raw color blob object info.
*/
public TrcVisionTargetInfo<TrcOpenCvDetector.DetectedObject<?>> getDetectedRawColorBlob(int lineNum)
{
TrcVisionTargetInfo<TrcOpenCvDetector.DetectedObject<?>> colorBlobInfo =
rawColorBlobVision != null? rawColorBlobVision.getBestDetectedTargetInfo(null, null, 0.0, 0.0): null;
if (lineNum != -1)
{
robot.dashboard.displayPrintf(
lineNum, "RawColorBlob: %s", colorBlobInfo != null? colorBlobInfo: "Not found.");
}
return colorBlobInfo;
} //getDetectedRawColorBlob
/**
* This method enables/disables AprilTag vision.
*
* @param enabled specifies true to enable, false to disable.
*/
public void setAprilTagVisionEnabled(boolean enabled)
{
if (aprilTagProcessor != null)
{
vision.setProcessorEnabled(aprilTagProcessor, enabled);
}
} //setAprilTagVisionEnabled
/**
* This method checks if AprilTag vision is enabled.
*
* @return true if enabled, false if disabled.
*/
public boolean isAprilTagVisionEnabled()
{
return aprilTagProcessor != null && vision.isVisionProcessorEnabled(aprilTagProcessor);
} //isAprilTagVisionEnabled
/**
* This method calls AprilTag vision to detect the AprilTag object.
*
* @param id specifies the AprilTag ID to look for, null if match to any ID.
* @param lineNum specifies the dashboard line number to display the detected object info, -1 to disable printing.
* @return detected AprilTag object info.
*/
public TrcVisionTargetInfo<FtcVisionAprilTag.DetectedObject> getDetectedAprilTag(Integer id, int lineNum)
{
TrcVisionTargetInfo<FtcVisionAprilTag.DetectedObject> aprilTagInfo =
aprilTagVision.getBestDetectedTargetInfo(id, null);
if (aprilTagInfo != null && robot.blinkin != null)
{
robot.blinkin.setDetectedPattern(BlinkinLEDs.APRIL_TAG);
}
if (lineNum != -1)
{
robot.dashboard.displayPrintf(
lineNum, "%s: %s", BlinkinLEDs.APRIL_TAG, aprilTagInfo != null? aprilTagInfo : "Not found.");
}
return aprilTagInfo;
} //getDetectedAprilTag
/**
* This method calculates the robot's absolute field location with the detected AprilTagInfo.
*
* @param aprilTagInfo specifies the detected AprilTag info.
* @return robot field location.
*/
public TrcPose2D getRobotFieldPose(TrcVisionTargetInfo<FtcVisionAprilTag.DetectedObject> aprilTagInfo)
{
TrcPose2D robotPose = null;
if (aprilTagInfo != null)
{
TrcPose2D aprilTagPose = RobotParams.APRILTAG_POSES[aprilTagInfo.detectedObj.aprilTagDetection.id - 1];
TrcPose2D cameraPose = aprilTagPose.subtractRelativePose(aprilTagInfo.objPose);
robotPose = cameraPose.subtractRelativePose(RobotParams.CAM_POSE);
tracer.traceInfo(
moduleName,
"AprilTagId=" + aprilTagInfo.detectedObj.aprilTagDetection.id +
", aprilTagFieldPose=" + aprilTagPose +
", aprilTagPoseFromCamera=" + aprilTagInfo.objPose +
", cameraPose=" + cameraPose +
", robotPose=%s" + robotPose);
}
return robotPose;
} //getRobotFieldPose
/**
* This method uses vision to find an AprilTag and uses the AprilTag's absolute field location and its relative
* position from the camera to calculate the robot's absolute field location.
*
* @return robot field location.
*/
public TrcPose2D getRobotFieldPose()
{
TrcPose2D robotPose = null;
if (aprilTagVision != null)
{
// Find any AprilTag in view.
TrcVisionTargetInfo<FtcVisionAprilTag.DetectedObject> aprilTagInfo = getDetectedAprilTag(null, -1);
if (aprilTagInfo != null)
{
robotPose = getRobotFieldPose(aprilTagInfo);
}
}
return robotPose;
} //getRobotFieldPose
/**
* This method enables/disables RedBlob vision.
*
* @param enabled specifies true to enable, false to disable.
*/
public void setRedBlobVisionEnabled(boolean enabled)
{
if (redBlobProcessor != null)
{
vision.setProcessorEnabled(redBlobProcessor, enabled);
}
} //setRedBlobVisionEnabled
/**
* This method checks if RedBlob vision is enabled.
*
* @return true if enabled, false if disabled.
*/
public boolean isRedBlobVisionEnabled()
{
return redBlobProcessor != null && vision.isVisionProcessorEnabled(redBlobProcessor);
} //isRedBlobVisionEnabled
/**
* This method calls ColorBlob vision to detect the Red Blob object.
*
* @param lineNum specifies the dashboard line number to display the detected object info, -1 to disable printing.
* @return detected Red Blob object info.
*/
public TrcVisionTargetInfo<TrcOpenCvColorBlobPipeline.DetectedObject> getDetectedRedBlob(int lineNum)
{
TrcVisionTargetInfo<TrcOpenCvColorBlobPipeline.DetectedObject> colorBlobInfo =
redBlobVision.getBestDetectedTargetInfo(null, null, 0.0, 0.0);
if (colorBlobInfo != null && robot.blinkin != null)
{
robot.blinkin.setDetectedPattern(BlinkinLEDs.RED_BLOB);
}
if (lineNum != -1)
{
robot.dashboard.displayPrintf(
lineNum, "%s: %s", BlinkinLEDs.RED_BLOB, colorBlobInfo != null? colorBlobInfo: "Not found.");
}
return colorBlobInfo;
} //getDetectedRedBlob
/**
* This method enables/disables BlueBlob vision.
*
* @param enabled specifies true to enable, false to disable.
*/
public void setBlueBlobVisionEnabled(boolean enabled)
{
if (blueBlobProcessor != null)
{
vision.setProcessorEnabled(blueBlobProcessor, enabled);
}
} //setBlueBlobVisionEnabled
/**
* This method checks if BlueBlob vision is enabled.
*
* @return true if enabled, false if disabled.
*/
public boolean isBlueBlobVisionEnabled()
{
return blueBlobProcessor != null && vision.isVisionProcessorEnabled(blueBlobProcessor);
} //isBlueBlobVisionEnabled
/**
* This method calls ColorBlob vision to detect the Blue Blob object.
*
* @param lineNum specifies the dashboard line number to display the detected object info, -1 to disable printing.
* @return detected Blue Blob object info.
*/
public TrcVisionTargetInfo<TrcOpenCvColorBlobPipeline.DetectedObject> getDetectedBlueBlob(int lineNum)
{
TrcVisionTargetInfo<TrcOpenCvColorBlobPipeline.DetectedObject> colorBlobInfo =
blueBlobVision.getBestDetectedTargetInfo(null, null, 0.0, 0.0);
if (colorBlobInfo != null && robot.blinkin != null)
{
robot.blinkin.setDetectedPattern(BlinkinLEDs.BLUE_BLOB);
}
if (lineNum != -1)
{
robot.dashboard.displayPrintf(
lineNum, "%s: %s", BlinkinLEDs.BLUE_BLOB, colorBlobInfo != null? colorBlobInfo: "Not found.");
}
return colorBlobInfo;
} //getDetectedBlueBlob
/**
* This method enables/disables TensorFlow vision.
*
* @param enabled specifies true to enable, false to disable.
*/
public void setTensorFlowVisionEnabled(boolean enabled)
{
if (tensorFlowProcessor != null)
{
vision.setProcessorEnabled(tensorFlowProcessor, enabled);
}
} //setTensorFlowVisionEnabled
/**
* This method checks if TensorFlow vision is enabled.
*
* @return true if enabled, false if disabled.
*/
public boolean isTensorFlowVisionEnabled()
{
return tensorFlowProcessor != null && vision.isVisionProcessorEnabled(tensorFlowProcessor);
} //isTensorFlowVisionEnabled
/**
* This method calls TensorFlow vision to detect the Pixel objects.
*
* @param lineNum specifies the dashboard line number to display the detected object info, -1 to disable printing.
* @return detected Pixel object info.
*/
public TrcVisionTargetInfo<FtcVisionTensorFlow.DetectedObject> getDetectedTensorFlowPixel(int lineNum)
{
TrcVisionTargetInfo<FtcVisionTensorFlow.DetectedObject> tensorFlowInfo =
tensorFlowVision.getBestDetectedTargetInfo(TFOD_OBJECT_LABEL, null, this::compareConfidence, 0.0, 0.0);
if (tensorFlowInfo != null && robot.blinkin != null)
{
robot.blinkin.setDetectedPattern(BlinkinLEDs.TENSOR_FLOW);
}
if (lineNum != -1)
{
robot.dashboard.displayPrintf(
lineNum, "%s: %s", BlinkinLEDs.TENSOR_FLOW, tensorFlowInfo != null? tensorFlowInfo: "Not found.");
}
return tensorFlowInfo;
} //getDetectedTensorFlowPixel
/**
* This method is called by the Arrays.sort to sort the target object by increasing distance.
*
* @param a specifies the first target
* @param b specifies the second target.
* @return negative value if a has closer distance than b, 0 if a and b have equal distances, positive value
* if a has higher distance than b.
*/
private int compareDistance(
TrcVisionTargetInfo<TrcOpenCvColorBlobPipeline.DetectedObject> a,
TrcVisionTargetInfo<TrcOpenCvColorBlobPipeline.DetectedObject> b)
{
return (int)((b.objPose.y - a.objPose.y)*100);
} //compareDistance
/**
* This method is called by the Arrays.sort to sort the target object by decreasing confidence.
*
* @param a specifies the first target
* @param b specifies the second target.
* @return negative value if a has higher confidence than b, 0 if a and b have equal confidence, positive value
* if a has lower confidence than b.
*/
private int compareConfidence(
TrcVisionTargetInfo<FtcVisionTensorFlow.DetectedObject> a,
TrcVisionTargetInfo<FtcVisionTensorFlow.DetectedObject> b)
{
return (int)((b.detectedObj.confidence - a.detectedObj.confidence)*100);
} //compareConfidence
} //class Vision

Binary file not shown.

After

Width:  |  Height:  |  Size: 28 KiB

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More