![]() |
sciurus17_examples package from sciurus17 reposciurus17 sciurus17_control sciurus17_examples sciurus17_gazebo sciurus17_moveit_config sciurus17_tools sciurus17_vision |
|
Package Summary
Tags | No category tags. |
Version | 3.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Checkout URI | https://github.com/rt-net/sciurus17_ros.git |
VCS Type | git |
VCS Version | humble-devel |
Last Updated | 2023-11-10 |
Dev Status | MAINTAINED |
CI status | No Continuous Integration |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Package Description
Additional Links
Maintainers
- RT Corporation
Authors
- Daisuke Sato
- Hiroyuki Nomura
- Atsushi Kuwagata
sciurus17_examples
このパッケージはSciurus17 ROS 2パッケージのサンプルコード集です。
準備(実機を使う場合)
1. Sciurus17本体をPCに接続する
Sciurus17本体をPCに接続します。 接続方法は製品マニュアルを参照してください。
※Sciurus17本体が接触しないように、十分なスペースを確保してください。
2. USB通信ポートの接続を確認する
USB通信ポートの設定についてはsciurus17_control
の
README
を参照してください。
正しく設定できていない場合、Sciurus17が動作しないので注意してください
3. move_groupとcontrollerを起動する
次のコマンドでmove_group (sciurus17_moveit_config
)と
controller (sciurus17_control
)を起動します。
ros2 launch sciurus17_examples demo.launch.py
準備 (Gazeboを使う場合)
1. move_groupとGazeboを起動する
次のコマンドでmove_group (sciurus17_moveit_config
)と
Gazeboを起動します。
ros2 launch sciurus17_gazebo sciurus17_with_table.launch.py
サンプルプログラムを実行する
準備ができたらサンプルプログラムを実行します。 例えばグリッパを開閉するサンプルは次のコマンドで実行できます。
ros2 launch sciurus17_examples example.launch.py example:='gripper_control'
終了するときはCtrl+c
を入力します。
Gazeboでサンプルプログラムを実行する場合
Gazeboでサンプルプログラムを実行する場合はuse_sim_time
オプションを付けます。
ros2 launch sciurus17_examples example.launch.py example:='gripper_control' use_sim_time:='true'
Examples
demo.launch
を実行している状態で各サンプルを実行できます。
実行できるサンプルの一覧は、example.launch.py
にオプション-s
を付けて実行することで表示できます。
ros2 launch sciurus17_examples example.launch.py -s
gripper_control
ハンドを開閉させるコード例です。
次のコマンドを実行します。
ros2 launch sciurus17_examples example.launch.py example:='gripper_control'
neck_control
首を上下左右へ動かすコード例です。
次のコマンドを実行します。
ros2 launch sciurus17_examples example.launch.py example:='neck_control'
waist_control
腰を左右へひねる動作をするコード例です。
次のコマンドを実行します。
ros2 launch sciurus17_examples example.launch.py example:='waist_control'
pick_and_place_right_arm_waist
右手でターゲットを掴んで動かすコード例です。腰の回転も使用します。
次のコマンドを実行します。
ros2 launch sciurus17_examples example.launch.py example:='pick_and_place_right_arm_waist'
pick_and_place_left_arm
左手でターゲットを掴んで動かすコード例です。
次のコマンドを実行します。
ros2 launch sciurus17_examples example.launch.py example:='pick_and_place_left_arm'
Wiki Tutorials
Source Tutorials
Package Dependencies
Deps | Name | |
---|---|---|
1 | ament_cmake | |
1 | ament_lint_auto | |
1 | ament_lint_common | |
1 | angles | |
1 | sciurus17_control | |
0 | sciurus17_description | |
1 | sciurus17_moveit_config | |
2 | geometry_msgs | |
3 | moveit_ros_planning_interface | |
1 | rclcpp | |
3 | tf2_geometry_msgs |
System Dependencies
Dependant Packages
Name | Repo | Deps |
---|---|---|
sciurus17 | github-rt-net-sciurus17_ros |
Launch files
Messages
Services
Plugins
Recent questions tagged sciurus17_examples at answers.ros.org
![]() |
sciurus17_examples package from sciurus17 reposciurus17 sciurus17_bringup sciurus17_control sciurus17_examples sciurus17_gazebo sciurus17_moveit_config sciurus17_msgs sciurus17_tools sciurus17_vision |
|
Package Summary
Tags | No category tags. |
Version | 2.0.0 |
License | Apache License 2.0 |
Build type | CATKIN |
Use | RECOMMENDED |
Repository Summary
Checkout URI | https://github.com/rt-net/sciurus17_ros.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2023-11-10 |
Dev Status | MAINTAINED |
CI status | No Continuous Integration |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Package Description
Additional Links
Maintainers
- RT Corporation
Authors
- Daisuke Sato
- Hiroyuki Nomura
sciurus17_examples
This package includes examples to control Sciurus17 using sciurus17_ros
.
How to launch Sciurus17 base packages
- Connect cables of a head camera, a chest camera and a control board to a PC.
- Power on the Sciurus17 and the camera device names are shown in the
/dev
directory. - Open terminal and launch
sciurus17_bringup.launch
ofsciurus17_bringup
package.
This launch file has arguments:
- use_rviz (default: true)
- use_head_camera (default: true)
- use_chest_camera (default: true)
Using virtual Sciurus17
To launch Sciurus17 base packages without Sciurus17 hardware, unplug the control board's cable from the PC, then launch nodes with the following command:
roslaunch sciurus17_bringup sciurus17_bringup.launch
Using real Sciurus17
Launch the base packages with the following command:
roslaunch sciurus17_bringup sciurus17_bringup.launch
Using without cameras
Launch the base packages with arguments:
roslaunch sciurus17_bringup sciurus17_bringup.launch use_head_camera:=false use_chest_camera:=false
Using without RViz
To reduce the CPU load of the PC, launch the base packages with arguments:
roslaunch sciurus17_bringup sciurus17_bringup.launch use_rviz:=false
Using Gazebo simulator
Launch the packages with the following command:
roslaunch sciurus17_gazebo sciurus17_with_table.launch
# without RViz
roslaunch sciurus17_gazebo sciurus17_with_table.launch use_rviz:=false
Run Examples
Following examples will be executable after launch Sciurus17 base packages.
- gripper_action_example
- neck_joint_trajectory_example
- waist_joint_trajectory_example
- pick_and_place_demo
- hand_position_publisher
- head_camera_tracking
- chest_camera_tracking
- depth_camera_tracking
- preset_pid_gain_example
- box_stacking_example
gripper_action_example
This is an example to open/close the grippers of the two arms.
Run a node with the following command:
rosrun sciurus17_examples gripper_action_example.py
Videos
neck_joint_trajectory_example
This is an example to change angles of the neck.
Run a node with the following command:
rosrun sciurus17_examples neck_joint_trajectory_example.py
Videos
waist_joint_trajectory_example
This is an example to change angles of the waist.
Run a node with the following command:
rosrun sciurus17_examples waist_joint_trajectory_example.py
Videos
pick_and_place_demo
This is an example to pick and place a small object with right hand while turning the waist.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_right_arm_demo.py
This is an example to pick and place a small object with left hand.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_left_arm_demo.py
This is an example to pick and place a small object with both hands.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_two_arm_demo.py
Videos
hand_position_publisher
This is an example to receive link positions from tf
server.
This example receives transformed positions l_link7
and r_link7
based on base_link
from tf
server, then publishes these positions as topics named
/sciurus17/hand_pos/left
and /sciurus17/hand_pos/right
.
Run a node with the following command:
rosrun sciurus17_examples hand_position_publisher_example.py
head_camera_tracking
This is an example to use the head camera images and OpenCV library for ball tracking and face tracking.
Run a node with the following command:
rosrun sciurus17_examples head_camera_tracking.py
For ball tracking
Edit ./scripts/head_camera_tracking.py
as follows:
def _image_callback(self, ros_image):
# ...
# Detect an object (specific color or face)
output_image = self._detect_orange_object(input_image)
# output_image = self._detect_blue_object(input_image)
# output_image = self._detect_face(input_image)
For face tracking
Edit ./scripts/head_camera_tracking.py
as follows:
This example uses Cascade Classifier for face tracking.
Please edit the directories of cascade files in the script file. USER_NAME depends on user environments.
class ObjectTracker:
def __init__(self):
# ...
# Load cascade files
# Example:
# self._face_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_frontalface_alt2.xml")
# self._eyes_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_eye.xml")
self._face_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_frontalface_alt2.xml")
self._eyes_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_eye.xml")
def _image_callback(self, ros_image):
# ...
# Detect an object (specific color or face)
# output_image = self._detect_orange_object(input_image)
# output_image = self._detect_blue_object(input_image)
output_image = self._detect_face(input_image)
Videos
This orange ball can be purchased from this page in RT ROBOT SHOP.
chest_camera_tracking
This is an example to use the chest camera images and OpenCV library for ball tracking.
Run a node with the following command:
rosrun sciurus17_examples chest_camera_tracking.py
Execute face tracking and ball tracking simultaneously
Launch nodes with the following commands for face tracking with the head camera and for ball tracking with the chest camera.
rosrun sciurus17_examples head_camera_tracking.py
# Open another terminal
rosrun sciurus17_examples chest_camera_tracking.py
Videos
depth_camera_tracking
This is an example to use the depth camera on the head for object tracking.
Run a node with the following command:
rosrun sciurus17_examples depth_camera_tracking.py
The default detection range is separated into four stages.
To change the detection range, edit ./scripts/depth_camera_tracking.py
as the followings:
def _detect_object(self, input_depth_image):
# Limitation of object size
MIN_OBJECT_SIZE = 10000 # px * px
MAX_OBJECT_SIZE = 80000 # px * px
# The detection range is separated into four stages.
# Unit: mm
DETECTION_DEPTH = [
(500, 700),
(600, 800),
(700, 900),
(800, 1000)]
preset_pid_gain_example
This is an example to change PID gains of the servo motors in bulk
using preset_reconfigure
of sciurus17_control
.
Lists of PID gain preset values can be edited in sciurus17_control/scripts/preset_reconfigure.py.
Launch nodes preset_reconfigure.py
and preset_pid_gain_example.py
with the following command:
roslaunch sciurus17_examples preset_pid_gain_example.launch
box_stacking_example
This is an example to detect boxes via PointCloudLibrary and stack the boxes.
Launch nodes with the following command:
roslaunch sciurus17_examples box_stacking_example.launch
To visualize the result of box detection, please add /sciurus17/example/markers
of visualization_msgs/MarkerArray
in Rviz.
Videos
Wiki Tutorials
Source Tutorials
Package Dependencies
Deps | Name | |
---|---|---|
1 | sciurus17_moveit_config | |
1 | pcl_ros | |
2 | roscpp | |
2 | sensor_msgs | |
3 | pcl_conversions | |
2 | visualization_msgs | |
2 | geometry_msgs | |
1 | catkin | |
3 | moveit_commander | |
3 | cv_bridge |
System Dependencies
Dependant Packages
Name | Repo | Deps |
---|---|---|
sciurus17 | github-rt-net-sciurus17_ros |
Messages
Services
Plugins
Recent questions tagged sciurus17_examples at answers.ros.org
![]() |
sciurus17_examples package from sciurus17 reposciurus17 sciurus17_bringup sciurus17_control sciurus17_examples sciurus17_gazebo sciurus17_moveit_config sciurus17_msgs sciurus17_tools sciurus17_vision |
|
Package Summary
Tags | No category tags. |
Version | 2.0.0 |
License | Apache License 2.0 |
Build type | CATKIN |
Use | RECOMMENDED |
Repository Summary
Checkout URI | https://github.com/rt-net/sciurus17_ros.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2023-11-10 |
Dev Status | MAINTAINED |
CI status | No Continuous Integration |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Package Description
Additional Links
Maintainers
- RT Corporation
Authors
- Daisuke Sato
- Hiroyuki Nomura
sciurus17_examples
This package includes examples to control Sciurus17 using sciurus17_ros
.
How to launch Sciurus17 base packages
- Connect cables of a head camera, a chest camera and a control board to a PC.
- Power on the Sciurus17 and the camera device names are shown in the
/dev
directory. - Open terminal and launch
sciurus17_bringup.launch
ofsciurus17_bringup
package.
This launch file has arguments:
- use_rviz (default: true)
- use_head_camera (default: true)
- use_chest_camera (default: true)
Using virtual Sciurus17
To launch Sciurus17 base packages without Sciurus17 hardware, unplug the control board's cable from the PC, then launch nodes with the following command:
roslaunch sciurus17_bringup sciurus17_bringup.launch
Using real Sciurus17
Launch the base packages with the following command:
roslaunch sciurus17_bringup sciurus17_bringup.launch
Using without cameras
Launch the base packages with arguments:
roslaunch sciurus17_bringup sciurus17_bringup.launch use_head_camera:=false use_chest_camera:=false
Using without RViz
To reduce the CPU load of the PC, launch the base packages with arguments:
roslaunch sciurus17_bringup sciurus17_bringup.launch use_rviz:=false
Using Gazebo simulator
Launch the packages with the following command:
roslaunch sciurus17_gazebo sciurus17_with_table.launch
# without RViz
roslaunch sciurus17_gazebo sciurus17_with_table.launch use_rviz:=false
Run Examples
Following examples will be executable after launch Sciurus17 base packages.
- gripper_action_example
- neck_joint_trajectory_example
- waist_joint_trajectory_example
- pick_and_place_demo
- hand_position_publisher
- head_camera_tracking
- chest_camera_tracking
- depth_camera_tracking
- preset_pid_gain_example
- box_stacking_example
gripper_action_example
This is an example to open/close the grippers of the two arms.
Run a node with the following command:
rosrun sciurus17_examples gripper_action_example.py
Videos
neck_joint_trajectory_example
This is an example to change angles of the neck.
Run a node with the following command:
rosrun sciurus17_examples neck_joint_trajectory_example.py
Videos
waist_joint_trajectory_example
This is an example to change angles of the waist.
Run a node with the following command:
rosrun sciurus17_examples waist_joint_trajectory_example.py
Videos
pick_and_place_demo
This is an example to pick and place a small object with right hand while turning the waist.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_right_arm_demo.py
This is an example to pick and place a small object with left hand.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_left_arm_demo.py
This is an example to pick and place a small object with both hands.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_two_arm_demo.py
Videos
hand_position_publisher
This is an example to receive link positions from tf
server.
This example receives transformed positions l_link7
and r_link7
based on base_link
from tf
server, then publishes these positions as topics named
/sciurus17/hand_pos/left
and /sciurus17/hand_pos/right
.
Run a node with the following command:
rosrun sciurus17_examples hand_position_publisher_example.py
head_camera_tracking
This is an example to use the head camera images and OpenCV library for ball tracking and face tracking.
Run a node with the following command:
rosrun sciurus17_examples head_camera_tracking.py
For ball tracking
Edit ./scripts/head_camera_tracking.py
as follows:
def _image_callback(self, ros_image):
# ...
# Detect an object (specific color or face)
output_image = self._detect_orange_object(input_image)
# output_image = self._detect_blue_object(input_image)
# output_image = self._detect_face(input_image)
For face tracking
Edit ./scripts/head_camera_tracking.py
as follows:
This example uses Cascade Classifier for face tracking.
Please edit the directories of cascade files in the script file. USER_NAME depends on user environments.
class ObjectTracker:
def __init__(self):
# ...
# Load cascade files
# Example:
# self._face_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_frontalface_alt2.xml")
# self._eyes_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_eye.xml")
self._face_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_frontalface_alt2.xml")
self._eyes_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_eye.xml")
def _image_callback(self, ros_image):
# ...
# Detect an object (specific color or face)
# output_image = self._detect_orange_object(input_image)
# output_image = self._detect_blue_object(input_image)
output_image = self._detect_face(input_image)
Videos
This orange ball can be purchased from this page in RT ROBOT SHOP.
chest_camera_tracking
This is an example to use the chest camera images and OpenCV library for ball tracking.
Run a node with the following command:
rosrun sciurus17_examples chest_camera_tracking.py
Execute face tracking and ball tracking simultaneously
Launch nodes with the following commands for face tracking with the head camera and for ball tracking with the chest camera.
rosrun sciurus17_examples head_camera_tracking.py
# Open another terminal
rosrun sciurus17_examples chest_camera_tracking.py
Videos
depth_camera_tracking
This is an example to use the depth camera on the head for object tracking.
Run a node with the following command:
rosrun sciurus17_examples depth_camera_tracking.py
The default detection range is separated into four stages.
To change the detection range, edit ./scripts/depth_camera_tracking.py
as the followings:
def _detect_object(self, input_depth_image):
# Limitation of object size
MIN_OBJECT_SIZE = 10000 # px * px
MAX_OBJECT_SIZE = 80000 # px * px
# The detection range is separated into four stages.
# Unit: mm
DETECTION_DEPTH = [
(500, 700),
(600, 800),
(700, 900),
(800, 1000)]
preset_pid_gain_example
This is an example to change PID gains of the servo motors in bulk
using preset_reconfigure
of sciurus17_control
.
Lists of PID gain preset values can be edited in sciurus17_control/scripts/preset_reconfigure.py.
Launch nodes preset_reconfigure.py
and preset_pid_gain_example.py
with the following command:
roslaunch sciurus17_examples preset_pid_gain_example.launch
box_stacking_example
This is an example to detect boxes via PointCloudLibrary and stack the boxes.
Launch nodes with the following command:
roslaunch sciurus17_examples box_stacking_example.launch
To visualize the result of box detection, please add /sciurus17/example/markers
of visualization_msgs/MarkerArray
in Rviz.
Videos
Wiki Tutorials
Source Tutorials
Package Dependencies
Deps | Name | |
---|---|---|
1 | sciurus17_moveit_config | |
1 | pcl_ros | |
2 | roscpp | |
2 | sensor_msgs | |
3 | pcl_conversions | |
2 | visualization_msgs | |
2 | geometry_msgs | |
1 | catkin | |
3 | moveit_commander | |
3 | cv_bridge |
System Dependencies
Dependant Packages
Name | Repo | Deps |
---|---|---|
sciurus17 | github-rt-net-sciurus17_ros |
Messages
Services
Plugins
Recent questions tagged sciurus17_examples at answers.ros.org
![]() |
sciurus17_examples package from sciurus17 reposciurus17 sciurus17_bringup sciurus17_control sciurus17_examples sciurus17_gazebo sciurus17_moveit_config sciurus17_msgs sciurus17_tools sciurus17_vision |
|
Package Summary
Tags | No category tags. |
Version | 2.0.0 |
License | Apache License 2.0 |
Build type | CATKIN |
Use | RECOMMENDED |
Repository Summary
Checkout URI | https://github.com/rt-net/sciurus17_ros.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2023-11-10 |
Dev Status | MAINTAINED |
CI status | No Continuous Integration |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Package Description
Additional Links
Maintainers
- RT Corporation
Authors
- Daisuke Sato
- Hiroyuki Nomura
sciurus17_examples
This package includes examples to control Sciurus17 using sciurus17_ros
.
How to launch Sciurus17 base packages
- Connect cables of a head camera, a chest camera and a control board to a PC.
- Power on the Sciurus17 and the camera device names are shown in the
/dev
directory. - Open terminal and launch
sciurus17_bringup.launch
ofsciurus17_bringup
package.
This launch file has arguments:
- use_rviz (default: true)
- use_head_camera (default: true)
- use_chest_camera (default: true)
Using virtual Sciurus17
To launch Sciurus17 base packages without Sciurus17 hardware, unplug the control board's cable from the PC, then launch nodes with the following command:
roslaunch sciurus17_bringup sciurus17_bringup.launch
Using real Sciurus17
Launch the base packages with the following command:
roslaunch sciurus17_bringup sciurus17_bringup.launch
Using without cameras
Launch the base packages with arguments:
roslaunch sciurus17_bringup sciurus17_bringup.launch use_head_camera:=false use_chest_camera:=false
Using without RViz
To reduce the CPU load of the PC, launch the base packages with arguments:
roslaunch sciurus17_bringup sciurus17_bringup.launch use_rviz:=false
Using Gazebo simulator
Launch the packages with the following command:
roslaunch sciurus17_gazebo sciurus17_with_table.launch
# without RViz
roslaunch sciurus17_gazebo sciurus17_with_table.launch use_rviz:=false
Run Examples
Following examples will be executable after launch Sciurus17 base packages.
- gripper_action_example
- neck_joint_trajectory_example
- waist_joint_trajectory_example
- pick_and_place_demo
- hand_position_publisher
- head_camera_tracking
- chest_camera_tracking
- depth_camera_tracking
- preset_pid_gain_example
- box_stacking_example
gripper_action_example
This is an example to open/close the grippers of the two arms.
Run a node with the following command:
rosrun sciurus17_examples gripper_action_example.py
Videos
neck_joint_trajectory_example
This is an example to change angles of the neck.
Run a node with the following command:
rosrun sciurus17_examples neck_joint_trajectory_example.py
Videos
waist_joint_trajectory_example
This is an example to change angles of the waist.
Run a node with the following command:
rosrun sciurus17_examples waist_joint_trajectory_example.py
Videos
pick_and_place_demo
This is an example to pick and place a small object with right hand while turning the waist.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_right_arm_demo.py
This is an example to pick and place a small object with left hand.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_left_arm_demo.py
This is an example to pick and place a small object with both hands.
Run a node with the following command:
rosrun sciurus17_examples pick_and_place_two_arm_demo.py
Videos
hand_position_publisher
This is an example to receive link positions from tf
server.
This example receives transformed positions l_link7
and r_link7
based on base_link
from tf
server, then publishes these positions as topics named
/sciurus17/hand_pos/left
and /sciurus17/hand_pos/right
.
Run a node with the following command:
rosrun sciurus17_examples hand_position_publisher_example.py
head_camera_tracking
This is an example to use the head camera images and OpenCV library for ball tracking and face tracking.
Run a node with the following command:
rosrun sciurus17_examples head_camera_tracking.py
For ball tracking
Edit ./scripts/head_camera_tracking.py
as follows:
def _image_callback(self, ros_image):
# ...
# Detect an object (specific color or face)
output_image = self._detect_orange_object(input_image)
# output_image = self._detect_blue_object(input_image)
# output_image = self._detect_face(input_image)
For face tracking
Edit ./scripts/head_camera_tracking.py
as follows:
This example uses Cascade Classifier for face tracking.
Please edit the directories of cascade files in the script file. USER_NAME depends on user environments.
class ObjectTracker:
def __init__(self):
# ...
# Load cascade files
# Example:
# self._face_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_frontalface_alt2.xml")
# self._eyes_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_eye.xml")
self._face_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_frontalface_alt2.xml")
self._eyes_cascade = cv2.CascadeClassifier("/home/USER_NAME/.local/lib/python2.7/site-packages/cv2/data/haarcascade_eye.xml")
def _image_callback(self, ros_image):
# ...
# Detect an object (specific color or face)
# output_image = self._detect_orange_object(input_image)
# output_image = self._detect_blue_object(input_image)
output_image = self._detect_face(input_image)
Videos
This orange ball can be purchased from this page in RT ROBOT SHOP.
chest_camera_tracking
This is an example to use the chest camera images and OpenCV library for ball tracking.
Run a node with the following command:
rosrun sciurus17_examples chest_camera_tracking.py
Execute face tracking and ball tracking simultaneously
Launch nodes with the following commands for face tracking with the head camera and for ball tracking with the chest camera.
rosrun sciurus17_examples head_camera_tracking.py
# Open another terminal
rosrun sciurus17_examples chest_camera_tracking.py
Videos
depth_camera_tracking
This is an example to use the depth camera on the head for object tracking.
Run a node with the following command:
rosrun sciurus17_examples depth_camera_tracking.py
The default detection range is separated into four stages.
To change the detection range, edit ./scripts/depth_camera_tracking.py
as the followings:
def _detect_object(self, input_depth_image):
# Limitation of object size
MIN_OBJECT_SIZE = 10000 # px * px
MAX_OBJECT_SIZE = 80000 # px * px
# The detection range is separated into four stages.
# Unit: mm
DETECTION_DEPTH = [
(500, 700),
(600, 800),
(700, 900),
(800, 1000)]
preset_pid_gain_example
This is an example to change PID gains of the servo motors in bulk
using preset_reconfigure
of sciurus17_control
.
Lists of PID gain preset values can be edited in sciurus17_control/scripts/preset_reconfigure.py.
Launch nodes preset_reconfigure.py
and preset_pid_gain_example.py
with the following command:
roslaunch sciurus17_examples preset_pid_gain_example.launch
box_stacking_example
This is an example to detect boxes via PointCloudLibrary and stack the boxes.
Launch nodes with the following command:
roslaunch sciurus17_examples box_stacking_example.launch
To visualize the result of box detection, please add /sciurus17/example/markers
of visualization_msgs/MarkerArray
in Rviz.
Videos
Wiki Tutorials
Source Tutorials
Package Dependencies
Deps | Name | |
---|---|---|
1 | sciurus17_moveit_config | |
1 | pcl_ros | |
2 | roscpp | |
2 | sensor_msgs | |
3 | pcl_conversions | |
2 | visualization_msgs | |
2 | geometry_msgs | |
1 | catkin | |
3 | moveit_commander | |
3 | cv_bridge |
System Dependencies
Dependant Packages
Name | Repo | Deps |
---|---|---|
sciurus17 | github-rt-net-sciurus17_ros |