Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro jazzy showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro kilted showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro rolling showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro ardent showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro bouncy showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro crystal showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro eloquent showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro dashing showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro galactic showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro foxy showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro iron showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro lunar showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro jade showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro indigo showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro hydro showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro kinetic showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro melodic showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange

No version for distro noetic showing humble. Known supported distros are highlighted in the buttons above.
Package symbol

extrinsic_calibrator_core package from extrinsic_calibrator repo

extrinsic_calibrator extrinsic_calibrator_core extrinsic_calibrator_examples

ROS Distro
humble

Package Summary

Tags No category tags.
Version 0.1.0
License AGPL-3.0-only
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/Ikerlan-KER/extrinsic_calibrator.git
VCS Type git
VCS Version humble
Last Updated 2025-03-17
Dev Status MAINTAINED
CI status No Continuous Integration
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map.

Additional Links

No additional links.

Maintainers

  • Josep Rueda Collell

Authors

No additional authors.

extrinsic_calibrator_core

Overview

extrinsic_calibrator_core is a ROS2 package designed to calibrate a set of cameras distributed throughout a room. The calibration is performed using ArUco markers scattered randomly in the environment. Each camera detects one or several ArUco markers within its field of view, and the algorithm reconstructs the positions of the markers to create a global map. The positions of the cameras are then computed and incorporated into this aforementioned map map.

The algorithm utilizes the OpenCV library to detect the markers and performs matrix transformations to compute the positions of both markers and cameras relative to each other.

Features

  • Extrinsically calibrate any number of cameras simultaneously.
  • Automatically build a global ArUco map using marker detection from multiple viewpoints.
  • Configurable ArUco marker properties and camera topics.
  • Includes an utility for generating printable ArUco markers.

Configuration

The package provides configuration options through YAML files.

ArUco Marker Parameters

You can customize the ArUco markers used in the calibration process by modifying the aruco_parameters.yaml file.

aruco_params:
  aruco_dict: # OpenCV marker dictionary
    type: string
    default_value: "DICT_6X6_250"
  marker_length: # Length of the marker side in meters
    type: double
    default_value: 0.26

Camera Topics Parameters

You can specify the topics for each camera in the camera_topics_parameters.yaml file. This setup is scalable to handle as many cameras as needed.

cameras_params:
  cam1:
    image_topic:
      type: string
      default_value: "/camera_1/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_1/camera_info"
  cam2:
    image_topic:
      type: string
      default_value: "/camera_2/image_raw"
    camera_info_topic:
      type: string
      default_value: "/camera_2/camera_info"
  # cam3:
  #   image_topic:
  #     type: string
  #     default_value: "/camera_3/image_raw"
  #   camera_info_topic:
  #     type: string
  #     default_value: "/camera_3/camera_info"

Usage

  1. Place the ArUco marker with ID 0 where any camera can see it. This marker will serve as the reference point. The system will consider this marker’s position as the origin (0,0,0) of the global coordinate system, called map.

setup_paint

  1. Distribute the remaining ArUco markers around the room, ensuring they’re visible to different cameras.

For best results:

  • Try to have each camera see multiple markers.
  • Aim for overlap, where multiple cameras can see the same markers.
  • The more markers a camera can detect, and the more cameras that can see the same markers, the more accurate your calibration will be.

Remember, the system first builds a map of marker positions, then determines camera positions based on this map. So, having markers visible to multiple cameras helps create a more accurate and interconnected calibration.

cameras_paint

  1. Initiate the calibration process by running the extrinsic_calibrator_node.

calibration_cameras_paint

  1. Wait for the algorithm to gather the transform of each marker from each camera. The algorithm will iteratively tell the user which marker transforms are finally reliable and which ones are still being verified.

calibration_debug

  1. The calibrator will provide tables with useful information, while the calibration is taking place.

markers_per_cam

marker_per_marker

  1. Finally, once the calibration is done, the frame of each marker and camera will be published in tf2.

tf_map

Launching the Calibrator

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged extrinsic_calibrator_core at Robotics Stack Exchange