No version for distro humble showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro jazzy showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro kilted showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro rolling showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro ardent showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro bouncy showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro crystal showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro eloquent showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro dashing showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro galactic showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro foxy showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro iron showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro lunar showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro jade showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro indigo showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro hydro showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

No version for distro kinetic showing melodic. Known supported distros are highlighted in the buttons above.
Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
melodic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange

Package symbol

robot_body_filter package from robot_body_filter repo

robot_body_filter

ROS Distro
noetic

Package Summary

Tags No category tags.
Version 1.3.2
License BSD
Build type CATKIN
Use RECOMMENDED

Repository Summary

Checkout URI https://github.com/peci1/robot_body_filter.git
VCS Type git
VCS Version master
Last Updated 2024-02-03
Dev Status DEVELOPED
CI status
Released RELEASED
Tags No category tags.
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

Filters the robot's body out of laser scans or point clouds.

Additional Links

Maintainers

  • Martin Pecka

Authors

  • Eitan Marder-Eppstein
  • Tomas Petricek
  • Martin Pecka

robot_body_filter

Filters the robot’s body out of point clouds and laser scans.

Tutorial

Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q

Changes vs PR2/robot_self_filter:

  • Now the package is a normal filters::FilterBase filter and not a standalone node.
  • Using both containment and ray-tracing tests.
  • Using all collision elements for each link instead of only the first one.
  • Enabling generic point type, removing PCL dependency and unnecessary params.
  • Using bodies.h and shapes.h from geometric_shapes.
  • As a by-product, the filter can compute robot’s bounding box or sphere.

Build Status

Development versions: Github Actions Dev melodic Dev noetic

Release jobs Melodic Melodic version: Bin melodic-amd64 Bin melodic-arm64 Bin melodic-armhf

Release jobs Noetic Noetic version: Bin noetic focal-amd64 Bin noetic focal-arm64 Bin noetic focal-armhf

Basic Operation

filters::FilterBase API

The basic workings of this filter are done via the filters::FilterBase API implemented for sensor_msgs::LaserScan and sensor_msgs::PointCloud2 types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.

General overview

This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).

It can perform 3 kinds of data filters: clip the data based on the provided sensor limits (parameter filter/do_clipping), remove points that are inside or on the surface of the robot body (parameter filter/do_contains_test) and remove points that are seen through a part of the robot body (parameter filter/do_shadow_test). These kinds of tests are further referenced as “clipping”, “contains test” and “shadow test”.

If working with point clouds, the filter automatically recognizes whether it works with organized or non-organized clouds. In organized clouds, it marks the filtered-out points as NaN. In non-organized clouds, it removes the filtered-out points. In laserscans, removal is not an option, so the filtered-out points are marked with NaN (some guides suggest that max_range + 1 should be used for marking invalid points, but this filter uses NaN as a safer option).

Performance tips

In general, the filter will be computationally expensive (clipping is fast, contains test is medium CPU intensive and shadow test is the most expensive part, because it basically performs raytracing). You can limit the required CPU power by limiting the filter only to parts that matter. E.g. if the robot has a link that can never be seen by the sensor, put it in the list of ignored links. The less links are processed, the better performance. If you’re only interested in removing a few links, consider using the only_links parameter.

To speed up shadow filtering, you can set filter/max_shadow_distance, which limits the number of points considered for shadow tests just to points close to the sensor. Setting this to e.g. three times the diameter of the robot should remove all of the shadow points caused by refraction by a part of the robot body. But you have to test this with real data.

Performance also strongly depends on representation of the robot model. The filter reads <collision> tags from the robot URDF. You can use boxes, spheres and cylinders (which are fast to process), or you can use convex meshes (these are much worse performance-wise). If you pass a non-convex mesh, its convex hull will be used for the tests. Don’t forget that each link can have multiple <collision> tags. If you do not have time to convert your meshes to the basic shapes, try to at least reduce the number of triangles in your meshes. You can use your high-quality meshes in <visual> tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.

Model inflation

You can utilize the builtin model inflation mechanism to slightly alter the size of the model. You will probably want to add a bit “margin” to the contains and shadow tests so that points that are millimeters outside the robot body will anyways get removed. You can set a default scale and padding which are used for all collisions. Different inflation can be used for contains tests and for shadow tests. Inflation can also be specified differently for each link. Look at the body_model/inflation/* parameters for details.

Scaling means multiplying the shape dimensions by the given

File truncated at 100 lines see the full file

CHANGELOG

Changelog for package robot_body_filter

1.3.2 (2024-01-22)

  • Do not segfault when a mesh resource is not found.
  • Added full_example.
  • Contributors: Martin Pecka

1.3.1 (2023-07-14)

  • Explicitly specify minimum versions of required libraries.
  • Contributors: Martin Pecka

1.3.0 (2023-04-12)

  • Removed local implementation of oriented bounding boxes in favor of geometric_shapes/obb. This change breaks API and ABI, but I hope nobody explicitly used the OBB part of this library. This change requires geometric_shapes version 0.6.6+ (Melodic) or 0.7.5+ (Noetic) (released April 2023).
  • Add example launch & config for ease of use. Thanks Doan Nguyen for the contribution!
  • Changed xmlrpc_traits variables to constexpr static instead of inline static to decrease the required C++ language standard for this part. Changed stringType from std::string to const char*.
  • Improved xmlrpc_traits to recognize more types of valid structures.
  • Make filter_utils FilterBase::getParamVerbose() methods const. Allowed by https://github.com/ros/filters/pull/35 (released in Melodic filters 1.8.2 (October 2021) and Noetic filters 1.9.1 (September 2021)).
  • Contributors: Doan Nguyen, Martin Pecka

1.2.2 (2021-08-25)

  • Change ROS_WARN to ROS_INFO when loading a value of an undefined parameter
  • Add link to the youtube tutorial
  • Contributors: Martin Pecka

1.2.1 (2021-08-06)

  • Merge pull request #15 from universal-field-robots/master TFFramesWatchdog and RayCastingShapeMask need to be installed in the CMakeLists.txt
  • Added RayCastingShapeMask and TFFramesWatchdog to install targets in cmake
  • Contributors: Josh Owen, Martin Pecka

1.2.0 (2021-07-30)

  • Merge pull request #11 from peci1/correct-pointcloud-transforms Add possibility to specify pointcloud channels that should be transformed together with positional data.
  • Merge pull request #14 from peci1/per_link_scaling Add support for scaling/padding each link differently
  • Short-circuit classification of NaN points.
  • Warn about missing collision elements only for non-ignored links and only if they have at least one visual.
  • Fixed bounding shapes computation
  • Added filter/max_shadow_distance for great performance increase
  • Added possibility to scale/pad collisions separately for computation of bounding box and sphere.
  • Unique-ify test target name to allow building with geometric_shapes.
  • Use correct Eigen allocator.
  • Reflected the newly added support for non-uniformly scaled meshes.
  • Contributors: Martin Pecka

1.1.9 (2021-04-17)

  • Compatibility with Noetic.
  • Contributors: Martin Pecka

1.1.8 (2020-04-06)

  • Fixed typo.
  • Contributors: Martin Pecka

1.1.7 (2020-04-05)

  • When sensor frame is empty (autodetected), do not add it as initial monitored frame to watchdog.
  • Added configuration examples.
  • Make sure use_sim_time is set to false for tests.
  • Fixed computation of pointcloud without local bounding box.
  • Added tests for RobotBodyFilter.
  • No longer require the index pointcloud field in computeMask(), as it is not used anywhere.
  • Surprise! CropBox::setTransform() doesn't set transform of the cropbox, but instead a transform of the points. Fixed that.
  • Fixed copy-paste bug in bounding sphere publisher.
  • Fix incorrect usage of fixed frame at many places (substituted with filtering frame).
  • Make sure orientation.w = 1 is set in all published markers.
  • Correctly exclude the ignored bounding shapes from the published markers and shapes.

File truncated at 100 lines see the full file

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

Services

No service files found

Plugins

Recent questions tagged robot_body_filter at Robotics Stack Exchange