Repository Summary
Checkout URI | https://github.com/peci1/robot_body_filter.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2024-02-03 |
Dev Status | DEVELOPED |
CI status | No Continuous Integration |
Released | RELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Packages
Name | Version |
---|---|
robot_body_filter | 1.3.2 |
README
robot_body_filter
Filters the robot’s body out of point clouds and laser scans.
Tutorial
Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q
Changes vs PR2/robot_self_filter:
- Now the package is a normal
filters::FilterBase
filter and not a standalone node. - Using both containment and ray-tracing tests.
- Using all collision elements for each link instead of only the first one.
- Enabling generic point type, removing PCL dependency and unnecessary params.
- Using bodies.h and shapes.h from geometric_shapes.
- As a by-product, the filter can compute robot’s bounding box or sphere.
Build Status
Basic Operation
filters::FilterBase
API
The basic workings of this filter are done via the filters::FilterBase
API implemented for sensor_msgs::LaserScan
and sensor_msgs::PointCloud2
types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.
General overview
This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).
It can perform 3 kinds of data filters: clip the data based on the provided
sensor limits (parameter filter/do_clipping
), remove points that are inside
or on the surface of the robot body (parameter filter/do_contains_test
) and
remove points that are seen through a part of the robot body (parameter
filter/do_shadow_test
). These kinds of tests are further referenced as
“clipping”, “contains test” and “shadow test”.
If working with point clouds, the filter automatically recognizes whether it
works with organized or non-organized clouds. In organized clouds, it marks
the filtered-out points as NaN
. In non-organized clouds, it removes the
filtered-out points. In laserscans, removal is not an option, so the
filtered-out points are marked with NaN
(some guides suggest that
max_range + 1
should be used for marking invalid points, but this filter uses
NaN
as a safer option).
Performance tips
In general, the filter will be computationally expensive (clipping is fast,
contains test is medium CPU intensive and shadow test is the most expensive
part, because it basically performs raytracing). You can limit the required CPU
power by limiting the filter only to parts that matter. E.g. if the robot has a
link that can never be seen by the sensor, put it in the list of ignored links.
The less links are processed, the better performance. If you’re only interested
in removing a few links, consider using the only_links
parameter.
To speed up shadow filtering, you can set filter/max_shadow_distance
, which
limits the number of points considered for shadow tests just to points close to
the sensor. Setting this to e.g. three times the diameter of the robot should
remove all of the shadow points caused by refraction by a part of the robot body.
But you have to test this with real data.
Performance also strongly depends on representation of the robot model.
The filter reads <collision>
tags from the robot URDF. You can use boxes,
spheres and cylinders (which are fast to process), or you can use convex
meshes (these are much worse performance-wise). If you pass a non-convex mesh,
its convex hull will be used for the tests. Don’t forget that each link
can have multiple <collision>
tags. If you do not have time to convert your
meshes to the basic shapes, try to at least reduce the number of triangles
in your meshes. You can use your high-quality meshes in <visual>
tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.
Model inflation
You can utilize the builtin model inflation mechanism to slightly alter the
size of the model. You will probably want to add a bit “margin” to the contains
and shadow tests so that points that are millimeters outside the robot body will
anyways get removed. You can set a default scale and padding which are used for
all collisions. Different inflation can be used for contains tests and for
shadow tests. Inflation can also be specified differently for each link. Look
at the body_model/inflation/*
parameters for details.
Scaling means multiplying the shape dimensions by the given
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/peci1/robot_body_filter.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2024-02-03 |
Dev Status | DEVELOPED |
CI status | No Continuous Integration |
Released | RELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Packages
Name | Version |
---|---|
robot_body_filter | 1.3.2 |
README
robot_body_filter
Filters the robot’s body out of point clouds and laser scans.
Tutorial
Check out the webinar recording where a lot of options for this filter are explained and demonstrated! https://www.youtube.com/watch?v=j0ljV0uZy3Q
Changes vs PR2/robot_self_filter:
- Now the package is a normal
filters::FilterBase
filter and not a standalone node. - Using both containment and ray-tracing tests.
- Using all collision elements for each link instead of only the first one.
- Enabling generic point type, removing PCL dependency and unnecessary params.
- Using bodies.h and shapes.h from geometric_shapes.
- As a by-product, the filter can compute robot’s bounding box or sphere.
Build Status
Basic Operation
filters::FilterBase
API
The basic workings of this filter are done via the filters::FilterBase
API implemented for sensor_msgs::LaserScan
and sensor_msgs::PointCloud2
types. This means you can load this filter into a FilterChain along other filters as usual. Different from the standard filters, this one can also publish several interesting topics and subscribes to TF.
General overview
This filter reads robot model and the filter config, subscribes to TF, waits for data (laserscans or point clouds) and then cleans them from various artifacts (this is called data filtering).
It can perform 3 kinds of data filters: clip the data based on the provided
sensor limits (parameter filter/do_clipping
), remove points that are inside
or on the surface of the robot body (parameter filter/do_contains_test
) and
remove points that are seen through a part of the robot body (parameter
filter/do_shadow_test
). These kinds of tests are further referenced as
“clipping”, “contains test” and “shadow test”.
If working with point clouds, the filter automatically recognizes whether it
works with organized or non-organized clouds. In organized clouds, it marks
the filtered-out points as NaN
. In non-organized clouds, it removes the
filtered-out points. In laserscans, removal is not an option, so the
filtered-out points are marked with NaN
(some guides suggest that
max_range + 1
should be used for marking invalid points, but this filter uses
NaN
as a safer option).
Performance tips
In general, the filter will be computationally expensive (clipping is fast,
contains test is medium CPU intensive and shadow test is the most expensive
part, because it basically performs raytracing). You can limit the required CPU
power by limiting the filter only to parts that matter. E.g. if the robot has a
link that can never be seen by the sensor, put it in the list of ignored links.
The less links are processed, the better performance. If you’re only interested
in removing a few links, consider using the only_links
parameter.
To speed up shadow filtering, you can set filter/max_shadow_distance
, which
limits the number of points considered for shadow tests just to points close to
the sensor. Setting this to e.g. three times the diameter of the robot should
remove all of the shadow points caused by refraction by a part of the robot body.
But you have to test this with real data.
Performance also strongly depends on representation of the robot model.
The filter reads <collision>
tags from the robot URDF. You can use boxes,
spheres and cylinders (which are fast to process), or you can use convex
meshes (these are much worse performance-wise). If you pass a non-convex mesh,
its convex hull will be used for the tests. Don’t forget that each link
can have multiple <collision>
tags. If you do not have time to convert your
meshes to the basic shapes, try to at least reduce the number of triangles
in your meshes. You can use your high-quality meshes in <visual>
tags. To simplify your model to primitive shapes, you can either manually edit the URDF, or you can utilize ColliderGen.
Model inflation
You can utilize the builtin model inflation mechanism to slightly alter the
size of the model. You will probably want to add a bit “margin” to the contains
and shadow tests so that points that are millimeters outside the robot body will
anyways get removed. You can set a default scale and padding which are used for
all collisions. Different inflation can be used for contains tests and for
shadow tests. Inflation can also be specified differently for each link. Look
at the body_model/inflation/*
parameters for details.
Scaling means multiplying the shape dimensions by the given
File truncated at 100 lines see the full file