Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file
CONTRIBUTING
Repository Summary
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-09-12 |
Dev Status | DEVELOPED |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.0 |
README
RKO_LIO - LiDAR-Inertial Odometry
Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Python
In case you already have a rosbag (ROS1 or ROS2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
rko_lio
is our odometry package, rosbags
is required for using our rosbag dataloader, and rerun-sdk
is what we use for our optional visualizer.
Next, run
rko_lio -v /path/to/rosbag_folder # <- has to be a directory! with either *.bag files or metadata.yaml from ROS2
and you should be good to go!
Click here for some more details on how the above works and how to use RKO_LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running `rko_lio` on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder ``` For all possible CLI flags, please check `rko_lio --help`.
For more install and usage instructions of our python interface, please refer to the python readme and the config doc.
The python interface to our system can be convenient to investigate recorded data offline as you don’t need to setup a ROS environment first.
File truncated at 100 lines see the full file