Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro ardent showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro bouncy showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro crystal showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro eloquent showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro dashing showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro galactic showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro foxy showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro iron showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro lunar showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro jade showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro indigo showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro hydro showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro kinetic showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro melodic showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).
No version for distro noetic showing humble. Known supported distros are highlighted in the buttons above.

Repository Summary

Checkout URI https://github.com/ros-event-camera/event_image_reconstruction_fibar.git
VCS Type git
VCS Version release
Last Updated 2025-12-06
Dev Status DEVELOPED
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

README

event_image_reconstruction_fibar

This repository contains a ROS package for event image reconstruction by means of a temporal and spatial filtering algorithm described here. It depends on the fibar library.

stereo apriltags

Supported platforms

Continuous integration testing for ROS Humble and later distros.

How to build

Set the following shell variables:

repo=event_image_reconstruction_fibar
url=https://github.com/ros-event-camera/${repo}.git

and follow the instructions here

About time synchronization and time stamps

The FIBAR algorithm reconstructs a brightness image event by event, and produces image frames for given frame times. This section explains how these frame times are computed, and how they are synchronized with external sources.

First off, all frames are ultimately produced based on sensor time, that is, the time stamps generated by the camera’s internal clock, and affixed to each event individually. However, when synchronizing against an external time source such as e.g. a camera, the time for which to reconstruct the frame will be specified by the host time given in the ROS image message header stamp. See the event camera codecs repository for more details on sensor vs host time. Since the sensor clock is not synchronized with the host clock, sensor time and host time have different starting points, and drift from each other. For this reason, the event image reconstruction node constantly estimates the offset between sensor and host time, which allows it to then convert host time to sensor time for frame generation.

Offset and drift estimation

When ROS event camera packet messages arrive at the reconstruction node, the sensor time of the first event in the packet corresponds to the host time provided by the ROS message header stamp. Thus, for every arriving packet, the reconstruction node updates a running average offset between host time and sensor time, allowing for a two-way conversion between host and sensor time. This is the conversion referred to below when writing “sensor time = estimated(host time)”, meaning the sensor time is computed from the host time by using the estimated offsets, and conversely, with some abuse of notation “host time = estimated(sensor time)” for deriving the host time from the sensor time.

Synchronization modes

Supported synchronization modes:

1) Free Running. The node generates its own frame times, equidistant in sensor time, and not synchronized to any external time sources. 2) Trigger Events. Many event cameras (notably the ones with Prophesee sensors) have an input pin that generates so-called “external trigger events” when a pulse signal arrives. These trigger events are time stamped to the arrival time of they pulse, and inserted into the event stream. When a trigger event is decoded by the reconstruction node, it will emit a frame based on the sensor time of the trigger event. The header stamp of the frame will be estimated from the trigger event’s sensor time. 3) Camera Image. This mode supports synchronizing the event camera to a frame-based camera. If the sync pulse triggering the frame-based camera’s image is not connected to the event camera, the header stamp of the camera image is converted to sensor time which is then used to reconstruct the frame. If a sync pulse is available, the reconstruction node can be configured to use the external trigger events as well, meaning the reconstruction is done based on the sensor time embedded in the external trigger event. The difference with respect to “Trigger Events” mode is that the header time stamp of the emitted image frame will be taken from the camera image header message, such that down-stream calibration packages can directly recognize which camera image frames belong to which reconstructed event image frames. 4) Time Reference. This mode allows for injection of arbitrary frame times via standard ROS TimeReference messages. The header stamp of the message will be used for the header stamp of the reconstructed frame, the time_ref field is expected to contain the sensor time for reconstruction. This mode is useful when two event cameras are connected with a sync cable, i.e. their sensor time is synchronized, and one (or both) are connected to an external trigger pulse. One of the reconstruction nodes is then configured to publish a TimeReference message (and also a reconstructed image frame), to which the reconstruction node for the other camera subscribes. This way the reconstructed frames of the two nodes will be based on the same sensor time, and will also carry identical ROS header stamps. If both cameras are connected to the same sync pulse, the node receiving the time reference message can be configured to ignore the sensor time of the message, and instead use the sensor time from external trigger events.

Node Parameters

  • sync_mode: How to find the sensor time for reconstructing frames. See Synchronization Modes and the sync table below for possible values. Default: free_running.
  • use_trigger_events: Set this to true to use external trigger events in the event data stream. See Synchronization Modes and the sync table below. Default: False.
  • fps: Frequency (in hz) at which images are reconstructed in free running mode. Default: 25.
  • cutoff_num_events: The cutoff period (in number of events) for the reconstruction algorithm. See the FIBAR paper. Default: 40
  • use_spatial_filter: whether to use spatial filtering (FIBAR). Default: true.
  • statistics_period: Time period in seconds between statistics printouts. Default: 5.
  • event_queue_memory_limit: How many bytes of event data to keep in the incoming queue before dropping data. Default: 10MB.
  • ros_event_queue_size: Number of event packet messages to keep in the ROS receive queue. Default: 1000.
  • edge: Whether to use the up or down edge of the hardware trigger signal. Default: up.
  • frame_path: output directory for reconstructed frames and frame-based camera images. Set to empty string to suppress frame writing. Default: "".
  • publish_time_reference: whether to publish time reference message. Default: false.
sync_mode use_trigger_events frame time source ROS header time stamp note
free_running false sensor clock estimated(sensor time)  
free_running true INVALID CONFIG INVALID CONFIG  
trigger_events false INVALID CONFIG INVALID CONFIG  
trigger_events true external trigger estimated(sensor time)  
camera_image false estimated(image.header.stamp) image.header.stamp  
camera_image true external trigger image.header.stamp  
time_reference false estimated(ref.header.stamp) ref.header.stamp  
time_reference true external trigger ref.header.stamp  

Node Topics

File truncated at 100 lines see the full file

CONTRIBUTING

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that license:

5. Submission of Contributions. Unless You explicitly state otherwise,
   any Contribution intentionally submitted for inclusion in the Work
   by You to the Licensor shall be under the terms and conditions of
   this License, without any additional terms or conditions.
   Notwithstanding the above, nothing herein shall supersede or modify
   the terms of any separate license agreement you may have executed
   with Licensor regarding such Contributions.

Contributors must sign-off each commit by adding a Signed-off-by: ... line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the Developer Certificate of Origin (DCO).

Any contribution that you make to this repository will be under the Apache 2 License, as dictated by that [license](http://www.apache.org/licenses/LICENSE-2.0.html): ~~~ 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. ~~~ Contributors must sign-off each commit by adding a `Signed-off-by: ...` line to commit messages to certify that they have the right to submit the code they are contributing to the project according to the [Developer Certificate of Origin (DCO)](https://developercertificate.org/).