calibrate_camera_to_camera
Skill class for ai.intrinsic.calibrate_camera_to_camera skill.
The calibrate_camera_to_camera skill computes the extrinsic relationship between multiple cameras in the workcell.
It requests the CalibrationService to process the data collected from the cameras observing a shared calibration
pattern. The service estimates the relative transformations between the cameras. The skill returns the results in
a MultiCameraCalibrationResult. This is essential for multi-camera setups to ensure they operate in a consistent
coordinate frame.
This skill causes an update of the camera poses in the execute world to reflect the new
calibration. However, it does not automatically apply the updated camera poses to the
initial world; use save_calibration_result for that.
Prerequisites
For calibrate_camera_to_camera to work, a calibration session must have been initialized
in the CalibrationService and data (camera poses and pattern detections) must have been
collected for the camera being calibrated. This is typically done by calling the
initialize_calibration and collect_calibration_data skills beforehand.
Usage Example
This skill does not have any usage example yet.
Parameters
calibration_service
Returns
calibration_results
The resulting calibration parameters for the camera-to-camera
transformation. The index of each entry corresponds to the index of the
camera in the initialize_calibration skill input.
Error Code
The skill does not have error codes yet