5 research outputs found

    Cameras and Inertial/Magnetic Sensor Units Alignment Calibration

    Get PDF
    Due to the external acceleration interference/ magnetic disturbance, the inertial/magnetic measurements are usually fused with visual data for drift-free orientation estimation, which plays an important role in a wide variety of applications, ranging from virtual reality, robot, and computer vision to biomotion analysis and navigation. However, in order to perform data fusion, alignment calibration must be performed in advance to determine the difference between the sensor coordinate system and the camera coordinate system. Since orientation estimation performance of the inertial/magnetic sensor unit is immune to the selection of the inertial/magnetic sensor frame original point, we therefore ignore the translational difference by assuming the sensor and camera coordinate systems sharing the same original point and focus on the rotational alignment difference only in this paper. By exploiting the intrinsic restrictions among the coordinate transformations, the rotational alignment calibration problem is formulated by a simplified hand–eye equation AX = XB (A, X, and B are all rotation matrices). A two-step iterative algorithm is then proposed to solve such simplified handeye calibration task. Detailed laboratory validation has been performed and the good experimental results have illustrated the effectiveness of the proposed alignment calibration method

    Two-step calibration methods for miniature inertial and magnetic sensor units

    Get PDF
    Low-cost inertial/magnetic sensor units have been extensively used to determine sensor attitude information for a wide variety of applications, ranging from virtual reality, underwater vehicles, handheld navigation systems, to biomotion analysis and biomedical applications. In order to achieve precise attitude reconstruction, appropriate sensor calibration procedures must be performed in advance to process sensor readings properly. In this paper, we are aiming to calibrate different error parameters, such as sensor sensitivity/scale factor error, offset/bias error, nonorthogonality error, mounting error, and also soft iron and hard iron errors for magnetometers. Instead of estimating all of these parameters individually, these errors are combined together as the combined bias and transformation matrix. Two-step approaches are proposed to determine the combined bias and transformation matrix separately. For the accelerometer and magnetometer, the combined bias is determined by finding an optimal ellipsoid that can best fit the sensor readings, and the transformation matrix is then derived through a two-step iterative algorithm by exploring the intrinsic relationship among sensor readings. For the gyroscope, the combined bias can be easily determined by placing the sensor node stationary. For the transformation matrix estimation, the intrinsic relationship among gyroscope readings is explored again, and an unscented Kalman filter is employed to determine such matrix. The calibration methods are then applied to our sensor nodes, and the good performance of the orientation estimation has illustrated the effectiveness of the proposed sensor calibration methods

    Cameras and Inertial/Magnetic Sensor Units Alignment Calibration

    Full text link
    corecore