OpenCV calibration parameters and a 3d point transformation from stereo cameras -
OpenCV calibration parameters and a 3d point transformation from stereo cameras -
i've 4 ps3eye cameras. , i've calibrated camera1 , camera2 using cvstereocalibrate() function of opencv library using chessboard pattern finding corners , passing 3d coordinates function.
also i've calibrated camera2 , camera3 using set of chessboard images viewed camera2 , camera3.
using same method i've calibrated camera3 , camera4.
so i've extrinsic , intrinsic parameters of camera1 , camera2, extrinsic , intrinsic parameters of camera2 , camera3, , extrinsic , intrinsic parameters of camera3 , camera4.
where extrinsic parameters matrices of rotation , translation , intrinsic matrices of focus length , principle point.
now suppose there's 3d point(world coordinate)(and know how find 3d coordinates stereo cameras) viewed camera3 , camera4 not viewed camera1 , camera2.
the question i've is: how take 3d world coordinate point viewed camera3 , camera4 , transform respect camera1 , camera2's world coordinate scheme using rotation, translation, focus , principle point parameters?
opencv's stereo calibration gives relative extrinsic matrix between 2 cameras.
acording documentation, don't transformations in world coordinates (i.e. in relation calibration pattern ). suggests though run regular photographic camera calibration on 1 of images , @ to the lowest degree know transformations. cv::stereocalibrate
if calibrations perfect, utilize daisy-chain setup derive world transformation of of cameras.
as far know not stable, because fact have multiple cameras should considered when running calibration.
multi-camera calibration not trivial of problems. have at:
multi-camera self-calibration gml c++ photographic camera calibration toolboxi'm looking solution this, if find out more regarding , opencv, allow me know.
opencv camera transformation coordinate-systems extrinsic-parameters
Comments
Post a Comment