proj_match_points_ransac_guidedT_proj_match_points_ransac_guidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuidedproj_match_points_ransac_guided (Operator)
Name
proj_match_points_ransac_guidedT_proj_match_points_ransac_guidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuidedproj_match_points_ransac_guided
— Compute a projective transformation matrix between two images by
finding correspondences between points based on a known
approximation of the projective transformation matrix.
Signature
proj_match_points_ransac_guided(Image1, Image2 : : Rows1, Cols1, Rows2, Cols2, GrayMatchMethod, MaskSize, HomMat2DGuide, DistanceTolerance, MatchThreshold, EstimationMethod, DistanceThreshold, RandSeed : HomMat2D, Points1, Points2)
Herror T_proj_match_points_ransac_guided(const Hobject Image1, const Hobject Image2, const Htuple Rows1, const Htuple Cols1, const Htuple Rows2, const Htuple Cols2, const Htuple GrayMatchMethod, const Htuple MaskSize, const Htuple HomMat2DGuide, const Htuple DistanceTolerance, const Htuple MatchThreshold, const Htuple EstimationMethod, const Htuple DistanceThreshold, const Htuple RandSeed, Htuple* HomMat2D, Htuple* Points1, Htuple* Points2)
void ProjMatchPointsRansacGuided(const HObject& Image1, const HObject& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& GrayMatchMethod, const HTuple& MaskSize, const HTuple& HomMat2DGuide, const HTuple& DistanceTolerance, const HTuple& MatchThreshold, const HTuple& EstimationMethod, const HTuple& DistanceThreshold, const HTuple& RandSeed, HTuple* HomMat2D, HTuple* Points1, HTuple* Points2)
HHomMat2D HImage::ProjMatchPointsRansacGuided(const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HString& GrayMatchMethod, Hlong MaskSize, const HHomMat2D& HomMat2DGuide, double DistanceTolerance, const HTuple& MatchThreshold, const HString& EstimationMethod, double DistanceThreshold, Hlong RandSeed, HTuple* Points1, HTuple* Points2) const
HHomMat2D HImage::ProjMatchPointsRansacGuided(const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HString& GrayMatchMethod, Hlong MaskSize, const HHomMat2D& HomMat2DGuide, double DistanceTolerance, Hlong MatchThreshold, const HString& EstimationMethod, double DistanceThreshold, Hlong RandSeed, HTuple* Points1, HTuple* Points2) const
HHomMat2D HImage::ProjMatchPointsRansacGuided(const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const char* GrayMatchMethod, Hlong MaskSize, const HHomMat2D& HomMat2DGuide, double DistanceTolerance, Hlong MatchThreshold, const char* EstimationMethod, double DistanceThreshold, Hlong RandSeed, HTuple* Points1, HTuple* Points2) const
HHomMat2D HImage::ProjMatchPointsRansacGuided(const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const wchar_t* GrayMatchMethod, Hlong MaskSize, const HHomMat2D& HomMat2DGuide, double DistanceTolerance, Hlong MatchThreshold, const wchar_t* EstimationMethod, double DistanceThreshold, Hlong RandSeed, HTuple* Points1, HTuple* Points2) const
(
Windows only)
HHomMat2D HHomMat2D::ProjMatchPointsRansacGuided(const HImage& Image1, const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HString& GrayMatchMethod, Hlong MaskSize, double DistanceTolerance, const HTuple& MatchThreshold, const HString& EstimationMethod, double DistanceThreshold, Hlong RandSeed, HTuple* Points1, HTuple* Points2) const
HHomMat2D HHomMat2D::ProjMatchPointsRansacGuided(const HImage& Image1, const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HString& GrayMatchMethod, Hlong MaskSize, double DistanceTolerance, Hlong MatchThreshold, const HString& EstimationMethod, double DistanceThreshold, Hlong RandSeed, HTuple* Points1, HTuple* Points2) const
HHomMat2D HHomMat2D::ProjMatchPointsRansacGuided(const HImage& Image1, const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const char* GrayMatchMethod, Hlong MaskSize, double DistanceTolerance, Hlong MatchThreshold, const char* EstimationMethod, double DistanceThreshold, Hlong RandSeed, HTuple* Points1, HTuple* Points2) const
HHomMat2D HHomMat2D::ProjMatchPointsRansacGuided(const HImage& Image1, const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const wchar_t* GrayMatchMethod, Hlong MaskSize, double DistanceTolerance, Hlong MatchThreshold, const wchar_t* EstimationMethod, double DistanceThreshold, Hlong RandSeed, HTuple* Points1, HTuple* Points2) const
(
Windows only)
static void HOperatorSet.ProjMatchPointsRansacGuided(HObject image1, HObject image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, HTuple grayMatchMethod, HTuple maskSize, HTuple homMat2DGuide, HTuple distanceTolerance, HTuple matchThreshold, HTuple estimationMethod, HTuple distanceThreshold, HTuple randSeed, out HTuple homMat2D, out HTuple points1, out HTuple points2)
HHomMat2D HImage.ProjMatchPointsRansacGuided(HImage image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, string grayMatchMethod, int maskSize, HHomMat2D homMat2DGuide, double distanceTolerance, HTuple matchThreshold, string estimationMethod, double distanceThreshold, int randSeed, out HTuple points1, out HTuple points2)
HHomMat2D HImage.ProjMatchPointsRansacGuided(HImage image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, string grayMatchMethod, int maskSize, HHomMat2D homMat2DGuide, double distanceTolerance, int matchThreshold, string estimationMethod, double distanceThreshold, int randSeed, out HTuple points1, out HTuple points2)
HHomMat2D HHomMat2D.ProjMatchPointsRansacGuided(HImage image1, HImage image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, string grayMatchMethod, int maskSize, double distanceTolerance, HTuple matchThreshold, string estimationMethod, double distanceThreshold, int randSeed, out HTuple points1, out HTuple points2)
HHomMat2D HHomMat2D.ProjMatchPointsRansacGuided(HImage image1, HImage image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, string grayMatchMethod, int maskSize, double distanceTolerance, int matchThreshold, string estimationMethod, double distanceThreshold, int randSeed, out HTuple points1, out HTuple points2)
def proj_match_points_ransac_guided(image_1: HObject, image_2: HObject, rows_1: Sequence[Union[float, int]], cols_1: Sequence[Union[float, int]], rows_2: Sequence[Union[float, int]], cols_2: Sequence[Union[float, int]], gray_match_method: str, mask_size: int, hom_mat_2dguide: Sequence[float], distance_tolerance: float, match_threshold: Union[int, float], estimation_method: str, distance_threshold: float, rand_seed: int) -> Tuple[Sequence[float], Sequence[int], Sequence[int]]
Description
Given a set of coordinates of characteristic points
(Cols1Cols1Cols1Cols1cols1cols_1
,Rows1Rows1Rows1Rows1rows1rows_1
) and
(Cols2Cols2Cols2Cols2cols2cols_2
,Rows2Rows2Rows2Rows2rows2rows_2
) in both input images
Image1Image1Image1Image1image1image_1
and Image2Image2Image2Image2image2image_2
, and given a known approximation
HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuidehom_mat_2dguide
for the transformation matrix between
Image1Image1Image1Image1image1image_1
and Image2Image2Image2Image2image2image_2
,
proj_match_points_ransac_guidedproj_match_points_ransac_guidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuidedproj_match_points_ransac_guided
automatically determines
corresponding points and the homogeneous projective transformation
matrix HomMat2DHomMat2DHomMat2DHomMat2DhomMat2Dhom_mat_2d
that best transforms the corresponding
points from the different images into each other. The
characteristic points can, for example, be extracted with
points_foerstnerpoints_foerstnerPointsFoerstnerPointsFoerstnerPointsFoerstnerpoints_foerstner
or points_harrispoints_harrisPointsHarrisPointsHarrisPointsHarrispoints_harris
. The
approximation HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuidehom_mat_2dguide
can, for example, be calculated
with proj_match_points_ransacproj_match_points_ransacProjMatchPointsRansacProjMatchPointsRansacProjMatchPointsRansacproj_match_points_ransac
on lower resolution versions
of Image1Image1Image1Image1image1image_1
and Image2Image2Image2Image2image2image_2
.
The transformation is determined in two steps: First, gray value
correlations of mask windows around the input points in the first
and the second image are determined and an initial matching between
them is generated using the similarity of the windows in both
images. The size of the mask windows is MaskSizeMaskSizeMaskSizeMaskSizemaskSizemask_size
x MaskSizeMaskSizeMaskSizeMaskSizemaskSizemask_size
. Three
metrics for the correlation can be selected. If
GrayMatchMethodGrayMatchMethodGrayMatchMethodGrayMatchMethodgrayMatchMethodgray_match_method
has the value 'ssd'"ssd""ssd""ssd""ssd""ssd", the sum of
the squared gray value differences is used, 'sad'"sad""sad""sad""sad""sad" means the
sum of absolute differences, and 'ncc'"ncc""ncc""ncc""ncc""ncc" is the normalized
cross correlation. For details please refer to
binocular_disparitybinocular_disparityBinocularDisparityBinocularDisparityBinocularDisparitybinocular_disparity
. The metric is minimized ('ssd'"ssd""ssd""ssd""ssd""ssd",
'sad'"sad""sad""sad""sad""sad") or maximized ('ncc'"ncc""ncc""ncc""ncc""ncc") over all possible
point pairs. A thus found matching is only accepted if the value of
the metric is below the value of MatchThresholdMatchThresholdMatchThresholdMatchThresholdmatchThresholdmatch_threshold
('ssd'"ssd""ssd""ssd""ssd""ssd", 'sad'"sad""sad""sad""sad""sad") or above that value
('ncc'"ncc""ncc""ncc""ncc""ncc").
To increase the algorithm's performance, the search area for the
matching operations is limited based on the approximate transformation
HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuidehom_mat_2dguide
. Only points within a distance of
DistanceToleranceDistanceToleranceDistanceToleranceDistanceTolerancedistanceTolerancedistance_tolerance
around the transformed a point in
Image2Image2Image2Image2image2image_2
of a point in Image1Image1Image1Image1image1image_1
via
HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuidehom_mat_2dguide
are considered for the matching.
Once the initial matching is complete, a randomized search algorithm
(RANSAC) is used to determine the transformation matrix
HomMat2DHomMat2DHomMat2DHomMat2DhomMat2Dhom_mat_2d
. It tries to find the matrix that is consistent
with a maximum number of correspondences. For a point to be
accepted, its distance from the coordinates predicted by the
transformation must not exceed the threshold
DistanceThresholdDistanceThresholdDistanceThresholdDistanceThresholddistanceThresholddistance_threshold
.
Once a choice has been made, the matrix is further optimized using
all consistent points. For this optimization, the
EstimationMethodEstimationMethodEstimationMethodEstimationMethodestimationMethodestimation_method
can be chosen to either be the slow but
mathematically optimal 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard""gold_standard" method or the faster
'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt". Here, the algorithms of
vector_to_proj_hom_mat2dvector_to_proj_hom_mat2dVectorToProjHomMat2dVectorToProjHomMat2dVectorToProjHomMat2dvector_to_proj_hom_mat2d
are used.
Point pairs that still violate the consistency condition for the
final transformation are dropped, the matched points are returned as
control values. Points1Points1Points1Points1points1points_1
contains the indices of the
matched input points from the first image, Points2Points2Points2Points2points2points_2
contains
the indices of the corresponding points in the second image.
The parameter RandSeedRandSeedRandSeedRandSeedrandSeedrand_seed
can be used to control the
randomized nature of the RANSAC algorithm, and hence to obtain
reproducible results. If RandSeedRandSeedRandSeedRandSeedrandSeedrand_seed
is set to a positive
number, the operator yields the same result on every call with the
same parameters because the internally used random number generator
is initialized with the seed value. If RandSeedRandSeedRandSeedRandSeedrandSeedrand_seed
=
0, the random number generator is initialized with the
current time. Hence, the results may not be reproducible in this
case.
Execution Information
- Multithreading type: reentrant (runs in parallel with non-exclusive operators).
- Multithreading scope: global (may be called from any thread).
- Processed without parallelization.
Parameters
Image1Image1Image1Image1image1image_1
(input_object) singlechannelimage →
objectHImageHObjectHImageHobject (byte / uint2)
Input image 1.
Image2Image2Image2Image2image2image_2
(input_object) singlechannelimage →
objectHImageHObjectHImageHobject (byte / uint2)
Input image 2.
Rows1Rows1Rows1Rows1rows1rows_1
(input_control) point.x-array →
HTupleSequence[Union[float, int]]HTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)
Row coordinates of characteristic points
in image 1.
Cols1Cols1Cols1Cols1cols1cols_1
(input_control) point.y-array →
HTupleSequence[Union[float, int]]HTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)
Column coordinates of characteristic points
in image 1.
Rows2Rows2Rows2Rows2rows2rows_2
(input_control) point.x-array →
HTupleSequence[Union[float, int]]HTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)
Row coordinates of characteristic points
in image 2.
Cols2Cols2Cols2Cols2cols2cols_2
(input_control) point.y-array →
HTupleSequence[Union[float, int]]HTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)
Column coordinates of characteristic points
in image 2.
GrayMatchMethodGrayMatchMethodGrayMatchMethodGrayMatchMethodgrayMatchMethodgray_match_method
(input_control) string →
HTuplestrHTupleHtuple (string) (string) (HString) (char*)
Gray value comparison metric.
Default:
'ssd'
"ssd"
"ssd"
"ssd"
"ssd"
"ssd"
List of values:
'ncc'"ncc""ncc""ncc""ncc""ncc", 'sad'"sad""sad""sad""sad""sad", 'ssd'"ssd""ssd""ssd""ssd""ssd"
MaskSizeMaskSizeMaskSizeMaskSizemaskSizemask_size
(input_control) integer →
HTupleintHTupleHtuple (integer) (int / long) (Hlong) (Hlong)
Size of gray value masks.
Default:
10
Value range:
MaskSize
MaskSize
MaskSize
MaskSize
maskSize
mask_size
≤
90
HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuidehom_mat_2dguide
(input_control) hom_mat2d →
HHomMat2D, HTupleSequence[float]HTupleHtuple (real) (double) (double) (double)
Approximation of the Homogeneous projective
transformation matrix between the two images.
DistanceToleranceDistanceToleranceDistanceToleranceDistanceTolerancedistanceTolerancedistance_tolerance
(input_control) real →
HTuplefloatHTupleHtuple (real) (double) (double) (double)
Tolerance for the matching search window.
Default:
20.0
Suggested values:
0.2, 0.5, 1.0, 2.0, 3.0, 5.0, 10.0, 20.0, 50.0
MatchThresholdMatchThresholdMatchThresholdMatchThresholdmatchThresholdmatch_threshold
(input_control) number →
HTupleUnion[int, float]HTupleHtuple (integer / real) (int / long / double) (Hlong / double) (Hlong / double)
Threshold for gray value matching.
Default:
10
Suggested values:
10, 20, 50, 100, 0.9, 0.7
EstimationMethodEstimationMethodEstimationMethodEstimationMethodestimationMethodestimation_method
(input_control) string →
HTuplestrHTupleHtuple (string) (string) (HString) (char*)
Transformation matrix estimation algorithm.
Default:
'normalized_dlt'
"normalized_dlt"
"normalized_dlt"
"normalized_dlt"
"normalized_dlt"
"normalized_dlt"
List of values:
'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard""gold_standard", 'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt"
DistanceThresholdDistanceThresholdDistanceThresholdDistanceThresholddistanceThresholddistance_threshold
(input_control) real →
HTuplefloatHTupleHtuple (real) (double) (double) (double)
Threshold for transformation consistency check.
Default:
0.2
RandSeedRandSeedRandSeedRandSeedrandSeedrand_seed
(input_control) integer →
HTupleintHTupleHtuple (integer) (int / long) (Hlong) (Hlong)
Seed for the random number generator.
Default:
0
HomMat2DHomMat2DHomMat2DHomMat2DhomMat2Dhom_mat_2d
(output_control) hom_mat2d →
HHomMat2D, HTupleSequence[float]HTupleHtuple (real) (double) (double) (double)
Homogeneous projective transformation matrix.
Points1Points1Points1Points1points1points_1
(output_control) integer-array →
HTupleSequence[int]HTupleHtuple (integer) (int / long) (Hlong) (Hlong)
Indices of matched input points in image 1.
Points2Points2Points2Points2points2points_2
(output_control) integer-array →
HTupleSequence[int]HTupleHtuple (integer) (int / long) (Hlong) (Hlong)
Indices of matched input points in image 2.
Example (HDevelop)
zoom_image_factor (Image1, Image1Zoomed, 0.5, 0.5, 'constant')
zoom_image_factor (Image2, Image2Zoomed, 0.5, 0.5, 'constant')
points_foerstner (Image1Zoomed, 1, 2, 3, 200, 0.3, 'gauss', 'false', \
Rows1, Cols1, _, _, _, _, _, _, _, _)
points_foerstner (Image2Zoomed, 1, 2, 3, 200, 0.3, 'gauss', 'false', \
Rows2, Cols2, _, _, _, _, _, _, _, _)
get_image_pointer1 (Image1Zoomed, Pointer, Type, Width, Height)
proj_match_points_ransac (Image1Zoomed, Image2Zoomed, Rows1, Cols1, \
Rows2, Cols2, 'ncc', 10, 0, 0, \
Height, Width, 0, 0.5, 'gold_standard', \
5, 0, HomMat2D, Points1, Points2)
hom_mat2d_scale_local (HomMat2D, 0.5, 0.5, HomMat2DGuide)
hom_mat2d_scale (HomMat2DGuide, 2, 2, 0, 0, HomMat2DGuide)
points_foerstner (Image1, 1, 2, 3, 200, 0.3, 'gauss', 'false', \
Rows1, Cols1, _, _, _, _, _, _, _, _)
points_foerstner (Image2, 1, 2, 3, 200, 0.3, 'gauss', 'false', \
Rows2, Cols2, _, _, _, _, _, _, _, _)
proj_match_points_ransac_guided (Image1, Image2, Rows1, Cols1, \
Rows2, Cols2, 'ncc', 10, \
HomMat2DGuide, 40, 0.5, \
'gold_standard', 10, 0, HomMat2D, \
Points1, Points2)
Possible Predecessors
points_foerstnerpoints_foerstnerPointsFoerstnerPointsFoerstnerPointsFoerstnerpoints_foerstner
,
points_harrispoints_harrisPointsHarrisPointsHarrisPointsHarrispoints_harris
Possible Successors
projective_trans_imageprojective_trans_imageProjectiveTransImageProjectiveTransImageProjectiveTransImageprojective_trans_image
,
projective_trans_image_sizeprojective_trans_image_sizeProjectiveTransImageSizeProjectiveTransImageSizeProjectiveTransImageSizeprojective_trans_image_size
,
projective_trans_regionprojective_trans_regionProjectiveTransRegionProjectiveTransRegionProjectiveTransRegionprojective_trans_region
,
projective_trans_contour_xldprojective_trans_contour_xldProjectiveTransContourXldProjectiveTransContourXldProjectiveTransContourXldprojective_trans_contour_xld
,
projective_trans_point_2dprojective_trans_point_2dProjectiveTransPoint2dProjectiveTransPoint2dProjectiveTransPoint2dprojective_trans_point_2d
,
projective_trans_pixelprojective_trans_pixelProjectiveTransPixelProjectiveTransPixelProjectiveTransPixelprojective_trans_pixel
Alternatives
hom_vector_to_proj_hom_mat2dhom_vector_to_proj_hom_mat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2dhom_vector_to_proj_hom_mat2d
,
vector_to_proj_hom_mat2dvector_to_proj_hom_mat2dVectorToProjHomMat2dVectorToProjHomMat2dVectorToProjHomMat2dvector_to_proj_hom_mat2d
See also
proj_match_points_ransacproj_match_points_ransacProjMatchPointsRansacProjMatchPointsRansacProjMatchPointsRansacproj_match_points_ransac
References
Richard Hartley, Andrew Zisserman: “Multiple View Geometry in
Computer Vision”; Cambridge University Press, Cambridge; 2000.
Olivier Faugeras, Quang-Tuan Luong: “The Geometry of Multiple
Images: The Laws That Govern the Formation of Multiple Images of a
Scene and Some of Their Applications”; MIT Press, Cambridge, MA;
2001.
Module
Matching