大家好,欢迎来到IT知识分享网。
一个图像要与另外一个图像配准
我们先对两个图像提取各自的sift特征点集合,然后再将分布于图像上的特征点一一配对。
这里假设相互配对特征点,其位置从一个集合,变换到新的集合时,是符合刚体变换的。
这个刚体变换就可以用于求取一一配对的匹配点的过程中。
asift.py
”’
Affine invariant feature-based image matching sample.
图像配准例子代码–基于仿射不变特征
这个代码与find_obj.py类似,但是用了仿射变换空间采样技术,命名为ASIFT。这里提取特征仍然用的是原来的提取特征方法SIFT,所以这里可以用SURF,或者ORB来替代。
Homograph RANSAC用来剔除异常点。
多线程用来快速地进行仿射采样。
[1] http://www.ipol.im/pub/algo/my_affine_sift/
USAGE
asift.py [–feature=<sift|surf|orb|brisk>[-flann]] [ <image1> <image2> ]
–feature – 选择提取feature的方法 . 可以是 sift, surf, orb or brisk. ‘-flann’
特指用 Flann-based 匹配器而不使用bruteforce.
在特征点上点击鼠标左键,可以查看它对应的匹配点。
imag1,和image2是需要配准的两张图片
”’
使用ASIFT方法图像配准
打开链接http://www.ipol.im/pub/algo/my_affine_sift
源代码:
1 #!/usr/bin/env python
2
3 ”’
4 Affine invariant feature-based image matching sample.
5
6 This sample is similar to find_obj.py, but uses the affine transformation
7 space sampling technique, called ASIFT [1]. While the original implementation
8 is based on SIFT, you can try to use SURF or ORB detectors instead. Homography RANSAC
9 is used to reject outliers. Threading is used for faster affine sampling.
10
11 [1] http://www.ipol.im/pub/algo/my_affine_sift/
12
13 USAGE
14 asift.py [–feature=<sift|surf|orb|brisk>[-flann]] [ <image1> <image2> ]
15
16 –feature – Feature to use. Can be sift, surf, orb or brisk. Append ‘-flann’
17 to feature name to use Flann-based matcher instead bruteforce.
18
19 Press left mouse button on a feature point to see its matching point.
20 ”’
21
22 # Python 2/3 compatibility
23 from __future__ import print_function
24
25 import numpy as np
26 import cv2 as cv
27
28 # built-in modules
29 import itertools as it
30 from multiprocessing.pool import ThreadPool
31
32 # local modules
33 from common import Timer
34 from find_obj import init_feature, filter_matches, explore_match
35
36
37 def affine_skew(tilt, phi, img, mask=None):
38 ”’
39 affine_skew(tilt, phi, img, mask=None) -> skew_img, skew_mask, Ai
40
41 Ai – is an affine transform matrix from skew_img to img
42 ”’
43 h, w = img.shape[:2]
44 if mask is None:
45 mask = np.zeros((h, w), np.uint8)
46 mask[:] = 255
47 A = np.float32([[1, 0, 0], [0, 1, 0]])
48 if phi != 0.0:
49 phi = np.deg2rad(phi)
50 s, c = np.sin(phi), np.cos(phi)
51 A = np.float32([[c,-s], [ s, c]])
52 corners = [[0, 0], [w, 0], [w, h], [0, h]]
53 tcorners = np.int32( np.dot(corners, A.T) )
54 x, y, w, h = cv.boundingRect(tcorners.reshape(1,-1,2))
55 A = np.hstack([A, [[-x], [-y]]])
56 img = cv.warpAffine(img, A, (w, h), flags=cv.INTER_LINEAR, borderMode=cv.BORDER_REPLICATE)
57 if tilt != 1.0:
58 s = 0.8*np.sqrt(tilt*tilt-1)
59 img = cv.GaussianBlur(img, (0, 0), sigmaX=s, sigmaY=0.01)
60 img = cv.resize(img, (0, 0), fx=1.0/tilt, fy=1.0, interpolation=cv.INTER_NEAREST)
61 A[0] /= tilt
62 if phi != 0.0 or tilt != 1.0:
63 h, w = img.shape[:2]
64 mask = cv.warpAffine(mask, A, (w, h), flags=cv.INTER_NEAREST)
65 Ai = cv.invertAffineTransform(A)
66 return img, mask, Ai
67
68
69 def affine_detect(detector, img, mask=None, pool=None):
70 ”’
71 affine_detect(detector, img, mask=None, pool=None) -> keypoints, descrs
72
73 Apply a set of affine transformations to the image, detect keypoints and
74 reproject them into initial image coordinates.
75 See http://www.ipol.im/pub/algo/my_affine_sift/ for the details.
76
77 ThreadPool object may be passed to speedup the computation.
78 ”’
79 params = [(1.0, 0.0)]
80 for t in 2**(0.5*np.arange(1,6)):
81 for phi in np.arange(0, 180, 72.0 / t):
82 params.append((t, phi))
83
84 def f(p):
85 t, phi = p
86 timg, tmask, Ai = affine_skew(t, phi, img)
87 keypoints, descrs = detector.detectAndCompute(timg, tmask)
88 for kp in keypoints:
89 x, y = kp.pt
90 kp.pt = tuple( np.dot(Ai, (x, y, 1)) )
91 if descrs is None:
92 descrs = []
93 return keypoints, descrs
94
95 keypoints, descrs = [], []
96 if pool is None:
97 ires = it.imap(f, params)
98 else:
99 ires = pool.imap(f, params)
100
101 for i, (k, d) in enumerate(ires):
102 print(‘affine sampling: %d / %d\r’ % (i+1, len(params)), end=”)
103 keypoints.extend(k)
104 descrs.extend(d)
105
106 print()
107 return keypoints, np.array(descrs)
108
109
110 def main():
111 import sys, getopt
112 opts, args = getopt.getopt(sys.argv[1:], ”, [‘feature=’])
113 opts = dict(opts)
114 feature_name = opts.get(‘–feature’, ‘brisk-flann’)
115 try:
116 fn1, fn2 = args
117 except:
118 fn1 = ‘aero1.jpg’
119 fn2 = ‘aero3.jpg’
120
121 img1 = cv.imread(cv.samples.findFile(fn1), cv.IMREAD_GRAYSCALE)
122 img2 = cv.imread(cv.samples.findFile(fn2), cv.IMREAD_GRAYSCALE)
123 detector, matcher = init_feature(feature_name)
124
125 if img1 is None:
126 print(‘Failed to load fn1:’, fn1)
127 sys.exit(1)
128
129 if img2 is None:
130 print(‘Failed to load fn2:’, fn2)
131 sys.exit(1)
132
133 if detector is None:
134 print(‘unknown feature:’, feature_name)
135 sys.exit(1)
136
137 print(‘using’, feature_name)
138
139 pool=ThreadPool(processes = cv.getNumberOfCPUs())
140 kp1, desc1 = affine_detect(detector, img1, pool=pool)
141 kp2, desc2 = affine_detect(detector, img2, pool=pool)
142 print(‘img1 – %d features, img2 – %d features’ % (len(kp1), len(kp2)))
143
144 def match_and_draw(win):
145 with Timer(‘matching’):
146 raw_matches = matcher.knnMatch(desc1, trainDescriptors = desc2, k = 2) #2
147 p1, p2, kp_pairs = filter_matches(kp1, kp2, raw_matches)
148 if len(p1) >= 4:
149 H, status = cv.findHomography(p1, p2, cv.RANSAC, 5.0)
150 print(‘%d / %d inliers/matched’ % (np.sum(status), len(status)))
151 # do not draw outliers (there will be a lot of them)
152 kp_pairs = [kpp for kpp, flag in zip(kp_pairs, status) if flag]
153 else:
154 H, status = None, None
155 print(‘%d matches found, not enough for homography estimation’ % len(p1))
156
157 explore_match(win, img1, img2, kp_pairs, None, H)
158
159
160 match_and_draw(‘affine find_obj’)
161 cv.waitKey()
162 print(‘Done’)
163
164
165 if __name__ == ‘__main__’:
166 print(__doc__)
167 main()
168 cv.destroyAllWindows()
免责声明:本站所有文章内容,图片,视频等均是来源于用户投稿和互联网及文摘转载整编而成,不代表本站观点,不承担相关法律责任。其著作权各归其原作者或其出版社所有。如发现本站有涉嫌抄袭侵权/违法违规的内容,侵犯到您的权益,请在线联系站长,一经查实,本站将立刻删除。 本文来自网络,若有侵权,请联系删除,如若转载,请注明出处:https://yundeesoft.com/76169.html