图像相似度匹配——距离大全

https://blog.csdn.net/lly1122334/article/details/89431244

说明:

PIL.Image读取图片并resize同一尺寸
scipy.spatial.distance库计算距离(也可用sklearn.metrics.pairwise_distances)
距离越小越匹配
文章目录
一、测试图片
二、欧氏距离
三、曼哈顿距离
四、切比雪夫距离
五、余弦距离
六、皮尔逊相关系数
七、汉明距离
八、杰卡德距离
九、布雷柯蒂斯距离
十、马氏距离
十一、JS散度
十二、image-match匹配库
十三、不装库匹配
十四、利用Keras预训练模型提取特征进行匹配
脚本
总结
参考文献
一、测试图片
图片来源见下方链接。

1.jpg 分辨率604×900


2.jpg 分辨率423×640

3.jpg 分辨率900×750


4.jpg 分辨率404×600

二、欧氏距离
d = ∑ i = 1 N ( x i 1 − x i 2 ) 2 d=\sqrt{\sum_{i=1}^N{\left( x_{i1}-x_{i2} \right) ^2}}d=

i=1
N

(x
i1

−x
i2

)
2


点到点的距离,越大越不匹配

考虑权值:标准欧氏距离,seuclidean
平方:欧式距离平方,sqeuclidean

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def euclidean(image1, image2):
X = np.vstack([image1, image2])
return pdist(X, 'euclidean')[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(euclidean(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
图片 1 2 3 4
1 0 40819 99266 42672


三、曼哈顿距离
d = ∑ i = 1 N ∣ x i 1 − x i 2 ∣ d=\sum_{i=1}^N{| x_{i1}-x_{i2} | }d=∑
i=1
N

∣x
i1

−x
i2

又称城市街区距离,两坐标轴距离之和

考虑权值:堪培拉距离,canberra。用于比较排名列表和计算机安全入侵检测

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def manhattan(image1, image2):
X = np.vstack([image1, image2])
return pdist(X, 'cityblock')[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(manhattan(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
图片 1 2 3 4
1 0 41122193 97631252 39064477
堪培拉距离:

图片 1 2 3 4
1 0 497302 848611 354084


四、切比雪夫距离
d = ∑ i = 1 N ( max ⁡ ( ∣ x i 1 − x i 2 ∣ , ∣ y i 1 − y i 2 ∣ ) ) d=\sum_{i=1}^N{\left( \max \left( |x_{i1}-x_{i2}|,|y_{i1}-y_{i2}| \right) \right)}d=∑
i=1
N

(max(∣x
i1

−x
i2

∣,∣y
i1

−y
i2

∣))

各座标数值差绝对值的最大值,取值范围为0-255

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def chebyshev(image1, image2):
X = np.vstack([image1, image2])
return pdist(X, 'chebyshev')[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(chebyshev(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
图片 1 2 3 4
1 0 218 255 204


五、余弦距离
d = ∑ i = 1 N ( x i 1 x i 2 + y i 1 y i 2 ( x i 1 2 + y i 1 2 ) ( x i 2 2 + y i 2 2 ) ) d=\sum_{i=1}^N{\left( \frac{x_{i1}x_{i2}+y_{i1}y_{i2}}{\sqrt{\left( x_{i1}^{2}+y_{i1}^{2} \right) \left( x_{i2}^{2}+y_{i2}^{2} \right)}} \right)}d=∑
i=1
N

(
(x
i1
2

+y
i1
2

)(x
i2
2

+y
i2
2

)


x
i1

x
i2

+y
i1

y
i2



)

又称余弦相似度,根据向量方向来判断向量相似度

运算速度超级慢

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def cosine(image1, image2):
X = np.vstack([image1, image2])
return pdist(X, 'cosine')[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(cosine(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
图片 1 2 3 4
1 0 0.0715 0.4332 0.0782


六、皮尔逊相关系数
d = ∑ i = 1 N ( x i 1 − x ˉ 1 ) ( x i 2 − x ˉ 2 ) ∑ i = 1 N ( x i 1 − x ˉ 1 ) 2 ∑ i = 1 N ( x i 2 − x ˉ 2 ) 2 d=\frac{\sum_{i=1}^N{\left( x_{i1}-\bar{x}_1 \right) \left( x_{i2}-\bar{x}_2 \right)}}{\sqrt{\sum_{i=1}^N{\left( x_{i1}-\bar{x}_1 \right) ^2}}\sqrt{\sum_{i=1}^N{\left( x_{i2}-\bar{x}_2 \right) ^2}}}d=

i=1
N

(x
i1


x
ˉ

1

)
2




i=1
N

(x
i2


x
ˉ

2

)
2




i=1
N

(x
i1


x
ˉ

1

)(x
i2


x
ˉ

2

)

与余弦相似度类似,并且具有平移不变性的优点,越大越相关

import numpy as np
from PIL import Image


def pearson(image1, image2):
X = np.vstack([image1, image2])
return np.corrcoef(X)[0][1]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(pearson(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
图片 1 2 3 4
1 1 0.8777 0.0850 0.7413
皮尔逊距离 = 1 - 皮尔逊相关系数

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def manhattan(image1, image2):
X = np.vstack([image1, image2])
return pdist(X, 'correlation')[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(manhattan(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17


七、汉明距离
d = ∑ i = 1 N ( { 1 , x i 1 = x i 2 0 , x i 1 ≠ x i 2 ) d=\sum_{i=1}^N{\left( \left\{ \right. \right)}d=∑
i=1
N

({
1, x
i1

=x
i2


0, x
i1




=x
i2



)

通过比较向量每一位是否相同,若不同则汉明距离加1

一般用于信息编码

import numpy as np
from PIL import Image


def hamming(image1, image2):
return np.shape(np.nonzero(image1 - image2)[0])[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1)
image2 = np.asarray(image2)

print(hamming(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
图片 1 2 3 4
1 0 0.9865 0.9933 0.9853


八、杰卡德距离
d = A △ B ∣ A ∪ B ∣ d=\frac{A\bigtriangleup B}{\left| A\cup B \right|}d=
∣A∪B∣
A△B

两个集合中不同元素占所有元素的比例来衡量,其相似度=1-d

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def jaccard(image1, image2):
X = np.vstack([image1, image2])
return pdist(X, 'jaccard')[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(jaccard(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
图片 1 2 3 4
1 0 0.9865 0.9936 0.9853


九、布雷柯蒂斯距离
生态学中用来衡量不同样地物种组成差异的测度

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def braycurtis(image1, image2):
X = np.vstack([image1, image2])
return pdist(X, 'braycurtis')[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(braycurtis(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
图片 1 2 3 4
1 0 0.2008 0.4877 0.1746


十、马氏距离
协方差距离,考虑各种特性之间的联系

两两之间计算,计算量过大

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def mahalanobis(image1, image2):
X = np.vstack([image1, image2])
XT = X.T
return pdist(XT, 'mahalanobis')


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

x=np.random.random(10)
y=np.random.random(10)
print(mahalanobis(x, y))

#print(mahalanobis(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22


十一、JS散度
测量两个概率分布之间相似距离,常用于生物信息学和基因组比较 ,历史定量研究,机器学习

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def jensenshannon(image1, image2):
X = np.vstack([image1, image2])
return pdist(X, 'jensenshannon')[0]


image1 = Image.open('image/1.jpg')
image2 = Image.open('image/2.jpg')
image2 = image2.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()

print(jensenshannon(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
图片 1 2 3 4
1 0 0.2008 0.4877 0.1746


十二、image-match匹配库
image-match

image-match文档

该库类似pHash库,包括一个数据库后端,可轻松扩展到数十亿张图像,并支持持续的高速图像插入

匹配原理是pHash离散余弦变换,归一化距离小于0.40很可能匹配

安装

pip install image_match
1
norm_diff = np.linalg.norm(b - a)
norm1 = np.linalg.norm(b)
norm2 = np.linalg.norm(a)
return norm_diff / (norm1 + norm2)
1
2
3
4
from image_match.goldberg import ImageSignature


def open(image):
return ImageSignature().generate_signature(image)


def distance(image1, image2):
return ImageSignature.normalized_distance(image1, image2)


image1 = open('image/1.jpg')
image2 = open('image/2.jpg')

print(distance(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
图片 1 2 3 4
1 0 0.2360 0.6831 0.4296
加个滤镜:

计算得到0.2027,匹配。

十三、不装库匹配
匹配代码源自原库

import numpy as np
from skimage.io import imread

def read(image):
# Step 1: Load image as array of grey-levels
im_array = imread(image, as_grey=True)

# Step 2a: Determine cropping boundaries
rw = np.cumsum(np.sum(np.abs(np.diff(im_array, axis=1)), axis=1))
cw = np.cumsum(np.sum(np.abs(np.diff(im_array, axis=0)), axis=0))
upper_column_limit = np.searchsorted(cw, np.percentile(cw, 95), side='left')
lower_column_limit = np.searchsorted(cw, np.percentile(cw, 5), side='right')
upper_row_limit = np.searchsorted(rw, np.percentile(rw, 95), side='left')
lower_row_limit = np.searchsorted(rw, np.percentile(rw, 5), side='right')
if lower_row_limit > upper_row_limit:
lower_row_limit = int(5 / 100. * im_array.shape[0])
upper_row_limit = int(95 / 100. * im_array.shape[0])
if lower_column_limit > upper_column_limit:
lower_column_limit = int(5 / 100. * im_array.shape[1])
upper_column_limit = int(95 / 100. * im_array.shape[1])
image_limits = [(lower_row_limit, upper_row_limit), (lower_column_limit, upper_column_limit)]

# Step 2b: Generate grid centers
x_coords = np.linspace(image_limits[0][0], image_limits[0][1], 11, dtype=int)[1:-1]
y_coords = np.linspace(image_limits[1][0], image_limits[1][1], 11, dtype=int)[1:-1]

# Step 3: Compute grey level mean of each P x P square centered at each grid point
P = max([2.0, int(0.5 + min(im_array.shape) / 20.)])
avg_grey = np.zeros((x_coords.shape[0], y_coords.shape[0]))
for i, x in enumerate(x_coords):
lower_x_lim = int(max([x - P / 2, 0]))
upper_x_lim = int(min([lower_x_lim + P, im_array.shape[0]]))
for j, y in enumerate(y_coords):
lower_y_lim = int(max([y - P / 2, 0]))
upper_y_lim = int(min([lower_y_lim + P, im_array.shape[1]]))
avg_grey[i, j] = np.mean(im_array[lower_x_lim:upper_x_lim,lower_y_lim:upper_y_lim])

# Step 4a: Compute array of differences for each grid point vis-a-vis each neighbor
right_neighbors = -np.concatenate((np.diff(avg_grey), np.zeros(avg_grey.shape[0]).reshape((avg_grey.shape[0], 1))),axis=1)
left_neighbors = -np.concatenate((right_neighbors[:, -1:], right_neighbors[:, :-1]), axis=1)
down_neighbors = -np.concatenate((np.diff(avg_grey, axis=0),np.zeros(avg_grey.shape[1]).reshape((1, avg_grey.shape[1]))))
up_neighbors = -np.concatenate((down_neighbors[-1:], down_neighbors[:-1]))
diagonals = np.arange(-avg_grey.shape[0] + 1, avg_grey.shape[0])
upper_left_neighbors = sum([np.diagflat(np.insert(np.diff(np.diag(avg_grey, i)), 0, 0), i) for i in diagonals])
lower_right_neighbors = -np.pad(upper_left_neighbors[1:, 1:], (0, 1), mode='constant')
flipped = np.fliplr(avg_grey)
upper_right_neighbors = sum([np.diagflat(np.insert(np.diff(np.diag(flipped, i)), 0, 0), i) for i in diagonals])
lower_left_neighbors = -np.pad(upper_right_neighbors[1:, 1:], (0, 1), mode='constant')
diff_mat = np.dstack(np.array([upper_left_neighbors, up_neighbors, np.fliplr(upper_right_neighbors), left_neighbors, right_neighbors,np.fliplr(lower_left_neighbors), down_neighbors, lower_right_neighbors]))

# Step 4b: Bin differences to only 2n+1 values
mask = np.abs(diff_mat) < 2 / 255.
diff_mat[mask] = 0.
positive_cutoffs = np.percentile(diff_mat[diff_mat > 0.], np.linspace(0, 100, 3))
negative_cutoffs = np.percentile(diff_mat[diff_mat < 0.], np.linspace(100, 0, 3))
for level, interval in enumerate([positive_cutoffs[i:i + 2] for i in range(positive_cutoffs.shape[0] - 1)]):
diff_mat[(diff_mat >= interval[0]) & (diff_mat <= interval[1])] = level + 1
for level, interval in enumerate([negative_cutoffs[i:i + 2] for i in range(negative_cutoffs.shape[0] - 1)]):
diff_mat[(diff_mat <= interval[0]) & (diff_mat >= interval[1])] = -(level + 1)

# Step 5: Flatten array and return signature
return np.ravel(diff_mat).astype('int8')


def distance(image1, image2):
norm_diff = np.linalg.norm(image1 - image2)
norm1 = np.linalg.norm(image1)
norm2 = np.linalg.norm(image2)
return norm_diff / (norm1 + norm2)


if __name__ == '__main__':
image1 = read('image/1.jpg')
image2 = read('image/2.jpg')
print(distance(image1, image2))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
结果与十二同。

十四、利用Keras预训练模型提取特征进行匹配
此处预训练模型使用VGG16,越大越匹配

import numpy as np
from numpy import linalg as LA
from keras.preprocessing import image
from keras.applications.vgg16 import VGG16
from keras.applications.vgg16 import preprocess_input


class VGGNet:
def __init__(self):
self.input_shape = (224, 224, 3)
self.model = VGG16(weights='imagenet', pooling='max', include_top=False,
input_shape=(self.input_shape[0], self.input_shape[1], self.input_shape[2]))

def extract_feat(self, img_path):
'''提取图像特征

:param img_path: 图像路径
:return: 归一化后的图像特征
'''
img = image.load_img(img_path, target_size=(self.input_shape[0], self.input_shape[1]))
img = image.img_to_array(img)
img = np.expand_dims(img, axis=0)
img = preprocess_input(img)
feat = self.model.predict(img)
norm_feat = feat[0] / LA.norm(feat[0])
return norm_feat


if __name__ == '__main__':
model = VGGNet()
image1 = model.extract_feat('image/1.jpg')
image2 = model.extract_feat('image/2.jpg')
print(np.dot(image1, image2.T))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
图片 1 2 3 4
1 1 0.8714762 0.60663277 0.67468536


脚本
1. 综合比较

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def euclidean(image1, image2):
'''欧氏距离'''
X = np.vstack([image1, image2])
return pdist(X, 'euclidean')[0]


def manhattan(image1, image2):
'''曼哈顿距离'''
X = np.vstack([image1, image2])
return pdist(X, 'cityblock')[0]


def chebyshev(image1, image2):
'''切比雪夫距离'''
X = np.vstack([image1, image2])
return pdist(X, 'chebyshev')[0]


def cosine(image1, image2):
'''余弦距离'''
X = np.vstack([image1, image2])
return pdist(X, 'cosine')[0]


def pearson(image1, image2):
'''皮尔逊相关系数'''
X = np.vstack([image1, image2])
return np.corrcoef(X)[0][1]


def hamming(image1, image2):
'''汉明距离'''
return np.shape(np.nonzero(image1 - image2)[0])[0]


def jaccard(image1, image2):
'''杰卡德距离'''
X = np.vstack([image1, image2])
return pdist(X, 'jaccard')[0]


def braycurtis(image1, image2):
'''布雷柯蒂斯距离'''
X = np.vstack([image1, image2])
return pdist(X, 'braycurtis')[0]


def mahalanobis(image1, image2):
'''马氏距离'''
X = np.vstack([image1, image2])
XT = X.T
return pdist(XT, 'mahalanobis')


def jensenshannon(image1, image2):
'''JS散度'''
X = np.vstack([image1, image2])
return pdist(X, 'jensenshannon')[0]


def image_match(image1, image2):
'''image-match匹配库'''
try:
from image_match.goldberg import ImageSignature
except:
return -1
image1 = ImageSignature().generate_signature(image1)
image2 = ImageSignature().generate_signature(image2)
return ImageSignature.normalized_distance(image1, image2)


def vgg_match(image1, image2):
'''VGG16特征匹配'''
try:
from numpy import linalg as LA
from keras.preprocessing import image
from keras.applications.vgg16 import VGG16
from keras.applications.vgg16 import preprocess_input
except:
return -1

input_shape = (224, 224, 3)
model = VGG16(weights='imagenet', pooling='max', include_top=False, input_shape=input_shape)

def extract_feat(img_path):
'''提取图像特征'''
img = image.load_img(img_path, target_size=(input_shape[0], input_shape[1]))
img = image.img_to_array(img)
img = np.expand_dims(img, axis=0)
img = preprocess_input(img)
feat = model.predict(img)
norm_feat = feat[0] / LA.norm(feat[0])
return norm_feat

image1 = extract_feat(image1)
image2 = extract_feat(image2)
return np.dot(image1, image2.T)


if __name__ == '__main__':
# 初始化
image1_name = 'image/1.jpg'
image2_name = 'image/2.jpg'
image3_name = 'image/3.jpg'

# 图像预处理
image1 = Image.open(image1_name).convert('L') # 转灰度图,若考虑颜色则去掉
image2 = Image.open(image2_name).convert('L')
image3 = Image.open(image3_name).convert('L')
image2 = image2.resize(image1.size)
image3 = image3.resize(image1.size)
image1 = np.asarray(image1).flatten()
image2 = np.asarray(image2).flatten()
image3 = np.asarray(image3).flatten()

# 相似度匹配
print('欧氏距离', euclidean(image1, image2), euclidean(image1, image3))
print('曼哈顿距离', manhattan(image1, image2), manhattan(image1, image3))
print('切比雪夫距离', chebyshev(image1, image2), chebyshev(image1, image3))
print('余弦距离', cosine(image1, image2), cosine(image1, image3))
print('皮尔逊相关系数', pearson(image1, image2), pearson(image1, image3))
print('汉明距离', hamming(image1, image2), hamming(image1, image3))
print('杰卡德距离', jaccard(image1, image2), jaccard(image1, image3))
print('布雷柯蒂斯距离', braycurtis(image1, image2), braycurtis(image1, image3))
# print('马氏距离', mahalanobis(image1, image2), mahalanobis(image1, image3))
print('JS散度', jensenshannon(image1, image2), jensenshannon(image1, image3))

print('image-match匹配库', image_match(image1_name, image2_name), image_match(image1_name, image3_name))
print('VGG16特征匹配', vgg_match(image1_name, image2_name), vgg_match(image1_name, image3_name))
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
2. 欧式相似度

import numpy as np
from PIL import Image
from scipy.spatial.distance import pdist


def euclidean(image1, image2, size):
'''欧氏相似度'''
black = Image.new('RGB', size, color=(0, 0, 0))
white = Image.new('RGB', size, color=(255, 255, 255))
white = np.asarray(white).flatten()
black = np.asarray(black).flatten()
X = np.vstack([white, black])
_max = pdist(X, 'euclidean')[0] # 两图最大欧氏距离
X = np.vstack([image1, image2])
return (_max - pdist(X, 'euclidean')[0]) / _max
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15


总结
任务 使用距离
文本相似度 余弦距离
用户相似度 皮尔逊相关系数


参考文献
利用python PIL库进行图像模式的转换
常见距离公式 numpy 实现
EdjoLabs/image-match: ? Quickly search over billions of images
Python计算图片之间的相似度
相似度计算——欧氏距离、汉明距离、余弦相似度
Distance computations (scipy.spatial.distance)
距离度量以及python实现(一)
距离度量以及python实现(二)
基于VGG-16的海量图像检索系统(以图搜图升级版)
灰度值比较获得图片指纹
sklearn.metrics.pairwise.paired_distances
数据科学中的 9 种距离度量
————————————————
版权声明:本文为CSDN博主「XerCis」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/lly1122334/article/details/89431244

原文地址:https://www.cnblogs.com/auschwitzer/p/15747956.html