Finally getting to the meat of using image processing to detect labels automatically. For background information you should first read part 1 and part 2 before reading this.
Just computing a histogram won't get me very far. For it to be useful I need to compare it to something. As said what I will do is to compare one part of the image (where I think the label is) with the rest of the image, minus the center hole as that isn't part of the record. The idea is that the overlap between the two will be minimal, but possibly not in the cases where the vinyl and the label have the same colour, or a similar colour.
With OpenCV there are built in functions to compare these and basically what I did is follow this blog post at PyImageSearch about comparing histograms and used it on my test images.
After computing the histograms (32 buckets) I computed the four types of histogram comparisons from the PyImageSearch post for the red, green and blue channels, as well as the full image. It is much better explained there than what I do, but the four types are:
The test image I used is of course the ideal situation, so I wondered how well my method works for other images that I downloaded from Discogs. I will dig into that on my next post.
Just computing a histogram won't get me very far. For it to be useful I need to compare it to something. As said what I will do is to compare one part of the image (where I think the label is) with the rest of the image, minus the center hole as that isn't part of the record. The idea is that the overlap between the two will be minimal, but possibly not in the cases where the vinyl and the label have the same colour, or a similar colour.
With OpenCV there are built in functions to compare these and basically what I did is follow this blog post at PyImageSearch about comparing histograms and used it on my test images.
After computing the histograms (32 buckets) I computed the four types of histogram comparisons from the PyImageSearch post for the red, green and blue channels, as well as the full image. It is much better explained there than what I do, but the four types are:
- correlation
- chi squared distance
- intersection
- Bhattacharyya distance
Sleeve histogram comparison
For the sleeve I am expecting a high correlation, as the colours of the two parts that are compared are very similar.Red channel
- correlation: 0.8926749724715063
- chi squared: 120660.19493757954
- intersection: 74872.0
- Bhattycharyya distance 0.29158852379442657
Green channel
- correlation: 0.6275938864980654
- chi squared: 181519.2086741701
- intersection: 63567.0
- Bhattycharyya distance: 0.35726567943462395
Blue channel
- correlation: 0.7437035346944266
- chi squared:190204.78252395047
- intersection: 73194.0
- Bhattycharyya distance: 0.32126113326366795
Full image
- correlation: 0.5584962418370338
- chi squared: 32.475552958421424
- intersection: 2.4023326691039983
- Bhattycharyya distance: 0.5647389844577926
Label histogram comparison
For the label I am expecting a much lower correlation, although the label itself is quite dark as well, which might make things more difficult. Again: the results are for the red, green and blue channels, as well as the full image.Red channel
- correlation: 0.04524780052542104
- chi squared: 435578.4556929312
- intersection: 17567.0
- Bhattycharyya distance: 0.825618002673244
Green channel
- correlation: 0.028391995259706947
- chi squared: 437614.5409754197
- intersection: 17194.0
- Bhattycharyya distance: 0.8394928259148619
Blue channel
- correlation: 0.04220587964200443
- chi squared: 490151.69460585836
- intersection: 15318.0
- Bhattycharyya distance: 0.8624890522162065
Full image
- correlation: 0.12186543341765493
- chi squared: 11.862898455687839
- intersection: 0.15000400247754442
- Bhattycharyya distance: 0.8706885323226409
The test image I used is of course the ideal situation, so I wondered how well my method works for other images that I downloaded from Discogs. I will dig into that on my next post.
Comments
Post a Comment