Skip to main content

Using image processing to automatically detect labels (part 3)

Finally getting to the meat of using image processing to detect labels automatically. For background information you should first read part 1 and part 2 before reading this.

Just computing a histogram won't get me very far. For it to be useful I need to compare it to something. As said what I will do is to compare one part of the image (where I think the label is) with the rest of the image, minus the center hole as that isn't part of the record. The idea is that the overlap between the two will be minimal, but possibly not in the cases where the vinyl and the label have the same colour, or a similar colour.

With OpenCV there are built in functions to compare these and basically what I did is follow this blog post at PyImageSearch about comparing histograms and used it on my test images.

After computing the histograms (32 buckets) I computed the four types of histogram comparisons from the PyImageSearch post for the red, green and blue channels, as well as the full image. It is much better explained there than what I do, but the four types are:

  1. correlation
  2. chi squared distance
  3. intersection
  4. Bhattacharyya distance
The numbers are interpreted in a different way (and I don't know the fine details), but the important thing to keep in mind for correlation and intersection: the lower the score, the more different the histograms are. For chi squared and Bhattacharyya distance it is the opposite: the higher the score, the more different the histograms are.

Sleeve histogram comparison

For the sleeve I am expecting a high correlation, as the colours of the two parts that are compared are very similar.

Red channel

  1. correlation: 0.8926749724715063
  2. chi squared: 120660.19493757954
  3. intersection: 74872.0
  4. Bhattycharyya distance 0.29158852379442657

Green channel

  1. correlation: 0.6275938864980654
  2. chi squared: 181519.2086741701
  3. intersection: 63567.0
  4. Bhattycharyya distance: 0.35726567943462395

Blue channel

  1. correlation: 0.7437035346944266
  2. chi squared:190204.78252395047
  3. intersection: 73194.0
  4. Bhattycharyya distance: 0.32126113326366795

Full image

  1. correlation: 0.5584962418370338
  2. chi squared: 32.475552958421424
  3. intersection: 2.4023326691039983
  4. Bhattycharyya distance: 0.5647389844577926
Those numbers only make sense when comparing with the results of the label histogram comparisons.

Label histogram comparison

For the label I am expecting a much lower correlation, although the label itself is quite dark as well, which might make things more difficult. Again: the results are for the red, green and blue channels, as well as the full image.

Red channel

  1. correlation: 0.04524780052542104
  2. chi squared: 435578.4556929312
  3. intersection: 17567.0
  4. Bhattycharyya distance: 0.825618002673244

Green channel

  1. correlation: 0.028391995259706947
  2. chi squared: 437614.5409754197
  3. intersection: 17194.0
  4. Bhattycharyya distance: 0.8394928259148619

Blue channel

  1. correlation: 0.04220587964200443
  2. chi squared: 490151.69460585836
  3. intersection: 15318.0
  4. Bhattycharyya distance: 0.8624890522162065

Full image

  1. correlation: 0.12186543341765493
  2. chi squared: 11.862898455687839
  3. intersection: 0.15000400247754442
  4. Bhattycharyya distance: 0.8706885323226409
As you can see all comparisons say that the label I cut from the image is showing very little overlap with the rest of the image, meaning that this method seems to work and I can tell from just these numbers that the latter one (with the label) is most likely the image with the label (as expected).

The test image I used is of course the ideal situation, so I wondered how well my method works for other images that I downloaded from Discogs. I will dig into that on my next post.

Comments

Popular posts from this blog

SID codes (part 1)

One thing that I only learned about after using Discogs is the so called Source Identification Code, or SID. These codes were introduced in 1994 to combat piracy and to find out on which machines a CD was made. It was introduced by Philips and adopted by IFPI, and specifications are publicly available which clearly describe the two available SID codes (mastering SID code and mould SID code). Since quite a few months Discogs has two fields available in the " Barcode and Other Identifiers " (BaOI) section: Mould SID code Mastering SID code A few questions immediately popped up in my mind: how many releases don't have a SID field defined when there should be (for example, the free text field indicates it is a SID field)? how many releases have a SID field with values that should not be in the SID field? how many release have a SID field, but a wrong year (as SID codes were only introduced in 1994) how many vinyl releases have a SID code defined (which is impossi

SPARS codes (part 1)

Let's talk about SPARS codes used on CDs (or CD-like formats). You have most likely seen it used, but maybe don't know its name. The SPARS code is a three letter code indicating if recording, mixing and mastering were analogue or digital. For example they could look like the ones below. There is not a fixed format, so there are other variants as well. Personally I am not paying too much attention to these codes (I simply do not care), but in the classical music world if something was labeled as DDD (so everything digital) companies could ask premium prices. That makes it interesting information to mine and unlock, which is something that Discogs does not allow people to do when searching (yet!) even though it could be a helpful filter. I wanted to see if it can be used as an identifier to tell releases apart (are there similar releases where the only difference is the SPARS code?). SPARS code in Discogs Since a few months SPARS is a separate field in the Discogs

Country statistics (part 2)

One thing I wondered about: for how many releases is the country field changed? I looked at the two most recent data dumps (covering February and March 2019) and see where they differed. In total 5274 releases "moved". The top 20 moves are: unknown -> US: 454 Germany -> Europe: 319 UK & Europe -> Europe: 217 unknown -> UK: 178 UK -> Europe: 149 Netherlands -> Europe: 147 unknown -> Europe: 139 unknown -> Germany: 120 UK -> US: 118 Europe -> Germany: 84 US -> UK: 79 USA & Canada -> US: 76 US -> Canada: 65 unknown -> France: 64 UK -> UK & Europe: 62 UK & Europe -> UK: 51 France -> Europe: 51 Saudi Arabia -> United Arab Emirates: 49 US -> Europe: 46 unknown -> Japan: 45 When you think about it these all make sense (there was a big consolidation in Europe in the 1980s and releases for multiple countries were made in a single pressing plant) but there are also a few weird changes: