Skip to main content

Label detection: testing with a large collection of images

In the past few weeks I have been experimenting a bit with OpenCV and other techniques to find out if  pictures contain labels and wrote about that (part 1, part 2, part 3, part 4, part 5 and part 6). As you know the proof of the pudding is in the eating and I do like pudding, so I decided to run a test to see if I could say if an image has a label on it, or could not say it (so basically the result is "has label" or "don't know if there is a label"), as up until now I had only tested with about 30 images.

I wanted to keep my test small because of time constraints (I still need to add multiprocessing to my labeling script) but big enough to draw interesting conclusions I chose one particular Spanish label called Belter. This is not because I like the music they released but just one that I encountered many times and that has a rather simple clean label design, so it seemed like a good candidate, although it could also mean that my results need to be taken with a grain of salt.

The first step is to get as many pictures as needed. I took the data dump covering data up until (and including) April 30 2019 and extracted all the 7" releases (not 12" or LPs) on this particular label that have images.

There are 3315 7" releases that have at least 1 image. The breakdown of the amount of images per release is as follows:
  1. 1 image: 1299 releases
  2. 4 images: 1188 releases
  3. 2 images: 619 releases
  4. 3 images: 147 releases
  5. 5 images: 36 releases
  6. 6 images: 13 releases
  7. 8 images: 5 releases
  8. 10 images: 2 releases
  9. 7 images: 2 releases
  10. 11 images: 1 release
  11. 15 images: 1 release
  12. 14 images: 1 release
  13. 9 images: 1 release
and in total there are 8,111 images. My rough guess is that I should be able to extract around 1,000 useful images from this set.

Next step: downloading all the images via the Discogs API. To my surprise there were 5 images more, which likely were added in the last few days. More is always better!

I then ran my label script on every image to see if it thought a label was present or not. The results:
  • 2517 images thought to be labels
  • 5599 images thought to be not labels
I then manually verified all the images, which was rather, ehm, 'interesting', especially the sleeves of releases with traditional Spanish music. Torture.

Images tagged as labels

Of the 2517 images that were tagged as labels 47 turned out to be false positives: 1.87%. This is not bad at all. Looking at the false positives (and processing them again) I could find a few patterns:
  • pictures were not properly cropped, for example an underlying wooden table can be seen in some pictures. When processing this is then treated as some sort of outer ring, and interferes with the method
  • during edge detection one extremely tiny circle is found somewhere in the image (with a 0 or 1  pixel inner radius). Eliminating these reduced the number of false positives with 0.44% (11 false positives) to 1.43%
  • there are circles on the releases, or something that resembles circles, such as this flexi disc, this single with an ellipse in the center, one of images in this rather elaborate packaging, a release with a circle and more. The most interesting one was actually this cardboard release, where my script thought the picture of the back of the release was actually a label.
There are definitely some improvements to be made, but 36 false positives on 2517 images (1.43%) is not bad for such a simple method.

I am sure that better quality images, especially better cropping, would eliminate at least another 50% of the false positives.

Images tagged as "not a label"

On to the false negatives: of the 5599 images tagged as "not a label" 341 releases actually turned out to be a label: 6.09%. This is quite a bit more than the false positives, but not as disastrous as false positives (as the choice was between "contains a label" or "don't know").

I did not count pictures where a partial label, or the sleeve plus label together were in a picture as a false negative. Counting those the amount of false negatives would be higher.

The biggest issues that I found:
  • camera flash and other sources of light: these really mess with the intensity of some of the pixels, which confuses the edge detection algorithm
  • white stickers covering part of the label, including the edge, which also confuses the edge detection algorithm
  • angle of the picture: some pictures had strange angles and for my method to work a top down view is best
Playing a bit with other blurring methods (Gaussian blur) lets me recognize a further 127 labels (bringing the false negative percentage below 4%), but it also adds another 44 false positives. It means an 83 extra labels that are recognized, but the increase in false positives doesn't sit well. It seems that smartly chosen values for the Gaussian blur can get this down to just a handful of false positives though (below 10).

Conclusion

For something that I literally threw together in a few evenings the method seems to be performing very well. It definitely isn't perfect and I would not recommend using it to automatically flag images without manual verification on a large database, but it is a very good first stepping stone.

Comments

Popular posts from this blog

SID codes (part 1)

One thing that I only learned about after using Discogs is the so called Source Identification Code, or SID. These codes were introduced in 1994 to combat piracy and to find out on which machines a CD was made. It was introduced by Philips and adopted by IFPI, and specifications are publicly available which clearly describe the two available SID codes (mastering SID code and mould SID code). Since quite a few months Discogs has two fields available in the " Barcode and Other Identifiers " (BaOI) section: Mould SID code Mastering SID code A few questions immediately popped up in my mind: how many releases don't have a SID field defined when there should be (for example, the free text field indicates it is a SID field)? how many releases have a SID field with values that should not be in the SID field? how many release have a SID field, but a wrong year (as SID codes were only introduced in 1994) how many vinyl releases have a SID code defined (which is impossi

SPARS codes (part 1)

Let's talk about SPARS codes used on CDs (or CD-like formats). You have most likely seen it used, but maybe don't know its name. The SPARS code is a three letter code indicating if recording, mixing and mastering were analogue or digital. For example they could look like the ones below. There is not a fixed format, so there are other variants as well. Personally I am not paying too much attention to these codes (I simply do not care), but in the classical music world if something was labeled as DDD (so everything digital) companies could ask premium prices. That makes it interesting information to mine and unlock, which is something that Discogs does not allow people to do when searching (yet!) even though it could be a helpful filter. I wanted to see if it can be used as an identifier to tell releases apart (are there similar releases where the only difference is the SPARS code?). SPARS code in Discogs Since a few months SPARS is a separate field in the Discogs

Country statistics (part 2)

One thing I wondered about: for how many releases is the country field changed? I looked at the two most recent data dumps (covering February and March 2019) and see where they differed. In total 5274 releases "moved". The top 20 moves are: unknown -> US: 454 Germany -> Europe: 319 UK & Europe -> Europe: 217 unknown -> UK: 178 UK -> Europe: 149 Netherlands -> Europe: 147 unknown -> Europe: 139 unknown -> Germany: 120 UK -> US: 118 Europe -> Germany: 84 US -> UK: 79 USA & Canada -> US: 76 US -> Canada: 65 unknown -> France: 64 UK -> UK & Europe: 62 UK & Europe -> UK: 51 France -> Europe: 51 Saudi Arabia -> United Arab Emirates: 49 US -> Europe: 46 unknown -> Japan: 45 When you think about it these all make sense (there was a big consolidation in Europe in the 1980s and releases for multiple countries were made in a single pressing plant) but there are also a few weird changes: