Forum Moderators: Robert Charlton & goodroi
Then I turned the SafeSearch option off. And guess what, a lot of the images from my site appeared using the same keyword.
My site is in a European language, but I don't think any words used in the site can be classified as adult content. Does anybody know what to do about this problem?
We had a thread about false Safe Search filtering [webmasterworld.com] back in January. Here's a reply I made in that discussion:
First, if the site publishes user-provided images, there may accidental occurences of incorrect (and adult) meta data embedded in some or many of the files. Depending on how the user works with their software, incorreect metadata can nappen. Enough accidents like this and the entire domain's images might be filtered.Second, Google recently entered a patent application that offers a lot of clues on how images can be automatically processed for search. Here's the patent application [appft1.uspto.gov]. Notice these possibilities, especially when there is little or no data/metadata directly associated with an image:
- Images can be auto-tagged according to shapes, colors, and textures. This may involve breaking down images into smaller tiles and tagging those tiles.
- Images can be compared to other indexed images from around the web that have similar extracted features. Then keywords that are semantically related to those other images may be imported and used to tag the image that is being classified.
I can see lots of ways that an image might be incorrectly filtered. When it comes to image search, any search engine might prefer to err on the conservative side.