If you’ve searched for objects or landmarks in Google Photos, you have already used Google’s Cloud Vision API without knowing it. Today, the Mountain View company announced that it is now opening up the API to more developers as it enters beta. Launched in limited preview last December, Google has also announced pricing for the service.

Developers will be able to submit photos and have them analyzed for objects present in them (like in Google Photos search), explicit content, face detection, and text extraction:

  • Insights from your images: Powered by the same technologies behind Google Photos, Cloud Vision API detects broad sets of objects in your images — from flowers to popular landmarks
  • Inappropriate content detection: Powered by Google SafeSearch, Cloud Vision API moderates content from your crowd sourced images by detecting different types of inappropriate content.
  • Image sentiment analysis: Cloud Vision API can analyze emotional attributes of people in your images, like joy, sorrow and anger, along with detecting popular product logos.
  • Text extraction: Optical Character Recognition (OCR) enables you to detect text within your images, along with automatic language identification across a broad set of languages.

Google will charge developers per every 1000 photos they submit. The rate gets cheaper the more photos that are scanned. If over five million images are submitted, Optical Character Recognition and facial scanning is $0.60 per 1000 images, while something like Label Detection costs $2 per 1000. During the beta period, users can submit up to 1,000 images for free.

Google used feedback from thousands of companies that have already used the API to create millions of image annotations. Apps like Yik Yak used the Cloud Vision API for OCR in multiple languages, while photo social network PhotoFy used it to scan for inappropriate imagery.

As the service is still in beta, users will have a quota of 20 million images per month and Google warns that the API is not meant for “real-time mission critical applications.”

About the Author