Google Play app review process gets a human touch

Google has added humans to its app review process to weed out malware, sexually explicit content and other violations, and has tightened its content rating system.

Compared to Apple’s App Store, developers who submitted apps to Google Play could expect a relatively frictionless path to publishing, thanks to Google’s automated systems for assessing apps. However, on Tuesday the Android maker today revealed humans are now assessing apps before they’re published, marking a turn towards Apple’s system for vetting apps.

“Several months ago, we began reviewing apps before they are published on Google Play to better protect the community and improve the app catalog. This new process involves a team of experts who are responsible for identifying violations of our developer policies earlier in the app lifecycle,” the company said on its Android developer blog.

The good news for developers, according to Google, is that human reviewers haven’t pushed out its timelines to days or weeks rather than the hours they’ve become accustomed to.

“In fact, there has been no noticeable change for developers during the rollout,” said Eunice Kim, Product Manager for Google Play.

Apple’s App Review Status dashboard indicates that 97 percent of iOS apps have been reviewed in the last five business days. It’s also reviewed 98 percent of updates to existing apps in that timeframe. A third party site appreviewtimes.com puts the average review time at 7 days.

Humans have however been involved in its app review process for some time, though only as a last line of defence.

The Android maker in 2012 introduced its Bouncer system to keep malicious apps out of its store. Shortly after its launch, security researchers picked apart Bouncer and discovered several ways to bypass it. Humans intervened if Bouncer detected something fishy but since Bouncer only analysed the app for a brief period in an emulator, it was possible to submit malicious apps designed to appear to be harmless in that environment.

The researchers found that a manual analysis by a human operator would be conducted if Bouncer detected something was amiss.

Bouncer may have improved Google’s detection rate but the problems it was attempting to solve with machines were complex and it’s to remove malicious apps after an alarm is raised by third-party security vendors.

Google’s reviewers however will be assessing a wide range of non-compliant behaviour and content in apps besides malware, including sexually explicit material, deceptive behaviour, intellectual property infringement, and apps that make changes to the device without the user’s consent.

Purnima Kochikar, director of business development for Google Play told the Wall Street Journal that its automated systems are good at spotting non-compliant images in apps, but were less effective at assessing intellectual property infringement.

Google is also tightening up its aged-based rating system for content in apps and games on its store, which now include official ratings from the Australian Classification Board and equivalent organisations for other markets.

Developers apps may be blocked in certain territories if they fail to complete a rating questionnaire — a rule that in May will apply to all existing and new apps before they can be published on Google Play.

This article is brought to you by Enex TestLab, content directors for CSO Australia.

Tags malwareAndroidGoogle PlayCSO AustraliaApple’s App StoreApple’s systemPurnima Kochikarappreviewtimes.comhuman touchEunice Kimsexually explicit content

Show Comments