Apple Pulls Two 500px Apps Over Nudity: Will It Pull Flipboard, Too?

As TechCrunch reported Tuesday, Apple has pulled the apps from photography network 500px from its App Store because, after 16 months of use, their clearly marked nude photo galleries suddenly became intolerable.

In addition for 500px's own app, the third-party 500px app ISO500, whose parent company 500px acquired because of ISO500's excellent integration, has also received notice that its app will be removed from the App Store shortly - for the same reason.

But here's the thing: Flipboard integrates completely with 500px as well. Everything you can do on 500px's app, you can do on Flipboard. Is Apple going to pull Flipboard as well? What about Tumblr, Instagram and all browsers - including Apple's own Safari? You can get to nude images with them pretty easily, too.

How 500px Got The News

500px Chief Operating Officer Evgeny Tchebotarev told ReadWrite that Apple called about an upcoming minor bugfix update to the app around 9pm Monday night. Apple told Tchebotarev the update would be rejected "because it is too easy to look for nude photos in search." The person on the phone initially said the app would be reverted to the earlier version.

"We said, 'it's fine, we can make the changes within a day,'" Tchebotarev says, "but in an hour, we got an email [from Apple] saying it would be pulled anyway, not just reverted." Though the representative he spoke to initially said the existing version could stay, Apple had second thoughts and decided to pull the app altogether.

The update changed nothing about search or the availability of nude photos in the app. It was just a minor release to improve performance and fix some bugs. Just as it has the whole time, the app defaults to a Safe Search mode that excludes nudity, and you have to log in on the desktop version to change that.

Tchebotarev says 500px is issuing a hot fix on the server side in order to satisfy Apple and get the app back into the store as soon as possible. "It's a little rough," he says, "just filters out some search terms. It's not the elegant solution we are usually looking for." Once Apple lets the app back in, 500px will be able to figure out a more permanent fix.

What's The Deal, Apple?

So will Apple pull every app that lets its users find nudity? This rhetorical question is getting tiresome. Apple has always been weird when it comes to defining what kinds of culture it deems appropriate for its users. It has also always been cagey and inconsistent with developers when applying these rules to the App Store.

Apple is rarely clear and upfront with its developer community about why and how these policies are applied. Usually, as with 500px, it's vague and confusing. Sometimes, as in another case on which ReadWrite has reported, it's completely inscrutable. Is the 500px takedown just the overzealous action of a new app reviewer? Or is there a whole new crackdown going on?

"What I've been thinking in the last hour is that our app['s name] starts with a number, so maybe they are getting stricter with not-safe-for-work apps, and ours was at the top of the list," Tchebotarev says. If that's the case, and this issue spreads beyond the apps that plug into 500px, we'll keep you posted.

We've reached out to Apple for comment, and though we probably won't get a response, we'll update the story if we do.