Earlier this week, we reported on the removal of iChatr, the iPhone version of the popular randomized video chat service ChatRoulette, noting that we thought Apple had gone too far in its action. Because users were exposing themselves, Apple removed the app as part of the company’s anti-porn stance.
Now we hear from DLP Mobile, creators of a mirror app for the iPhone 4, that Apple rejected its app from the App Store because its submitted screenshots were considered “obscene, pornographic, or defamatory.” All we can say to this is: Take a look and see for yourself.
The images in question are two shoulders-up pictures of America’s Next Top Model contestant Lyudmila Bouzinova shown below.
In the case of iChatr, there was little question that users were indeed exposing themselves – we wouldn’t hesitate to call the practice a tradition with the iPhone app’s Web counterpart, ChatRoulette. Our contention there was that the browser – or any number of other sanctioned apps – could be used just as easily to arrive at naked body parts.
According to DLP Mobile’s blog post on the rejection, Apple objected to its mirror app based solely on these two pictures, saying they were “in violation of Section 3.3.18 from the iPhone Developer Program License Agreement.” The agreement states the following:
“Applications may be rejected if they contain content or materials of any kind (text, graphics, images, photographs, sounds, etc.) that in Apple’s reasonable judgment may be found objectionable, for example, materials that may be considered obscene, pornographic, or defamatory.”
The email to DLP Mobile from Apple continued, saying that “The application screenshots must meet the requirements for a 4+ rating (no objectionable material) since these images are visible on the App Store by all users even when purchasing is restricted by the application’s rating.”
Although the company graciously (what other choice did they have?) changed the screenshots, and the app was subsequently accepted, the review process seems inherently flawed. While Apple may reserve the right to allow or reject whatever apps they choose, we have to wonder by what standards these pics could be reasoned to be “obscene, pornographic, or defamatory.” And at what point will Apple’s discretionary nature of banning even the barest shoulder from its App Store come back to bite it with consumers? Will developers continue to deal with cases such as this or will Android eventually come to be the leading mobile platform for precisely reasons such as this?
What do you think – were these images “obscene, pornographic, or defamatory” in any way?