Major regulation is pending that could change the future of the mobile ecosystem and the way mobile apps are made, played and paid for. And it’s not all good.
The Problem With App Rights
Two weeks ago, Rep. Hank Johnson (D-GA) released the APPS Rights Act, a bill pushing developers to implement self-regulatory practices that would improve the security and transparency of user data in mobile apps. “This bill would require that app developers maintain privacy policies, obtain consent from consumers before collecting data, and securely maintain the data that they collect,” Johnson’s office writes online.
There’s not question that changes are needed. Mobile users must be able to make their information isn’t transmitted and sold to third-party vendors. But like similar regulatory efforts, including the recent do-not-track mobile privacy guidelines laid out by the Federal Trade Commission last Friday, and last month’s recommendations to the mobile industry from California Attorney General Kamala Harris, there’s both good and bad aspects to the specific approach taken by the APPS Rights Act. And unfortunately, there’s plenty of bad.
One problem with these guides is that they are penned by people outside of the industry — often in the dark about the best ways to reach their laudable goals. Harris’ recommendation and the FTC’s suggestions comprised a slew of unenforceable recommendations. The APPS bill, meanwhile, would become a mandate if adopted. A mandate likely to lead to unintended consequences to the mobile marketplace.
Developers Are Worried
Security expert Dan Kaminsky says the slow, muddled, legislative process can create frameworks bearing “no resemblance to the problems that need to be solved.” Kaminsky thinks this could lead to applications having to show users exactly what they’re doing in a hardware add-on – akin to web cams having a light that goes on insuring people are aware of exactly what they’re doing.
“What I fear is you won’t be able to write code without having to consult a lawyer,” he says. And if that happens, Kaminsky adds, developers are likely move away from making mobile apps and return building websites.
Beyond subjecting users to long, complex terms-of-use agreements, the doesn’t do a good specifying what happens to collected data beyond the third parties, says Joe Santilli, the chief executive of the mobile app certification service SafeApp. This gray area is known as data retention.
“It really doesn’t make any provisions whatsoever for how third parties are going to share the data with so-called fourth or fifth parties,” Santilli explains. “For example, a marketing partner of an ad network. These people are going to share the data that they cull from these apps… to fourth and fifth parties.”
No one knows the length of time personal data will be stored, the rights of users and the process by which they exercise their rights when dealing with third and fourth parties. The APPS bill’s withdrawal of consent form is a weak attempt at stemmin the data flow. The Opt Out of App Use function requires developers to delete all data if a user opts out. But that doesn’t address the issue of fourth and fifth parties that may already have the data in question:
“By the time the app developer has seen this request from the user, this data has already been shared by the third party (to) the marketing partners, the ad networks, the ad analytics partners,” Santilli says. “At this point you can’t really put the genie back in the bottle, can you?”
At the same time, having to meet these requirements could kill the drive of young entrepreneurs, says developer Jad Meouchy. “This act will end up creating a barrier for new startups… by doubling development time and creating data management headaches,” he predicts. “When you’re an indie developer, there are simply not enough resources to address this kind of compliance.”
Real-World Example
Benjamin Goering, the technical product manager at Livefyre Labs, manages more than 10 million comment threads and personal user accounts for customers. When those customers upgrade from freemium accounts to enterprise versions, they want their user data and accounts migrated. But if those people have not authorized that data to be shared, Livefyre can’t make the transition for them.
But rather than stifle innovation, Goering worries that users won’t take the rules seriously if they don’t work. “It may be completely ignored if it’s out of touch,” Goering said. “If it’s well legislated, it may be useful to have a framework for safe harbor” where developer can be confident they won’t get sued
His team faced that issue when working on a Super Bowl product that aggregates tweets and Instagram photos. This raises the question of whether or not users know shared content is ripe for the plucking. Livefyre bet that users know their shared content may be re-used, and decided not to worry about legal red tape.
Goering warns that if developers have to wait for lawmakers to resolve everything, “it would be impossible to make week-long projects.”
“The nature of the Web is you’re requesting a document and receiving it – at some level data is being taken,” he says. “Where do you draw that line?”
Photo courtesy of Shutterstock