Business Daily.
.
Business Mentor
A+ R A-

An online 'erasure service' for California minors – but can it work?

E-mail Print PDF
imagePushing the button is more complicated than it looks ©istockphoto.com/maxkabakov (courtesy), Author provided

Privacy is often thought of as the right to be left alone. Yet, our lives are embedded in relationships – with people, with corporations, with government, and with technological devices – that can’t be pursued without some amount of privacy loss.

Sometimes that loss is unexpected, large scale, and with unpredictable impact. Donald Sterling thought he was having a private conversation. Edward Snowden unveiled the government’s trove of corporate data. And everyday tech devices are ubiquitous collectors of our personal information - with analysts predicting 2.5 billion global smart phone users by 2015.

On the whole, consumers say they want their data to be left alone – but only sometimes. The New Yorker recently published a story about an artist who offered chocolate chip cookies during an arts festival in Brooklyn in exchange for personal information, such as a mother’s maiden name, home address, the last four digits of a Social Security number, or even fingerprints. Surprisingly, nearly 400 people gave up sensitive personal information in exchange for a cookie.

If our own behavior is inconsistent with preserving privacy, how can we expect laws to effectively protect it? This contradiction is particularly problematic for privacy laws that seek to balance the government’s interests in surveillance and protecting the country against terrorism with a citizen’s right to be left alone. And judges are noticing.

During a recent conference at Georgetown University Law Center, Judge Margaret McKeown of the US Court of Appeals for the Ninth Circuit reportedly offered the following view: “With much of US [Fourth Amendment] privacy law based on a reasonable expectation of privacy, it’s difficult … to define what that means when people are voluntarily sharing all kinds of personal information online.”

This contradiction is also problematic for privacy laws that seek to balance society’s interest in preserving and analyzing posted content with an individual’s right to information privacy.

The rules, regulations, and best practices for companies dealing with obligations to “erase” data are no less seamless than the imperfectly reconcilable desires of consumers. That is because there are different privacy requirements for different age groups under both federal law and state law. And even within state law, there are so many qualifiers around what content does not need to be erased that it’s unclear that the law will protect much privacy at all.

An “erasure service” for minors

Take, for example, the new California Rights for Minors in the Digital World Act, which comes into effect January 1, 2015.

imageShould we erase that?www.shutterstock.com

This new law gives minors, defined as users under the age of 18, the right to remove or request removal of content and other information, including images, from online or mobile application postings. Although the new law protects only California minors, it applies broadly – including to companies outside of California.

More specifically, the law will apply to any company that owns a website or mobile application that is either: (i) directed to minors – meaning it was created for the purpose of reaching an audience that is predominantly comprised of minors; or (ii) directed to a general audience if the company has actual knowledge that a minor is using the site or application.

This new law, which essentially mandates an “erasure service for minors,” also requires that companies provide notice to minors that the removal service is available as well as clear instructions on how to use it.

Sounds straightforward. Yet, it’s not.

Protecting 13 to 18 year olds

Longstanding privacy rules and regulations, such as the 1998 federal Children’s Online Privacy Protection Act (COPPA) have been designed to protect the privacy of minors within a certain age group – those under age 13.

COPPA gives parents control over what information is collected from their children online including, for example, by requiring that companies obtain verifiable parental consent before collecting personal information online and by essentially prohibiting companies from disclosing that information to third parties. COPPA also gives parents access to their children’s personal information so they may review or delete it.

The new California erasure law, however, is designed to protect the privacy of everyone under age 18. Former California Senator Darrell Steinberg, the sponsor of the law, proposed the extension of COPPA-like protections to teens under age 18 because he believed that these teens are more susceptible to revealing personal information online before they comprehend the consequences.

Information privacy laws have long been comprised of an irregular patchwork of federal and state rules and regulations. But the new California law will further complicate corporate compliance for those companies that want to balance a corporate or social interest in preserving and analyzing big data with an individual’s right to information privacy.

Come January 1, 2015, companies such as General Mills and McDonalds will have to continue to comply with COPPA for users under the age of 13 while simultaneously working with a different privacy regime under the new California state law for California users under the age of 18.

Under COPPA, for example, companies must provide parents a mechanism to access and delete personal information about their children under the age of 13. Under the new California law, companies must provide users who are under 18 a mechanism to remove or request themselves the removal of content and information – but only if they themselves have posted it.

Corporate compliance under the new California law is further complicated by several qualifiers around what content must be removed.

For example, the law does not require companies to remove content copied or posted by a third party. So if an Instagram user posts an embarrassing image of a fellow Instagram user who is fifteen years old, Instagram would not have to honor the fifteen year old’s request to remove the embarrassing image because she did not post the image herself. And even if the embarrassing image was originally posted by the fifteen year old and then reposted by another Instagram user, Instagram would still not have to remove the image.

Confusing? That’s not all. The new law does not require companies to remove content posted by a minor for which the minor was paid or otherwise compensated. It doesn’t make companies remove content if they are able to de-identify it so that the minor could no longer be identified with the content. It doesn’t ask companies to remove the “erased” data from their servers, so long as they delete it from their websites and/or mobile applications.

The California Rights for Minors in the Digital World Act is going to be, in other words, one difficult law to implement. And as applied it won’t necessarily always result in any meaningful erasure of content, let alone provide enhanced privacy rights for the minors it was designed to protect.

As The New York Times reported in 2013, there are certainly good reasons to provide enhanced privacy rights to users between the ages of 13 and 18. Take this case as an example:

The rash revelations by a Texas teenager, Justin Carter, on Facebook last February — a threatened school shooting his family insists was sarcastic, made in the heat of playing a video game — landed him in a Texas jail on a felony terrorism charge for nearly six months.

However, when privacy rules and regulations lead to inconsistent outcomes, privacy rights can, as seen in the Instagram example, be compromised — even for those who only sometimes want to be left alone.

image

Lydia A. Jones is the founder and president of InSage, a consulting firm that provides strategic advice to businesses seeking to balance a commercial interest in leveraging data as a monetized asset with a consumer’s right to privacy.

Read more http://theconversation.com/an-online-erasure-service-for-california-minors-but-can-it-work-35077

Business Daily Media