Tag: privacy

A positive look at Me2B

Somehow Martin Geddes and I were both at PIE2017 in London a few days ago and missed each other. That bums me because nobody in tech is more thoughtful and deep than Martin, and it would have been great to see him there. Still, we have his excellent report on the conference, which I highly recommend.

The theme of the conference was #Me2B, a perfect synonym (or synotag) for both #VRM and #CustomerTech, and hugely gratifying for us at ProjectVRM. As Martin says in his report,

This conference is an important one, as it has not sold its soul to the identity harvesters, nor rejected commercialism for utopian social visions by excluding them. It brings together the different parts and players, accepts the imperfection of our present reality, and celebrates the genuine progress being made.

Another pull-quote:

…if Facebook (and other identity harvesting companies) performed the same surveillance and stalking actions in the physical world as they do online, there would be riots. How dare you do that to my children, family and friends!

On the other hand, there are many people working to empower the “buy side”, helping people to make better decisions. Rather than identity harvesting, they perform “identity projection”, augmenting the power of the individual over the system of choice around them.

The main demand side commercial opportunity at the moment are applications like price comparison shopping. In the not too distant future is may transform how we eat, and drive a “food as medicine” model, paid for by life insurers to reduce claims.

The core issue is “who is my data empowering, and to what ends?”. If it is personal data, then there needs to be only one ultimate answer: it must empower you, and to your own benefit (where that is a legitimate intent, i.e. not fraud). Anything else is a tyranny to be avoided.

The good news is that these apparently unreconcilable views and systems can find a middle ground. There are technologies being built that allow for every party to win: the user, the merchant, and the identity broker. That these appear to be gaining ground, and removing the friction from the “identity supply chain”, is room for optimism.

Encouraging technologies that enable the individual to win is what ProjectVRM is all about. Same goes for Customer Commons, our nonprofit spin-off. Nice to know others (especially ones as smart and observant as Martin) see them gaining ground.

Martin also writes,

It is not merely for suppliers in the digital identity and personal information supply chain. Any enterprise can aspire to deliver a smart customer journey using smart contracts powered by personal information. All enterprises can deliver a better experience by helping customers to make better choices.

True.

The only problem with companies delivering better experiences by themselves is that every one of them is doing it differently, often using the same back-end SaaS systems (e.g. from Salesforce, Oracle, IBM, et. al.).

We need ways customers can have their own standard ways to change personal data settings (e.g. name, address, credit card info), call for support and supply useful intelligence to any of the companies they deal with, and to do any of those in one move.

See, just as companies need scale across all the customers they deal with, customers need scale across all the companies they deal with. I visit the possibilities for that here, here, here, and here.

On the topic of privacy, here’s a bonus link.

And, since Martin takes a very useful identity angle in his report, I invite him to come to the next Internet Identity Workshop, which Phil Windley, Kaliya @IdentityWoman and I put on twice a year at the Computer History Museum. The next, our 26th, is 3-5 April 2018.

 

 

The Castle Doctrine

home castle

The Castle doctrine has been around a long time. Cicero (106–43 BCE) wrote, “What more sacred, what more strongly guarded by every holy feeling, than a man’s own home?” In Book 4, Chapter 16 of his Commentaries on the Laws of England, William Blackstone (1723–1780 CE) added, “And the law of England has so particular and tender a regard to the immunity of a man’s house, that it stiles it his castle, and will never suffer it to be violated with impunity: agreeing herein with the sentiments of ancient Rome…”

Since you’re reading this online, let me ask, what’s your house here? What sacred space do you strongly guard, and never suffer to be violated with impunity?

At the very least, it should be your browser.

But, unless you’re running tracking protection in the browser you’re using right now, companies you’ve never heard of (and some you have) are watching you read this, and eager to use or sell personal data about you, so you can be delivered the human behavior hack called “interest based advertising.”

Shoshana Zuboff, of Harvard Business School, has a term for this:surveillance capitalism, defined as “a wholly new subspecies of capitalism in which profits derive from the unilateral surveillance and modification of human behavior.”

Almost across the board, advertising-supported publishers have handed their business over to adtech, the surveillance-based (they call it “interactive”) wing of advertising. Adtech doesn’t see your browser as a sacred personal space, but instead as a shopping cart with ad space that you push around from site to site.

So here is a helpful fact: we don’t go anywhere when we use our browsers. Our browser homes are in our computers, laptops and mobile devices. When we “visit” a web page or site with our browsers, we actually just request its contents (using the hypertext protocol called http or https).

In no case do we consciously ask to be spied on, or abused by content we didn’t ask for or expect. That’s why we have every right to field-strip out anything we don’t want when it arrives at our browsers’ doors.

The castle doctrine is what hundreds of millions of us practice when we use tracking protection and ad blockers. It is what called the new Brave browser into the marketplace. It’s why Mozilla has been cranking up privacy protections with every new version of Firefox . It’s why Apple’s new content blocking feature treats adtech the way chemo treats cancer. It’s why respectful publishers will comply with CHEDDAR. It’s why Customer Commons is becoming the place to choose No Trespassing signs potential intruders will obey. And it’s why #NoStalking is a good deal for publishers.

The job of every entity I named in the last paragraph — and every other one in a position to improve personal privacy online — is to bring as much respect to the castle doctrine in the virtual world as we’ve had in the physical one for more than two thousand years.

It should help to remember that it’s still early. We’ve only had commercial activity on the Internet since April 1995. But we’ve also waited long enough. Let’s finish making our homes online the safe places they should have been in the first place.

 

VRM Day: Let’s talk UMA and terms

VRM Day and IIW are coming up in October: VRM Day on the 26th, and IIW on the 27th-29th. As always, both are at the Computer History Museum in the heart of Silicon Valley. Also, as always, we would like to focus  VRM day on issues that will be discussed and pushed forward (by word and code) on the following days at IIW.

I see two.

The first isUMA-logo UMA, for User Managed Access. UMA is the brainchild of Eve Maler, one of the most creative minds in the Digital Identity field. (And possibly its best singer as well.) The site explains, “User-Managed Access (UMA) is an award-winning OAuth-based protocol designed to give a web user a unified control point for authorizing who and what can get access to their online personal data, content, and services, no matter where all those things live on the web. Read the spec, join the group, check out the implementations, follow us on Twitter, like us onFacebook, get involved!”

Which a number of us in the #VRM community already are — enough, in fact, to lead discussion on VRM Day.

In Regaining Control of Our Data with User-Managed Access, Phil Windley calls VRM “a perfect example of the kind of place where UMA could have a big impact. VRM is giving customers tools for managing their interactions with vendors. That sounds, in large part, like a permissioning task. And UMA could be a key piece of technology for unifying various VRM efforts.”

For example, “Most of us hate seeing ads getting in the way of what we’re trying to do online. The problem is that even with the best “targeting” technology, most of the ads you see are wasted. You don’t want to see them. UMA could be used to send much stronger signals to vendors by granting permission for them to access information would let them help me and, in the process, make more money.”

We call those signals “intentcasting.”

Yet, even though our wiki lists almost two dozen intentcasting developers, all of them roll their own code. As a result, all of them have limited success. This argues for looking at UMA as one way they can  substantiate the category together.

A large amount of activity is going into UMA and health care, which is perhaps the biggest VRM “vertical.” (Since it involves all of us, and what matters most to our being active on the planet.)

The second topic is terms. These can take two forms: ones individuals can assert (which on the wiki we call EmanciTerm); and truly user- and customer-friendly ones sites and services can assert. (Along with truly agreeable privacy policies on both sides.)

At last Fall’s VRM Day, we came up with one possible approach, which looked like this on the whiteboard:

UserTerms1This was posted on Customer Commons, which is designed to serve the same purpose for individual terms as Creative Commons does for individual artists’ copyright terms. We can do the same this time.

Lately Meeco has come out with terms individuals can set. And there are others in the works as well. (One in particular will be of special interest, but it’s not public yet. I expect it will be, by VRM Day.)

So be sure to register soon. Space is limited.

Bonus links/tweets: here and here.

 

 

Do we have to “trade off” privacy?

Look up privacy trade-offs and you’ll get more than 150,000,000 results. The assumption in many of those is that privacy is something one can (and often should) trade away. Also that privacy trading is mostly done with marketers and advertisers, the most energetic of which take advantage of social media such as and .

I don’t think this has to be so.

One example of a trade-off story is this one on public radio’s Marketplace program, which I heard this evening. It begins with the case of Shea Sylvia, a FourSquare user who got creeped out by an unwelcome call from a follower who knew her location. Marketplace’s Sally Herships says,

There are millions of Sylvias out there, giving away their private information for social reasons. More and more, they’re also trading it in for financial benefits, like coupons and discounts. Social shopping websites like Blippy and Swipely let shoppers post about what they buy. But first they turn over the logins to their e-mail accounts or their credit card numbers, so their purchases can be tracked online.

Later, there’s this (the voice is Herships again):

Alessandro Acquisti researches the economics of privacy at Carnegie Mellon, and he says the value we put on privacy can easily shift. In other words, if giving away your credit card information or even your location in return for a discount or a deal seems normal, it must be OK.

ALESSANDRO ACQUISTI: Five years ago, if someone told you that there’d be lots of people going online to show, to share with strangers their credit card purchases, you probably would have been surprised, you probably would thought, “No, I can’t believe this. I wouldn’t have believed this.”

But Acquisti says, when new technologies are presented as the norm, people accept them that way. Like social shopping websites.

HERSHIPS: So the more we use sites like Blippy, the more we’ll use sites like Blippy?

ACQUISTI: Or Blippy 2.0.

Which Acquisti says will probably be even more invasive, because as time passes, we’re going to care less and less about privacy.

Back in Kansas City Shea Sylvia is feeling both better and worse. She thinks the phone call she got that night at the restaurant was probably a prank. But it was a wake up call.

What we’re dealing with here is an evanescent norm. A fashion. A craze. I’ve indulged in it myself with FourSquare, and at one point was the “mayor” of ten different places, including the #77 bus on Mass Ave in Cambridge. (In fact, I created that location.) Gradually I came to believe that it wasn’t worth the hassle of “checking in” all over the place, and was worth nothing to know Sally was at the airport, or Bill was teaching a class, or Mary was bored waiting in some check-out line, much as I might like all those people. The only time FourSquare came in handy was when a friend intercepted me on my way out of a stop in downtown Boston, and even then it felt strange.

The idea, I am sure, is that FourSquare comes to serve as a huge central clearing house for contacts between companies selling stuff and potential buyers (that’s you and me) wandering about the world. But is knowing that a near-infinite number of sellers can zero in on you at any time a Good Thing? And is the assumption that we’re out there buying stuff all the time not so wrong as to be insane?

Remember that we’re the product being sold to advertisers. The fact that our friends may be helping us out might be cool, but is that the ideal way to route our demand to supply? Or is it just one that’s fun at the moment but in the long term will produce a few hits but a lot of misses—some of which might be very personal, as was the case with Shea Silvia? (Of course I might be wrong about both assumptions. What I’m right about is that FourSquare’s business model will be based on what they get from sellers, not from you or me.)

The issue here isn’t how much our privacy is worth to the advertising mills of the world, or to intermediaries like FourSquare. It’s how we maintain and control our privacy, which is essentially priceless—even if millions of us give it away for trinkets or less. Privacy is deeply tied with who we are as human beings in the world. To be fully human is to be in control of one’s self, including the spaces we occupy.

An excellent summary of our current privacy challenge is this report by Joy L. Pitts (developed as part of health sciences policy development process at the Institute of Medicine, the health arm of the National Academy of Sciences). It sets context with these two quotes:

“The makers of the Constitution conferred the most comprehensive of rights and the right most valued by all civilized men—the right to be let alone.”

—Justice Louis Brandeis (1928)

“You already have zero privacy anyway. Get over it.”

—Scott McNealy, Chairman and CEO of Sun Microsystems (1999)

And, in the midst of a long, thoughtful and well-developed case, it says this (I’ve dropped the footnotes, which are many):

Privacy has deep historical roots. References to a private domain, the private or domestic sphere of family, as distinct from the public sphere, have existed since the days of ancient Greece.  Indeed, the English words “private” and “privacy” are derived from the Latin privatus, meaning “restricted to the use of a particular person; peculiar to oneself, one who holds no public office.” Systematic evaluations of the concept of privacy, however, are often said to have begun with the 1890 Samuel Warren and Louis Brandeis article, “The Right of Privacy,” in which the authors examined the law’s effectiveness in protecting privacy against the invasiveness of new technology and business practices (photography, other mechanical devices and newspaper enterprises). The authors, perhaps presciently, expressed concern that modern innovations had “invaded the sacred precincts of private and domestic life; and . . . threatened to make good the prediction that ‘what is whispered in the closet shall be proclaimed from the house-tops.’” They equated the right of privacy with “the right to be let alone” from these outside intrusions.

Since then, the scholarly literature prescribing ideal definitions of privacy has been “extensive and inconclusive.” While many different models of privacy have been developed, they generally incorporate concepts of:

  • Solitude (being alone)
  • Seclusion (having limited contact with others)
  • Anonymity (being in a group or in public, but not having one’s name or identity known to others; not being the subject of others’ attention)
  • Secrecy or reserve (information being withheld or inaccessible to others)

In essence, privacy has to do with having or being in one’s own space.

Some describe privacy as a state or sphere where others do not have access to a person, their information, or their identity. Others focus on the ability of an individual to control who may have access to or intrude on that sphere. Alan Westin, for example, considered by some to be the “father” of contemporary privacy thought, defines privacy as “the claim of individuals, groups or institutions to determine for themselves when, how and to what extent information about them is communicated to others.” Privacy can also be seen as encompassing an individual’s right to control the quality of information they share with others.

In the context of personal information, concepts of privacy are closely intertwined with those of confidentiality and security. Privacy addresses “the question of what personal information should be collected or stored at all for a given function.” In contrast, confidentiality addresses the issue of how personal data that has been collected for one approved purpose may be held and used by the organization that collected it, what other secondary or further uses may be made of it, and when the permission of the individual is required for such uses.Unauthorized or inadvertent disclosures of data are breaches of confidentiality. Informational security is the administrative and technological infrastructure that limits unauthorized access to information. When someone hacks into a computer system, there is a breach of security (and also potentially, a breach of confidentiality). In common parlance, the term privacy is often used to encompass all three of these concepts.

Take any one of these meanings, or understandings, and be assured that it is ignored or violated in practice by large parts of today’s online advertising business—for one simple reason (I got from long ago): Individuals have no independent status on the Web. Instead we have dependent status. Our relationships (and we have many) are all defined by the entities with which we choose to relate via the Web. All those dependencies are silo’d in the systems of sellers, schools, churches, government agencies, social media, associations, whatever. You name it. You have to deal with all of them separately, on their terms, and in their spaces. Those spaces are not your spaces. (Even if they’re in a place called . Isn’t it weird to have somebody else using the first person possessive pronoun for you? It will be interesting to see how retro that will seem after it goes out of fashion.)

What I’m saying here is that, on the Web, we do all our privacy-trading in contexts that are not out in the open marketplace, much less in our own private spaces (by any of the above definitions). They’re all in closed private spaces owned by the other party—where none of the rules, none of the terms of engagement, are yours. In other words, these places can’t be private, in the sense that you control them. You don’t. And in nearly all cases (at least here in the U.S.), your “agreements” with these silos are contracts of adhesion that you can’t break or change, but the other party can—and often does.

These contexts have been so normative, for so long, that we can hardly imagine anything else, even though we have that “else” out here in the physical world. We live and sleep and travel and get along in the physical world with a well-developed understanding of what’s mine, what’s yours, what’s ours, and what’s none of those. That’s because we have an equally well-developed understanding of bounded spaces. These differ by culture. In her wonderful book , Polly Platt writes about how French —comfortable distances from others—are smaller than those of Americans. The French feel more comfortable getting close, and bump into each other more in streets, while Americans tend to want more personal space, and spread out far more when they sit. Whether she’s right about that or not, we actually have personal spaces on Earth. We don’t on the Web, and in Web’d spaces provided by others. (The Net includes more than the Web, but let’s not get into that here. The Web is big enough.)

So one reason that privacy trading is so normative is that dependency requires it. We have to trade it, if that’s what the sites we use want, regardless of how they use whatever we trade away.

The only way we can get past this problem (and it is a very real one) is to create personal spaces on the Web. Ones that we own and control. Ones where we set the terms of engagement. Ones where we decide what’s private and what’s not.

In the VRM development community we have a number of different projects and companies working on exactly this challenge.  is pure open source and has a self-explanatory name. Others (, and others) are open in many ways as well, and are working together to create (or put to use) common code, standards, protocols, terminologies and other conventions on which all of us can build privacy-supporting solutions. You’ll find links to some of the people involved in those efforts (among others) in Personal Data Stores, Exchanges, and Applications, a new post by  (of Switchbook). There’s also the One example is the and at . (For more context on that, check out Iain Henderson’s unpacking of the .) There’s also our own work at ProjectVRM and , which has lately centered on developing -like legal tools for both individuals and companies.  What matters most here is that a bunch of good developers are working on creating spaces online that are as natural, human, personal—and under personal control—as the ones we enjoy offline.

Once we have those, the need for privacy trade-offs won’t end. But they will begin to make the same kind of down-to-Earth sense they do in the physical world. And that will be a huge leap forward.

Geocasting

The following excerpts a recent Project VRM Conversation on Geocasting — the ability to share your location data with the world, how you could optionally share it, and how it could be abused.

A thread on privacy developed as often happens in these discussions about the ongoing digitization of our thoughts, movements, and actions. Continue reading

Privacy and VRM

In Privacy is Relative my column in May’s Linux Journal, I wrote,

there are essentially two forms of privacy. One is the kind where you hide out. You minimize exposure by confining it to yourself. The other is where you trust somebody with your information.

In order to trust somebody, you need a relationship with them. You’re their spouse, friend, client or patient.

This isn’t so easy if you’re just a customer, or worse, a “consumer”. There the obligation is minimized, usually through call centers and other customer-avoidance mechanisms that get only worse as technology improves. Today, the call center wants to scrape you off onto a Web site or a chat system.

Minimizing human contact isolates your private information inside machines that have little interest in relating to you as a human being or in putting you in contact with a human being inside the company. Hence, your data is indeed safe—from you. It’s also safe from the assumption that this data might in any way also belong to you—meaning, under your control. It’s still private, but only on the company’s terms. Not on yours.

This mess can’t be fixed just by humanizing call centers. It can be fixed only by humanizing companies. This has to be done from both inside and out.

There isn’t enough room in a column like that to unpack that last statement. But there is in a thread like this one, if you’re game.

© 2024 ProjectVRM

Theme by Anders NorenUp ↑