I just got back from a few days down at The Internet Identity Workshop (IIW) in San Jose, thanks to some coaxing from my friend, Kaliya Hamlin. In this post, I thought I’d share some observations from the trip on the future of the web – and the future of society.
A Growing Interest inTrust
The issue of online trust is very hot right now, thanks to a number of recent, high profile cases of identity and data theft. Software firms, banks and phone companies know there’s a lot more money to be made as the virtual world merges with the real world, but they’re also painfully aware of just how dependent this huge opportunity is on people having confidence that they can safely conduct transactions over the Internet.
That’s why the U.S. government recently announced a new initiative called The National Strategy for Trusted Identities in Cyberspace (NSTIC). The new program brings together government, industry and advocacy groups in order to build out what they’re calling an “Identity Ecosystem” in the private sector.
Trust Connects Us – and Our Software
Since getting back from IIW last night, I’ve been thinking more broadly about trust. I wrote recently in another piece called Trust and Networks that “trust is the lubricant that supports relationships and makes a network work” and that “networks are the connections that allow peers to work together.” At the time, the “networks of peers” I was thinking about were really networks of people. Since coming back from IIW though, I’m realizing that many of my conclusions about networks of peers are just as true for software as they are for people.
We are moving into a world of software networks; vast meshes of autonomous websites and applications that are increasingly dependent upon one another in order to do their jobs. Software developers sometimes call this “small pieces, loosely joined” – and it’s one of the things we get from an always-on, reasonably reliable communications line that makes it easy for one piece of software to talk to another. When you know that connection’s always there, you can turn to others for pieces of functionality you can’t provide yourself, or that you simply choose not to so that you can concentrate on the work you really are good at.
Say, for example, that you’re on the Zazzle website and decide to buy a t-shirt with a 12″X12″ picture of your face on it. Zazzle trusts PayPal to handle financial transactions and momentarily hands PayPal control of its user experience so that you can authorize payment. We’re so used to these kinds of hand-offs we barely notice that there’s an exchange of trust here. But there is, and it’s one piece of software talking to another. One peer talking to another peer.
Now, payment is a big part of buying a shirt on Zazzle, and so Zazzle and PayPal are willing to invest resources into integrating their services for a nice, seamless experience for users. Getting two pieces of software talking to each other like this is relatively straightforward.
But in a world with lots of small pieces connecting with lots of other small pieces, the total possible connections explodes quickly and you need a different approach than custom-built connections. You need a standard way for the small pieces to connect and that’s precisely what we got with the OAuth protocol.
If you’ve spent much time using social media services, you’ve probably already had some direct experience with OAuth, even if you didn’t know that’s what you were using. You’re using OAuth whenever an app or website prompts you with a little pop-up to sign-in via your Twitter or Facebook account.
The first time you ran into it, it probably made you a little confused or nervous, but once you eventually tried it and the world didn’t end, you began to trust it. The problem is that sometimes we trust these agreements just a little too much and simply click past the explanations as to exactly what data we are agreeing to share with a third party service. And even when I do look carefully at what Facebook tells me it’s about to share with some website I’m visiting, it doesn’t really sink in somehow. That’s why I found the I Shared What?!? website by Joe Andrieu so interesting. It’s a very healthy, and very tangible reminder of just how much trust we’re putting in Facebook to steward our data in responsible ways.
Trust in a Box
The kind of trust we have with people is inherently fuzzy; not black or white, but shades of gray. Trust depends on context, and as social animals we’ve developed all kinds of tricks for knowing whether and when to trust someone; and when we do, most of what we’re doing we’re doing at an unconscious, intuitive level rather than conscious, logical thought. Though it’s often prone to error, we usually rely on our gut to feel when someone’s telling us the truth.
Fuzzy logic and neural networks aside, software doesn’t tend to be very good at interpreting these kind of subtle nuances. So, when it comes to embedding trust into our software, we need to be explicit, specific and deliberate.
The social consequences of small pieces, loosely joined and the growing need to embed trust in our software are now transforming our abstract notions of ‘trust‘ into something more explicit, specific and deliberate. Technology has reached a point where it can no longer advance on its current trajectory without getting more concrete and specific about what we mean when we use a term like trust.
It turns out, embedding something as nuanced as trust into software is really complicated. The government has wisely decided this isn’t something it can operate itself, and that it must rely on the private develop solutions nimble enough to handle this complexity.
The NSTIC initiative was heavily influenced by the IIW crowd, and uses something called a Trust Framework to ensure flexibility while still adhering to the government’s policy objects. Here’s a quick animated overview of how NSTIC would work and here’s a separate high-level overview of the Trust Framework approach. Keep an eye on this space over the next several months, as I think there will be some interesting and exciting developments here, based on some of the noise I heard at IIW this week.
I’m still learning about identity and will be doing a lot more thinking and writing about it over this next year. My sense though is that NSTIC is a big deal that could set in motion some important changes in the way that software is developed.
It’s also important to note that these changes will be the direct result of a lot of hard work by IIW community members, a smart group of people, working a couple steps out front of most of us on these critically important societal issues. We’re lucky to have these folks around. They’re the people working behind-the-scenes to bring about a new vision for the way software connects with other software in ways that help keep our world a civil, nice place to live. I didn’t fully get that until spending three days hanging with this crowd.
Thanks for all you do, IIW. We owe you…
Image modified from original by Aidan Jones.