About a year ago I wrote an article arguing for creating the next generation of the OAuth protocol. That article, and some of the other writing around it, has been picked up recently, and so people have been asking me what’s the deal with XYZ, TxAuth, OAuth 3.0, and anything else mentioned there. As you can imagine, a lot has happened in the last year and we’re in a very different place.
The short version is that there is now a new working group in the IETF: Grant Negotiation and Authorization Protocol (gnap). …
This article is part of a series about XYZ and how it works, also including articles on Why?, Handles, Interaction, and Compatibility.
OAuth 2 loves its bearer tokens. They’re a really useful construct because they are simple: if you have the token, you can do whatever the token is good for. It’s all the proof that you need: you can present it in exactly the format that you received it in, no cryptography involved.
This simplicity makes it easy for client developers to get it right, most of the time. You give them a magic value, they put the magic…
This article is part of a series about XYZ and how it works, also including articles on Why?, Handles, Compatibility, and Cryptographic Agility.
When OAuth 1 was first invented, it primarily sought to solve the problem of one website talking to another website’s API. It could also be used with native applications, but it was awkward at best due to many of the assumptions built into the OAuth 1 model. To be fair, smartphones didn’t really exist yet, and the handful of clunky desktop apps could more or less be dealt with via a few hacks. In OAuth 2, we…
This article is part of a series about XYZ and how it works, also including articles on Why?, Handles, Interaction, and Cryptographic Agility.
XYZ is a novel protocol, and one of its goals is to move beyond what OAuth 2 enables in any easy fashion. One of the first and most important decisions I made was to not be backwards compatible with OAuth 2.
This is a bold choice: OAuth 2 is absolutely everywhere, and deliberately breaking with it will put any new protocol at a distinct deployment disadvantage. However, I think that the advantages in the flexibility, especially in…
This article is part of a series about XYZ and how it works, also including articles on Why?, Interaction, Compatibility, and Cryptographic Agility.
One comment I’ve gotten from several people when reading the XYZ spec text and surrounding documentation is about one of its core innovations. Namely, what’s with all the handles everywhere?
The XYZ protocol works by passing JSON objects around that represent different parts of the request and response. These parts include the keys the client is using to protect the request, the resources the client is asking for, the user information the client wants or knows about…
This article is part of a series about XYZ and how it works, also including articles on Handles, Interaction, Compatibility, and Cryptographic Agility.
It’s been about a year and a half since I started in earnest on the XYZ project. I’ve talked with a variety of different people and companies about how it fits and doesn’t fit their use cases, I’ve presented it at a number of different conferences, and I’ve submitted that work into the IETF in the hopes of starting a new standards working group based on what we’ve built and learned.
With that in mind, I wanted…
A contract in the world of computer science is something that defines what the inputs and outputs of a given function are allowed to be, and what you can expect to get as outputs when you hand over a certain set of inputs. This concept is fundamental to APIs, libraries, languages, protocols, and basically all portions of the technology world.
But the most important thing about contracts is that they allow you to fulfill the contract in whatever way you want to. As long as you’re abiding by the contract’s guides, you can do whatever makes the most sense within…
I’ve been working in the internet standards space for well over a decade at this point, and have authored and/or edited several of them myself along with contributing to many others. Many years ago, I had a realization about how standards function in the world that has helped me immensely in the long run. It really comes down to two parts:
At first, a technical standard seems like it’s a form of documentation…
If we’re going to build OAuth 3.0, we need to do the work somewhere. We could do this in the existing OAuth working group in the IETF, or we could start up a new working group. I have been thinking about these options, and I believe this work should be in its own separate standards space.
OAuth 1.0 started its life as an independent specification cooked up by a handful of really bright engineers (mostly in the back rooms of OSCON if I have my history right). Only later was it brought into the IETF to be made into a…
When I proposed that we move toward OAuth 3.0, one question kept coming back to me: Why now? OAuth 2.0 is incredibly successful, why would we do anything to imply that we need something else?
From my view, the answer comes down to a combination of a the rising of a need for something different and, at the same time, the opportunity for a comprehensive solution to fill that need.
As we’ve already covered, OAuth 2.0 is coming up short in some of the ways that people are wanting to use it. We’re seeing a number of different extensions for…
Justin Richer is a security architect and freelance consultant living in the Boston area. To get in touch, contact his company: https://bspk.io/