Add bearcaps section

This commit is contained in:
Christopher Lemmer Webber 2019-07-24 15:12:49 -04:00
parent 812c7dde2d
commit c5b4ad2ec2
No known key found for this signature in database
GPG Key ID: 4BC025925FF8F4D3

View File

@ -964,7 +964,7 @@ The easiest and simplest way to implement ocaps would be to use
simple but statistically unguessable "Capability URLs". simple but statistically unguessable "Capability URLs".
For example: For example:
https://site.example/obj/sXJ9WWj6LRLCggZrjzfaeDutb8352OqSR0m2yg8XBkA https://social.example/obj/sXJ9WWj6LRLCggZrjzfaeDutb8352OqSR0m2yg8XBkA
To have the address both brings you to the corresponding object To have the address both brings you to the corresponding object
and gives you access to it. and gives you access to it.
@ -993,11 +993,91 @@ logs.
*** Ocaps as bearcaps *** Ocaps as bearcaps
One way we might improve this situation is to use [[https://github.com/cwebber/rwot9-prague/blob/bearcaps/topics-and-advance-readings/bearcaps.md][bearcaps]].
Here's one that's roughly equivalent to the previous one:
: bear:?u=https://social.example/obj&t=sXJ9WWj6LRLCggZrjzfaeDutb8352OqSR0m2yg8XBkA
Bearcaps are very similar to capability URLs in a sense; they also
don't separate designation from authority, but that's because they
glue it together in two pieces:
- *the =u= query parameter*: The URL to make requests against
(in this case, =https://social.example/obj=)
- *the =t= query parameter*: The bearer authorization token to be
used when making this request (in this case,
=sXJ9WWj6LRLCggZrjzfaeDutb8352OqSR0m2yg8XBkA=)
This is then used to make a request:
#+BEGIN_SRC text
GET /obj HTTP/1.1
Host: social.example
Authorization: Bearer sXJ9WWj6LRLCggZrjzfaeDutb8352OqSR0m2yg8XBkA
#+END_SRC
Note that in this case, the URL doesn't actually tell you what object
you're referencing: in this particular usage, the bearer token
actually is responsible for both pieces of designation and authority.
**** What this doesn't prevent (conflicts with browser assumptions) **** What this doesn't prevent (conflicts with browser assumptions)
We've successfully moved our secret designator someplace that web
browsers and web servers will be less likely to leak data.
However, we should be careful to express limitations around what
we have just described.
The primary limitation with bearcaps, as with capability URLs,
actually comes from contemporary tooling.
While it would be possible to design a browser built with ocap
assumptions, contemporary browsers have not been built this way.
This leads to a couple of risky mismatches:
# - transparent or opaque? shoulder-surfing
# - slurp-it-up javascript
- The first is that we are opened to "shoulder surfing" attacks.
Imagine you are visiting or hovering over a page with a capability
URL and someone took a photo of you; they can now type-in by hand
that URL and gain its authority.
- Second, any javascript that is loaded can scrape the page and
gain access to all your capability URLs or bearcaps.
[[https://arstechnica.com/information-technology/2019/07/dataspii-inside-the-debacle-that-dished-private-data-from-apple-tesla-blue-origin-and-4m-people/][This has happened]], and arguably happens every day for most people;
services like Google Analytics operate by "watching over the user's
shoulder".
In a sense, we can see that this is the same attack as above, but
for code supplied by a webpage or extension.
The solutions to these are similar.
It is unlikely that we can change the assumption that for URIs using
the =http:= or =https:= schemas that we can change browser behavior.
However, browsers do not even accept or know how to use the =bear:=
URI scheme.
In its standardization, we could specify a requirement that clients
treat =bear:= URLs as opaque.
For example, in response to the first of the two problems we
identified above, we could demand that the "full" bearcap not be
exposed (exposing the URL component might be fine) without an explicit
action (such as right-clicking on the link and saying "expose link").
The solution to the second problem is very similar once we realize
that browsers made the perimeter-security-is-eggshell-security mistake.
(And now we understand why dealing with CORS headers is such a
headache!)
Except... "solving" this problem would mean brining explicit ocap type
security to the web in general, meaning that extensions could not
automatically reach in and scrape an entire page by default, for
instance.
We might be able to create a wrapper around solitaire, but fixing the
current generation of webpage and associated javascript deployment
assumptions is a migraine of the scarcely-possible.
However, there's good news: there are plenty of uses of the web which
are not just "contemporary web browsers" in the usage of APIs.
We can still use either capability URLs or bearcaps for endpoints which
are specifically for API endpoints as opposed to links that are intended
for human viewing.
*** Ocaps meet ActivityPub objects/actors *** Ocaps meet ActivityPub objects/actors
@ -1012,8 +1092,6 @@ logs.
* Limitations * Limitations
**
* Future work * Future work
** Petnames ** Petnames