decentralized-id.github.io/_posts/identosphere-dump/organizations/CCGWGDigest.md
⧉ infominer 761b3a88be mdc
2022-11-26 00:33:06 -05:00

64 KiB
Raw Blame History

published
false

CCGWGDigest

Credentials Community Group - Digest july 2021 - july 2022

Decentralization

The course is available on the Open Universitys OpenLearn Create platform and is licensed under CC BY-NC-SA 4.0. Upon completion of the course, learners earn a free statement of participation.

Funding

NSF is introducing a new program called "Pathways to Enable Open-Source Ecosystems" (POSE).  The purpose of the program is to harness the power of open-source development for the creation of new technology solutions to problems of national and societal importance. Many NSF-funded research projects result in publicly accessible, modifiable, and distributable open-sourced software, hardware or data platforms that catalyze further innovation.

Human Rights

Good topic for CCG discussion and reading on the implications of a lot of

the tech we are working on:

The Ford Foundation paper attached provides the references. However, this thread should not be about governance philosophy but rather a focus on human rights as a design principle as we all work on protocols that will drive adoption of W3C VCs and DIDs at Internet scale.

NFT

A data NFT represents the copyright (or exclusive license against copyright) for a data asset on the blockchain — we call this the “base IP”. When a user publishes a dataset in OceanOnda V4, they create a new NFT as part of the process. This data NFT is proof of your claim of base IP. Assuming a valid claim, you are entitled to the revenue from that asset, just like a title deed gives you the right to receive rent.

China is using #blockchain technology to manage #prisoners as if each #prisoner was an #NFT/token on the blockchain...

What we've found as a good framework is the concept of "Principal Authority" which comes from the Laws of Agency, which allows us to leverage fiduciary style Laws of Custom to define requirements for practices when digital identity is delegated to others (whether for authorization or for use of data).

I've written up a layman's article (as I am not a lawyer) introducing this topic at:

Spruce, MATTR, and Digital Bazaar have collaborated on creating an interoperability test suite for something we're calling the "Verifiable Driver's License" (temporary name):

  • The test suite demonstrates that a few things are possible in addition to what mDL provides:
  1. The mDL data model can be expressed cleanly using W3C Verifiable Credentials

Its sad and frustrating that this isnt based on verifiable credentials… it appears vendor lock in is going to be hard to prevent.

For anyone who missed the November coverage about this, heres a pretty outrageous CNBC article: "Apple is sticking taxpayers with part of the bill for rollout of tech giant's digital ID card

Yikes!

For those that didn't read the article, the TL;DR is:

Tough to forge digital drivers license is… easy to forge... 4 million mobile driver's licenses in NSW Australia compromised in an unrecoverable way.

Code

I know there are others out there, too, but these Ive worked with.

interoperable specifications for QR-based air-gap cryptographic use cases that we call Universal Resources (aka "UR").

Our UR specifications are designed for the interoperable transmission and storage of a variety of kinds of information, but in particular cryptographic data, and we have an advanced QR and CBOR-based architecture. (For more information on this see URs: An Overview

For make it easier to implement our specs we also make available open source reference libraries and demo apps in our repos on Github

...with new capabilities for coercing the Agent serviceEndpoint selector and Agent interface method selector (13 minutes.

I wanted to share another DID Web + JOSE + GitHub demo:

TLDR - JWS linked to DIDs from a Github Action [...] this will also work for VCs.

I wanted to share some very recent (experimental and unstable) work we've done to enable Decentralized Identifiers and Verifiable Credentials to assist with the software supply chain.

The key idea is to enable github actions to sign and verify credentials that conform to the W3C Verifiable Credentials standard (which in turn supports various envelope formats including JOSE, COSE and PGP).

I wanted to share some updates I made to the github action we created for working with DIDs and VCs in GitHub Workflows.

  • Creating Container Revision VCs with DID Web in a GitHub Action
  • Uploading the VC-JWT for the signed revision as a label to GitHub Container Registry
  • Pulling the latest container registry tag and checking the vc for the revision.

Standardization

  • FYI: What makes a standard world class? Michael Herman (Trusted Digital Web) (Saturday, 14 August)

  •   A world class standard should have well-defined objectives that respond to real needs in a timely manner.

  •   Its technical content should be complete and accurate.

  •   It should be easy to understand (or as easy as the subject matter allows!) and easy to implement.

  •   Its requirements should be expressed clearly and unambiguously.

  •   It should be validated.

  •   It should be well-maintained.

Reference: A Guide To Writing World Class Standards

In the 17 years i worked at W3C, the formal objections were

(1) "we [the objector] wanted to be on record as saying this but go ahead and publish" (the most common);

(2) we [the objector] have a product, or are about to ship a product, and the feature(s) in  this spec would cause problems in the short-term for our product, and that's more important to us than the Web (no-one will ever admit to this but it's not uncommon)

(3) we object to this spec, we prefer another approach, so here's a bunch of fake objections to slow things down because we can't share our actual business strategy

(4) we believe there's a technical problem with this spec, but we didn't notice it over the past four years despite a last call  review (this one is actually rare but does happen)

Just a reminder that these "politics" and "other-ing" isn't some weird by product of the "identity community", or DIF, or CCG, or OpenID... it's endemic in any long-lived community composed of human beings.

It's not something you're ever rid of... it's something you manage over time;

Procedure \ CCG

Notifications of messages to this mailing list (public-credentials) are now sent to our IRC channel (#ccg).

This process is open to anyone -- no W3C Membership dues, fees, etc. required to participate.

This is a friendly reminder that anyone in the community that is doing something interesting that you think the community should know about whether that work is done here in the CCG or elsewhere, can email the chairs with what you want to share and we can get you on the calendar. It's best if you email all 3 chairs.

there are statements like: "Buy our products! We're the best!" (with nothing else that we can learn from) that is frowned upon... but, in general, even if it is a feature in one of your products, chances are that we want to hear about it if it has relevance to how we might interoperate on that feature (or use it to meet a goal of the community).

https://www.notion.soimages/image1.png

W3C

This major organizational overhaul to the W3C is also happening at a time of unprecedented activity and change for the internet. Will the web support crypto and Web3 industry proposals? How will the web support advertising? What should be the baseline web browser security standards?

"We designed the W3C legal entity in a way that keeps our core unchanged," said Dr. Jeff Jaffe, W3C CEO. "Our values-driven work remains anchored in the royalty-free W3C Patent Policy, and the W3C Process Document where we enshrined dedication to security, privacy, internationalization and web accessibility. W3C and its Members will continue to play a fundamental role in making the web work for billions of people."

Decentralized Identifiers (DID)

#didlang 0.3 includes support for round-robin, load-balanced DID Agent serviceEndpoint clusters. Here's a demo

This is the final step of the W3C global standardization process.

If you are a W3C Member, you can now vote to approve it as a global standard here:

  1.  Governments “lobbying” for single DID method and Non-Interoperability
  •   “tantek: concerned to hear that there are governments looking to adopt, with only single implementation methods and non interop, sounds like lobbying may have occurred, … advocating for single-implementation solutions that are centralized wolves in decentralized clothing”

  •   “ +1 to tantek's concern that governments are responding to lobbying attempts on non-interoperable methods”

  • Mozilla Formally Objects to DID Core  Drummond Reed (Thursday, 1 September)

Now, here's the REAL irony. Mozilla and others are pointing to the URI spec and existing URI schemes as the precedent without recognizing that in in section 9.11 of the DID spec, we specifically compare the DID spec to the URN spec, RFC 8141. In fact we deliberately patterned the ABNF for DIDs  after the ABNF for URNs—and patterned DID method names after URN namespaces. And we set up a registry for the exactly the same way RFC 8141 establishes a registry of URN namespaces.

Now: guess how many URN namespaces have been registered with IANA?

I don't see anyone complaining about interoperability of URN namespaces. Amd RFC 8141 was published over four years ago.

The motivation for verification relationships in the DID spec stems from the general security recommendation of "use separate keys for separate purposes".

You can see this at work in other specifications, such as JWKS (JSON Wek Key Set), specifically in the 'use' (Public Key Use) parameters, from https://datatracker.ietf.org/doc/html/rfc7517#section-4.2

great to see that press release at https://www.w3.org/2022/07/pressrelease-did-rec.html.en

There's a testimonial from UNECE near the bottom.  I thought the community might be interested in the white paper from UNECE on VCs and DIDs for cross border trade - https://unece.org/trade/uncefact/guidance-material

This message is to inform the DID WG and CCG that the W3C intends to write a press release.

To that end, we are seeking testimonials about Decentralized Identifiers.

For an example of the sort of thing we're looking for, please see: https://www.w3.org/2019/03/pressrelease-webauthn-rec.html

The testimonials may be submitted as a reply to this email.

DID Methods

The publication of this DID Method specification realizes, in large part, a 4-year quest (or should I say personal mission) to create a platform to Tokenize Every Little Thing (ELT).

On 8/26/21 12:37 PM, Heather Vescent wrote:

  1. What are the pros of including did methods as work items in the CCG?

Community vetting and approval of particular DID Methods.

Basically, broader and deeper review of DID Methods that we expect to be of

great use to the world. I expect there will be DID Methods that the community

wants to eventually propose as DID Methods for standardization (did:key and

did:web feel like two ones where we could get consensus on doing so).

can't we pick just a small number of un-controversial methods to standardise?  even if it's just did:key and did:web to start with.

The broader generalisation of this question is : "for trust anchors like governments that issue VCs to their constituents, what rules should govern which did:methods they should accept as the subject identifier for the VCs they issue?"  Are those rules context specific?

I'm not sure of the answer - but it's why did:ion was on my list - as an allowed subject of a government issued vc - and as the issuer of trade documents.  should I take it off my list pending a bit more maturity (eg that azure service goes out of beta into full production)?  or is it safe enough for this use case?  if so what others would also be "safe enough"?

https://www.notion.soimages/image2.png

DID:TAGre: Using Email as an Identifier  Bob Wyman (Friday, 12 November)

My did:tag proposal is, I believe, the only proposed DID Method that addresses the use of email addresses and email as a resolution method

There are quite a number of issues with using email addresses as identifiers, or parts of identifiers, and I'm hoping that discussion and development of the did:tag method will illuminate those issues and potentially find solutions for them.

DID:WEB

We have had the same issue... per the did core spec, there are really 2 main key types, in our crypto libraries for the key pair classes themselves, we do our best to support both and handle translation for you:

We then generate a DID Web DID Document from the public keys for the 3 children, and encode the ca chain from them back to the root using x5c.

We then issue a JWT from the private key for 1 of them.

We then verify the JWT signature using the public key.

We then check the x5c using open seel to confirm the certificate chain.

My questions are:

  1. Is it possible to use JOSE to automate this further?

  2. Is there a better way of accomplishing this?

  3. Should the CA chain be pushed into the JWT?

DID:JWK

DID:KEY

I published a did:key creator at

This has been tested to create did:keys from the P-256,P-384, and P-521 curves specified in https://github.com/w3c-ccg/did-method-key and https://w3c-ccg.github.io/did-method-key/ .

The DID Document generation algorithm for did:key is being refined to the

point that we can finish off a first pass of a did:key test suite.

Verifiable Credentials

These VCs (etc.) will be embedded into the assets (e.g., video, images, documents, etc.) in a tamper-evident manner, so that in addition to the individual VCs “proof”, any attempt to change the CreativeWork relationships, etc. can also be detected. [..] we have no protection against a malicious actor simply copying the VC from one asset and dropping it into another (and then signing the new setup), because there is nothing that binds the credential to the asset in our case.

This seems more of a feature of the architecture than a threat, as long as you understand that the signing of the anti-tamper mechanism is, by its nature, an attestation about the affinity of that VC to the rest of the PDF, made by that signing authority (and by neither the VC issuer nor the Holder, unless the tamper signature can be independently demonstrated to be either the issuer or holder).

For Github users, submit your use cases as issues here: https://github.com/w3c-ccg/vc-ed-use-cases/issues

This template can help guide you: https://github.com/w3c-ccg/vc-ed-use-cases/blob/main/.github/ISSUE_TEMPLATE/use-case-template.md

Is a VC still considered to be valid if it contains fields that are not described in its context file(s)? Does it depend on the signature type?

The short answers are "maybe" and "yes".

The chip in your e-passport is the analogy Ive been most successful with

An issuer gives it to you.

You carry it around and show to whom you choose

The verifier can check its integrity without contacting the issuer

“A VC is like the chip in your passport - bit for any document type”

So far the best analogy Ive found.  Policy makers say “ah, I see”…

Video Using Paper-based Structured Credentials to Humanize Verifiable Credentials [Rough Cut] Michael Herman (Trusted Digital Web) (Friday, 19 November)

User Scenario: ABC Grocery wants to use the Trusted Digital Web to issue a Purchase Order for 10 cabbages from David's Cabbages.

A common example of this is when someone uses a "Power of Attorney," to sign a contract. When they do, they typically sign documents with their own names and an annotation "on behalf of," "for," or "by power of attorney," they don't forge the signature of the one who granted the power of attorney.

One should delegate rights, not credentials.

Note that this is different than binding multiple credentials together in a Verifiable Presentation (and having the presenter sign the VP). In the VP case, the binding just means "this presenter is authenticating the handing over of these unrelated credentials". Whereas in the linked VC case, the credentials are aware of each other, and the peer or hierarchical relationship is built into the VC itself.

Apparently so… Evaluating the Current State of Application Programming Interfaces for Verifiable Credentials

I am excited to share with you today the release of Blockcerts V3. As you may already know the earlier versions of Blockcerts were architected by Kim H. Duffy through Learning Machine and leveraged the Open Badge standard.

We have followed through with the initial ideas established at RWOT 9 in Prague in December 2019, to align Blockcerts with the Verifiable Credential specification.

  • Proposal Work Item | Credential Chaining  Robin Klemens (Thursday, 27 January)

  • to provide an overview of all existing flavors of credential chaining (What current and new techniques exist or are being researched?)

  • to gather the reasons and requirements for credential chaining

  • to come up with best practices and create a sort of decision tree that helps map the requirements of the use case with the implementation of credential chaining

  • to provide working code with concrete implementations on different chaining variants

  • to integrate credential chaining into future versions of the Verifiable Credentials Data Model

  • DIF VC-JWTs look like Linked Data Proof Verifiable Credentials  Orie Steele (Thursday, 24 February)

As far as I know, no other VC-JWT implementation supports this format, aka "JwtProof2020".

If you have a few minutes, I would love some review of what the DIF implementation is doing, and how we can either push it all the way into the LD Proof camp, or all the way into the VC-JWT camp.

as you know we spent quite some time on the text in the VC Data Model v1.1 to differentiate between a credential and a verifiable credential, and to highlight that regardless of the proof format (JWT, LD-Proof etc) the credential is always the same once the proof has been removed.

Therefore the obvious way to me to store any type of VC in a wallet is to store the credential as JSON, along with the proofed VC,  then the same wallet will be able to receive any type of proofed VC and store the embedded credential in the same way. I have also been highlighting this model in the DIF PE group, so that the same Presentation Definition can be used by any wallet to select any type of credential, regardless of the proof type.

If the VCs in the cloud are a commitment to a DID instead of a hardware bound key... then their presentation from hardware bound keys achieves the same effect, but if the device is lost, the holder just registers new device bound keys, and no need to re-issue the VCs (but a DID Update operation is required).

Indeed the use case is for so called bearer credentials. The example of a concert ticket mentioned in there is a good one, although the actual bachelor degree example nr 33 is questionable since a degree is not subject independent.  That seems to come more from the fact that the degree is used throughout the spec as an example.

This document proposes Verifiable Web Forms -- a new way to provide Verifiable Credentials [VC-DATA-MODEL] to Web Browser via Clipboard. By using Verifiable Web Forms, users can provide third-party verified data with standard user interfaces without typing. The data is also verifiable on the server-side too.

Ive created a Miro board as a place to start gathering questions and assumptions:

I've made a pass at updating the registry to be more helpful to people and organizations that are not involved in the week-to-week with VCWG or CCG. The update, which adds proof methods, links to specs, implementations, and test suites can be found here:

The pull request[4] involves a few things that are worth noting

We design, implement, and evaluate a solution for achieving continuous authorization of HTTP requests exploiting Verifiable Credentials (VCs) and OAuth 2.0. Specifically, we develop a VC issuer that acts as an OAuth 2.0 authorization server, a VC verifier that transparently protects HTTP-based resources, and a VC wallet implemented as a browser extension capable of injecting the necessary authentication data in HTTP requests without needing user intervention.

Verifiable Credentials Data Model v1.1 https://www.w3.org/TR/2022/REC-vc-data-model-20220303/

This was largely a maintenance release of the specification. The list of (minor) revisions since the v1.0 release can be found here:

This evidence could be a test score, a link to an image, video, and/or web page, etc. that demonstrates competency or participation. These specs are working towards aligning with VCs and it was originally thought that this type of evidence would be included as part of the credentialSubject if it existed.

This would look something like this:

But since VCs already have an evidence property that allows for an array of evidence, it seems to make sense to use that property instead of using a separate property like the one demonstrated above.

This draft Rebooting the Web of Trust 11 paper explores ways in which the Verifiable Credentials data model could be extended to support visual, audio, and physical renderings for Verifiable Credentials.

VC-API

This demo API and Spec has a number of improvements over the current

VC-HTTP-API, including tested support for VC-JWT, JsonWebSignature2020 and

BBS+ Selective Disclosure Presentation Exchange.

https://www.notion.soimages/image4.png

Typical solutions to this problem require that you put the binary data outside of the VC, if at all possible. This works well for common static images such as logos. It is also possible to split the VC into two VCs... one with the machine-readable data from the issuer (with a digital signature) and one with the image data from any source (without a digital signature, since, if hashlinked, the signature will verify the validity of the image data). That latter approach can be more privacy preserving AND more complex than many might feel is necessary.

We are happy to announce today that we have our first demonstration of cross-vendor interoperability between Danube Tech and Digital Bazaar for the VC Issuer API and VC Verifier API. The test suites test the OAS definition files (which are used to generate the specification):

  1. There are sequence and communications diagrams for both issuance and verification, plus a class diagram.

https://www.notion.soimages/image3.png

https://www.notion.soimages/image6.png

I think I'm starting to understand how RAR fits into this picture. This decision can be made for us by punting the question to the authorization process entirely. With RAR we can force the user to authorize for the actual subject they are issuing the credential about. Is Alice authorized to issue VCs with claims about did:example:12345? To answer that question Alice asks for a token with the following RAR request

It seemed like a good idea when I first invented it a decade ago: https://blue-button.github.io/blue-button-plus-pull/#scopes or when it got pulled into other efforts like https://openid.net/specs/openid-heart-fhir-oauth2-1_0-2017-05-31.html… and Orie even suggested the following set of parameterized scopes for this API:

'create:credentials': Grants permission to create credentials

'derive:credentials': Grants permission to derive credentials

'create:presentations': Grants permission to create presentations

'verify:presentations': Grants permission to verify presentations

'exchange:presentations': Grants permission to exchange presentations

So whats the problem? I can say with full confidence after years of experience building and deploying systems to support parameterized scopes like this that they are fragile, awkward, and lead to insecure corner cases.

See: https://github.com/w3c-ccg/vc-http-api/issues/218

Proposal 1: The APIs that use OAS3.0 MUST define securitySchemes per the OAS 3.0 spec. (@OR13 proposal addresses 4)

Proposal 2: The APIs that use OAS3.0 MUST define the use of the Link Header for suite and issuer id discovery (@TallTed 's proposal addressing 1/2/3)

Proposal 3: The APIs that use OAS3.0 MUST define the use of a .well-known JSON resource for conveying supported issuer ids and suites. (@OR13 's. proposal addressing 1/2/3)

the fundamental issue is that stringing a bunch of consonants together ("HTTP") rarely leads to something easy to say in conversation.

CHAPI

https://www.notion.soimages/image7.png

TL;DR: chapi.io is a site that helps developers integrate Verifiable Credential issuance, holding, and presentation into their applications. It includes a playground that can issue arbitrary VCs to digital wallets (web and native). It also includes tutorials on how Web Developers can add CHAPI integration to their websites. All you need to try it out is a web browser.

The credential selector is an icon-based selector for all the credentials that the chapi.io playground currently supports issuing. You can now click on an image of the credential you'd like to issue.

  • [...]

We have added a permanent resident card from the fictitious Government of Utopia to the list of credentials that can be issued. This credential uses the Citizenship Vocabulary[...]

TL;DR: In an attempt to support the current Jobs for the Future Plugfest, an Open Badge v3.0 example for an Academic Achievement has been added to the chapi.io playground. You can now see what a JFF badge issuance and transfer to a Holder wallet looks like in CHAPI (on mobile and web, on any device that can run a web browser). Images of the flow are attached.

Crypto

This type of independent review is critically important for U.S. Government entities who are deploying capabilities based on these standards to ensure that the technologies conform to relevant U.S. Federal government standards and requirements, including the Federal Information Security Management Act (FISMA) and National Institute of Technology (NIST) standards for use of cryptography.

Please find attached (and online at the link below) the results of this independent review and the associated cryptography implementation recommendations.

I've posted a new SSI blog entitled: "Protecting Sensitive Parts of Credentials with Cryptographically Enforceable Policies".

It has a proposal that enables credential issuers to encrypt sensitive parts of credentials in such a way that can only be decrypted by parties tha satisfy the issuer's policy (that was used to encrypt these parts). The blog motivates the need, introduces a high-level architecture, explains how it would work, and discusses some issues that need to be looked into.

Cryptography Review of W3C Verifiable Credentials Data Model (VCDM) and Decentralized Identifiers (DIDs) Standards and Cryptography Implementation Recommendations by David Balenson & Nick Genise

It's largely a view from the US NIST cybersecurity standards, which are used through most of the world, but not everywhere. In any case, it's a valuable perspective that I hope the VC2WG and DIDWG takes into the next stage of the work.

We (Danube Tech) have a "Universal Verifier" here: https://univerifier.io/

But I don't claim that it actually supports all the credential formats  and signature suites in existence...

Especially considering that at the last Internet Identity Workshop a lot of different formats were identified:

It suggests updates to the SafeCurves website

We are happy to announce today that we have our first demonstration of cross-vendor interoperability between Danube Tech and Digital Bazaar for verification regarding the Data Integrity and Ed25519Signature2020 work items:

https://www.notion.soimages/image5.png

This is a publication request for four Data Integrity Community Group

Final Reports. Namely:

DIDComm

Now that the DIDComm v2 spec is nearing completion, and there are robust libraries in multiple programming languages, we are starting a user group to share learnings as we put DIDComm into production. We will organize community resources, produce a handbook, foster application-level protocol creation, maintain the didcomm.org website and repo, and recommend best practices.

application/pdf attachment: DIDComm_v2_Primer.pdf

Wallets

This document describes a mechanism to transfer digital credentials securely between two devices. Secure credentials may represent a digital key to a hotel room, a digital key to a door lock in a house or a digital key to a car. Devices that share credentials may belong to the same or two different platforms (e.g. iOS and Android). Secure transfer may include one or more write and read operations. Credential transfer needs to be performed securely due to the sensitive nature of the information.

I am successfully able to integrate Okta cloud identity with SSI agent .

Looking for your feedback on how we can improve this moreDIF Wallet Security WG - Wallet Implementers Survey  Bastian, Paul (Friday, 7 January)

I summarized our goals and visions in this presentation, for more information check out the Github page

Also we ended up to initiating 2 new work items at the end of last year:

As most of us know, that eventually led to the realization of the many dimensions of decentralization and creation of the excellent "DID Method Rubric" by JoeA, RyanG, and DanielH (with support from a very large cast of characters in this community).

It feels like we're in the early throes of a "Wallet Rubric".

RDF

I think what happens is that a first blank node is created for the proof, and since that node has @container @graph, instead of being able to trace the relationships directly from credential to proof to verification method...

Each proof is being treated as a disjoint subgraph, and the relationship is not being preserved during import… [...]

I suspect this is solvable with a more complicated graph config: https://neo4j.com/labs/neosemantics/4.0/config/

But I wonder if we might correct this behavior in VC Data Model 2.0, such that RDF representations don't have this odd behavior when imported as labeled property graphs. [...]

answer on the github issue for the standard, I raised it here: https://github.com/w3c/vc-data-model/issues/881

The goal of this group is to standardize the way many of us digitally sign Verifiable Credentials. This working group has been about decade in the making (some would say two decades) and is important for achieving things like BBS+ selective disclosure as well as standardizing the way we format Verifiable Credentials before they are digitally signed.

The announcement is here

The proposed charter is here

I've instrumented the rdf-canonicalize library so I can inspect the order of execution, and it appears that what differs between my implementation and the Javascript one is the order of the permutations. The spec doesn't say how the permutations should be ordered, and my intuition is that the order does indeed matter - though I'm happy to be corrected if I'm wrong.

So, here is my question(s):

  • Does the order of the permutations matter?
  • If so, what order should they be in?

Quantum

What this means is that it is now possible to not have to depend on one signature format, and instead use multiple to meet different needs. The VC above supports NIST-approved cryptography today, while enabling the advanced use of BBS+ (if an organization would like to use it /before/ it is standardized at IETF), and also enabling protection if a quantum computer were to break both Ed25519 and BBS+... all on the same VC in a fairly compact format.

I look forward to continuing to work on JSON encoding for post quantum signature schemes.

In particular, support for JWS and JWK as building blocks for higher order cryptographic systems, such as DIDs and VCs.

If you are interested in contributing, please feel free to open issues here: https://github.com/mesur-io/post-quantum-signatures

The TLDR is to assume that we need hard answers as a community, and at the standards level, on crypto agility by 2024, as well as support for the key algorithms as listed above.

Assorted

Here's an illustration of the relationships between the initial DOMAIN and POOL txns used to bootstrap an example Aries VDR...

Just wanted to update folks here that the C2PA has released version 1.0 of their specification at https://c2pa.org/specifications/specifications/1.0/index.html.  As previously mentioned, it includes native support for VCs for use in identification of actors (be they human, organizations, etc.).  Thanks to everyone here for their input on our work and helping us to deliver.

I asked them whether they considered GNAP via slack.

They are chartered here: https://fedidcg.github.io/

To look at AuthN that breaks when browser primitives are removed.

They are currently focused on OIDC, SAML, WS-Fed.

The reason I asked them was in relation to the questions we have discussed regarding "What can GNAP replace".

Clearly GNAP can replace OAuth, but I think you both have now confirmed that GNAP does not replace OIDC, or federated identity...

We've been working on generating test vectors for: https://datatracker.ietf.org/doc/html/rfc8391 $1$2

That we could use to register the kty and alg for XMSS such that it could be used by JOSE and COSE.

I've reached the limits of my ability to move this ball forward, and am here to ask for help