From b463f9a790a9c39f28cdafc725365c2cb60b5961 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E2=A7=89=20infominer?= Date: Mon, 5 Jun 2023 07:25:11 +0530 Subject: [PATCH] new links from sort --- _data/content.csv | 8 +++++++- _data/standards.csv | 3 ++- 2 files changed, 9 insertions(+), 2 deletions(-) diff --git a/_data/content.csv b/_data/content.csv index 0ffaa63b..043f0084 100644 --- a/_data/content.csv +++ b/_data/content.csv @@ -43,7 +43,12 @@ Evernym,Evernym,,,,Verityflow,,,,,Verity Flow: Evernym's no-code solution for is Evernym,PRNewswire,,,,,LONDON,,,,Sovrin Foundation Launches First Dedicated Self-Sovereign Identity Network,"Evernym, Inc. announced today at the Ctrl-Shift Personal Information Economy conference that it has donated the intellectual property for the Sovrin Identity Network—the world's first and only dedicated self-sovereign identity platform—to a newly-formed nonprofit organization. The Sovrin Foundation, which is run by a group of internationally recognized identity experts, has a mission to empower everyone with a digital identity which they fully own and control.","Sovrin Foundation Launches First Dedicated Self-Sovereign Identity Network Sep 29, 2016, 02:00 ET LONDON, Sept. 29, 2016 /PRNewswire-USNewswire/ -- Evernym, Inc. announced today at the Ctrl-Shift Personal Information Economy conference that it has donated the intellectual property for the Sovrin Identity Network—the world's first and only dedicated self-sovereign identity platform—to a newly-formed nonprofit organization. The Sovrin Foundation, which is run by a group of internationally recognized identity experts, has a mission to empower everyone with a digital identity which they fully own and control. ""Imagine a world where fraud is reduced, logins are simpler and more secure, governments can slash red tape, and healthcare practitioners can provide care with patients' immediate consent,"" said Dr. Phillip Windley, Sovrin Foundation's inaugural Chair. ""Beyond these applications, the potential is limitless when global impact is considered. Developing nations will finally have an identity solution to underpin birth registration, land ownership, vaccination and refugee tracking."" The underlying problem Sovrin solves is that the Internet was designed to identify machines, but has no standard way to identify people. This new platform utilizes distributed ledger technology, a close cousin to Bitcoin's underlying blockchain, but specifically tailored to identity. Sovrin imparts not only full control to the user over their identity, but absolute sovereignty: no one can read it, use it, change it, or turn it off without the user's explicit consent. When identity is ""self-sovereign"", it becomes a hub for many types of interactions like secure messaging, data sharing, and the management of consent. These capabilities enable businesses to transition from being identity providers—typically a cost center—to being identity consumers, and putting users in control leads to higher customer satisfaction. ""Governments and private industry waste hundreds of billions a year on inefficient and inaccurate identity proofing measures, which rarely if ever put the consumer first,"" Timothy Ruff, Evernym's CEO, said. ""We recognized that a completely new platform was needed to enable universal digital identity, and for it to be trusted it needs to belong to the world and not to us."" To learn more visit http://www.sovrin.org. About The Sovrin Foundation Founded in September 2016, the Sovrin Foundation is a private-sector, international non-profit body for coordinating the global, stable operation of the Sovrin Identity Network. Supported by a Board of Trustees, Technical Governance Board, Executive Director and Staff, the Sovrin Foundation is the first of its kind. Sovrin's partners include global, national and local businesses, nonprofits, government, and civic organizations, along with developers, volunteers, health providers, donors, and more. For more information about Sovrin, visit http://www.sovrin.org or follow us on Twitter: @SovrinID and #Sovrin. SOURCE The Sovrin Foundation",https://www.prnewswire.com/news-releases/sovrin-foundation-launches-first-dedicated-self-sovereign-identity-network-300336702.html,,Press,,Meta,,,,,,,,2016-09-29,,,,,,,,,,,,, Evernym,Evernym,,,Samuel M. Smith; Dmitry Khovratovich,,,,,,Identity System Essentials,"The purpose of this white paper is to describe the essential characteristics of an identity system that provides sovereignty, security and privacy. Here the meaning of identity is derived from the characteristics of the identity system, that is, what the identity system provides. Instead of defining identity a priori, this white paper describes an identity system and then defines identity within the context of that identity system. Many of the features of the identity system has been influenced and inspired other proposed systems such as Open Reputation. This paper argues that an identity system that simultaneously provides a high degrees of sovereignty, security and privacy is best obtained via an open platform that employs distributed consensus protocols and modern cryptographic techniques.",,https://www.evernym.com/wp-content/uploads/2017/02/identity-system-essentials.pdf,,Whitepaper,,Meta,,,,,,,,2017-02,,,,,,,,,,,,, Evernym,Evernym,,,,Aries; Trinsic; IBM; IDramp; Esatus,,,,,Evernym’s Connect.Me,"Connect.Me
Our consumer digital wallet app
Enable customers and end users to manage all of their digital credentials from the safety of their own phone
Engage in structured two-way messaging over secure and private channels
Eliminate excess data collection with zero-knowledge proof technology, and other cutting-edge privacy features",,https://www.evernym.com/connectme/,,Product,,Product,,,,,,,,2021-09-27,,,,,,,,,,,,, +Evernym,DHS,,,,,,,,,News Release: DHS S&T Awards $749K to Evernym for Decentralized Key Management,"Managing public and private cryptographic keys in existing public key infrastructure as well as permissioned and permission-less blockchains continues to be a difficult challenge,” said S&T Identity Management Program Manager Anil John. “Through this project, Evernym will push the limits of the emerging decentralized key management system technology to deliver a high level of comfort to the public and American businesses as they integrate blockchain technologies into their technology portfolio.",,https://www.dhs.gov/science-and-technology/news/2017/07/20/news-release-dhs-st-awards-749k-evernym-decentralized-key,,Press,,Press,,,,,,,,2017-07-20,,,,,,,,,,,,, Evernym,Evernym,,,,Trinsic; IBM; Lissi; esatus,,,,,Evernym’s Verity,"Our flagship product for verifiable credential exchange
Issue and verify digital credentials
Easily integrate with back-end systems, using our REST API and SDKs in Java, Node.Js, Python, and .NET
Build for scale, with enterprise-grade architecture designed to support millions of users.
Enable open ecosystems and true data portability, with a solution based on open standards and interoperability",,https://www.evernym.com/verity/,https://evernym.wpenginepowered.com/wp-content/uploads/2021/10/verity-product.png,Product,,Product,,,,,,,,2021-10-10,,,,,,,,,,,,, +Evernym,Globalnewswire,,,,,,,,,IOTA and Evernym Launch Collaboration Aimed at Making the Internet of Things More Secure,,"“Evernym and IOTA are both intensively working toward achieving the same goal,” said IOTA founder David Sønstebø. “That is, a world where distributed ledgers facilitate the secure and efficient exchange of resources and data between all connected entities. This is a natural pairing and the world should pay attention to the exciting products that result from it.”",https://globenewswire.com/news-release/2017/08/31/1106292/0/en/IOTA-and-Evernym-Launch-Collaboration-Aimed-at-Making-the-Internet-of-Things-More-Secure.html,,Press,,Press,,,,,,,,2017-08-31,,,,,,,,,,,,, +Evernym,Globalnewswire,,,,,,,,,Evernym rolls with auto industry association MOBI to promote SSI in automotive and IoT,,"Cars, like people, have a digital identity problem that Evernym, a technology company focused on digital identity, wants to help solve. Cars that connect online will soon need to assert their own identities and be able to verify people’s identities in ways unthinkable just a few years ago. Is this replacement component a safe one? Can I let it access the car’s network? Is this person authorized to change my settings or drive me?",https://globenewswire.com/news-release/2018/10/05/1617425/0/en/Evernym-rolls-with-auto-industry-association-MOBI-to-promote-SSI-in-automotive-and-IoT.html,,Press,,Press,,,,,,,,2018-10-05,,,,,,,,,,,,, +Evernym,Evernym,,,,Hyperledger Foundation; Sovrin,,,,,Evernym's contributions to Hyperledger and Sovrin,,Evernym's contributions to Hyperledger and Sovrin. Video contents are listed here: https://wiki.hyperledger.org/display/indy/Evernym+Sprint+Demos,https://www.youtube.com/playlist?list=PLRp0viTDxBWGLdZk0aamtahB9cpJGV7ZF,,Meta,,Playlist,,,,Development,,,,2020-05-22,,,,,,,,,,,,, +Evernym,Globalnewswire,,,,,,,,,15 Industry Leaders Join Evernym’s Global Accelerator to Build the Future of Digital Identity.,,"Founding members of the Accelerator include industry leading organizations ATB Financial, IAG, Irish Life, the International Federation of Red Cross, Spark New Zealand, Truu and three provincial and state governments. Collectively, these organizations represent the interests of 100's of millions of individuals worldwide.",https://globenewswire.com/news-release/2018/11/07/1647044/0/en/15-Industry-Leaders-Join-Evernym-s-Global-Accelerator-to-Build-the-Future-of-Digital-Identity.html,,Press,,Press,,,,,,,,2018-11-07,,,,,,,,,,,,, Factom,,Accumulate,,,,,,,,Accumulate Network,"Accumulate’s story starts with the founding of Factom in 2014, a data publishing layer atop major blockchains. In 2021, Factom was acquired by Inveniam Capital Partners, bringing along lead engineers Paul Snow and Jay Smith. Inveniam Capital Partners created the Defi Devs subsidiary to be lead developers in the Accumulate community.

The Accumulate protocol is based on many of the best concepts that came of the Factom protocol, including data and identity focus while combining the components in a new and unique configuration.

The Accumulate protocol is designed by Paul Snow. Paul Snow is the Chief Blockchain Scientist at Inveniam and Defi Devs. Previously, he was the CEO and chief architect of the Factom protocol and co-author of the Factom White Paper, developing and implementing a “multi-leader” consensus algorithm for the blockchain network. Of note, he was founder and chief architect for DTRules, an open-source project providing decision table-based rules engines. He is listed as inventor on many of Factom’s 40+ patents, both issued and in progress, which serve as a foundation for Accumulate.",,https://accumulatenetwork.io/,,Company,,Company,Web3,,,Data,,Blockchain,"DID,Verifiable Credentials",2021-08,,https://twitter.com/accumulatehq,,https://accumulatenetwork.io/blog/,https://accumulatenetwork.io/feed/,https://discord.gg/X74hPx8VZT,https://www.crunchbase.com/organization/accumulate-358f,https://www.linkedin.com/company/accumulatenetwork/,https://accumulatenetwork.io/whitepaper/,https://docs.accumulatenetwork.io/,,, Gataca,,Gataca,,Irene Hernandez; Samuel Gómez,ESSIFLab,"USA, Massachusetts, Boston",Europe,,,Gataca,"Gataca is a cybersecurity company founded in Boston, MA, at the heart of MIT’s entrepreneurship and innovation ecosystem. It started as an academic research study seeking to reduce the risk of doing business online. As victims of the Equifax data breach later that year, the topic became very Personal.

We built Gataca because we knew there had to be a better way to protect our data.",,https://gataca.io/,,Company,,Company,Enterprise,ID,,Personal Data,,,DID,2018,,https://twitter.com/gataca_id,https://www.youtube.com/channel/UCaoK-LYmCPiXThYpLOShgvg/,https://gataca.io/blog/,,,https://www.crunchbase.com/organization/gataca-4a8f,https://www.linkedin.com/company/gataca/,https://developer.global.id/documentation/index.html,https://developer.global.id/,,, Gataca,Gataca,,,,,,,,,"Decentralized Finance & Self-sovereign Identity: A tale of decentralization, a new paradigm of trust",We are aware that DeFi’s growth is explosive and inevitable yet its growth needs to be sustainable and responsible. This can be done with SSI.,,https://gataca.io/blog/decentralized-finance-self-sovereign-identity-a-tale-of-decentralization-a-new-paradigm-of-trust/,,Post,,Explainer,,DWeb,DeFi,,,,,2021-05-07,,,,,,,,,,,,, @@ -243,6 +248,7 @@ Mattr,Mattr,,Medium,Nader Helmy,,,,,,JWT vs Linked Data Proofs: comparing Verifi Mattr,Mattr,,Medium,,,,,,,OpenID Connect Credential Provider,"Introducing OpenID Connect Credential Provider, an extension to OpenID Connect which enables the end-user to request credentials from an OpenID Provider and manage their own credentials in a digital wallet. This specification defines how an OpenID Provider can be extended beyond being the provider of simple identity assertions into being the provider of credentials, effectively turning these Identity Providers into Credential Providers.","Introducing OIDC Credential Provider OpenID Connect (OIDC) is a hugely popular user authentication and identity protocol on the web today. It enables relying parties to verify the identity of their users and obtain basic profile information about them in order to create an authenticated user experience. In typical deployments of OpenID Connect today, in order for a user to be able to exercise the identity they have with a relying party, the relying party must be in direct contact with what’s known as the OpenID Provider (OP). OpenID Providers are responsible for performing end-user authentication and issuing end-user identities to relying parties. This effectively means that an OpenID Provider is the Identity Provider (IdP) of the user. It’s the reason we often see buttons that say “Login with Google” or “Login with Facebook” during the login journey in an application or service. The website or application you want to use must first authenticate who you are with a provider like Google or Facebook which controls and manages that identity on your behalf. In this context we can think of the IdP as the “man in the middle.” This relationship prevents users from having a portable digital identity which they can use across different contexts and denies users any practical control over their identity. It also makes it incredibly easy for IdPs like Google or Facebook to track what users are doing, because the “man in the middle” can gather metadata about user behavior with little agency over how this identity data is shared and used. In order to allow users to have practical control over their identity, we need a new approach. Introducing OpenID Connect Credential Provider, an extension to OpenID Connect which enables the end-user to request credentials from an OpenID Provider and manage their own credentials in a digital wallet. This specification defines how an OpenID Provider can be extended beyond being the provider of simple identity assertions into being the provider of credentials, effectively turning these Identity Providers into Credential Providers. To maximize the reuse of existing infrastructure that’s deployed today, OIDC Credential Provider extends the core OpenID Connect protocol, maintaining the original design and intent of OIDC while enhancing it without breaking any of its assumptions or requirements. Instead of using OIDC to provide simple identity assertions directly to the relying party, we can leverage OIDC to offer a Verifiable Credential (VC) which is cryptographically bound to a digital wallet of the end-users choice. The digital wallet plays the role of the OpenID Client application which is responsible for interacting with the OpenID Provider and manages the cryptographic key material (both public and private keys) used to prove ownership of the credential. The credentials issued to the wallet are re-provable and reusable for the purposes of authentication. This helps to decouple the issuance of identity-related information by providers and the presentation of that information by a user, introducing the user-controlled “wallet” layer between issuers and relying parties. Essentially, a wallet makes a request to an OpenID provider in order to obtain a credential, and then receives the credential back into their wallet so they can later use it to prove their identity to relying parties. The interaction consists of three main steps: - The Client sends a signed credential request to the OpenID Provider with their public key - The OpenID Provider authenticates and authorizes the End-User to access the credential - The OpenID Provider responds to the Client with the issued VC In this new flow, the credential request extends the typical OpenID Connect request in that it expresses the intent to ask for something beyond the identity token of a typical OIDC flow. Practically, what this means is that the client uses a newly defined scope to indicate the intent of the request. The Client also extends the standard OIDC Request object to add cryptographic key material and proof of possession of that key material so that the credential can be bound to the wallet requesting it. Though the credential can be bound to a public key by default, it can also support different binding mechanisms, e.g. the credential can optionally be bound to a Decentralized Identifer (DID). In binding to a DID, the subject of the credential is able to maintain ownership of the credential on a longer life cycle due to their ability to manage and rotate keys while maintaining a consistent identifier. This eases the burden on data authorities to re-issue credentials when keys change and allows relying parties to verify that the credential is always being validated against the current public key of the end-user. The request can also indicate the format of the requested credential and even ask for specific claims present within the credential. This is designed to allow multiple credential formats to be used within the OIDC flow. On the provider side, OpenID Connect Providers are able to advertise which capabilities they support within the OIDC ecosystem using OpenID Connect Provider Metadata. This approach extends the metadata to support additional fields that express support for binding to DIDs, for issuing VCs, and advertising which DID methods, credential formats, credentials, and claims they are offering. This information can be utilized by the end-user’s digital wallet to help the user understand whether or not they wish to proceed with a credential request. In order to create a way for the wallet or client to connect to the OpenID Provider, the spec also defines a URL which functions as a Credential Offer that the client can invoke in order to retrieve and understand the types of credential being offered by the provider. The client registers the ‘openid’ URI scheme in order to be able to understand and render the offer to the user so they can make an informed decision. The sum of these changes means that OpenID Connect can allow users to have a portable digital identity credential that’s actually under their control, creating an opportunity for greater agency in digital interactions as well as preventing identity providers from being able to easily track user behavior. The OpenID Connect Credential Provider specification is in the process of being contributed to the OpenID Foundation (OIDF) as a work item at the A/B Working Group, where it will continue to be developed by the community behind OpenID Connect. Mattr is pleased to announce that our OIDC Bridge Platform Extension now uses OIDC Credential Provider under the hood to facilitate issuing credentials with OpenID Connect. OIDC Bridge hides the complexity associated with setting up infrastructure for credential issuance and simply requires configuration of a standard OpenID Provider. We also simplify the process of verifying credentials issued over OIDC Credential Provider by allowing the wallet to respond to requests, present credentials, and prove ownership and integrity of their credentials via OIDC. This new set of capabilities allows OpenID Providers greater flexibility around which claims end up in a credential, and allows for the support of many different credential types with a straight-forward authentication journey for end-users. Our Mobile Wallet supports the ability to invoke credential offers using OIDC Credential Provider as well as creating credential requests and receiving credentials from an OpenID Provider. To find out more, check out our tutorials on Mattr Learn, read the spec, or watch a recording of our presentation on this spec from the recent Internet Identity Workshop.",https://medium.com/Mattr-global/introducing-oidc-credential-provider-7845391a9881,,Post,,Standards,,,,,,,OIDC,2020-12-15,,,,,,,,,,,,, Mattr,Mattr,,GitHub,T. Looker ; J. Thompson ; A. Lemmon ; K. Cameron
,,,,,,OIDC Credential Provider,is “an extension to OpenID Connect which enables the end-user to request credentials from an OpenID Provider and manage their own credentials in a digital wallet.”,"|OpenID Connect Credential Provider||April 2021| |Looker, et al.||Informational||[Page]| OpenID Connect 1.0 is a simple identity layer on top of the OAuth 2.0 protocol. It enables relying parties to verify the identity of the End-User based on the authentication performed by an Authorization Server, as well as to obtain basic profile information about the End-User.¶ OpenID Providers today within OpenID Connect assume many roles, one of these is providing End-User claims to the relying party at the consent of the End-User such as their name or date of birth, providers performing this function are often referred to as being claims providers. However, the need for End-Users to be able to provide a variety of claims to a relying party from different providers is only increasing as many business processes that span multiple logical domains such as KYC and education move towards digital channels.¶ However, assuming a direct integration between the relying party and the claims providers leads to a difficult experience for End-Users to manage. Instead End-Users need a way to consolidate the different identities and claims they have available with various claims providers into one place where they can manage their release from. In doing this, a layer of in-direction is created between the relying party and the claims provider through the introduction of a new party that we refer to in this specification as being the ""holder"".¶ In OpenID Connect today the existing ways to communicate End-User claims to relying parties are the id_token and the userinfo endpoint, however these mechanisms alone are unsuitable for the style of indirect presentation of claims to relying parties via a holder, as the relying party must be able to authenticate the authority of the holder to be presenting the claims on behalf of the End-User. Instead in order to support this style of flow, this specification defines a new vehicle for communicating End-User claims called a ""credential"". In addition to this definition this specification defines how an existing OpenID Provider can be extended to issue ""credentials"" to holders.¶ OpenID Connect 1.0 is a simple identity layer on top of the OAuth 2.0 protocol. It enables relying parties to verify the identity of the End-User based on the authentication performed by an Authorization Server, as well as to obtain basic profile information about the End-User.¶ OpenID Providers today within OpenID Connect assume many roles, one of these is providing End-User claims to the relying party at the End-User's consent such as their name or date of birth. OpenID providers performing this function are often referred to as being claims providers. However, the need for End-Users to be able to provide a variety of claims from different providers is only increasing as many business processes that span multiple logical domains such as KYC and education move towards digital channels.¶ However, assuming a direct integration between the relying party and the claims providers leads to a difficult experience for End-Users to manage. Instead End-Users need a way to consolidate the different identities and claims they have available with various claims providers into one place where they can manage their release from. In doing this, a layer of indirection is created between the relying party and the claims provider through the introduction of a new party that we refer to in this specification as being the ""holder"".¶ In OpenID Connect today the existing ways to communicate End-User claims to relying parties are the id_token and the userinfo endpoint. However, these mechanisms alone are unsuitable for the style of indirect presentation of claims to relying parties via a holder as the relying party must be able to authenticate the authority of the holder to be presenting the claims on behalf of the End-User. Instead, in order to support this style of flow, this specification defines a new vehicle for communicating End-User claims called a ""credential"". In addition to this definition this specification defines how an existing OpenID Provider can be extended to issue ""credentials"" to holders.¶ To reiterate, this specification defines a protocol where a Credential Issuer (OP) may provide a Credential (set of claims) to a Credential Holder (acting as an RP) of which that Credential Holder (acting as an OP) controls and may present onward to Verifiers (RPs).¶ Note that the protocol for a Credential Holder to present a Credential to a Credential Verifier is outside the scope of this specification.¶ The key words ""MUST"", ""MUST NOT"", ""REQUIRED"", ""SHALL"", ""SHALL NOT"", ""SHOULD"", ""SHOULD NOT"", ""RECOMMENDED"", ""NOT RECOMMENDED"", ""MAY"", and ""OPTIONAL"" in this document are to be interpreted as described in RFC 2119 @!RFC2119.¶ In the .txt version of this document, values are quoted to indicate that they are to be taken literally. When using these values in protocol messages, the quotes MUST NOT be used as part of the value. In the HTML version of this document, values to be taken literally are indicated by the use of this fixed-width font.¶ All uses of JSON Web Signature (JWS) JWS and JSON Web Encryption (JWE) JWE data structures in this specification utilize the JWS Compact Serialization or the JWE Compact Serialization; the JWS JSON Serialization and the JWE JSON Serialization are not used.¶ This specification uses the terms defined in OpenID Connect Core 1.0; in addition, the following terms are also defined:¶ A set of claims about the End-User (subject) which is cryptographically bound to the credential holder in an authenticatable manner based on public/private key cryptography.¶ An OpenID Connect Authentication Request that results in the End-User being authenticated by the Authorization Server and the Client (Credential Holder) receiving a credential about the authenticated End-User.¶ A role an entity performs by holding credentials and presenting them to RPs on behalf of the consenting End-User (subject of the credentials). A Credential Holder serves the role of an OP (when presenting credentials to RPs) and an RP (when receiving credentials from Credential Issuers).¶ A role an entity performs by asserting claims about one or more subjects, creating a credential from these claims (cryptographically binding them to the holder), and transmitting the credential to the Credential Holder. A Credential Issuer is an OP that has been extended in order to also issue credentials.¶ An entity about which claims are made. Example subjects include human beings, animals, and things. In many cases the Credential Holder of a credential is the subject, but in certain cases it is not. For example, a parent (the Credential Holder) might hold the credentials of a child (the subject), or a pet owner (the Credential Holder) might hold the credentials of their pet (the subject). Most commonly the subject will be the End-User.¶ A role an entity performs by receiving one or more credentials for processing. A verifier is an RP that has been extended to receive and process credentials.¶ This specification extends the OpenID Connect protocol for the purposes of credential issuance. Whereby credential issuance refers to a protocol where a Credential Holder (acting as an RP) makes a request to a Credential Issuer (acting as an OP) to have a credential issued to it so that it may at a later stage then present this credential at the consent of the end user to Credential Verifiers (RPs). The steps in the credential issuance protocol are as follows:¶ The Credential Holder (acting as an RP) sends a Credential Request to the Credential Issuer (acting as an OP).¶ The Credential Issuer authenticates the End-User and obtains authorization.¶ The Credential Issuer responds with a Credential.¶ These steps are illustrated in the following diagram:¶ +----------+ +----------+ | | | | | |---(1)OpenID Credential Request--->| | | | | | | | +--------+ | | | | | | | | |Credential| | End- |<--(2) AuthN & AuthZ-->|Credential| | Holder | | User | | Issuer | | (RP) | | | | (OP) | | | +--------+ | | | | | | | |<--(3)OpenID Credential Response---| | | | | | +----------+ +----------+¶ Note - Outside of the scope for this specification is how the Credential Holder then exercises presentation of this credential with a Credential Verifier, however the diagram looks like the following.¶ The Credential Verifier (acting as a relying party) sends an OpenID Request to the Credential Holder (acting an OP).¶ The Credential Holder authenticates the End-User and obtains authorization.¶ The Credential Holder responds with a Credential Presentation.¶ +----------+ +----------+ | | | | | |---(1)OpenID Connect Credential Presentation Request------>| | | | | | | | +--------+ | | | | | | | | |Credential| | End- |<--(2) AuthN & AuthZ-->|Credential| | Verifier | | User | | Holder | | (RP) | | | | (OP) | | | +--------+ | | | | | | | |<--(3)OpenID Connect Credential Presentation Response------| | | | | | +----------+ +----------+¶ A Credential Request is an OpenID Connect authentication request made by a Credential Holder that requests the End-User to be authenticated by the Credential Issuer and consent be granted for a credential containing the requested claims about the End-User be issued to it.¶ The following section outlines how an OpenID Connect Authentication Request is extended in order to become a valid Credential Request.¶ The simplest OpenID Connect Credential Request is an ordinary OpenID Connect request that makes use of one additional scope, openid_credential.¶ A non-normative example of the Credential Request.¶ HTTP/1.1 302 Found Location: https://server.example.com/authorize? response_type=code &scope=openid%20openid_credential &client_id=s6BhdRkqt3 &state=af0ifjsldkj &redirect_uri=https%3A%2F%2Fclient.example.org%2Fcb &credential_format=w3cvc-jsonld¶ When a request of this nature is made, the access_token issued to the Credential Holder authorizes it to access the credential endpoint to obtain a credential from the Credential Issuer.¶ A Credential Request uses the OpenID and OAuth2.0 request parameters as outlined in section 3.1.2.1 of OpenID Connect core, except for the following additional constraints.¶ REQUIRED. A Credential Request MUST contain the openid_credential scope value in the second position directly after the openid scope.¶ REQUIRED. Determines the format of the credential returned at the end of the flow, values supported by the OpenID Provider are advertised in their openid-configuration metadata, under the credential_formats_supported attribute.¶ OPTIONAL. Used when making a Signed Credential Request, defines the key material the Credential Holder is requesting the credential to be bound to and the key responsible for signing the request object. The value is a JSON Object that is a valid JWK.¶ OPTIONAL. Defines the relationship between the key material the Credential Holder is requesting the credential to be bound to and a decentralized identifier. Processing of this value requires the CI to support the resolution of decentralized identifiers which is advertised in their openid-configuration metadata, under the dids_supported attribute. The value of this field MUST be a valid decentralized identifier.¶ Public private key pairs are used by a requesting Credential Holder to establish a means of binding to the resulting credential. A Credential Holder making a Credential Request to a Credential Issuer must prove control over this binding mechanism during the request, this is accomplished through the extended usage of a signed request defined in OpenID Connect Core.¶ It is RECOMMENDED that a Credential Request flow use the authorization code flow as defined in OpenID Connect core.¶ Successful and Error Authentication Response are in the same manor as OpenID Connect Core 1.0 with the code parameter always being returned with the Authorization Code Flow.¶ On Request to the Token Endpoint the grant_type value MUST be authorization_code inline with the Authorization Code Flow and the code value included as a parameter.¶ The following is a non-normative example of a response from the token endpoint, whereby the access_token authorizes the Credential Holder to request a credential from the credential endpoint.¶ { ""access_token"": ""eyJhbGciOiJSUzI1NiIsInR5cCI6Ikp..sHQ"", ""token_type"": ""bearer"", ""expires_in"": 86400, ""id_token"": ""eyJodHRwOi8vbWF0dHIvdGVuYW50L..3Mz"" }¶ The Credential Endpoint is an OAuth 2.0 Protected Resource that when called, returns Claims about the authenticated End-User in the form of a credential. To obtain a credential on behalf of the End-User, the Credential Holder makes a request to the Credential Endpoint using an Access Token obtained through OpenID Connect Authentication whereby the the openid_credential scope was granted.¶ Communication with the Credential Endpoint MUST utilize TLS. See Section 16.17 for more information on using TLS.¶ The Credential Endpoint MUST support the use of HTTP POST methods defined in RFC 2616 [RFC2616].¶ It is recommended that the Credential Endpoint SHOULD enforce presentation of the OAuth2.0 Access Token to be sender constrained DPOP. However the Credential Endpoint MAY also accept Access Tokens as OAuth 2.0 Bearer Token Usage [RFC6750].¶ The Credential Endpoint SHOULD support the use of Cross Origin Resource Sharing (CORS) [CORS] and or other methods as appropriate to enable Java Script Clients to access the endpoint.¶ The Credential Holder may provide a signed request object containing the sub to be used as the subject for the resulting credential. When a sub claim is present within the request object an associated sub_jwk claim MUST also be present of which the request object MUST be signed with, therefore proving control over the sub.¶ The Credential Holder may also specify the credential_format they wish the returned credential to be formatted as. If the Credential Issuer receiving the request does not support the requested credential format they it MUST return an error response, as per [TODO]. If the credential_format is not specified in the request the Credential Issuer SHOULD respond with their preferred or default format. (Note if we are going to have a default we need to specify it or is it at the discretion of the Credential Issuer to determine this?)¶ When a signed request is not provided the Credential Issuer will use the sub associated with the initial Credential request, where possible. If a sub value is not available the Credential Issuer MUST return an error response, as per [TODO].¶ OPTIONAL. A valid OIDC signed JWT request object. The request object is used to provide a sub the Credential Holder wishes to be used as the subject of the resulting credential as well as provide proof of control of that sub.¶ A non-normative example of a Signed Credential request.¶ POST /credential HTTP/1.1 Host: https://issuer.example.com Authorization: Bearer Content-Type: application/json { ""request"": }¶ Where the decoded payload of the request parameter is as follows:¶ { ""aud"": ""https://issuer.example.com"", ""iss"": ""https://wallet.example.com"", ""sub"": ""urn:uuid:dc000c79-6aa3-45f2-9527-43747d5962a5"", ""sub_jwk"" : { ""crv"":""secp256k1"", ""kid"":""YkDpvGNsch2lFBf6p8u3"", ""kty"":""EC"", ""x"":""7KEKZa5xJPh7WVqHJyUpb2MgEe3nA8Rk7eUlXsmBl-M"", ""y"":""3zIgl_ml4RhapyEm5J7lvU-4f5jiBvZr4KgxUjEhl9o"" }, ""credential_format"": ""w3cvc-jwt"", ""nonce"": ""43747d5962a5"", ""iat"": 1591069056, ""exp"": 1591069556 }¶ format : REQUIRED. The proof format the credential was returned in. For example w3cvc-jsonld or w3cvc-jwt. credential : REQUIRED. A cryptographically verifiable proof in the defined proof format. Most commonly a Linked Data Proof or a JWS.¶ { ""format"": ""w3cvc-jsonld"", ""credential"": }¶ Formats of the credential can vary, examples include JSON-LD or JWT based Credentials, the Credential Issuer SHOULD make their supported credential formats available at their openid-configuration metadata endpoint.¶ The following is a non-normative example of a Credential issued as a W3C Verifiable Credential 1.0 compliant format in JSON-LD.¶ { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://www.w3.org/2018/credentials/examples/v1"" ], ""id"": ""http://example.gov/credentials/3732"", ""type"": [""VerifiableCredential"", ""UniversityDegreeCredential""], ""issuer"": ""did:key:z6MkjRagNiMu91DduvCvgEsqLZDVzrJzFrwahc4tXLt9DoHd"", ""issuanceDate"": ""2020-03-10T04:24:12.164Z"", ""credentialSubject"": { ""id"": ""urn:uuid:dc000c79-6aa3-45f2-9527-43747d5962a5"", ""jwk"": { ""crv"":""secp256k1"", ""kid"":""YkDpvGNsch2lFBf6p8u3"", ""kty"":""EC"", ""x"":""7KEKZa5xJPh7WVqHJyUpb2MgEe3nA8Rk7eUlXsmBl-M"", ""y"":""3zIgl_ml4RhapyEm5J7lvU-4f5jiBvZr4KgxUjEhl9o"" }, ""givenName"": ""John"", ""familyName"": ""Doe"", ""degree"": { ""type"": ""BachelorDegree"", ""name"": ""Bachelor of Science and Arts"" } }, ""proof"": { ""type"": ""Ed25519Signature2018"", ""created"": ""2020-04-10T21:35:35Z"", ""verificationMethod"": ""did:key:z6MkjRagNiMu91DduvCvgEsqLZDVzrJzFrwahc4tXLt9DoHd#z6MkjRagNiMu91DduvCvgEsqLZDVzrJzFrwahc4tXLt9DoHd"", ""proofPurpose"": ""assertionMethod"", ""jws"": ""eyJhbGciOiJFZERTQSIsImI2NCI6ZmFsc2UsImNyaXQiOlsiYjY0Il19..l9d0YHjcFAH2H4dB9xlWFZQLUpixVCWJk0eOt4CXQe1NXKWZwmhmn9OQp6YxX0a2LffegtYESTCJEoGVXLqWAA"" } }¶ The following is a non-normative example of a Credential issued as a JWT¶ ewogICJhbGciOiAiRVMyNTYiLAogICJ0eXAiOiAiSldUIgp9.ewogICJpc3MiOiAiaXNzdWVyIjogImh0dHBzOi8vaXNzdWVyLmVkdSIsCiAgInN1YiI6ICJkaWQ6ZXhhbXBsZToxMjM0NTYiLAogICJpYXQiOiAxNTkxMDY5MDU2LAogICJleHAiOiAxNTkxMDY5NTU2LAogICJodHRwczovL3d3dy53My5vcmcvMjAxOC9jcmVkZW50aWFscy9leGFtcGxlcy92MS9kZWdyZWUiOiB7CiAgICAgImh0dHBzOi8vd3d3LnczLm9yZy8yMDE4L2NyZWRlbnRpYWxzL2V4YW1wbGVzL3YxL3R5cGUiOiAiQmFjaGVsb3JEZWdyZWUiLAogICAgICJodHRwczovL3d3dy53My5vcmcvMjAxOC9jcmVkZW50aWFscy9leGFtcGxlcy92MS9uYW1lIjogIkJhY2hlbG9yIG9mIFNjaWVuY2UgYW5kIEFydHMiCiAgfQp9.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c¶ And the decoded Claim Set of the JWT¶ { ""iss"": ""https://issuer.example.com"", ""sub"": ""urn:uuid:dc000c79-6aa3-45f2-9527-43747d5962a5"", ""sub_jwk"" : { ""crv"":""secp256k1"", ""kid"":""YkDpvGNsch2lFBf6p8u3"", ""kty"":""EC"", ""x"":""7KEKZa5xJPh7WVqHJyUpb2MgEe3nA8Rk7eUlXsmBl-M"", ""y"":""3zIgl_ml4RhapyEm5J7lvU-4f5jiBvZr4KgxUjEhl9o"" }, ""iat"": 1591069056, ""exp"": 1591069556, ""https://www.w3.org/2018/credentials/examples/v1/degree"": { ""https://www.w3.org/2018/credentials/examples/v1/type"": ""BachelorDegree"", ""https://www.w3.org/2018/credentials/examples/v1/name"": ""Bachelor of Science and Arts"" } }¶ TODO improve this section¶ Decentralized identifiers are a resolvable identifier to a set of statements about the did subject including a set of cryptographic material (e.g public keys). Using this cryptographic material, a decentralized identifier can be used as an authenticatable identifier in a credential, rather than using a public key directly.¶ A Credential Holder submitting a signed Credential Request can request, that the resulting credential be bound to the Credential Holder through the usage of decentralized identifiers by using the did field.¶ A Credential Holder prior to submitting a credential request SHOULD validate that the CI supports the resolution of decentralized identifiers by retrieving their openid-configuration metadata to check if an attribute of dids_supported has a value of true.¶ The Credential Holder SHOULD also validate that the CI supports the did method to be used in the request by retrieving their openid-configuration metadata to check if an attribute of did_methods_supported contains the required did method.¶ A CI processing a credential request featuring a decentralized identifier MUST follow the following additional steps to validate the request.¶ Validate the value in the did field is a valid decentralized identifier.¶ Resolve the did value to a did document.¶ Validate that the key in the sub_jwk field of the request is referenced in the authentication section of the DID Document.¶ If any of the steps fail then the CI MUST respond to the request with the Error Response parameter, section 3.1.2.6. with Error code: invalid_did.¶ The following is a non-normative example of requesting the issuance of a credential that uses a decentralized identifier.¶ { ""response_type"": ""code"", ""client_id"": ""IAicV0pt9co5nn9D1tUKDCoPQq8BFlGH"", ""sub_jwk"" : { ""crv"":""secp256k1"", ""kid"":""YkDpvGNsch2lFBf6p8u3"", ""kty"":""EC"", ""x"":""7KEKZa5xJPh7WVqHJyUpb2MgEe3nA8Rk7eUlXsmBl-M"", ""y"":""3zIgl_ml4RhapyEm5J7lvU-4f5jiBvZr4KgxUjEhl9o"" }, ""did"": ""did:example:1234"", ""redirect_uri"": ""https://Client.example.com/callback"", ""credential_format"": ""w3cvc-jsonld"" }¶ The following is a non-normative example of a credential endpoint response for the request shown above.¶ { ""format"": ""w3cvc-jsonld"", ""credential"": { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://www.w3.org/2018/credentials/examples/v1"" ], ""id"": ""http://example.gov/credentials/3732"", ""type"": [""VerifiableCredential"", ""UniversityDegreeCredential""], ""issuer"": ""https://issuer.edu"", ""issuanceDate"": ""2020-03-10T04:24:12.164Z"", ""credentialSubject"": { ""id"": ""did:example:1234"", ""degree"": { ""type"": ""BachelorDegree"", ""name"": ""Bachelor of Science and Arts"" } }, ""proof"": { ""type"": ""Ed25519Signature2018"", ""created"": ""2020-04-10T21:35:35Z"", ""verificationMethod"": ""https://issuer.edu/keys/1"", ""proofPurpose"": ""assertionMethod"", ""jws"": ""eyJhbGciOiJFZERTQSIsImI2NCI6ZmFsc2UsImNyaXQiOlsiYjY0Il19..l9d0YHjcFAH2H4dB9xlWFZQLUpixVCWJk0eOt4CXQe1NXKWZwmhmn9OQp6YxX0a2LffegtYESTCJEoGVXLqWAA"" } } }¶ An OpenID provider can use the following meta-data elements to advertise its support for credential issuance in its openid-configuration defined by OpenID-Discovery.¶ credential_supported Boolean value indicating that the OpenID provider supports the credential issuance flow.¶ credential_endpoint A JSON string value indicating the location of the OpenID providers credential endpoint.¶ credential_formats_supported A JSON array of strings identifying the resulting format of the credential issued at the end of the flow.¶ credential_claims_supported A JSON array of strings identifying the claim names supported within an issued credential.¶ credential_name A human readable string to identify the name of the credential offered by the provider.¶ dids_supported Boolean value indicating that the OpenID provider supports the resolution of decentralized identifiers.¶ did_methods_supported A JSON array of strings representing Decentralized Identifier Methods that the OpenID provider supports resolution of.¶ The following is a non-normative example of the relevant entries in the openid-configuration meta data for an OpenID Provider supporting the credential issuance flow¶ { ""dids_supported"": true, ""did_methods_supported"": [ ""did:ion:"", ""did:elem:"", ""did:sov:"" ], ""credential_supported"": true, ""credential_endpoint"": ""https://server.example.com/credential"", ""credential_formats_supported"": [ ""w3cvc-jsonld"", ""jwt"" ], ""credential_claims_supported"": [ ""given_name"", ""last_name"", ""https://www.w3.org/2018/credentials/examples/v1/degree"" ], ""credential_name"": ""University Credential"" }¶ In certain instances it is advantageous to be able to construct a URL which points at an OpenID Connect provider, of which is invocable by a supporting OpenID Connect client.¶ The URL SHOULD use the scheme openid to allow supporting clients to register intent to handle the URL.¶ The URL SHOULD feature the term discovery in the host portion of the URL identifying the intent of the URL is to communicate discovery related information.¶ The URL SHOULD feature a query parameter with key issuer who's value corresponds to a valid issuer identifier as defined in OpenID Connect Discovery. This identifier MUST be a url of the scheme https:// of which when concatenated with the string /.well-known/openid-configuration and dereferenced by an HTTP GET request results in the retrieval of the providers OpenID Connect Metadata.¶ The following is a non-normative example of an invocable URL pointing to the OpenID Provider who has the issuer identifier of https://issuer.example.com¶ openid://discovery?issuer=https://issuer.example.com¶",https://Mattrglobal.github.io/oidc-client-bound-assertions-spec/,,Spec,,Standards,,,,,,,OIDC,2021-04-20,,,,,,,,,,,,, Mattr,CCG,,GitHub,Dave Longley ; Manu Sporny,Mattr,,,,,Revocation List 2020,"This specification describes a privacy-preserving, space-efficient, and high-performance mechanism for publishing the revocation status of Verifiable Credentials.","This specification describes a privacy-preserving, space-efficient, and high-performance mechanism for publishing the revocation status of Verifiable Credentials. This document is experimental and is undergoing heavy development. It is inadvisable to implement the specification in its current form. An experimental implementation is available. It is often useful for an issuer of verifiable credentials [[VC-DATA-MODEL]] to link to a location where a verifier can check to see if a credential has been revoked. There are a variety of privacy and performance considerations that are made when designing, publishing, and processing revocation lists. One such privacy consideration happens when there is a one-to-one mapping between a verifiable credential and a URL where the revocation status is published. This type of mapping enables the website that publishes the URL to correlate the holder, time, and verifier when the status is checked. This could enable the issuer to discover the type of interaction the holder is having with the verifier, such as providing an age verification credential when entering a bar. Being tracked by the issuer of a driver's license when entering an establishment violates a privacy expectation that many people have today. Similarly, there are performance considerations that are explored when designing revocation lists. One such consideration is where the list is published and the burden it places from a bandwidth and processing perspective, both on the server and the client fetching the information. In order to meet privacy expectations, it is useful to bundle the status of large sets of credentials into a single list to help with herd privacy. However, doing so can place an impossible burden on both the server and client if the status information is as much as a few hundred bytes in size per credential across a population of hundreds of millions of holders. The rest of this document proposes a highly compressible, bitstring-based revocation list mechanism with strong privacy-preserving characteristics, that is compatible with the architecture of the Web, is highly space-efficient, and lends itself well to content distribution networks. As an example of using this specification to achieve a number of beneficial privacy and performance goals, it is possible to create a revocation list that can be constructed for 100,000 verifiable credentials that is roughly 12,500 bytes in size in the worst case. In a case where a few hundred credentials have been revoked, the size of the list is less than a few hundred bytes while providing privacy in a herd of 100,000 individuals. This section outlines the core concept utilized by the revocation list mechanism described in this document. At the most basic level, revocation information for all verifiable credentials issued by an issuer are expressed as simple binary values. The issuer keeps a bitstring list of all verifiable credentials it has issued. Each verifiable credential is associated with a position in the list. If the binary value of the position in the list is 1 (one), the verifiable credential is revoked, if it is 0 (zero) it is not revoked. One of the benefits of using a bitstring is that it is a highly compressible data format since, in the average case, large numbers of credentials will remain unrevoked. This will ensure long sections of bits that are the same value and thus highly compressible using run-length compression techniques such as ZLIB [[RFC1950]]. The default bitstring size is 16KB (131,072 entries), and when only a handful of verifiable credentials are revoked, the compressed bitstring size is reduced down to a few hundred bytes. Another benefit of using a bitstring is that it enables large numbers of verifiable credential revocation statuses to be placed in the same list. This specification utilizes a minimum bitstring length of 131,072 (16KB). This population size ensures an adequate amount of herd privacy in the average case. If better herd privacy is required, the bitstring can be made to be larger. The following sections outlines the data model for this document. When an issuer desires to enable revocation for a verifiable credential, they MAY add a status property that uses the data model described in this specification. |Property||Description| |id|| The constraints on the | |type|| The | |revocationListIndex|| The | |revocationListCredential|| The | { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/VC-revocation-list-2020/v1"" ], ""id"": ""https://example.com/credentials/23894672394"", ""type"": [""VerifiableCredential""], ""issuer"": ""did:example:12345"", ""issued"": ""2020-04-05T14:27:42Z"", ""credentialStatus"": { ""id"": ""https://dmv.example.gov/credentials/status/3#94567"", ""type"": ""RevocationList2020Status"", ""revocationListIndex"": ""94567"", ""revocationListCredential"": ""https://example.com/credentials/status/3"" }, ""credentialSubject"": { ""id"": ""did:example:6789"", ""type"": ""Person"" }, ""proof"": { ... } } When a revocation list is published, the result is a verifiable credential that encapsulates the revocation list. The following section describes the format of the verifiable credential that encapsulates the revocation list: |Property||Description| |id|| The verifiable credential that contains the revocation list MUST express an | |type|| The verifiable credential that contains the revocation list MUST express a | |credentialSubject.type|| The | |credentialSubject.encodedList|| The | { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/VC-revocation-list-2020/v1"" ], ""id"": ""https://example.com/credentials/status/3"", ""type"": [""VerifiableCredential"", ""RevocationList2020Credential""], ""issuer"": ""did:example:12345"", ""issued"": ""2020-04-05T14:27:40Z"", ""credentialSubject"": { ""id"": ""https://example.com/status/3#list"", ""type"": ""RevocationList2020"", ""encodedList"": ""H4sIAAAAAAAAA-3BMQEAAADCoPVPbQsvoAAAAAAAAAAAAAAAAP4GcwM92tQwAAA"" }, ""proof"": { ... } } The following section outlines the algorithms that are used to generate and validate revocation lists as described by this document. The following process, or one generating the exact output, MUST be followed when producing a RevocationList2020Credential: encodedListproperty set. encodedListto compressed bitstring. The following process, or one generating the exact output, MUST be followed when validating a verifiable credential that is contained in a RevocationList2020Credential: statusentry that is a RevocationList2020Status. encodedListproperty of the RevocationList2020Credential. revocationListIndexproperty of the RevocationList2020Status. trueif revoked is 1, false otherwise. The following process, or one generating the exact output, MUST be followed when generating a revocation list bitstring. The algorithm takes a issuedCredentials list as input and returns a compressed bitstring as output. revocationListIndexvalue in a revoked credential in issuedCredentials, set the bit to 1 (one), otherwise set the bit to 0 (zero). The following process, or one generating the exact output, MUST be followed when expanding a compressed revocation list bitstring. The algorithm takes a compressed bitstring as input and returns a uncompressed bitstring as output. This section details the general privacy considerations and specific privacy implications of deploying this specification into production environments. This document specifies a minimum revocation bitstring length of 131,072, or 16KB uncompressed. This is enough to give holders an adequate amount of herd privacy if the number of verifiable credentials issued is large enough. However, if the number of issued verifiable credentials is a small population, the ability to correlate an individual increases because the number of allocated slots in the bitstring is small. Correlating this information with, for example, where the geographic request came from can also help to correlate individuals that have received a credential from the same geographic region. It is possible for verifiers to increase the privacy of the holder whose verifiable credential is being checked by caching revocation lists that have been fetched from remote servers. By caching the content locally, less correlatable information can be inferred from verifier-based access patterns on the revocation list. The use of content distribution networks by issuers can increase the privacy of holders by reducing or eliminating requests for the revocation lists from the issuer. Often, a request for a revocation list will be served by an edge device and thus be faster and reduce the load on the server as well as cloaking verifiers and holders from issuers. There are a number of security considerations that implementers should be aware of when processing data described by this specification. Ignoring or not understanding the implications of this section can result in security vulnerabilities. While this section attempts to highlight a broad set of security considerations, it is not a complete list. Implementers are urged to seek the advice of security and cryptography professionals when implementing mission critical systems using the technology outlined in this specification. Write security considerations. There are a number of accessibility considerations implementers should be aware of when processing data described in this specification. As with any web standards or protocols implementation, ignoring accessibility issues makes this information unusable to a large subset of the population. It is important to follow accessibility guidelines and standards, such as [[WCAG21]], to ensure all people, regardless of ability, can make use of this data. This is especially important when establishing systems utilizing cryptography, which have historically created problems for assistive technologies. This section details the general accessibility considerations to take into account when utilizing this data model. Write accessibility considerations. There are a number of internationalization considerations implementers should be aware of when publishing data described in this specification. As with any web standards or protocols implementation, ignoring internationalization makes it difficult for data to be produced and consumed across a disparate set of languages and societies, which would limit the applicability of the specification and significantly diminish its value as a standard. This section outlines general internationalization considerations to take into account when utilizing this data model. Write i18n considerations.",https://w3c-ccg.github.io/VC-status-rl-2020/,,Spec,,Standards,,,Revocation,,,VC,,2020-04-05,,,,,,,,,,,,, +Mattr,Mattr,,,,,,,,,Verifiable Credential based Authentication via OpenID Connect,"At MATTR, we’ve been working hard on an exciting opportunity with the Government of British Columbia (BC Gov) in Canada. In June 2019, the BC Gov Verifiable Organisations Network team put out a “Code With Us” development bounty to integrate KeyCloak, their chosen enterprise Identity and Access Management (IAM) solution, with a new W3C standard called Verifiable Credentials. This work led to a solution that enables the use of Verifiable Credentials (VC) as a means of authentication that is interoperable with OpenID Connect (OIDC). We call this work VC-AuthN-OIDC. The output is an adapter that bridges these standards and enables a whole new set of capabilities through a simple extension of most modern IAM solutions.",,https://mattr.global/verifiable-credential-based-authentication-via-openid-connect/,https://mattr.global/wp-content/uploads/2019/10/0_Kcm1VBTjAxZP9Dkk-1024x465.png,post,,Standards,,,,,,,,2019-12-10,,,,,,,,,,,,, Meeco,,Meeco,,Katryna Dow,MyData; Respect Network; DIF,"Australia, Melbourne, Victoria",Europe,GDPR,,Meeco,"Meeco gives people and organisations the tools to access, control and create mutual value from Personal data

Privately, securely and with explicit consent","Put your customers in control of their Personal data, identity and digital assets Unlock the power of permissioned Personal data and digital assets with enterprise infrastructure that has privacy, security and convenience built in. Reduce cost and meet data compliance requirements on a range of uses cases, from decentralised identity to digital asset management. Deploy new business models built on digital trust and evolve existing applications from Web2 to Web3 with our platform for Personal identity and data ecosystems. Trust is a key enabler of connected digital communities. It is central to delivering sustainable outcomes across financial services, mobility, health, education, environment, public administration, employment and eCommerce.next Seamless experiences are underpinned by tools that deliver interoperability. Citizens, employees, students, patients and customers can securely transact across networks and ecosystems.next Hybrid infrastructure will support the transition from Web2 to Web3, delivering security, convenience and decentralised services.next Enterprise customers can complete their Web3 transition with Secure Value Exchange by Meeco. Offering secure data storage through to self-sovereign identity and verifiable credentials, SVX is a complete toolkit for enterprise customers to deploy trusted Personal data ecosystems.next",https://meeco.me,,Company,,Company,Enterprise,ID,,Personal Data,,,,2012-08-23,https://github.com/Meeco,https://twitter.com/meeco_me,https://www.youtube.com/user/MeecoMe,https://blog.meeco.me/,,,https://www.crunchbase.com/organization/meeco,https://www.linkedin.com/company/meeco-me/,,https://dev.meeco.me/,https://app.meeco.me/,, Meeco,Meeco,,,,,,,EU Data Strategy,,European Strategy for Data,"A Meeco Review of the European Strategy for Data Communication from the European Commission on February 19th, 2020",,https://media.meeco.me/public-assets/reports/Meeco_Review_of_European_Strategy_for_Data.pdf,,Report,,Ecosystem,Public,,,,,,,2020-02-19,,,,,,,,,,,,, Meeco,Meeco,,,,,,,EU Data Governance Act,,EU Data Governance Act,"We welcome the regulation as a needed common ground for clarifying the role of data intermediaries, building trust in these intermediaries and setting the direction for data governance, including the emergence of digital human rights.In this context we offer the following suggestions:
1. Explicitly include individuals as active participants in the definitions [...]
2. Clarify the scope of the data sharing services (Art. 9 (2)) and extend it to include services that empower the data subject beyond compliance.
3. Foster the growth of intermediaries, which offer new technologies and have the greatest likelihood of success in Europe if supported by the Data Governance Act.
4. Open silos and implement soft infrastructure such as standards & open APIs to accelerate uptake and interoperability between data sharing services.
5. Foster eco-systems and demonstrate the value through practical use-cases.
6. Create a level playing field for sustainable data sharing by providing funding to pioneers at the forefront of developing data eco-systems

","The proposed European Data Governance Act is another progressive indication that the EU is seeking to develop a more equitable digital economy. However, where we go from here depends on how the European Union is able to use the Data Governance Act to strike a balance between the existing tech giants and data platforms alongside an entirely new range of services designed to enable the collection, protection and exchange of data. Currently, a handful of global players enjoy a virtual monopoly on the exploitation of data. Unlocking these data silos and regulating for data mobility and interoperability will provide the vital infrastructure required for meeting the challenges of the next century, including timely and informed decision making. At Meeco we believe that enabling citizens, students, patients, passengers and consumers to more equitably join the value chains fuelled by data will ultimately lead to greater trust and Personalisation, resulting in a more prosperous society. However, this will require new commercial models, enforceable regulation such as the Data Governance Act and the digital tools to transform our connected society. We believe this will lead to significant benefits to including Personalised health and education, increased financial literacy and better financial decisions, more informed consumer choices which also contribute to protecting our environment. Meeco is endorsing the Data Governance Act as a founding member of Data Sovereignty Now; a coalition of leading Europe-based technology companies, research institutions and not-for-profit organisations. We are working together to ensure that the control of data remains in the hands of the people and organisations that generate it in order to play a key role in not only securing the rights of individuals over their data, but also providing significant stimulus for the digital economy. Meeco is also a member of MyData Global and was amongst the first 16 organisations to be awarded the MyData Operator designation in 2020. We join in the goal towards developing interconnected and human-centric data intermediaries to meet the Personalisation and equity challenges of open digital society. We welcome the regulation as a needed common ground for clarifying the role of data intermediaries, building trust in these intermediaries and setting the direction for data governance, including the emergence of digital human rights. In this context we offer the following suggestions: - Explicitly include individuals as active participants in the definitions: define the key roles in data sharing (Art. 2 Definitions) so that data rights holders (data subject) and technical data holders (controller or processor) can be separated and acknowledge the type of data sharing where individuals are active participants in the transactions. - Clarify the scope of the data sharing services (Art. 9 (2)) and extend it to include services that empower the data subject beyond compliance. - Foster the growth of intermediaries, which offer new technologies and have the greatest likelihood of success in Europe if supported by the Data Governance Act. - Open silos and implement soft infrastructure such as standards & open APIs to accelerate uptake and interoperability between data sharing services. - Foster eco-systems and demonstrate the value through practical use-cases. The EU data sharing ecosystem is formative; therefore, it is imperative to demonstrate utility and benchmark best practices that contribute to a more sustainable, healthy, resilient and safe digital society. - Create a level playing field for sustainable data sharing by providing funding to pioneers at the forefront of developing data eco-systems, this includes start-ups, scale-ups alongside established enterprises. Included is a Meeco white paper detailing practical use-cases aligned to our response, including the barriers the Data Governance Act can address to make data work for all. Meeco is a global leader in the collection, protection & permission management of Personal data and decentralised identity. Our award-winning patented API platform & tools enable developers to create mutual value through the ethical exchange of Personal data. Privately, securely and always with explicit consent. Data Sovereignty Now is a coalition of partners who believe that Data Sovereignty should become the guiding principle in the development of national and European data sharing legislation. Data Sovereignty is the key driver for super-charging the data economy by putting the control of Personal and business data back in the hands of the people and organisations which generate it. The foundation members include aNewGovernance, freedom lab, INNOPAY, International Data Spaces Association, iSHARE, Meeco, University of Groningen, MyData Global, SITRA, The Chain Never Stops and TNO. MyData Global is an award-winning international non-profit. The purpose of MyData Global is to empower empower individuals by improving their right to self-determination regarding their Personal data, based on the MyData Declaration. MyData Global has over 100 organisation members and more than 400 individual members from over 40 countries, on six continents.",https://blog.meeco.me/eu-data-governance-act/,,Review,,Ecosystem,Public,,,,,,,2021-02-16,,,,,,,,,,,,, @@ -343,7 +349,7 @@ MyDex,MyDex,,Medium,,,,,,,Our New White Paper: Achieving Transformation At Scale MyDex,MyDex,,Medium,,,,,,,Helping Data Trusts Manage Personal Data,"Mydex CIC has just published a blog for Cambridge University’s Data Trust Initiative on ‘Helping Data Trusts Manage Personal Data’. In it, we address the challenges that arise as the Data Trust movement begins to scale.","Helping Data Trusts Manage Personal Data Mydex CIC has just published a blog for Cambridge University’s Data Trust Initiative on ‘Helping Data Trusts Manage Personal Data’. In it, we address the challenges that arise as the Data Trust movement begins to scale. In a world where many different Data Trusts want to access individuals’ data for a range of different purposes and services, two questions arise: - How can many different Data Trusts collect/access the data they need from the same individuals without creating far-reaching duplication of cost and effort? - How can individuals keep track of, and assert control over, the data they are sharing with many different Data Trusts? One answer, we suggest, is to use individuals’ Personal data stores as infrastructure for Data Trusts. Individuals can use their PDSs to feed their data to the Trusts they want to support and to exercise appropriate controls over this data. The blog goes into more detail as to how this can work.",https://medium.com/mydex/helping-data-trusts-manage-personal-data-4215faaee5f2,,Post,,Product,,,,,,,,2022-05-03,,,,,,,,,,,,, SecureKey,Avast,SecureKey,,Greg Wolfond,"DHS, DIF","Canada, Ontario, Toronto",Canada,,,SecureKey Technologies,"SecureKey is a leading identity and authentication provider that simplifies consumer access to online services and applications. SecureKey’s next generation privacy-enhancing services enable consumers to conveniently and privately assert identity information using trusted providers, such as banks, telcos and governments, helping them connect to critical online services with a digital credential they already have and trust, while ensuring that information is only ever shared with explicit user consent. SecureKey is a champion of the ecosystem approach to identity, revolutionizing the way consumers and organizations approach identity and attribute sharing in the digital age.",,https://securekey.com/,,Company,,Company,Enterprise,ID,"SSI, Supply Chain",,,,DID,2008,https://twitter.com/SecureKey,https://www.youtube.com/user/SecureKeyTech,,https://www.crunchbase.com/organization/securekey-technologies,https://www.linkedin.com/company/securekey/,,,,,,,, Spherity,,Spherity,,Carsten Stoecker,Sovrin Steward,"European Union, Germany, Berlin, Berlin",Europe,,,Spherity,"Spherity is building decentralized identity management solutions to power the 4th industrial revolution, bringing secure identities (“Digital Twins”) to machines, algorithms, and other non-human entities.

Spherity’s Digital Twins enable innovative customer journeys across mobility, supply chain transparency, risk assessment, audit trails for data analytics, and many more use cases.

Our developers and systems designers combine years of deep research in the emerging decentralized identity space with a wide range of cross-industry experience. They have built and refined complex, bespoke information systems for supply chain management, data-rich manufacturing, and next-generation data control.

We participate in key standards processes and community conferences to ensure full compliance and interoperability in the complex technological landscapes of decentralization and self-sovereign identity","Credentialing the world for a new internet age with digital trust Enable digital trust in your ecosystems by implementing decentral identities and verifiable credentials. Leverage the trust to streamline your business processes. Start now and use our solutions to easily integrate with your existing IT landscape. OUR ECOSYSTEM AND PARTNERS Products The Spherity Product Suite Two products. Same mission. CARO Credentialing Service for US DSCSA compliance. Spherity’s compact app to authenticate direct and indirect pharmaceutical Authorized Trading Partners in real-time.Learn more Digital Product Passport Boost your compliance with regulatory requirements introduced by the New EU Battery Regulation with Spherity’s Digital Product Passport.Learn more Services Supporting you in Strengthening Trust through Digital Identity. Set-up your trust-ecosystem in your specific industry.Learn more Stay sphered, join our newsletter! Receive product updates and the latest tech trends across industries. We care about the protection of your data. Read our Privacy Policy. Resources Read and watch in-depth articles on case studies, solutions, technical implementations, and more! How issuers can manage credential revocation? Spherity has developed an Ethereum-based credential revocation mechanism for use in the US pharmaceutical supply chain. In brief, a credential issuer examines real-world evidence, such as a trading license,... COP27: Digital Trust Technology Supports International Climate Action The Government of British Columbia (B.C.) and Spherity, both members of the Global Battery Alliance (GBA), are cooperating to facilitate the secure exchange of sustainability information using digital trust technology. Product Passport Pioneers - #6 with Mario Malzacher, Circular Fashion In this episode, we speak to Mario Malzacher, CO-Founder of CircularFashion. Mario is driving the circular economy in the textile industry. He heads and participates in research projects of the BMWK...",https://spherity.com,,Company,,Company,Enterprise,,"ID,AI,IOT",,,,"ISO 27001,DID,Verifiable Credentials",2017,,https://twitter.com/spherityproject,https://www.youtube.com/channel/UCJd30vQ46EYCq0KFysJtRMg,https://medium.com/@spherityy,https://medium.com/@spherityy,,https://www.crunchbase.com/organization/spherity,https://de.linkedin.com/company/spherity; ,,,,, -Spherity,Spherity,,Medium,,EBSI; EIDAS; W3C,,,European Data Infrastructure,,"Spherity connects the dots between SSI, AI, and European Data Infrastructure","Juan Caballero attended the stakeholder meeting for the European Blockchain Services Infrastructure project in Brussels, where architects and legal counsel presented their requirements and reports for the next round of development in partnership with industry leaders and contractors. [...] The most interesting development [...] the report from Nacho Alamilla, a key legal advisor for EBSI, on the functional limits of the current eIDAS (cross-border electronic signature mutual recognition) system in Europe and possible revisions or refinements of it being discussed in the EU.[...]
[Carsten Stöcker](https://Medium.com/u/2d7ca4c61292) and [Ricky Thiermann](https://Medium.com/u/16518b469d1e) were in Bonn attended the High-Tech Partnering Conference [#HTPC20](https://www.htpc-htgf.de/en-gb/home) organized by our lead investor [High-Tech Gründerfonds](https://high-tech-gruenderfonds.de/en/the-decentralized-identity-and-digital-twin-pioneer-spherity-receives-seed-financing-from-htgf/) (HTGF). Carsten had a keynote about “How to unlock the untapped business potential of IOT devices with digital identity”. Further we were able to exchange with the other start-ups of High-Tech Gründerfonds’ portfolio and to establish relations to HTGF’s industry and corporate partners.\
[...]
At the end of January, [Juan Caballero](https://Medium.com/u/7da78f634e80) and [Carsten Stöcker](https://Medium.com/u/2d7ca4c61292) were in Amsterdam, attending the specification-writing face-to-face meeting of the Worldwide Web Consortium’s Decentralized Identifier Working Group (W3C DID-WG). [...] The main event at this meeting was the renegotiation of the limits and interoperability of [DID Documents](https://Medium.com/spherity/ssi101-what-exactly-gets-written-to-a-blockchain-69ef1a88fa3c), which has become a sticking point in recent months due to the complexity of ongoing development based on different encodings (traditional JSON, JSON-LinkedData, CBOR, and even ASN.1 and PDF).\
[...]
On 31st January [Marius Goebel](https://Medium.com/u/3a23dedbeb33) attended the steering committee of the “Standardization Roadmap Artificial Intelligence” for the German Federal Ministry of Economics and Energy ([BMWi](https://www.bmwi.de/Navigation/EN/Home/home.html)) hosted by [DIN](https://www.din.de/en) [German Institute for Standardization] and [DKE](https://www.dke.de/en) [German Commission for Electrical, Electronic & Information Technologies].\
[...]
[Sphertiy](http://www.spherity.com/) is contributing to the working groups around the fields of “IT security in artificial intelligence (AI) systems” and “Certification and quality of AI systems” delivering its expertise in the fields of digital identities, in particular auditability, authenticity, traceability and identifiability of data and artificial intelligences (AIs).","Spherity connects the dots between SSI, AI, and European Data Infrastructure Recap of the first month of the new year Spherity started the year off with a busy travel itinerary, participating in standards work and startup communities. We met with the stakeholders of the European Blockchain Services Infrastructure, shared the business potential of the Internet of Things, made headway on the industry-wide groundwork for more robustly interoperable Decentralized Identifiers, and pushed forward the Identity capabilities of Germany’s Artificial Intelligence standards body. European Blockchain Services Infrastructure, Brussels Juan Caballero attended the stakeholder meeting for the European Blockchain Services Infrastructure project in Brussels, where architects and legal counsel presented their requirements and reports for the next round of development in partnership with industry leaders and contractors. We have built relationships with the key architects of the new system, and will be following closely the tenders and calls for industry input of this epochal project for European integration. The most interesting development, perhaps, was not the EBSI project itself (or its self-sovereign research and development framework, eSSIF), but the report from Nacho Alamilla, a key legal advisor for EBSI, on the functional limits of the current eIDAS (cross-border electronic signature mutual recognition) system in Europe and possible revisions or refinements of it being discussed in the EU. High-Tech Partnering Conference, Bonn Carsten Stöcker and Ricky Thiermann were in Bonn attended the High-Tech Partnering Conference #HTPC20 organized by our lead investor High-Tech Gründerfonds (HTGF). Carsten had a keynote about “How to unlock the untapped business potential of IOT devices with digital identity”. Further we were able to exchange with the other start-ups of High-Tech Gründerfonds’ portfolio and to establish relations to HTGF’s industry and corporate partners. Worldwide Web Consortium’s (W3C), Amsterdam At the end of January, Juan Caballero and Carsten Stöcker were in Amsterdam, attending the specification-writing face-to-face meeting of the Worldwide Web Consortium’s Decentralized Identifier Working Group (W3C DID-WG). As observers consulting on specific use-cases for the supplemental use-case document, Spherity met with stakeholders and designers hammering out a rescoping and process change for the specifications underlying interoperable “DIDs”. We will be representing our clients’ IoT requirements and use cases in the ongoing industry inputs to the standards process. The main event at this meeting was the renegotiation of the limits and interoperability of DID Documents, which has become a sticking point in recent months due to the complexity of ongoing development based on different encodings (traditional JSON, JSON-LinkedData, CBOR, and even ASN.1 and PDF). To make the security and the translation of DID-Documents more manageable across these divergent encodings, the working group decided to define a finite list of valid DID Document properties and contents, establishing a threshold for method interoperability and a standard registry for maintenance of the standard. More complex extensibility mechanisms, which might be difficult for all other standards-compliant methods to support fully, has been relegated to a separate layer linked via the @Context to allow simpler systems to remain fully-compliant. Other extension mechanisms around “metadata”, matrix parameters (which work like query strings or URIs), and the incorporation of a broader base of use-cases were also discussed. For a more detailed guide to the online documentation, see Juan Caballero’s report here. Standardization Roadmap Artificial Intelligence, Berlin On 31st January Marius Goebel attended the steering committee of the “Standardization Roadmap Artificial Intelligence” for the German Federal Ministry of Economics and Energy (BMWi) hosted by DIN [German Institute for Standardization] and DKE [German Commission for Electrical, Electronic & Information Technologies]. Experts from business and society are working jointly to develop a roadmap for norms and standards in the field of artificial intelligence. The aim is to develop a framework for standardization at an early stage. The standardization roadmap will include an overview of existing norms and standards on AI aspects and, in particular, make recommendations with regard to future activities that are still necessary. It will be developed by the respective stakeholders from industry, science, public authorities and society. The roadmap will make a significant contribution to introducing and enforcing the national position at European and international level. Sphertiy is contributing to the working groups around the fields of “IT security in artificial intelligence (AI) systems” and “Certification and quality of AI systems” delivering its expertise in the fields of digital identities, in particular auditability, authenticity, traceability and identifiability of data and artificial intelligences (AIs). “For us as a society as well as for us as Spherity, it is immensely important, in my view, that we necessarily deal with the topic of standardization of artificial intelligence — not least because this technology has the potential to completely transform our concept of what it means to be human.”, Marius Goebel says. With its activity within the steering committee of the “Standardization Roadmap Artificial Intelligence” Spherity is contributing to the AI strategy of the German Federal Government. Aim of the steering committee is to overhand a first draft by November 2020.",https://medium.com/spherity/spherity-connects-the-dots-between-ssi-ai-and-european-data-infrastructure-1f626e77ba7,,Post,,Ecosystem,Public,,,,,,,2020-02-06,,,,,,,,,,,,, +Spherity,Spherity,,Medium,,EBSI; EIDAS; W3C,,,European Data Infrastructure,,"Spherity connects the dots between SSI, AI, and European Data Infrastructure","Juan Caballero attended the stakeholder meeting for the European Blockchain Services Infrastructure project in Brussels, where architects and legal counsel presented their requirements and reports for the next round of development in partnership with industry leaders and contractors. [...] The most interesting development [...] the report from Nacho Alamilla, a key legal advisor for EBSI, on the functional limits of the current eIDAS (cross-border electronic signature mutual recognition) system in Europe and possible revisions or refinements of it being discussed in the EU.[...]
[Carsten Stöcker](https://Medium.com/u/2d7ca4c61292) and [Ricky Thiermann](https://Medium.com/u/16518b469d1e) were in Bonn attended the High-Tech Partnering Conference [#HTPC20](https://www.htpc-htgf.de/en-gb/home) organized by our lead investor [High-Tech Gründerfonds](https://high-tech-gruenderfonds.de/en/the-decentralized-identity-and-digital-twin-pioneer-spherity-receives-seed-financing-from-htgf/) (HTGF). Carsten had a keynote about “How to unlock the untapped business potential of IOT devices with digital identity”. Further we were able to exchange with the other start-ups of High-Tech Gründerfonds’ portfolio and to establish relations to HTGF’s industry and corporate partners.
[...]
At the end of January, [Juan Caballero](https://Medium.com/u/7da78f634e80) and [Carsten Stöcker](https://Medium.com/u/2d7ca4c61292) were in Amsterdam, attending the specification-writing face-to-face meeting of the Worldwide Web Consortium’s Decentralized Identifier Working Group (W3C DID-WG). [...] The main event at this meeting was the renegotiation of the limits and interoperability of [DID Documents](https://Medium.com/spherity/ssi101-what-exactly-gets-written-to-a-blockchain-69ef1a88fa3c), which has become a sticking point in recent months due to the complexity of ongoing development based on different encodings (traditional JSON, JSON-LinkedData, CBOR, and even ASN.1 and PDF).
[...]
On 31st January [Marius Goebel](https://Medium.com/u/3a23dedbeb33) attended the steering committee of the “Standardization Roadmap Artificial Intelligence” for the German Federal Ministry of Economics and Energy ([BMWi](https://www.bmwi.de/Navigation/EN/Home/home.html)) hosted by [DIN](https://www.din.de/en) [German Institute for Standardization] and [DKE](https://www.dke.de/en) [German Commission for Electrical, Electronic & Information Technologies].
[...]
[Sphertiy](http://www.spherity.com/) is contributing to the working groups around the fields of “IT security in artificial intelligence (AI) systems” and “Certification and quality of AI systems” delivering its expertise in the fields of digital identities, in particular auditability, authenticity, traceability and identifiability of data and artificial intelligences (AIs).","Spherity connects the dots between SSI, AI, and European Data Infrastructure Recap of the first month of the new year Spherity started the year off with a busy travel itinerary, participating in standards work and startup communities. We met with the stakeholders of the European Blockchain Services Infrastructure, shared the business potential of the Internet of Things, made headway on the industry-wide groundwork for more robustly interoperable Decentralized Identifiers, and pushed forward the Identity capabilities of Germany’s Artificial Intelligence standards body. European Blockchain Services Infrastructure, Brussels Juan Caballero attended the stakeholder meeting for the European Blockchain Services Infrastructure project in Brussels, where architects and legal counsel presented their requirements and reports for the next round of development in partnership with industry leaders and contractors. We have built relationships with the key architects of the new system, and will be following closely the tenders and calls for industry input of this epochal project for European integration. The most interesting development, perhaps, was not the EBSI project itself (or its self-sovereign research and development framework, eSSIF), but the report from Nacho Alamilla, a key legal advisor for EBSI, on the functional limits of the current eIDAS (cross-border electronic signature mutual recognition) system in Europe and possible revisions or refinements of it being discussed in the EU. High-Tech Partnering Conference, Bonn Carsten Stöcker and Ricky Thiermann were in Bonn attended the High-Tech Partnering Conference #HTPC20 organized by our lead investor High-Tech Gründerfonds (HTGF). Carsten had a keynote about “How to unlock the untapped business potential of IOT devices with digital identity”. Further we were able to exchange with the other start-ups of High-Tech Gründerfonds’ portfolio and to establish relations to HTGF’s industry and corporate partners. Worldwide Web Consortium’s (W3C), Amsterdam At the end of January, Juan Caballero and Carsten Stöcker were in Amsterdam, attending the specification-writing face-to-face meeting of the Worldwide Web Consortium’s Decentralized Identifier Working Group (W3C DID-WG). As observers consulting on specific use-cases for the supplemental use-case document, Spherity met with stakeholders and designers hammering out a rescoping and process change for the specifications underlying interoperable “DIDs”. We will be representing our clients’ IoT requirements and use cases in the ongoing industry inputs to the standards process. The main event at this meeting was the renegotiation of the limits and interoperability of DID Documents, which has become a sticking point in recent months due to the complexity of ongoing development based on different encodings (traditional JSON, JSON-LinkedData, CBOR, and even ASN.1 and PDF). To make the security and the translation of DID-Documents more manageable across these divergent encodings, the working group decided to define a finite list of valid DID Document properties and contents, establishing a threshold for method interoperability and a standard registry for maintenance of the standard. More complex extensibility mechanisms, which might be difficult for all other standards-compliant methods to support fully, has been relegated to a separate layer linked via the @Context to allow simpler systems to remain fully-compliant. Other extension mechanisms around “metadata”, matrix parameters (which work like query strings or URIs), and the incorporation of a broader base of use-cases were also discussed. For a more detailed guide to the online documentation, see Juan Caballero’s report here. Standardization Roadmap Artificial Intelligence, Berlin On 31st January Marius Goebel attended the steering committee of the “Standardization Roadmap Artificial Intelligence” for the German Federal Ministry of Economics and Energy (BMWi) hosted by DIN [German Institute for Standardization] and DKE [German Commission for Electrical, Electronic & Information Technologies]. Experts from business and society are working jointly to develop a roadmap for norms and standards in the field of artificial intelligence. The aim is to develop a framework for standardization at an early stage. The standardization roadmap will include an overview of existing norms and standards on AI aspects and, in particular, make recommendations with regard to future activities that are still necessary. It will be developed by the respective stakeholders from industry, science, public authorities and society. The roadmap will make a significant contribution to introducing and enforcing the national position at European and international level. Sphertiy is contributing to the working groups around the fields of “IT security in artificial intelligence (AI) systems” and “Certification and quality of AI systems” delivering its expertise in the fields of digital identities, in particular auditability, authenticity, traceability and identifiability of data and artificial intelligences (AIs). “For us as a society as well as for us as Spherity, it is immensely important, in my view, that we necessarily deal with the topic of standardization of artificial intelligence — not least because this technology has the potential to completely transform our concept of what it means to be human.”, Marius Goebel says. With its activity within the steering committee of the “Standardization Roadmap Artificial Intelligence” Spherity is contributing to the AI strategy of the German Federal Government. Aim of the steering committee is to overhand a first draft by November 2020.",https://medium.com/spherity/spherity-connects-the-dots-between-ssi-ai-and-european-data-infrastructure-1f626e77ba7,,Post,,Ecosystem,Public,,,,,,,2020-02-06,,,,,,,,,,,,, Spherity,Spherity,,Medium,,Legisym,,,,,Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S. Life Sciences Marke,"“Legisym is thrilled to be working alongside Spherity to bring the first production-level ATP Credentialing solution to the industry,” said Legisym President & Co-Owner David Kessler. “With the successful completion of the ATP Credentialing Pilot in 2020 and the joint founding of the Open Credentialing Initiative in early 2021, the Spherity-Legisym partnership is already proving successful in collaboration and forward thinking.”","Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S. Life Sciences Market. Legisym, LLC is a trusted expert in the U.S. Life Sciences Market, providing services to pharmaceutical companies around the world since 2009 Legisym and Spherity have worked closely together to bring to maturity a joint offering that meets the security requirements of the U.S. Life Sciences Market. As part of the joint development, both companies have collaborated with SAP and Novartis, which have already subjected the product to extensive quality testing and functional validation. Spherity and Legisym are pleased to officially announce their partnership as of today. In November 2013, the U.S. congress enacted the Drug Supply Chain Security Act (DSCSA) in order to protect patients’ health. To ensure that only legitimate actors are part of the supply chain, the regulation requires U.S. pharmaceutical trading partners to ensure that they only interact with other trading partners that are authorized. Authorized is every trading partner holding a valid state-issued license or a current registration with the Food and Drug Administration (FDA). Today in 2022, U.S. pharmaceutical supply chain actors have no interoperable, electronic mechanism to validate each other´s authorized status. With more than 60,000 interacting trading partners involved in the U.S. Life Sciences Industry and a FDA recommendation to respond to data requests in under one minute, a solution that provides compliance with the regulations by 2023 is in high demand. Legisym and Spherity have decided to cooperate and offer an interoperable highly secure service to enable pharmaceutical supply chain actors to become an Authorized Trading Partner (ATP) according to U.S. DSCSA. Legisym, as a trusted identity and license verification service provider, perfectly complements Spherity’s digital wallet technology for managing verifiable credentials. The verifiable credential technology is used to represent the authorized status of interacting trading partners in an highly efficient, secure and DSCSA-compliant way. To use credentialing for Authorized Trading Partner (ATP) requirements under DSCSA, trading partners need to go through a one-time due diligence onboarding process with Legisym. Once the verifiable credentials are issued, they are stored in a secure digital wallet which comes embedded with the Credentialing Service provided by Spherity. Using this technology enables U.S. pharmaceutical supply chain actors to interact with digital trust, as they now can digitally verify their ATP status in every interaction. Georg Jürgens, Manager Industry Solutions at Spherity says, “together with our partner Legisym we focused on making the adoption of credentialing for trading partners as simple as possible. Manufacturers, wholesalers and dispensers can all acquire a digital wallet and ATP credentials within minutes without integration effort and use this innovative solution for DSCSA-regulated interactions.” “Legisym is thrilled to be working alongside Spherity to bring the first production-level ATP Credentialing solution to the industry,” said Legisym President & Co-Owner David Kessler. “With the successful completion of the ATP Credentialing Pilot in 2020 and the joint founding of the Open Credentialing Initiative in early 2021, the Spherity-Legisym partnership is already proving successful in collaboration and forward thinking.” Legisym and Spherity founded along with other adopters, the Open Credentialing Initiative (OCI). This newly formed organization incubates and standardizes the architecture using Digital Wallets and Verifiable Credentials for DSCSA compliance for Authorized Trading Partner requirements. Besides U.S pharmaceutical manufacturers, wholesalers, and dispensers, the OCI is open for solution providers integrating the ATP solution. For press relations, contact communication@spherity.com. Stay sphered by joining Spherity’s Newsletter list and following us on LinkedIn. About Legisym, LLC For over a decade, Legisym, LLC has successfully provided the pharmaceutical industry with affordable and effective regulatory compliance technologies. In early 2020, driven by the 2023 authorized trading partner (ATP) requirements, Legisym began leveraging their existing Controlled Substance Ordering System (CSOS) and license verification technologies and experience, to engage as a credential issuer. By performing thorough credential issuer due diligence processes, first to establish a root of trust, Legisym promotes confidence in the trading partner’s digital identity prior to the issuance of all ATP credentials. About Spherity Spherity is a German software provider bringing secure and decentralized identity management solutions to enterprises, machines, products, data and even algorithms. Spherity provides the enabling technology to digitalize and automate compliance processes in highly regulated technical sectors. Spherity’s products empower cyber security, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001.",https://medium.com/spherity/spherity-is-partnering-with-legisym-offering-joint-compliance-product-for-the-u-s-cbf9fd5a217,,Post,,Ecosystem,Public,,,,,,,2022-01-20,,,,,,,,,,,,, Spherity,Spherity,,Medium,,,,,,,#SSI101: An Introductory Course on Self-Sovereign Identity,"Outside of a few philosophers, social scientists, and a tiny minority of specialized technologists, however, most people feel uncomfortable making any definitive or authoritative statements about identity.","#SSI101: An Introductory Course on Self-Sovereign Identity The Spherity Way Most of the time when someone first hears about “self-sovereign identity,” “decentralized identity,” or “blockchain identity,” they naturally assume the terms refer to some esoteric topic far enough away from their domain of experience and expertise that they can safely leave it to the experts. “Identity,” after all, is an important, hotly debated, and nearly undefinable core concept of human life. Outside of a few philosophers, social scientists, and a tiny minority of specialized technologists, however, most people feel uncomfortable making any definitive or authoritative statements about identity. Who would volunteer to express opinions about something that can so easily offend, and which we rarely think about when it is working well for us? As for the adjectives “self-sovereign,” “decentralized,” and “blockchain,” these are no less controversial, no less stable, and no less likely to intimidate, to offend, or to confuse. I do not believe, however, that most people can safely leave it to the experts, even though I am one of those experts. On the contrary, I believe “SSI” is worth learning about, worth getting familiar with, and worth getting excited about. For this reason, I have tried to outline a quick tour through the basic “building blocks” needed to understand what SSI is, how SSI is different from other “regimes” or systems of organizing identity, and what Spherity does with SSI. Half as a fun way to structure these essays, and half out of habit, I will refer to this series of writing as a “curriculum,” and I will use North-American-style course numbers of the kind that were standard in my former life as a college professor. Here, then, is an overview of the topics that will be covered in the coming weeks in our “SSI 101” series: - Identities & Identifiers - An Overview of Non-Human Identities - Self-Sovereignty and Autonomy - Attest, Identify, Authenticate, and Verify - What Exactly Gets Written on a Blockchain? - Verifiable Credentials & Data Portability - Encryption & Correlation - How Open Standards Get Made To facilitate your and our sharing of these links and cross-linking back to them from other writings, I will structure each “glossary entry” listed above as a distinct Medium post with a permanent URL. Full disclosure, they might get more detailed (or illustrated) at some point in the future. They can be read in any order, although they are easiest understood by the true beginner in the linear sequence conveyed in the “previous”/“next” links at the top of each entry. For the reader who is already comfortable with the 101 topics, members from across the Spherity team will be collaboratively writing articles for the rest of 2019 that walk you through the specific data needs of various industries we have studied closely with partners and clients, and even in our past lives pre-Spherity. These longer articles comprise the “special topics” in the 200-level sequence of our SSI curriculum. Having read two or three of those, hopefully at least one of the 300-level will be of interest and accessible: there, we will cover speculative economics, machine futures, the data needs of an increasingly circular economy, data marketplaces, and other “advanced topics” in SSI. At this level, things get a little academic and we stand on the shoulders of many giants, mostly deep thinkers within the SSI community or the software sector generally. So let’s start at the beginning, then, with identities and identifiers, the smallest indivisible unit of SSI.",https://medium.com/spherity/ssi101-an-introductory-course-on-self-sovereign-identity-the-spherity-way-19e0d1de3603,,Post,,Explainer,,,,,,,,2020-09-07,,,,,,,,,,,,, Spherity,KuppingerCole,,,,,,,,,Dr. Carsten Stöcker - Decentralizing Provenance in an Industry 4.0 World,"In this episode, Raj Hegde sits down with Dr. Carsten Stöcker, Founder & CEO of Spherity to understand how #decentralized identity is transforming the end-to-end supply chain lifecycle.","Decentralizing Provenance in an Industry 4.0 World | Frontier Talk #3 - Dr. Carsten Stöcker In this episode, Raj Hegde sits down with Dr. Carsten Stöcker, Founder & CEO of Spherity to understand how #decentralized identity is transforming the end-to-end supply chain lifecycle. Tune in to this episode to explore the increasingly important role of provenance in helping build a better world and learn about the intersection of exciting concepts such as non-fungible tokens (NFTs) and decentralized identifiers (DIDs). He pursued a PhD in physics from RVDA Hawkin, um, to understand how the world works and is leveraging the power of technology for the greater good of society. He's a highly respected figure in the blockchain space and acts as an advisor to the world economic forum, as part of its global future counsel, you are to share a stake on our provenance as a fundamental technology connect as a force multiplier to bring about positive change in society. Dr. Carsten Stoker, the founder and CEO of austerity. Yeah. Hi, thanks for having me on your Frontier Talk and I'm glad to be here today. Welcome to the podcast. And I'm speaking to a physicist, uh, to be honest as being always up there on my bucket list. So I'm glad that I could finally scratch it off my list. Um, so let's get started. Um, you've had an interesting career today. It's spanning across research consulting, a stint at the WEF and now entrepreneurship. So I'm curious to know, how were you introduced to blockchain technology? We, how I got introduced to blockchain technology basically was a big coincidence. I worked for utility RWE at this time and later it was kind of spun off into energy. I worked at the innovation hub and then there was a Dutch board member and the Dutch partner number basically wants to invent Uber for energy. So at this time, everyone wants to invent Uber for some things they have and be for something to come up as a new digital digital proposition. And basically it was a Dutch board members. The Dutch board member asked one of his friends or freelance and his network. And the freelancer said, yes, no, I can start working on this and invent Uber for energy. And what's the freelance and residence that he basically wrote a LinkedIn message to his network and asked his network, Hey, anyone in my network can help me to invent Uber for energy. And then all of the very early Ethereum developer ecosystem, because at this time there was the goal theory team and Amsterdam, and then I think C plus plus theory on team in Berlin and because it was Netherlands and the people from the Netherlands ecosystem said, Hey, yes, we have this fancy new technology, Ethereum, there's all the smart contracts. Why shouldn't we kind of try to not invent Uber for energy, but Uber for energy was out Uber in between, even if it's a disintermediation of Uber. And that's how I got in touch with the theater production and then be developed as early as 2015, already a peer to peer energy trading prototype, um, for, uh, on, based on smart contracts on the theory. So for households that offer energy because of renewable energy that could do a direct peer to peer energy transaction, other households that would like to consume energy. So with our utility, um, in the, uh, in between, without moving in between, and that is how I got in touch with the blockchain technology. Okay. Right. Brilliant. That's so cool. Um, there are so many interesting applications of blockchain today, be it, um, decentralized finance or defy, or for computing, I'm curious to know what, uh, got hooked onto decentralized identity and more specifically, why did you choose to specifically tackle the challenge of automating identity verification in end-to-end value chains? It's basically, um, the S mentioned, we did some the Saudis in 2015, some deceptive digitization business model designs and all of them. That was the problem identity. And I think when you do digitization, I think every digitalization should start with kind of proper identity solution. And, um, that's, that's also another thing from Nobel prize winner, you basically said, okay, if we'd like to solve the identity problem, then we need to solve it end to end. So, which means, so then, um, let's say as a company, right, and I got a company identifier, there must be someone who says, okay, the companies, the companies that I can start testing the company, and this could be, for example, in Germany would be And that's basically what we call an ends to enter the entire supply chain. That's still not solved in the internet and on the other. So we truly believe today's internet is, uh, internet with data and Um, so what are the industries that you primarily target and why is there a core need for your technology specifically? Yeah. Now I would like to, uh, to, to mention two things as a continuum, from my perspective, when you think about identity decent last sentence, the internet, on the one hand side, we have the cypherpunk manifesto. So there's, everything's encrypted and fully autonomous and anonymous transaction systems for humans. It's on one side of the continuum, a lot of privacy to protect my data and to make sure it's kind of self-sovereign data and no data will be leaking about myself. And on the other side of the continuum, it's basically, um, yeah, I would say surveillance of an object where I would like to have the full back to burst testability back to birth lifecycle history of an object. So are completely two different, um, the poets of the continuum. So then we had 30 would like, of course, to address both of the poles, but I think the human poll is much more difficult. GDPR does sends inertia to kind of, to convince people, to use specific technologies, specific wandered, very, very tough questions. And we think that the other side of the continuum where I would live to have this full traceability of an object for compliance reasons, for what reasons, because I would like to protect patient heads. I would like to provide, let's say an auditor for circular economy and for the school to kind of, to protect the safety of food, because I have the provenance, I know where it's coming from, and this is the support. So where we address our technology, because we think from a doctrine perspective, it's kind of more realistic to push it short term into production. And that's what we actually do in the pharma pharma supply chain area. And that's the reason why I put more focus on enterprise and specifically object identity. And when we do an API is legacy systems, manufacturing, execution systems, global AtWork databases, ERP systems, then API integration is kind of reasonable to be done, but then you can significantly scale technology because then the scale is number of objects as the number of files to good product that's being produced by, by the company, right? Um, some pertinent points that you raised. I think it's a great time to now, um, segue into the role of enterprise identity in supply chains today. Um, supply chains, as you might know, are often seen as this complex value chain of sorts, comprising of a wide range of parties. Uh, there's no clear understanding as in who actually is part of this entire value chain, you know, be it vendors, wholesalers, regulators, there are a whole bunch of parties. So to say, so could you perhaps start off this discussion by highlighting, what are the typical identities in a traditional end to end lifecycle in a supply chain? I think when maybe it's pretty difficult because we all need to go to the GS one to this global standardization of identifiers and to have identifies quality digital global trade identification number. And then they have another identifier called civilized. So a GTIN is basic right, and identify for specific product, but similar Streeton is basically breaking it down to batch level and even to see the number level that's one part of the identifiers. Another part for example, is so-called GL and global location number or party GL and PG lands that are presenting a legal entity or company. And these are the kind of identifiers that are today supply chain, but we cannot verify anything. So we have no instruments in the tools to find out is really coming from a company Right. So you're pretty much forced to believe, um, what's in front of you essentially. And is that the fundamental issue with supply chains today? Um, is there a missing trust layer? Yeah, I think today, um, the couple of tools in place one's called EPC, I S when you have specific messages as a tier two supplier tier one supplier, the customer, and then the messages are being created exchange among each other, but still there's no tools to verify the authenticity. At first, it starts with authenticity of the product. So how can I pull the authenticity of the product from our perspectives, that kind of two different kind of, let's say key technologies in place. So one is to put an identifier on the product and identify if it's a randomized you the number. So, which means if I'm a malicious actor, for example, the pharma industry and I'm producing fake pharmaceutical products. Yeah. Then it's almost impossible for me to guess a valid serial number because the manufacturer use this randomized serial numbers for, to get packages, which means I, as a verifier, let's say I have a patient, or I have the policy. If I scan the identifier, I can send the request to the manufacturer and send the manufacturers, basically looking at So all these players are also trying to go into supply chain for supply chain integrity for product authenticity use cases. And that's still a big problem, but it doesn't end with product authenticity. It's also about very simple things such as an ear leaflet and electronic leaflets. Yeah, because if I'm, as a hacker can put a fake leaflet, electronic or digital leaflet can attach it to foster good product, then the patient might get the wrong instructions, how to consume the pharmaceutical product is this could have significant impact on patient health. But also if we think about machines, if I can sneak in a fake, a leaflet for machine, and then people that are maintaining on starting the machines, they basically, this can be kind of a very bad impact on the health and safety, because if there's a high voltage power line and it doesn't explain how to disable the high, high voltage power line, and that touched the high voltage power line, then it can be very impactful. So it's about authenticity. It's about leaflets. It's about product recalls because how do I know as a manufacturer who is, uh, owning the product who's using it? Yeah, because I have a product recall, I must need to contact the end consumer. So that that's another as I use case product recall, but also back to birth lifecycle for circular goods. For example, what plastics ingredients are in the casing is this bio soft plastics, is this biodegradable plastics? How do I recycle it is very important information. And if I, if, if the antisystem is not secure, so then I'm kind of screwing up my, my circular economy, um, or plastic recycling systems and all of this, uh, very, very important use cases up to customs. So we also have a project with the U S the power performance, security, customs, and border protections. So we even thinks this is technology civilization due to training is also last line of defense for circular economy, because let's, let's assume we are here in Germany and getting products from Asia. And the only in the EU, we only would like to let let in products specifically circular, renewable, sustainable history. And, um, let's say basic usable energy was used to produce them. Bio source plastics was first used kind of to produce them. So if I would like to do this, then the customs transition must have the tools and the hand to check the serial number, to go to digital twin, to go back to the auditor and find ours is proper circular object, fixed circular object, and to fake the clock objects that we locked out from, for our perspective, it's the enabling technology for the circular economy and all these kinds of let's say features are very fascinating. And we're still only talking about objects, identity objects ends up plenty, plenty of excellent and comprehensive from use case. Right. Um, you raise an interesting point there about the circle economy. Um, I'm curious to know, um, what is the role of identity in a circular future, according to you? Um, so fast, it's all about provenance and provenance starts with the company will provide components. So who basically is a company that manufactured components that are then being assembled to produce a product, for example. So I need to know the identity of the companies that are producing the components, and this is where it all stops are can I trust it? The company is a fake company. Is this company who would use the components? Is it coming from, uh, from a country, was export compliance issues as coming from a pop-up proper country. Do they do the work in a Crohn's to environment, health and safety standards? Do they do the work and the Crohn's to labor rights, anti-bribery tried labor, and then I can check this. I can stop trusting the company, and then I can start testing the origin of the data of a product. And then I think as I think maybe for, for the audience, for the, particularly as that component of being sample to product, in terms of transformed and circular object, as an end customer, I would like to have the full, the full back to burst traceability, because if some of the players is kind of cheating, then I cannot trust the end product. And that's still a very, very big problem when I, for digital twin in the end product to how can I trust the data so that describing the end product and how kind of tasks or the supply chain actors that's part, part of the circular economy just sounds like a very big complex problem. And I think the, uh, critical success factor is to start very, very simple. It's very simple use cases that are doable and not to try to solve the entire circular economy problem from the start, but to start with very simple. Okay. Um, and to add to that, um, in our first episode, on this podcast with, um, Dr. Harry barons, he mentioned that in a B2B setting, um, the trust authority almost always goes to a route, particularly in regulated industries. So how do you ensure that the credentialing authority is who they actually claim to be? Yeah, I think this is a very important context. Concept is basically similar. What we know from public key infrastructure. We have past hierarchies in the five or nine words and in the internet is a whoop whoop certificate authorities. And if you would like, and what are basically all these decentralized identity solutions and verify their credentials all about, it's exactly the same context concept as X five or nine, it's more standardized, it's more extensible, it's more flexible. It allows more or different so-called testaments, but in a given customer, I need to know who the, the, the, um, the authority that are issuing certificates and today in the internet. So we have a couple of tools. One is a so-called well-known to it because I always need to able to verify the identity of the would authority and award authority could be the government authority could be global legal entity identifier foundation that is establishing a global governance and infrastructure for verifying enterprise identity. And then I need this well known tool, and well-known basically says there's an identifier for a company. And then there's public keys that I have presenting the company, or that are being used by the authority to sign now certificates, for example, enterprise certification certificates. And then I need to check the task. Is this identifier, does it really belong to the authority? For example, the German government does the identifier belong to the German government and the public keys, VDP log belong to the hood identifier. If I can check this, then I can establish a task Tibor key. And then when I get a very fabric credential or science data payload, I can basic cryptographically verify it. I can also check those. The keys belong to the entity was the entity being verified, for example, by the German government and the keys of the verifiable credentials, verification credential, really drunk government that's Tyreke. I have to do the anti-trust chain and that's by the way, by this. And plus it is still challenging, but that's kind of trust hierarchies and must be solved. And that's where a lot of that momentum is established right now to really get to some place. Brilliant. Um, the term decentralized identity is almost always associated with blockchain technology. Um, why is this, so, and, um, could you perhaps, uh, double click on this relationship between blockchain and decentralized identity? Yeah, basically in decent last entity, it all starts that I S identity subtract on identity controller. I would like identifier and adult go, and that's a big difference. I don't go to centralized platform that are creating the identifier and the public private key pair on my behalf, if I will do it. And what all the kinds of existing systems are about today is that an administrator can kind of try to cheat. So a lot of the tech vectors straight I can kind of manipulate the, the keys can see the private key can manipulate the identifiers. This is problem is all centralized platforms in these set. Last, the entity is different. So I, as an anti subtract with my identity controller, basically out of randomness or from Abbotsbury no noise, I create seed. And then I create a public private key pair. I fully control it. There's no one else who's controlling it. And when I create a public private key pair, also public key, I create an identifier and now what do I need to do now? And I want to make sure my contact counterpart is nosy decent. Last identifier. Plus saves the tools to look up my public, finding keys for this identifier, and then you need to broadcast them or to inform them. Yeah. And blockchain is a very handy tool because what I can basically do, I can establish a smart crowd fact. And in the smart contact I had just done my identifier. Plus my signing keys was identifier and a smart contract is making sure. So then I'm changing this, that only I, as an identity controller can change it. So in this is what cryptographers called tapping the entropy. This is basically, uh, the blockchain is providing the instrument so that other people can look up for my identifier. So what are the other corresponding keys that are being used for signing for signing data on my behalf? And, um, as a blockchain is immutable and publicly accessible. That's a perfect tool to basically, um, communicate the keys to my identifier. And then there's the second use case. If I have an identifier, I also would like to enable people that they can look up service endpoint or URL of a service. So it belongs to my identifier. And there's an analogy with the DNS. So in the DNS today, we base the it's a lookup, it's a mapping tool between it address I between an IP address and a domain name. That's exactly what we're doing here. So the other perspective on decentralized identity is a decentralization of the DNS because you have an identifier basically, which is kind of, let's say maybe domain name. And then I basically can, can, can look up, identify service endpoint, and then I can go to service end points and interact in a neutral way on the internet with me as an identity subject. And those are the two very important tools, uh, features of these times, the entities, assigning keys, communicating them and communicating service endpoints. And if I put as an immutable ledger that I only have control about the data, then I can be sure that everyone can read it and kind of find all the keys and interact with my service endpoint. And, um, I have the tool to communicate that to everyone else, the tools that fully controlled by my own, there's no set party involved that can try to lock me out to manipulate my identity. And that's, that's being avoided by the use of blockchain. Right. So now that we've discussed the current state of play in supply chains, I think, um, it's a good time to deep dive into the future of, um, industry 4.0, um, according to you, what are some of the biggest inefficiencies you see when it comes to supply chain, network design? And, um, more importantly, is there any room for ecosystem innovation So it's basically, it's kind of self controlled. I can request some credentials published class. I can go throw up a route of trust, a trust chain to find out if the credentials are correct, just establishing completely new means. And when I can basically verify where's the data coming from, from which company which Agros in which object, I can also put what we call this scoring in place. Yeah. And then I can give in the future data of this score, especially in the more crop, kind of, let's say in a typical cyber-physical value chain, just so many data that blended processed and merge to establish, for example, digital twin of a manufacturer product. But, um, I can, I can be is an Eve, uh, trust all the data and digital twin or establish some tools to test scoring on the data. And then I can have some special decisions and can make up my own mind whether I trust the data, um, or not. But this is very important, especially when, um, machine learning comes into play as well. So when I feed machine learning algorithms, I get machine learning labels out of the algorithms. Then it's even more important to find out, can I task the machine learning algorithm is a benchmark. What has the training data? Was it biased? And even if I trust the word best machine learning algorithm, what is the input data? Was it fake cars or was it real BMWs? Yeah, if I can check the input rate as an egg wasn't, um, but when I get the output machine labeled, so then I can establish a risk scores. I can trust them and I can use them for decisions for autonomous driving back and forth, uh, driver assistance systems for risk propositions, for traffic control systems for mapping systems. I need to trust the data as a provenance of data being processed. And that's, that's the can from, from our perspective, the, um, the big, um, as a big opportunity with specific technology, right? I think you raise a very important point there, uh, um, with regard to the verification of trust. And I think that is a concept that we can definitely touch upon later in this podcast, um, for now, and again, like a shift back, um, the focus on to industry 4.0, um, you've seen the industry evolve for some time now. And, um, in your opinion, what are the four or five, um, ingredients that organizations need to consider to successfully bring any new technology to domain? And from, from my perspective, first, it starts with simplicity to find this really, really simple use cases, few supply chain, actors, few data sets, um, and few systems, um, that I can basically connect to have to have a proper business value already. Yeah. So which means I would like to have precise integration of few systems and not to have this very complex Boise OSHA report in it. So it's simplicity, it's also education. So for us as a start ups, we prefer to engage with people, people, and businesses. So they're already, yeah, that's a very important ingredient because otherwise we have to do one or two years additional education, and then we bring nothing to the market. So it's implicit, it's simplicity, it's education, it's a business case. So if there's not a business case, and it's difficult, what we see the best business case guide now, because this is an ecosystem technology is the business case can be dilutive. It's more systemic business case where everyone kind of benefit, but how do I benefit? What's my business case. What's in, for me, it's unclear. But if there's regulatory requirements for provenance, for auditability, for back to birth traceability, if these requirements as they are, then these compliance requirements as a business case, because today is a lot of paperwork. It's many work. It's a lot of, let's say quality inspectors. If I can digitize all of this, or if I can avoid fines or penalties, because I'm not compliant. So this, from our perspective compliance process cost reduction. So that's the business case for short term implementation. So I mentioned simplicity. I mentioned, uh, as the education and the post compliance, I would like to mention two more. So one is that as an ecosystem, the full ecosystem, because this technology doesn't make sense if just one company introduce it. So it's very important to have, let's say, consistent ecosystem for everyone shares the same common goal, for example, to reduce compliance costs and last but not least very important, because we mentioned this past domains and some need to do some enterprise identity verification. That's where I'd start. This is, this is normally not in place. It's still a big question. So who says at Porsche Porsche, Siemens, Siemens So there's a solution to implementing that is bootstrapping the DEA trust domain. Um, this is fantastic for us because we don't need to verify anyone so we can do combination of DVDs and five or nine signing certificates. And then we can establish digital identities and we can establish so-called also has trading partner, um, credentials. And this is how we can bring it to production is outside need to solve the problem of enterprise identity verification because in principle that's solvable, but in practice in practice, it's not being solved because there's no infrastructure in place and no consistent credentials as this. This is fantastic. And that's the reason why we are, we live lives as also as trading partners use case in, uh, in us very much. That's the sixth ingredient I would like to mention this because when you have all the wallets city centers, the wallets, you need to go to every supply chain and gives the wallet. You have to sell it to them. You have to integrate it to all the pharma companies and wholesalers. Then you have to validate and test it. So it's a lot of work, even from a commercial perspective, this is almost an unbelievable amount of work, but in this use case in us, uh, so-called men in the middle and in this also has tiny partner use case. Um, it's regulated by FDA. It's a us stock supply chain security act, everyone, every supply chain participant wants to use compliance cost. They don't have, let's say a more principle discussion. I compose a wallet is the custody wallet or noncustodial wallet. So it doesn't matter what matters is to reduce the compliance cost and to establish a secure system and SMN in the middle of the, for us, it's very interesting. Um, they're called so-called verification, defaulting service providers, and to provide a lookup service, there are a few of them. If we integrate our solution of what business verification holding service providers with the maintenance emitter, it's fast, uh, fantastic how to market, because they are connected to all supply chain actors. And we all need to go to the very few VMs providers, for example, SAP, F X a TraceLink just to name three of them. You basically need to go to three of them and say, basically, uh, yeah, I have outsource the whole set as a pharma companies outside as opposed to them. And then by integrating with CVS providers, we can basically kind of switch the entire industry to use our technology. And it's fantastic because otherwise we have to go to every pharma company and kind of to, in a very many work to somehow integrate our wallet. And by integrating our API APIs to few men in the middle, that's fantastic from a go to market and how to doctrine. Right. Um, great insights there. Costin, I think that's a great playbook. You've put out there for anyone looking to move a supply chain 2.0 or upgrade the supply chain systems. Um, I think one thing that's clear from our conversation so far is that we are living in, in narrative driven society. I mean, if you look at all the institutions around us, the media, the politicians, and, and whoever, you know, almost always are told to believe in a certain kind of truth without being able to verify it. And then all of a sudden comes this incredible piece of technology called blockchain. That gives us the ability to not just verify the truth, but also be sure that what's in front of you is actually what it means, you know, being in the case of provenance of raw materials or in the case of NFTs where we can now track and verify the authenticity of non-financial assets that are scars and, and, and unique being a piece of physical art or digital art, you name it. I'm curious to know how do you see this move playing out in society, um, by this move? I mean, the intersection of decentralized identifiers, DIDs and NFTs. Mm. I think so the verification of tools, that's about the trustworthiness and that, I think it's all coming back again to the risk scoring. Yeah. Because tools is not something binary. It is a tool or not tool. It kind of convolution. Yeah. Because I have so many different data, for example, in the pandemic. So what's the tools it's a tool set with that says more infections or less infections. Uh, what is the tools when I make a specific policy? So how, what is the impact on the infections? Um, what is, what is the, what is the lesson for me Personally for my age, this just a lot of possibilities. Yeah. And in, in supply chains, in art, in, um, in e-commerce there's always a lot of data, data are blended together. Um, I get a lot of data. I think also think tools is not binary, I think in the end, and this is even unexplored. There must be tools kind of to, um, to assess the risks of using the data of testings of data. And that's something that's still unexplored because even in the debts and verifiable credential, um, domain, people think bit, a bit binary, uh, or the have a driver license, or I have not a driver license, or I have a COVID vaccination test certificate. I don't have it, but what can go wrong? So then I go to a test center. Okay. The test center can basically screw up my name. Yeah. And screw up, my sample to test center could be tested, was very poor quality management. Yeah. Which ruins lab equipment is not maintained correctly. So the outcomes public has also some, some probability and then they basically can, uh, then they have to, to give me a certificate about the test. Yeah. Uh, how does the insurers, as they give it to the right person and that person authenticated there? So what I would like to say, it's all about a lot of this risk matrix and to understand and have proper scoring. I think that that's tooth. Um, but you mentioned the NFCS and the debts. So a lot of people are putting art as an NFT, a non-functional talking. And first of all, uh, what do I know about the artists? What is the artists, what do you know about the ad it's set as one piece existing, obviously art also serious of 1 million pieces of art. There's only tiny differences. Yeah. So if it says one millions and it's less about, it was much less combat compared to scarcity. If the art is just one of the piece of sad. Yeah. So again, need to, to, to know that the provenance and I think even from a legal perspective, so if someone puts an NFT Ethereum, so how can I make sure the same, person's not setting NFT on Bitcoin polka dots, uh, on a cadaver or any, any other chain. Yeah. So how do we ensure it? And then it's kind of at the physical digital intersection and with, with some legal perspectives, how do I do this? But anyway, I think that's, that's kind of the broader perspectives here in terms of tools. And you mentioned the NFTs and debts from a tank perspective, there are a lot of similarities because an NFT is controlled by an owner, and then I can do ownership transactions. I can give it to him to new owner. I can also establish, uh, uh, fractional ownership, um, to, yeah, for example, there's an expensive bottle of wine. I can give it to one person that's controlling the NFTs, presenting the bottle of fine, and they have to show the NFT before I get it, or there can be multiple owners. See, I don't have a concept of faction ownership. And in addition, if ownership is changing in the NFT and might be able to see the chain of custody, and that's also for some supply chain use case, um, for luxury goods, it can be connected to authenticity. I can do similar things with the debts debts. I also control it. Um, I can base even change the ownership by kind of giving control about that to you. Um, then I have the service endpoint. I can find a duty between the two. It can be digital and to add, I can describe the prevalence of the art and see the heritage of the artist was very fiber credentials. So we feel a lot of intersection between this and NFTs, but, um, and especially for the question of provenance of NFTs, and this is pretty much unexplored yet. I think it's unchartered territory. And my prediction is that there will be a lot of, um, yeah. Work going on in combining NFTs did spare fiber credentials and a not so far future. Right. And finally, I now want to explore the cultural revolution of sorts that is unfolding in front of our eyes. Um, you recently wrote an article on the principle of duality in science and art, uh, that is changing the course of tech and marketing. I Personally think it's a fascinating read and for our I'll post the link to the article in the description box below, um, I Personally would like to add onto it and call it a trifecta of sorts because we're announcing athletes jump on the bandwagon and get associated with emerging technologies. You're seeing the newly crowned quarterback of tags, um, Trevor Lawrence signing with blocky fi you also have silica now partnering with nine top football stars to endorse their product. So my question to you is why is this a recipe for success? Um, when it comes to marketing in today's world, that perfectly thing from a technology perspective, there's a concept called crossing the chasm. Yeah. So, and you have to kind of react cantaloupe best technology. Um, but you need to have early adopters. Yeah. And, but it's even not enough. So you have crossed the champion when there's an early majority. Huh. And you're, and if you don't reach an early majority, then no one really uses the technology outside the lab or outside a few test. Yeah. But as a technology technologist and entrepreneur, you're only successful when you can reach the early majority. And I think when you think about the athletes and the artists, so they have quite some reach and that can help you bootstrap, um, uh, to the crossing, to the costings of Chisholm, big cross, if they're interested in technology can leverage this for, for their benefit or for the greater good of society. And that can help to establish an ecosystems that can help kind of to transport things, the message, the narrative to the early majority. I think this, this is a fast festinating duality of, um, of kind of combining technology because the reach of, of athletes and artists and that's, what's, what's, what's pretty fascinating. Brilliant. Uh, I think now it's time for the best part of the podcast. It's time for frontier fire, where pause a series of rapid fire questions to our guests on the pod. So Karsten, I ready for the challenge. Yes. Brilliant. Let's get started. Um, I'm curious to know what's the best application of physics in everyday life? Well, from my perspective, I liked statistical physics pretty much because if you apply statistics, statistical physics for machine learnings, for predictions for everyday life, for social, for economic, for technical questions. Um, so that's, that's, that's from my perspective, the best application to forecast the future, to predict the future, to make better decisions for now, for all of us. And what is the best business advice you've ever received? It's the best piece of advice. So from my perspective is really, um, uh, humility because you, sometimes people think they can control something. They own some things I have found, see the big insight, and now they can change the world. And usually it's not so easy. I think humility is very important to have a little bit It's quite a beautiful mindset and there's as a movie about, um, uh, and tooling. And it's so fascinating, what kind of Personal fights I have with them setups with inner being is kind of trying to open themselves to innovate, to bring new mathematics, to bring new encryption, and, but then to struggling in everyday lives. So this is very fascinating because this is Michael micro system. So people have some tools and the capabilities to do something great. As, as traveling, it's a tiny things. And if you, if you, if you can have, let's say, um, transforming to a macro level, it's a planet earth. It's still, it's still for us humans. I think we all, we all have all the knowledge and all the tools and science to understand that separate is twice a planet, but we are traveling and cannot really change our costs. And that's so that's, that's what I like to connect to science and started of people with, um, the greater climate change problems we're in today. And speaking about struggles, um, what is the one thing people don't know about entrepreneurship? I think from my perspective, so as a consultant, it was super easy as a consultant to set stuff to enterprise as, yeah, incredibly easy to sell as a consulting project system integration, project strategy project, super easy, and tend to sell even more clients and doing transformation implementation as an entrepreneur, especially which would do be to focus on B2B. It's, it's just the opposite. It's sore, it's sore spot, super difficult to sell emerging technology. There's a business case. It's not clear to sell it to companies because especially, let's say in Europe, in Germany, people would like to have a crystal clear business case center start investing. They don't invest in hypothesis that they need to build the capabilities to kind of, to work for technology. And I think that the second tech tech and one big thing it's that decisions and implementation Ultimax exchanging because a lot of technologists are ecosystem technologies. And this requires a different approach because you cannot just sell it to one company who's successful. You have to kind of to sell it to an ecosystem is dependencies because it's fusion of technologies that dependencies the main technologies and that's set sets. That's kind of tough challenges, but, um, yeah, if you kind of have that same ecosystem innovation approach in mind, um, Down's has probably prerequisite finally, what's your advice to anyone listening to this podcast? Cool. My, my advice is basically so, so never give, give up, um, and be very flexible because when all of you is kind of pushing forward, decent, less identity bent vegan, uh, you have to be very flexible. You cannot kind of focus on not just one domain and one copy, please fall off. That's what venture capitalist wants to see. So it's unclear where to start. And I think to be able to pivot my different domains propositions, I think this, this, this flexibility is very important and also the ability to execute, to learn fast then going into another business domain. That's very important Carsten. It was an absolute pleasure speaking with you today. Thank you so much for shedding light on the increasingly important role of provenance in the world tomorrow. I hope to have you again on this podcast and wish you and your team as Verity the very best of luck going forward. Thank you so much. Yeah. Thank you for having me. And it was a fantastic experience being on your frontier show. That was Dr. Carson Stoker cost. And we'll be speaking at the European identity and cloud conference EIC, and you can get your tickets to the event. Why the link in the description box below. I hope you enjoyed this conversation that dabbled around NFTs provenance and the cultural revolution. Um, if you think anyone would benefit from this information, please go on and share this with them. Um, until next time, this is me Raj Hegde day, and I hope to see you all again on this incredible journey to really find the eye in identity, stay safe, stay happy. Video Links Stay Connected Related Videos How can we help you",https://www.kuppingercole.com/watch/frontier-talk-podcast-3-decentralized-provenance,,Post,,Explainer,,Supply Chain,,,,,,2021-05-12,,,,,,,,,,,,, diff --git a/_data/standards.csv b/_data/standards.csv index efa77e87..67e2d55a 100644 --- a/_data/standards.csv +++ b/_data/standards.csv @@ -154,7 +154,7 @@ Decentralized Identifiers,WebofTrustInfo,,,,Decentralized Identifiers,,,,rwot07- Decentralized Identifiers,"SRI International, NIST, FIMSA",,,,Decentralized Identifiers,,,,,Cryptography Review of W3C VC Data Model and DID Standards and Implementation Recommendations,"Cryptography used by U.S. government entities in operational systems must conform to relevant federal government standards and requirements, including the Federal Information Security Management Act (FISMA) and National Institute of Technology (NIST) standards for use of cryptography. As part of its in-depth technical due-diligence to enable operational capabilities for DHS/CBP, DHS/PRIV and DHS/USCIS, the U.S. Department of Homeland Security’s Silicon Valley Innovation Program (SVIP) sponsored independent nonprofit research center SRI International to conduct a cryptographic review of the W3C Verifiable Credentials Data Model and W3C Decentralized Identifiers standards. The review provided constructive feedback and recommendations for technology developers and W3C standards developers to increase their level of compliance with federal government standards.",,https://web.archive.org/web/20230319062836/https://www.csl.sri.com/papers/vcdm-did-crypto-recs/,,Paper,,,Literature,,,,,,,,2023-03-19,,,,,,,,,,,,, Decentralized Identifiers,Legendary Requirements,,,,,,,,,did:directory,"The DID Directory is a public directory of DID methods, provided by Legendary Requirements, long time advocates for decentralized identity and its emerging technologies, such as the Decentralized Identifiers from the World Wide Web Consortium.

Decentralized Identifiers (DIDs) enable identity-based services without dependence on a trusted third party. Instead of requiring centralized identity verification services, like Facebook, Google or the Department of Motor Vehicles, DIDs can be created by anyone, anywhere, and be used for any purpose.",,https://diddirectory.com/,,Directory,,,About DID Methods,,,,,,,,,,,,,,,,,,,,, Decentralized Identifiers,DIDWG,,,,Decentralized Identifiers,,,,,DID Specification Registries,This table summarizes the DID method specifications currently in development. The links will be updated as subsequent Implementer’s Drafts are produced.,,https://w3c-ccg.github.io/did-method-registry/#the-registry,,registry,,,About DID Methods,,,,,,,DID Working Group,2023-05-14,,,,,,,,,,,,, -Decentralized Identifiers,Transmute,,,Margo Johnson,Decentralized Identifiers,,,,,DID:Customer,"While we are committed to providing optionality to our customers, it’s equally important to communicate the selection criteria behind these options so that customers can consider the tradeoffs of underlying DID-methods alongside the problem set they’re solving for.","Transmute builds solutions that solve real business problems. For this reason, we support a number of different decentralized identifier (DID) methods. While we are committed to providing optionality…",https://medium.com/transmute-techtalk/did-customer-4ca8b7957112,https://miro.medium.com/v2/resize:fit:1200/1*MVDiykjv5WUBP4PUWweB5w.jpeg,Post,,,About DID Methods,,,,,,,,2020-10-30,,,,,,,,,,,,, +Decentralized Identifiers,Transmute,,,Margo Johnson,Decentralized Identifiers,,,,,DID:Customer,"While we are committed to providing optionality to our customers, it’s equally important to communicate the selection criteria behind these options so that customers can consider the tradeoffs of underlying DID-methods alongside the problem set they’re solving for.","Transmute builds solutions that solve real business problems. For this reason, we support a number of different decentralized identifier (DID) methods. While we are committed to providing optionality…",https://medium.com/transmute-techtalk/did-customer-4ca8b7957112,,Post,,,About DID Methods,,,,,,,,2020-10-30,,,,,,,,,,,,, Decentralized Identifiers,WebOfTrustInfo,,,"Joe Andrieu, Shannon Appelcline, Amy Guy, Joachim Lohkamp, Drummond Reed, Markus Sabadello, Oliver Terbu, Kai Wagner",Decentralized Identifiers,,,,rwot9-prague,A Rubric for Decentralization of DID Methods,"The communities behind Decentralized Identifiers (DIDs) bring together a diverse group of contributors, who have decidedly different notions of exactly what “decentralization” means. For some, the notion of a DID anchored to DNS is anathema, for others, DIDs that cannot be publicly verified are problematic. This debate about decentralization is a continuation of a similar, ongoing argument in cryptocurrency circles: the question of whether or not bitcoin or ethereum is more decentralized is a nearly endless source of argument. Rather than attempting to resolve this potentially unresolvable question, we propose a rubric — which is a scoring guide used to evaluate performance, a product, or a project — that teaches how to evaluate a given DID method according to one’s own requirements. Our goal is to develop a guide that minimizes judgment and bias. Rather than advocating particular solutions, the rubric presents a series of criteria which an evaluator can apply to any DID method based on their particular use cases. We also avoid reducing the evaluation to a single number because the criteria tend to be multidimensional and many of the options are not necessarily good or bad: it is the obligation of the evaluator to understand how each response in each criteria might illuminate favorable or unfavorable consequences for their needs. Finally, this rubric allows evaluating aspects of decentralization of a DID method, but it is not exhaustive, and does not cover other issues that may affect selection or adoption of a particular method, such as privacy or efficiency.","RWOT9 in Prague, The Czech Republic (September 2019) - rwot9-prague/decentralized-did-rubric.md at master · WebOfTrustInfo/rwot9-prague",https://github.com/WebOfTrustInfo/rwot9-prague/blob/master/draft-documents/decentralized-did-rubric.md,,Paper,,,About DID Methods,,,,,,,,2019-09-06,,,,,,,,,,,,, Decentralized Identifiers,IDCommons,,"https://iiw.idcommons.net/13D/_We_evaluated_7_DID_methods_with_the_W3C_DID_Rubric!_did:btcr,_did:sov,_did:ion,_did:web,_did:key,_did:peer,_did:ethr",,Decentralized Identifiers,,,,IIW,DID Method Rubric v1.0,This rubric presents a set of criteria which an Evaluator can apply to any DID Method based on the use cases most relevant to them. We avoid reducing the Evaluation to a single number because the criteria tend to be multidimensional and many of the possible responses are not necessarily good or bad. It is up to the Evaluator to understand how each response in each criteria might illuminate favorable or unfavorable consequences for their needs.,,https://w3c.github.io/did-rubric/,,Guidance,Draft,,About DID Methods,,,,,,,,2022-01-11,,,,,,,,,,,,, Decentralized Identifiers,IDCommons,,"https://iiw.idcommons.net/13D/_We_evaluated_7_DID_methods_with_the_W3C_DID_Rubric!_did:btcr,_did:sov,_did:ion,_did:web,_did:key,_did:peer,_did:ethr","Walid Fdhila, Markus Sabadello","did:btcr, did:sov, did:ion, did:web, did:key, did:peer, did:ethr, Decentralized Identifiers",,,,IIW,DID Methods Evaluation Report,This report evaluates a selection of DiD methods using the guidelines specified in the W3C DiD method Rubric V1.0 (draft 06 January 2021). The evaluation reflects the authors’ opinion based on documents and source code that are publicly available. The report mainly includes a comprehensive evaluation.,"Web word processing, spreadsheets and presentations",https://docs.google.com/document/d/1jP-76ul0FZ3H8dChqT2hMtlzvL6B3famQbseZQ0AGS8//,,Report,,,About DID Methods,,,,,,,,2021-04-04,,,,,,,,,,,,, @@ -654,6 +654,7 @@ KERI,DIF,,,,,,,,,Q&A about KERI’s Security model and Guarantees - Part II Secu KERI,,,,Samuel Smith,,,,,,W3C DID Security Concerns,**Certificate Transparency Solution**
- Public end-verifiable append-only event log with consistency and inclusion proofs
- End-verifiable duplicity detection = ambient verifiability of duplicity
- Event log is third party infrastructure but it is not trusted because logs are verifiable.
- Sparse Merkle trees for revocation of certificates
- (related EFF SSL Observatory),,https://github.com/SmithSamuelM/Papers/blob/master/presentations/W3C_DID_Security_Concerns.pdf,,,,,Development,,,,,,,,2020-01-14,,,,,,,,,,,,, KERI,DIF,,https://hackmd.io/orhyiJkLT721v4PCPkvQiA?both,,,,,,,Implementation Notes for KERI,"The interpretation of the data associated with the digest or hash tree root in the seal is independent of KERI. This allows KERI to be agnostic about anchored data semantics. Another way of saying this is that seals are data agnostic; they don’t care about the semantics of its associated data. This better preserves privacy because the seal itself does not leak any information about the purpose or specific content of the associated data. Furthermore, because digests are a type of content address, they are self-discoverable. This means there is no need to provide any sort of context or content specific tag or label for the digests. Applications that use KERI may provide discovery of a digest via a hash table (mapping) whose indexes (hash keys) are the digests and the values in the table are the location of the digest in a specific event. To restate, the semantics of the digested data are not needed for discovery of the digest within a key event sequence.",,https://github.com/decentralized-identity/keri/blob/master/implementation.md,,,,,Development,,,,,,,,2020-05-16,,,,,,,,,,,,, KERI,IDCommons,,,"Samuel Smith, Dave Huseby",,,,,IIW,KERI and ADS Key State Provenance Logs Kumbaya (KEL and ADPL),"This was a meeting of the minds between myself and Sam Smith and Adrian Gropper that was hugely successful. We all decided to use the term ""endorser"" for what we all called ""registrar""/""witness""/""notary"". We also realized that the KERI proposal for encoding is good enough for authentic data provenance logs and we will be using the KERI encoding. Sam has modified the spec for KERI key event logs to include scripting capabilities needed in the authentic data economy for doing things like cross-chain atomic swaps for selling non-fungible authentic data (NFADs).

The result is that there is grand convergence on the encoding and file format for key event provenance logs that will be supported by both KERI networks and the broader authentic data economy.",,https://iiw.idcommons.net/24H/_KERI_and_ADS_Key_State_Provenance_Logs_Kumbaya_(KEL_and_ADPL),,Session Notes,,,Development,,,,,,,,2021-05-06,,,,,,,,,,,,, +KERI,Jolocom,,,,,,,,,Jolocom’s latest contributions to DIF,"Jolocom added support for an [off-chain element based on KERI](https://github.com/decentralized-identity/keri/blob/master/kids/KERI_WP.pdf). This is in addition to the Jolocom DID method (did:jolo and did:keri), which supports the Jolocom-Lib, our own SDK and the Jolocom SmartWallet.",,https://jolocom.io/blog/jolocoms-contributions-to-dif/,,,,,Development,,,,,,,,2021-01-19,,,,,,,,,,,,, KERI,SSI-Meetup,,https://www.youtube.com/watch?v=izNZ20XSXR0,,,,,,,Key Event Receipt Infrastructure (KERI): A secure identifier overlay for the internet – Sam Smith – Webinar 58,,,https://ssimeetup.org/key-event-receipt-infrastructure-keri-secure-identifier-overlay-internet-sam-smith-webinar-58/,,,,,Presentations,,,,,,,,2020-05-19,,,,,,,,,,,,, KERI,,,,Samuel Smith,,,,,,KERI Overview,"**Separation of Control** Shared (permissioned) ledger = shared control over shared data.
* Shared data = good, shared control = bad.
* Shared control between controller and validator may be problematic for governance, scalability, and performance.KERI = separated control over shared data.
* Separated control between controller and validator may provide better decentralization, more flexibility, better scalability, lower cost, higher performance, and more privacy at comparable security.",,https://raw.githubusercontent.com/SmithSamuelM/Papers/master/presentations/KERI2_Overview.web.pdf,,,2.54,,Presentations,,,,,,,,2020-10-22,,,,,,,,,,,,, KERI,,,,Samuel Smith,,,,,,The Duplicity Game: or why you can trust KERI,"**Inconsistency vs. Duplicity**
- inconsistency: lacking agreement, as two or more things in relation to each other
- duplicity: acting in two different ways to different people concerning the same matter
**Internal vs. External Inconsistency**
- Internally inconsistent log = not verifiable.
- Log verification from self-certifying root-of-trust protects against internal inconsistency.
Externally inconsistent log with a purported copy of log but both verifiable = duplicitous.
Duplicity detection protects against external inconsistency.",,https://raw.githubusercontent.com/SmithSamuelM/Papers/master/presentations/DuplicityGame_IIW_2020_A.pdf,,,,,Presentations,,,,,,,,2020-05-09,,,,,,,,,,,,,