mirror of
https://github.com/Decentralized-ID/decentralized-id.github.io.git
synced 2024-12-25 15:19:26 -05:00
482 lines
2.4 MiB
482 lines
2.4 MiB
main,parent,name,source,founders,related,location,serving,policy,event,title,text,description,link,image,type,status,section,sector,industry,market,focus,projects,tech,standard,date,code,twitter,youtube,blog,feed,discord,crunchbase,linkedin,docs,devtools,app,telegram,forum
|
||
Animo,,Animo,,Timo Glastra; Ana Goessens,Digibyte; ESSIFlab,"European Union, Netherlands, Utrecht",Europe,,,Animo,"A Change In How We Handle Verification<br><br>The world is set up in a way where everyone continually needs to prove aspects of themselves. Organisations need personal information about their customers to verify their identity to give them access to services. Institutions like colleges, governments and banks need to verify information to assist people in their day to day life. To do this in the digital world people currently fill out countless forms, create accounts for every service and send copies of sensitive documents over email. This has resulted in people's information being scattered around the web, given to and stored by countless organisations, with no easy way to control and manage it.<br><br>Animo is working to change this. Through a relatively new technology called verifiable credentials, it is possible for users to store personal information about themselves in a digital wallet on their phone. Organisations that need to verify some information about the user (e.g. age, address, membership, qualification, etc.) can send a request and, with user permission, get the information they need without having to store any personal data themselves. The user proves aspects of their digital identity while keeping control over any personal information.","Our aim We build digital infrastructure that makes the world function as it should. Without borders, without vendor lock-in, without limitations. We work for a future where every individual is in control of their life. What we do At Animo we work with developers and organizations to improve digital interaction using verifiable credentials. We create solutions where exchanging data is privacy preserving and frictionless. SEE HOW IT WORKS Who we are Our team works hard to solve the most difficult problems without taking shortcuts. At Animo we understand the value in using open source and open standards to get the job done, working together makes innovation easy. MEET THE TEAM Our projects We have worked with some great organizations to make their products easier, faster and safer. Interested to see what we are working on? Take a look at our projects and updates to see our work. ALL PROJECTS AND UPDATES H I G H L I G H T S Animo receives EU grant to work on open source SSI development. READ MORE Trusted by Demo The future of digital verification is private, secure and centered around the end-user. Our demo will let you experience how easy it is!",https://animo.id/,,Company,,Company,,,Enterprise,ID; Software,,,,2020,https://github.com/animo,https://twitter.com/AnimoSolutions,,https://medium.com/@AnimoSolutions,https://medium.com/feed/@AnimoSolutions,https://discord.gg/vXRVNh3DYD,,https://www.linkedin.com/company/animosolutions/,https://docs.agent-cli.animo.id/,,,,
|
||
Anonyome,,Anonyome,,Steve Shillingford,,"USA, Utah, Salt Lake City",USA,,,Anonyme Labs,"Anonyome Labs was created to give people control and freedom over their personal and private information. We believe that people should be able to determine how, what, and with whom they share their personal details. We build the tools necessary to empower our brand partners’ users and end consumers with the necessary capabilities to protect and control their digital information. Anonyome Labs puts control back into users’ hands through mobile and desktop applications.","Privacy is progress Privacy will be the defining topic of this decade. We believe personal freedom hinges on safety and security, and this liberty is essential in our expanding digital world. We provide scalable mobile and desktop solutions that empower users to protect their private information. The challenge In today’s world millions of consumers manage much of their lives online, requiring personally identifiable information at every turn. Consumers need access to these online conveniences while also protecting their personal information. Businesses need better ways of interacting with their customers without the risks associated with collecting their personal data. The solution To meet the challenges facing businesses and consumers, Anonyome Labs provides a platform that enables a next generation approach to security and privacy. A cornerstone of this platform includes limited disclosure digital identities, that we call “Sudos”. Using and interacting with them reduces the amount of personally identifiable information (PII) needed to navigate today’s digital world. Both consumers and businesses maintain everyday relationships, communications, commerce, and more without unnecessary PII disclosure or collection. The platform is complete with all the components necessary to build secure and private market offerings. These scalable components include secure and private calling, messaging, video, browsing, purchasing, and more. The Sudo Platform enables businesses to empower their users through privacy and cyber safety capabilities. We provide a variety of tools that can be rapidly integrated into B2C product and service offerings. To show how the platform can be used, we provide a reference consumer application called MySudo. Sudo Platform The Complete Privacy Toolkit Sudo Platform is a set of easy-to-use privacy solutions that can be integrated into your existing and new products. Sudo Platform APIs and SDKs are quick to learn and simple to use. MySudo Talk, text, email, browse and purchase all in one app Check out the MySudo app, which offers users safety and security in the digital world. Create and manage Sudos for privacy protection online, on the phone, or wherever technology takes you. “Disposable emails, phone numbers and prepaid cards aren’t new. But Sudo does a good job at bringing them all together.” TechCrunch “Sudo is an all-in-one platform for calls, texts, emails and browsers that is customizable and secure.” Fast Company “Never worry about spam again: Sudo supplies disposable phone numbers, email addresses.” Digital Trends From our app store “Can’t live without it. I don’t understand how I manage to survive without this app, it is absolutely indispensable.” comechingones “So easy! I thought this app was so easy to use and It’s so awesome to have an avenue for creating a secure identity account! Good job! I would recommend!” Hippieuser “Awesome!!! Easy to use and private! Win win!!!!” iwishitwasyou katiesweet2010 Let’s chat Want to learn how to integrate our technology into your product stack? Please contact us, we’d love to hear from you.",https://anonyome.com/,,Company,,Company,,,Consumer,Privacy,,,,2014,,https://twitter.com/AnonyomeLabs,,https://anonyome.com/blog/,https://anonyome.com/feed/,,https://www.crunchbase.com/organization/anonyome-labs,,,,,,
|
||
Auth0,Okta,Auth0,,Eugenio Pace; Federico Jack; Matias Woloski,,"USA, Washington, Seattle",,,,Auth0,"Auth0 is a cloud identity management SAAS application for the web, mobile, IoT, and internal software","From improving customer experience through seamless sign-on to making MFA as easy as a click of a button – your login box must find the right balance between user convenience, privacy and security.That’s why Okta and Auth0 have joined forces. Because we know together we can help you build a better solution for Customer Identity (CIAM) that will reduce security and compliance risks, improve your UX, and help your developers maximize their time. Basically, we make your login box awesome. Get Gartner’s 2022 overview of leading Access Management vendorsLearn more Let’s take a look at everything you can do. Optimize for user experience and privacy. Use social login integrations, lower user friction, incorporate rich user profiling, and facilitate more transactions. Registration Anonymous User Bot Detection Registration Login Directory SSO Social Integrations Access Progressive Profiling Transactions Step-up Auth Auth Factors convenience privacy security report The Total Economic Impact of Auth0 11.7ᴹ Total benefit 548% ROI <6mo Payback Time powered by Forrester® ↗ whitepaper Build vs Buy: Guide to Identity Management 6 Signs You Need to Move From DIY to an Identity Management Solution video The Auth0 Product Tour A short tour through Auth0’s extensibility and uses for B2B, B2C, and B2E.",https://auth0.com,,Company,,Company,,,Consumer; Enterprise,ID; IAM,,,,2013,,,,https://auth0.com/blog/,https://auth0.com/blog/rss.xml,,https://www.crunchbase.com/organization/auth0,,,,,,
|
||
Auth0,,Okta,,Frederic Kerrest; Todd McKinnon,10000+ Organizations; JetBlue; Nordstrom; Siemens; Slack; T-Mobile; Takeda; Teach for America; Twilio,"USA, California, San Francisco",,,,Okta,Everything you need to build or integrate authentication and user management,"Okta is the leading independent identity provider. The Okta Identity Cloud enables organizations to securely connect the right people to the right technologies at the right time. With more than 7,000 pre-built integrations to applications and infrastructure providers, Okta provides simple and secure access to people and organizations everywhere, giving them the confidence to reach their full potential. More than 10,000 organizations, including JetBlue, Nordstrom, Siemens, Slack, T-Mobile, Takeda, Teach for America, and Twilio, trust Okta to help protect the identities of their workforces and customers.<br>",https://okta.com,,Company,,Company,,,Enterprise,IAM,,,,2009,https://github.com/originalmy,https://twitter.com/okta,https://www.youtube.com/OktaInc,https://www.okta.com/blog/; https://developer.okta.com/blog/,https://developer.okta.com/feed.xml,,https://www.crunchbase.com/organization/okta,https://www.linkedin.com/company/okta-inc-/,,,,,
|
||
Bonifii,CULedger,Bonifii,,Darrell O'Donnell; John Ainsworth; Julie Esser,Sovrin Steward; Indicio; Over 70 Partners,"USA, Colorado, Denver",,,,Bonifii,"Bonifii is an innovative financial technology company that proactively protects credit union members from becoming victims of financial fraud by providing safe, secure, and convenient identity verification solutions.","Bonifii is a credit union-owned CUSO (credit union service organization) that focuses on delivering innovative applications on a global distributed ledger (DLT) or blockchain platform for credit unions. Blockchain has already been substantiated as a critical element of digital transformation. In working through a national consortium of credit unions and trusted industry partners, CULedger is uniquely positioned to help credit unions serve the digital needs of its members.<br><br>CULedger’s credit union-specific distributed ledger technology gives the credit union industry the edge it needs to remain competitive in the rapidly changing financial services industry. CULedger is not just about having a seat at the table as it relates to this technology. Credit unions will be able to implement the technology and utilize the current and future applications that run on it.<br><br>The development of CULedger was made possible through the efforts of many partners including the 70+ credit unions, CUSOs, and industry partners that made a contribution toward the "research to action" initiative, Best Innovation Group, the Credit Union National Association, the National Credit Union CEO Roundtable, The Mountain West Credit Union Association and Evernym (www.Evernym.com). Evernym developed the Sovrin Platform (www.sovrin.org).",https://bonifii.com/,,Company,,Company,,,Clients,Banking,,,,2017,,,,https://www.bonfii.com/resources-blog,https://www.bonfii.com/resources-blog?format=rss,,https://www.crunchbase.com/organization/culedger,,,,,,
|
||
Bonifii,,CULedger,,Darrell O'Donnell; John Ainsworth; Julie Esser,Sovrin Steward; Bonifii,,,,,CULedger,"CULedger is a credit union-owned CUSO (credit union service organization) that is creating the premier platform of digital exchange for financial cooperatives globally. In working through a national consortium made up of credit unions and trusted industry investors, CULedger has pioneered new developments related to global self-sovereign decentralized identity, MemberPassTM, that will further enhance the trust credit unions have with their members.<br><br>CULedger provides advantages to credit unions and their members by reducing risks associated with cybersecurity and fraud, improving member experience, streamlining internal processes and reducing administrative and operational costs. To learn more about MemberPass, visit www.memberpass.com or follow the company on the CULedger Facebook, LinkedIn or Twitter.",,https://culedger.com,Sovrin Steward,Organization,Rebrand,Company,,,Consumer,Banking,,,,2017,,https://twitter.com/CULedger/,https://www.youtube.com/channel/UCPcopipop1XTBdYkM2VHPfw,,,,https://www.crunchbase.com/organization/culedger,https://www.linkedin.com/company/27238176/,,,,,
|
||
Danube,,Danube,,Markus Sabadello,Sovrin Steward; DIF; DHS; RWoT; IIW; ESSIFLab,"European Union, Austria, Wien, Vienna",Europe,,IIW; RWoT,Danube Tech,"Danube Tech works on advanced Information and Communication Technologies (ICTs) in the field of digital identity and personal data. Following the NSA surveillance scandal, the fall of the Safe Harbor agreement, the E.U.'s new General Data Protection Regulation (GDPR), and several other developments, questions around control, privacy, and economic value of personal data are leading to new legal frameworks, business models, and technological architectures and protocols.<br><br>Danube Tech and its partners are working on several emerging technologies, including: 1. The XDI protocol, 2. The FreedomBox personal server, and 3. Blockchain-based identifier registration services.<br><br>Grown out of a background of Internet freedom activism and grassroots hacker culture, we continue to explore digital developments at the edge of important political and social questions. We contribute to ongoing discourse about anonymity vs. veronymity, centralization vs. decentralization, as well as sovereign and user-centric digital identity.","Danube Tech works on advanced Information and Communication Technologies (ICTs) in the field of digital identity and personal data. We explore questions around control, privacy, and economic value of personal data that are leading to new legal frameworks, business models, and technological architectures and protocols.<br><br>Danube Tech and its partners are now focused on developing technologies and products for the W3C Decentralized Identifiers (DIDs) standard. We are building the bridges that interconnect all decentralized identity networks globally. This enables interoperable identity applications and services for everyone.",https://danubetech.com/,,Company,,Company,,,Enterprise,ID; Data; Privacy,,Universal Resolver; BTCR; Indy; ERC725,DID; Verifiable Credentials; OAuth; ,2015,https://github.com/danubetech; https://github.com/projectdanube,https://twitter.com/peacekeeper,,https://medium.com/@markus.sabadello,https://medium.com/feed/@markus.sabadello,,https://www.crunchbase.com/organization/danube-tech,https://www.linkedin.com/company/danube-tech,,,,,
|
||
Danube,German Blockchain Association,,https://web.archive.org/web/20181117025930/https://www.bundesblock.de/wp-content/uploads/2018/10/ssi-paper.pdf,,,,,,,New Position Paper: Self Sovereign Identity defined,"In a SSI proof-of-concept during the first half of 2018, 3 banks, an insurance company, the Austrian Post, and an institution representing notaries has cooperated to implement a range of use cases based on DIDs, Verifiable Credentials, Sovrin, and the XDI protocol. The use cases included:<br> * digital ID onboarding for existing clients,<br> * SSO for new clients,<br> * sharing of KYC data between organizations,<br> * dynamic data verification (change-of-address),<br> * secure communication (e-mail with ID confirmation),<br> * change of identity service providers,<br> * Personal ID verification in a peer-to-peer marketplace<br><a href=""https://www.Hyperledger.org/blog/2018/08/15/developer-showcase-series-markus-sabadello-Danube-tech"">Developer Showcase Series: Markus Sabadello, Danube Tech</a><br> I have worked on digital identity technologies for a long time, the question of who we are, how we present ourselves, and what do others know about us in the digital world. There’s this concept of user-centric identity, and more recently self-sovereign identity, which places individuals at the center of their online relationships and transactions, and gives us all the ability to create, manage, use, and destroy our online identities according to our own rules.",,https://serverprofis.bundesblock.de/new-position-paper-self-sovereign-identity-defined/,,Paper,,Meta,,,,,,,,2018-11-15,,,,,,,,,,,,,
|
||
Danube,UDHR,,,,,,Global,,,The Universal Declaration of Human Rights,"The Universal Declaration of Human Rights (UDHR) is a document that acts like a global road map for freedom and equality – protecting the rights of every individual, everywhere. It was the first time countries agreed on the freedoms and rights that deserve universal protection in order for every individual to live their lives freely, equally and in dignity.
|
||
|
||
The UDHR was adopted by the newly established United Nations on 10 December 1948, in response to the “barbarous acts which […] outraged the conscience of mankind” during the Second World War. Its adoption recognized human rights to be the foundation for freedom, justice and peace.
|
||
|
||
Work on the UDHR began in 1946, with a drafting committee composed of representatives of a wide variety of countries, including the USA, Lebanon and China. The drafting committee was later enlarged to include representatives of Australia, Chile, France, the Soviet Union and the United Kingdom, allowing the document to benefit from contributions of states from all regions, and their diverse religious, political and cultural contexts. The UDHR was then discussed by all members of the UN Commission on Human Rights and finally adopted by the General Assembly in 1948.",,https://www.amnesty.org/en/what-we-do/universal-declaration-of-human-rights/,,Paper,,Policy,Cross,,,Humanitarian,,,,1948,,,,,,,,,,,,,
|
||
Danube,ICCPR,,,,,,Global,,,International Covenant on Civil and Political Rights,"ICCPR is an international human rights treaty adopted in 1966. The UK agreed to follow ICCPR in 1976. It enables people to enjoy a wide range of human rights, including those relating to:
|
||
- freedom from torture and other cruel, inhuman or degrading treatment or punishment
|
||
- fair trial rights
|
||
- freedom of thought, religion and expression
|
||
- privacy, home and family life
|
||
- equality and non-discrimination",,https://www.equalityhumanrights.com/en/our-human-rights-work/monitoring-and-promoting-un-treaties/international-covenant-civil-and,,Paper,,Policy,Cross,,,Humanitarian,,,,1966,,,,,,,,,,,,,
|
||
Danube,ICESCR,,,,,,Global,,,"International Covenant on Economic, Social and Cultural Rights","The International Covenant on Economic, Social and Cultural Rights (ICESCR) is a multilateral treaty adopted by the United Nations General Assembly (GA) on 16 December 1966 through GA. Resolution 2200A (XXI), and came in force from 3 January 1976.[1] It commits its parties to work toward the granting of economic, social, and cultural rights (ESCR) to the Non-Self-Governing and Trust Territories and individuals, including labour rights and the right to health, the right to education, and the right to an adequate standard of living. As of July 2020, the Covenant has 171 parties.[3] A further four countries, including the United States, have signed but not ratified the Covenant.",,"https://en.wikipedia.org/wiki/International_Covenant_on_Economic,_Social_and_Cultural_Rights",,Paper,,Policy,Cross,,,Humanitarian,,,,1966,,,,,,,,,,,,,
|
||
Danube,CRPD,,,,,,Global,,,Convention on the Rights of People with Disabilities,"The United Nations Convention on the Rights of Persons with Disabilities (CRPD) is an international human rights treaty adopted in 2006 that reaffirms that all persons with disabilities must enjoy all human rights and fundamental freedoms.
|
||
|
||
It clarifies that all persons with disabilities have the right to participate in civil, political, economic, social and cultural life of the community.",,https://www.edf-feph.org/un-crpd/,,Paper,,Policy,Cross,,,Humanitarian,,,,2023,,,,,,,,,,,,,
|
||
Danube,ECHR,,,,,,Europe,,,European Convention on Human Rights,"The European Convention on Human Rights (ECHR) protects the human rights of people in countries that belong to the Council of Europe.
|
||
|
||
All 47 Member States of the Council, including the UK, have signed the Convention. Its full title is the ‘Convention for the Protection of Human Rights and Fundamental Freedoms’.
|
||
|
||
What is the Council of Europe?
|
||
Formed in 1949, the Council of Europe is completely separate from the European Union and much larger, with 47 members compared to the EU’s 28. The UK became a Council member 24 years before it joined the EU. The UK’s membership of the Council would be unaffected if it left the EU",,https://www.equalityhumanrights.com/en/what-european-convention-human-rights,,Paper,,Policy,Cross,,,Humanitarian,,,,2017-04-19,,,,,,,,,,,,,
|
||
Danube,CFREU,,,,,,Europe,,,Charter of Fundamental Rights of the European Union,"The Charter of Fundamental Rights of the European Union brings together the most important personal freedoms and rights enjoyed by citizens of the EU into one legally binding document. The Charter was declared in 2000, and came into force in December 2009 along with the Treaty of Lisbon",,https://www.citizensinformation.ie/en/government_in_ireland/european_government/eu_law/charter_of_fundamental_rights.html,,Paper,,Policy,Cross,,,Humanitarian,,,,2023-01-31,,,,,,,,,,,,,
|
||
Danube,HRHP,,,,,,Europe,,,Human Rights Handbook for Parliamentarians,"Human rights have pervaded much of the political discourse since the Second World War. While the struggle for freedom from oppression and misery is probably as old as humanity itself, it was the massive affront to human dignity perpetrated during that War, and the need felt to prevent such horror in the future, which put the human being back at the centre and led to the codification at the international level of human rights and fundamental freedoms. Article 1 of the Charter of the United Nations declares “promoting and encouraging respect for human rights and for fundamental freedoms for all without distinction as to race, sex, language, or religion” as one of the purposes of the Organization.
|
||
The Universal Declaration of Human Rights, adopted by the United Nations General Assembly in 1948, was the first step towards achieving this objective. It is seen as the authoritative interpretation of the term “human rights” in the Charter of the United Nations. The Universal Declaration together with the International Covenant on Civil and Political Rights and the International Covenant on Economic, Social and Cultural Rights, both adopted in 1966, constitute what has become known as the International Bill of Human Rights. Since 1948, human rights and fundamental freedoms have indeed been codified in hundreds of universal and regional, binding and non-binding instruments, touching almost every aspect of human life and covering a broad range of civil, political, economic, social and cultural rights. Thus, the codification of human rights has largely been completed. As the Secretary-General of the United Nations, Mr. Kofi Annan, has recently pointed out, today’s main challenge is to implement the adopted standards",,https://www.refworld.org/docid/46cea90d2.html,,Paper,,Policy,Cross,,,Humanitarian,,,,2005-08-01,,,,,,,,,,,,,
|
||
DigitalBazaar,,DigitalBazaar,,Dave Longley; Manu Sporny,IETF; Web Payments IG; VCTF; CCG; DHS; Veres One; W3C; GS1; SecureKey; TradeLens; Sovrin Steward; Founding Sovrin Steward; USC&B,"USA, Virginia, Blacksburg",USA,,,Digital Bazaar,"Digital Bazaar, Inc. develops technology, services, and products that help integrate Linked Data, identity, and payments into the core architecture of the Web.","We have over a decade of extensive experience with web standards at the World Wide Web Consortium (W3C) and the Internet Engineering Task Force including leadership roles in the Web Payments Interest Group as well the Verifiable Claims Task Force and the Credentials Community Group at the W3C.<br><br>Digital Bazaar is deeply involved in the latest Web research and development standards groups including XHTML+RDFa, HTML5+RDFa, Semantic Web, OpenID Connect, and WebID.<br><br>Digital Bazaar also oversees technical development of core Web technologies across a wide variety of technology areas and has been a primary driving force in getting open identity and Linked Data technologies like JSON-LD and RDFa adopted at companies like Google, Microsoft, Yahoo!, Facebook and agencies in the US Federal Government.",https://digitalbazaar.com/,Https://i.imgur.com/v2ZuWeL.jpg,Company,,Company,,,Enterprise,ID; Payments,,Encrypted Data Vaults; Linked Data,XHTML+RDFa; HTML5+RDFa; Semantic Web; OpenID Connect; WebID; JSON-LD; RDFa; Verifiable Credentials; DID,2004,https://github.com/digitalbazaar,https://twitter.com/digitalbazaar,https://www.youtube.com/channel/UCZv6VnzDx2pj_slpqKxTvUQ,,,,https://www.crunchbase.com/organization/digital-bazaar,https://www.linkedin.com/company/digital-bazaar-inc-/,,,,,
|
||
Disco,,Disco,,Evin McMullen; Sarah Ruddy,,"USA, New York, NYC",USA,,,Disco.xyz,"Disco is your identity for the metaverse.<br>Our friendly tools make it easy for you to carry your data from web2 to Web3, under your ownership and control. We believe in the power of equality, ownership and joyful experiences. We are building autonomy and freedom for all blockchains, all apps and all people.","Disco brings fun to the Metaverse with self-sovereign identity. Disco enables users to enjoy nuanced Web3 reputation associated to public identifiers across chains and web2, while maintaining privacy and user autonomy. Disco profiles coming Spring 2022",https://www.disco.xyz/,,Company,,Company,Web3,,Consumer,Reputation,,Cryptography; governance frameworks,,2022,,,,https://mirror.xyz/0xaf115b18eE30734f6CeA1C56BE76615df046e010,https://disco.mirror.xyz/feed/atom,,,https://www.linkedin.com/company/disco-xyz/,,,,,
|
||
Dock,,Dock,,Elina Cadouri; Nick Macario,,"USA, California, San Francisco",USA,,,Dock,"Dock was founded with a mission to solve universal problems with existing data solutions: data silos and gatekeepers, untrusted and inaccurate information, incompatibilities across platforms, inefficiencies with verifying data, and lack of control and privacy for users.<br><br>In today’s world, accurate and individually-owned data is more important than ever. Our ability to navigate through society depends heavily on being able to accurately capture and prove various pieces of information that represent who we are and our accomplishments.<br><br>These pieces of information can be anything from a passport proving our identity and citizenship, a graduation diploma proving our education, or a vocational license proving our ability to work in a designated field. Digital credentials are virtual representations of these important pieces of data which are essential in our lives and careers, but there are many problems with how this data is captured, shared, and controlled.<br><br>Dock is open and permissionless across our technology, network and governance. By enabling any organization or developer to issue via Dock, we can work together across markets and industries to unlock a better future world powered by secure, individually-owned verifiable credentials.","There is a problem in the digital economy. Paper and PDFs are easy to fake. Verifying the authenticity of a document or certificate is slow and manual. And if you don't verify them, you risk fraud. That's why world-class organisations use Verifiable Credentials to verify documents instantly. Verifiable Credentials are documents that contain a crypto signature: a permanent stamp that allows anyone to confirm you issued that credential. They are fraud-proof and verifiable with one click, creating instant trust between people and organisations in the digital economy. Dock provides organizations with all the infrastructure and tools to issue and verify credentials on the blockchain. Create your identity on blockchain and issue your first Verifiable Credentials in seconds. Certs intuitive no-code dashboard lets you customize and issue certificates in a few clicks. It's the preferred solution for those who want to issue VCs without having to touch any code. "We are confident that Dock is able to support us in scaling up our projects regarding SSI solutions within government and beyond” Do you want to issue Verifiable Credentials from your existing system? Certs API enables developers to easily and instantly issue, verify, manage, and revoke Verifiable Credentials and Decentralized Identities on the Dock Blockchain. “A decentralised option that maintains highest levels individual data privacy and integrity." Build a Verifiable Credentials wallet inside your app, and allow your users to receive, store and manage their credentials and DOCK tokens. Built for React Native applications with added support for Polkadot-JS. Available for iOS and Android. “Together with the Dock team we are bringing digital empowerment to the people.” Dock’s substrate-based blockchain provides an open-source, decentralized, and low-cost platform for organizations and developers to build Decentralized Identity and data applications for the Web3. Easy-to-use and open-source framework especially built for developers and enterprises to develop and scale DID products with cutting-edge innovations and quick upgrades Incorporating standards from the industry-leading World Wide Web Consortium (W3C) and VCDM to facilitate data exchange with other platforms seamlessly Tamper-proof data management that is exceedingly secure and cryptographically verifiable ensuring trust and privacy in data exchange and management Integrated with Parity’s Frontier, deploy smart contacts written on solidity and interact with them using existing Ethereum libraries such as Web3 or ethers.js With the same consensus as Polkadot, Dock’s blockchain is especially designed to build enterprise-grade products with high efficiency, scalability, and speed Built with Nominated Proof of Stake model that is validator-friendly, ultra-low-cost, and energy-efficient with lowest carbon footprint Dock’s technology stack unlocks endless use cases from a wide variety of sectors including DeFi, supply chain, healthcare, metaverse, human resource, academic institutions, trading platforms and many more. Read below how your organization can benefit from our technology. Ensure compliance and simplify access to financial services Empower learners with secure verifiable credentials Create safer, ethical, and more efficient supply chains Streamline healthcare credential verification and monitoring Easily issue and verify data while protecting their privacy Provide credentials that are cryptographically verifiable Enable login access to platforms and apps without storing any personal data Provide a privacy preserving yet cryptographically provable identity Working with some of the best names in the Verifiable Credentials and blockchain If you think we can build something together, contact us here Download Dock’s Wallet App and take back control of your DOCK tokens. Send, receive, and manage your DOCK tokens without the involvement of a third party. Check out what we have built and what we are working on, on Dock’s Roadmap. Help further the growth and adoption of the Dock network and join our Grant Program. Design and develop projects that reach specific objectives and be rewarded. Help increase the brand awareness of Dock by becoming a Dock Captain. Create your own content, videos, and more, or repurpose Dock’s content and be rewarded.",https://dock.io,,Company,,Company,Web3,,Consumer,Bridge Web2 Profiles,,Ethereum,DID,2017,https://github.com/docknetwork,https://twitter.com/docknetwork,https://www.youtube.com/channel/UC8vcF6sIhussJ6nZsSid_cA,https://blog.dock.io/,https://blog.dock.io/rss/,,https://www.crunchbase.com/organization/dock-io,https://www.linkedin.com/company/docknetwork/,,,,,
|
||
EnergyWeb,,EnergyWeb,,Ana Trbovic,,"European Union, Germany, Berlin",Europe,,,EnergyWeb,We build open-source Web3 technologies that help companies navigate the energy transition,Energy Web technology is powering decarbonization solutions in dozens of countries,https://www.energyweb.org/,,Company,,Company,Energy,,Enterprise,Green energy,,,,2017,,,,https://energywebx.medium.com/,https://medium.com/feed/@energywebx,https://discord.com/invite/psraNwqGqp,https://www.crunchbase.com/organization/energy-web-foundation,https://www.linkedin.com/company/energywebx/,,https://lab.energyweb.org/,,,
|
||
Evernym,Avast,Evernym,,Jason Law; Timothy Ruff,Sovrin Steward; Founding Sovrin Steward; DIF; ESSIFLab,"USA, Utah, Draper",USA,,,Evernym,"When you work with Evernym, you work with the world’s leading expert in decentralized identity. With deep skills in digital identity, cryptography, privacy, security and new governance frameworks, we are the original developers of Hyperledger Indy and the creator of the Sovrin Network and the Sovrin Foundation.<br><br>We are passionate about open source and open standards, so there’s no vendor lock-in to our solutions. We believe in true data interoperability and delivering the highest levels of security and privacy in the market, and our software makes it easy and simple to connect, share, and rely on trusted digital information.","While the risk of fraud and data misuse is increasing, decentralized identity and credentials are meeting the demands of businesses across the digital identity value chain with: - Enhanced security - Privacy & user experience with the ability to easily consent - Shareable & verifiable claims without having to disclose sensitive data With this report, access promising use cases, risks and considerations, and expert recommendations on creating value for the fully decentralized future. Gartner®, Innovation Insight for Decentralized Identity and Verifiable Claims, 18 August 2021. Michael Kelley, David Mahdi, Gartner do not endorse any vendor, product or service depicted in its research publications, and do not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, express or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally, and is used herein with permission. All rights reserved.",Http://www.evernym.com,Https://www.evernym.com/wp-content/uploads/2017/06/evernymBarebell_new2.png,Company,,Company,,,Enterprise,ID; privacy; security,VCI,,Verifiable Credentials; DID,2013,,https://twitter.com/evernym,https://www.youtube.com/c/Evernym,https://www.evernym.com/blog/,https://www.evernym.com/feed/,,https://www.crunchbase.com/organization/evernym,https://www.linkedin.com/company/evernym-inc-/,,,,,
|
||
Evernym,,Avast,,Jason Law; Timothy Ruff,,"Czech Republic, Prague",,,,Avast,Avast is a security software development company that protects people from threats on the internet.,,https://www.avast.com/,,Company,,Company,Cybersecurity,,Consumer; Enterprise,Virus Protection,,,,1988,,,,https://blog.avast.com/,https://blog.avast.com/rss.xml,,https://www.crunchbase.com/organization/evernym,,,,,,
|
||
Evernym,Evernym,,,,Verityflow,,,,,Creating a new verification flow in Verity Flow,"Evernym product manager Candice Ward shows the process of creating a custom verification workflow in Evernym's no-code visual interface, Verity Flow.",,https://www.youtube.com/watch?v=vyuoi_lmxia,,Video,,HowTo,,,,,,,,2021-08-26,,,,,,,,,,,,,
|
||
Evernym,Evernym,,,,Verityflow,,,,,Verifying credentials using Verity Flow,"Evernym product manager Candice Ward demonstrates the process of requesting and verifying credentials using Evernym's no-code visual interface, Verity Flow.<br><br>See also: Part 2 - Creating a custom verification prompt: [https://youtu.be/VYuoi_LMxiA](https://youtu.be/VYuoi_LMxiA)",,https://www.youtube.com/watch?v=9d2qmzw4bxy,,Video,,HowTo,,,,,,,,2021-08-26,,,,,,,,,,,,,
|
||
Evernym,Evernym,,,,Verityflow,,,,,Verity Flow: Evernym's no-code solution for issuing and verifying digital credentials,"On our August 2021 webinar, Evernym's product team provided a first look at Verity Flow, our new no-code solution for issuing and verifying credentials.<br><br>We covered:<br><br>- An overview of Verity Flow, including a demo and what you can expect from our upcoming release<br>- How it’s used today, by 1,500+ lab accounts within the IATA Travel Pass ecosystem<br>- An update on our product roadmap, including support for the cheqd network and a new identity verification capabiliy",,https://www.youtube.com/watch?v=nafqcqiycjy,,Video,,HowTo,,,,,,,,2021-08-26,,,,,,,,,,,,,
|
||
Evernym,PRNewswire,,,,,LONDON,,,,Sovrin Foundation Launches First Dedicated Self-Sovereign Identity Network,"Evernym, Inc. announced today at the Ctrl-Shift Personal Information Economy conference that it has donated the intellectual property for the Sovrin Identity Network—the world's first and only dedicated self-sovereign identity platform—to a newly-formed nonprofit organization. The Sovrin Foundation, which is run by a group of internationally recognized identity experts, has a mission to empower everyone with a digital identity which they fully own and control.","Sovrin Foundation Launches First Dedicated Self-Sovereign Identity Network Sep 29, 2016, 02:00 ET LONDON, Sept. 29, 2016 /PRNewswire-USNewswire/ -- Evernym, Inc. announced today at the Ctrl-Shift Personal Information Economy conference that it has donated the intellectual property for the Sovrin Identity Network—the world's first and only dedicated self-sovereign identity platform—to a newly-formed nonprofit organization. The Sovrin Foundation, which is run by a group of internationally recognized identity experts, has a mission to empower everyone with a digital identity which they fully own and control. ""Imagine a world where fraud is reduced, logins are simpler and more secure, governments can slash red tape, and healthcare practitioners can provide care with patients' immediate consent,"" said Dr. Phillip Windley, Sovrin Foundation's inaugural Chair. ""Beyond these applications, the potential is limitless when global impact is considered. Developing nations will finally have an identity solution to underpin birth registration, land ownership, vaccination and refugee tracking."" The underlying problem Sovrin solves is that the Internet was designed to identify machines, but has no standard way to identify people. This new platform utilizes distributed ledger technology, a close cousin to Bitcoin's underlying blockchain, but specifically tailored to identity. Sovrin imparts not only full control to the user over their identity, but absolute sovereignty: no one can read it, use it, change it, or turn it off without the user's explicit consent. When identity is ""self-sovereign"", it becomes a hub for many types of interactions like secure messaging, data sharing, and the management of consent. These capabilities enable businesses to transition from being identity providers—typically a cost center—to being identity consumers, and putting users in control leads to higher customer satisfaction. ""Governments and private industry waste hundreds of billions a year on inefficient and inaccurate identity proofing measures, which rarely if ever put the consumer first,"" Timothy Ruff, Evernym's CEO, said. ""We recognized that a completely new platform was needed to enable universal digital identity, and for it to be trusted it needs to belong to the world and not to us."" To learn more visit http://www.sovrin.org. About The Sovrin Foundation Founded in September 2016, the Sovrin Foundation is a private-sector, international non-profit body for coordinating the global, stable operation of the Sovrin Identity Network. Supported by a Board of Trustees, Technical Governance Board, Executive Director and Staff, the Sovrin Foundation is the first of its kind. Sovrin's partners include global, national and local businesses, nonprofits, government, and civic organizations, along with developers, volunteers, health providers, donors, and more. For more information about Sovrin, visit http://www.sovrin.org or follow us on Twitter: @SovrinID and #Sovrin. SOURCE The Sovrin Foundation",https://www.prnewswire.com/news-releases/sovrin-foundation-launches-first-dedicated-self-sovereign-identity-network-300336702.html,,Press,,Meta,,,,,,,,2016-09-29,,,,,,,,,,,,,
|
||
Evernym,Evernym,,,Samuel M. Smith; Dmitry Khovratovich,,,,,,Identity System Essentials,"The purpose of this white paper is to describe the essential characteristics of an identity system that provides sovereignty, security and privacy. Here the meaning of identity is derived from the characteristics of the identity system, that is, what the identity system provides. Instead of defining identity a priori, this white paper describes an identity system and then defines identity within the context of that identity system. Many of the features of the identity system has been influenced and inspired other proposed systems such as Open Reputation. This paper argues that an identity system that simultaneously provides a high degrees of sovereignty, security and privacy is best obtained via an open platform that employs distributed consensus protocols and modern cryptographic techniques.",,https://www.evernym.com/wp-content/uploads/2017/02/identity-system-essentials.pdf,,Whitepaper,,Meta,,,,,,,,2017-02,,,,,,,,,,,,,
|
||
Evernym,Evernym,,,,Aries; Trinsic; IBM; IDramp; Esatus,,,,,Evernym’s Connect.Me,"Connect.Me<br>Our consumer digital wallet app<br>Enable customers and end users to manage all of their digital credentials from the safety of their own phone<br>Engage in structured two-way messaging over secure and private channels<br>Eliminate excess data collection with zero-knowledge proof technology, and other cutting-edge privacy features",,https://www.evernym.com/connectme/,,Product,,Product,,,,,,,,2021-09-27,,,,,,,,,,,,,
|
||
Evernym,DHS,,,,,,,,,News Release: DHS S&T Awards $749K to Evernym for Decentralized Key Management,"Managing public and private cryptographic keys in existing public key infrastructure as well as permissioned and permission-less blockchains continues to be a difficult challenge,” said S&T Identity Management Program Manager Anil John. “Through this project, Evernym will push the limits of the emerging decentralized key management system technology to deliver a high level of comfort to the public and American businesses as they integrate blockchain technologies into their technology portfolio.",,https://www.dhs.gov/science-and-technology/news/2017/07/20/news-release-dhs-st-awards-749k-evernym-decentralized-key,,Press,,Press,,,,,,,,2017-07-20,,,,,,,,,,,,,
|
||
Evernym,Evernym,,,,Trinsic; IBM; Lissi; esatus,,,,,Evernym’s Verity,"Our flagship product for verifiable credential exchange<br>Issue and verify digital credentials<br>Easily integrate with back-end systems, using our REST API and SDKs in Java, Node.Js, Python, and .NET<br>Build for scale, with enterprise-grade architecture designed to support millions of users.<br>Enable open ecosystems and true data portability, with a solution based on open standards and interoperability",,https://www.evernym.com/verity/,https://evernym.wpenginepowered.com/wp-content/uploads/2021/10/verity-product.png,Product,,Product,,,,,,,,2021-10-10,,,,,,,,,,,,,
|
||
Evernym,Globalnewswire,,,,,,,,,IOTA and Evernym Launch Collaboration Aimed at Making the Internet of Things More Secure,,"“Evernym and IOTA are both intensively working toward achieving the same goal,” said IOTA founder David Sønstebø. “That is, a world where distributed ledgers facilitate the secure and efficient exchange of resources and data between all connected entities. This is a natural pairing and the world should pay attention to the exciting products that result from it.”",https://globenewswire.com/news-release/2017/08/31/1106292/0/en/IOTA-and-Evernym-Launch-Collaboration-Aimed-at-Making-the-Internet-of-Things-More-Secure.html,,Press,,Press,,,,,,,,2017-08-31,,,,,,,,,,,,,
|
||
Evernym,Globalnewswire,,,,,,,,,Evernym rolls with auto industry association MOBI to promote SSI in automotive and IoT,,"Cars, like people, have a digital identity problem that Evernym, a technology company focused on digital identity, wants to help solve. Cars that connect online will soon need to assert their own identities and be able to verify people’s identities in ways unthinkable just a few years ago. Is this replacement component a safe one? Can I let it access the car’s network? Is this person authorized to change my settings or drive me?",https://globenewswire.com/news-release/2018/10/05/1617425/0/en/Evernym-rolls-with-auto-industry-association-MOBI-to-promote-SSI-in-automotive-and-IoT.html,,Press,,Press,,,,,,,,2018-10-05,,,,,,,,,,,,,
|
||
Evernym,Evernym,,,,Hyperledger Foundation; Sovrin,,,,,Evernym's contributions to Hyperledger and Sovrin,,Evernym's contributions to Hyperledger and Sovrin. Video contents are listed here: https://wiki.hyperledger.org/display/indy/Evernym+Sprint+Demos,https://www.youtube.com/playlist?list=PLRp0viTDxBWGLdZk0aamtahB9cpJGV7ZF,,Meta,,Playlist,,,,Development,,,,2020-05-22,,,,,,,,,,,,,
|
||
Evernym,Globalnewswire,,,,,,,,,15 Industry Leaders Join Evernym’s Global Accelerator to Build the Future of Digital Identity.,,"Founding members of the Accelerator include industry leading organizations ATB Financial, IAG, Irish Life, the International Federation of Red Cross, Spark New Zealand, Truu and three provincial and state governments. Collectively, these organizations represent the interests of 100's of millions of individuals worldwide.",https://globenewswire.com/news-release/2018/11/07/1647044/0/en/15-Industry-Leaders-Join-Evernym-s-Global-Accelerator-to-Build-the-Future-of-Digital-Identity.html,,Press,,Press,,,,,,,,2018-11-07,,,,,,,,,,,,,
|
||
Factom,,Accumulate,,,,,,,,Accumulate Network,"Accumulate’s story starts with the founding of Factom in 2014, a data publishing layer atop major blockchains. In 2021, Factom was acquired by Inveniam Capital Partners, bringing along lead engineers Paul Snow and Jay Smith. Inveniam Capital Partners created the Defi Devs subsidiary to be lead developers in the Accumulate community.<br><br>The Accumulate protocol is based on many of the best concepts that came of the Factom protocol, including data and identity focus while combining the components in a new and unique configuration.<br><br>The Accumulate protocol is designed by Paul Snow. Paul Snow is the Chief Blockchain Scientist at Inveniam and Defi Devs. Previously, he was the CEO and chief architect of the Factom protocol and co-author of the Factom White Paper, developing and implementing a “multi-leader” consensus algorithm for the blockchain network. Of note, he was founder and chief architect for DTRules, an open-source project providing decision table-based rules engines. He is listed as inventor on many of Factom’s 40+ patents, both issued and in progress, which serve as a foundation for Accumulate.",,https://accumulatenetwork.io/,,Company,,Company,Web3,,,Data,,Blockchain,"DID,Verifiable Credentials",2021-08,,https://twitter.com/accumulatehq,,https://accumulatenetwork.io/blog/,https://accumulatenetwork.io/feed/,https://discord.gg/X74hPx8VZT,https://www.crunchbase.com/organization/accumulate-358f,https://www.linkedin.com/company/accumulatenetwork/,https://accumulatenetwork.io/whitepaper/,https://docs.accumulatenetwork.io/,,,
|
||
Gataca,,Gataca,,Irene Hernandez; Samuel Gómez,ESSIFLab,"USA, Massachusetts, Boston",Europe,,,Gataca,"Gataca is a cybersecurity company founded in Boston, MA, at the heart of MIT’s entrepreneurship and innovation ecosystem. It started as an academic research study seeking to reduce the risk of doing business online. As victims of the Equifax data breach later that year, the topic became very Personal.<br><br>We built Gataca because we knew there had to be a better way to protect our data.",,https://gataca.io/,,Company,,Company,Enterprise,ID,,Personal Data,,,DID,2018,,https://twitter.com/gataca_id,https://www.youtube.com/channel/UCaoK-LYmCPiXThYpLOShgvg/,https://gataca.io/blog/,,,https://www.crunchbase.com/organization/gataca-4a8f,https://www.linkedin.com/company/gataca/,https://developer.global.id/documentation/index.html,https://developer.global.id/,,,
|
||
Gataca,Gataca,,,,,,,,,"Decentralized Finance & Self-sovereign Identity: A tale of decentralization, a new paradigm of trust",We are aware that DeFi’s growth is explosive and inevitable yet its growth needs to be sustainable and responsible. This can be done with SSI.,,https://gataca.io/blog/decentralized-finance-self-sovereign-identity-a-tale-of-decentralization-a-new-paradigm-of-trust/,,Post,,Explainer,,DWeb,DeFi,,,,,2021-05-07,,,,,,,,,,,,,
|
||
Gataca,Gataca,,,,,,,,,SSI Essentials: Everything you need to know about Decentralized Identity,"Solving the identity paradox: the tradeoff between privacy, security, & user experience",,https://gataca.io/blog/ssi-essentials-everything-you-need-to-know-about-decentralized-identity/,,Post,,Explainer,,,,,,,,2021-11-29,,,,,,,,,,,,,
|
||
Gataca,Gataca,,,,,,,,,GATACA joins EU Commission’s Early Adopters Program as SSI provider in the Spanish group,"In Spain, three universities will pioneer the issuance of digital Academic Diplomas. The issuance will be performed 100% online, where students will authenticate themselves using a digital ID previously issued by FNMT (the Royal Mint of Spain) and stored in their mobile wallets.",,https://gataca.io/blog/gataca-joins-the-european-commission-s-early-adopters-program-as-the-ssi-technology-provider-in-the-spanish-group/,,Post,,Meta,,,,Real World,,,,2021-04-12,,,,,,,,,,,,,
|
||
Gataca,CyberNews,,,,,,,,,"Jose San Juan, GATACA: “blockchain technology has become the protagonist of the world we live in”","For the past 4 years, GATACA has focused the majority of its efforts on building an interoperable, secure, and user-friendly product for the European region. We not only plan to continue to focus on the needs of our clients and regulatory, as well as standardization demands from the market but to take our SSI tech to the next level.",,https://cybernews.com/security/jose-san-juan-gataca-blockchain-technology-has-become-the-protagonist-of-the-world-we-live-in/,,Interview,,Meta,,,,,,,,2023-04-24,,,,,,,,,,,,,
|
||
Gataca,iGrantio,,Twitter,,ValidatedID; Danube; Waltid; DXCTechnology; CIMEA_Naric; identyum; ThalesDigiSec; Posteitaliane,,,,,Congrats to the 11 wallet providers for being conformant to @EU_EBSI,We are glad to be among the first few along with [@ValidatedID](https://mobile.Twitter.com/ValidatedID) [@Danube](https://mobile.Twitter.com/Danube) [@GATACA_ID](https://mobile.Twitter.com/GATACA_ID) [@walt_id](https://mobile.Twitter.com/walt_id) [@DXCTechnology](https://mobile.Twitter.com/DXCTechnology) [@CIMEA_Naric](https://mobile.Twitter.com/CIMEA_Naric) [@identyum](https://mobile.Twitter.com/identyum) [@ThalesDigiSec](https://mobile.Twitter.com/ThalesDigiSec) [@posteitaliane](https://mobile.Twitter.com/posteitaliane),,https://mobile.twitter.com/igrantio/status/1532036324882104321/photo/1,,Tweet,,Meta,,,,,,,,2022-07-01,,,,,,,,,,,,,
|
||
Gataca,Gataca,,,,,,,,,This is how GATACA achieves blockchain interoperability,blockchain agnosticism is possible due to our DID registry component: all incoming activity is delegated to the DID registry with specific connections to each blockchain so that the rest of our technology components do not have to participate in the process. Other components need not know where the information persists from; they delegate that special knowledge to the DID registry and continue to perform their regular activities as usual.,,https://gataca.io/blog/this-is-how-gataca-achieves-blockchain-interoperability,,Post,,Standards,,,,,,DID Registry,,2021-03-2021,,,,,,,,,,,,,
|
||
Gataca,eSSIFlab,,,,,,,,,Verifier Universal Interface by Gataca España S.L.,This draft version can be found at [https://Gataca-io.GitHub.io/verifier-apis/](https://Gataca-io.GitHub.io/verifier-apis/) and has been built using ReSpec.<br>This draft version for VUI includes today 6 APIs:<br><br>- Presentation Exchange<br>- Consent Management<br>- Schema resolution<br>- Issuer resolution<br>- ID resolution<br>- Credential status resolution<br>,"Verifier Universal Interface (VUI) is an interoperability working group that aims at building a complete set of standard APIs for Verifier components in SSI ecosystems As different technology providers build SSI solutions, it becomes critical to ensure interoperability between these solutions. Available standards for SSI still have important gaps, leading us to an ecosystem of full-stack providers whose approach to interoperability is building proprietary plug-ins for each one of the other available solutions. This approach to interoperability is not scalable. The underlying problem is that building standards take time. That is the reason that we propose a practical and focused approach to enable scalable interoperability in the SSI community. We propose to start with a specific SSI component, namely the Verifier component, and lead the definition of the minimum set of standard APIs necessary to implement or interoperate with such module. That is, a role-centric approach to standardization at API level. To date, 12 organisations are contributing to this initiative. The VUI working group has already drafted a first version of a generic spec that integrates existing standards and interop efforts and fills the gaps to provide a complete set of APIs. This draft version can be found at https://bit.ly/3h5VE7P and has been built using ReSpec. This draft version for VUI includes today 6 APIs: - Presentation Exchange - Consent Management - Schema resolution - Issuer resolution - ID resolution - Credential status resolution Next steps As next steps, the Working Group (WG) needs to take this ground work to a more mature level. That is, to further define the specification by achieving consensus in the broader community, and bridging perspectives from DIF, W3C, EBSI, and Aries. The WG is organized in Working Packages (WP), one for each interface. Any participant can lead or contribute to WP, which shall integrate at least 2 Implementors and 1 Integrator. Implementors are responsible for defining the API, a set of interoperability tests, and service endpoints for Integrators to execute those tests. The WG has launched a survey in the broad SSI community and two of the 6 interfaces have been selected as initial WPs: Presentation Exchange Issuer Resolution Ready to contribute? To subscribe to this WG please refer to https://groups.io/g/vui Country: Spain Further information: https://Gataca.io Team: Gataca Spain GitLab: https://gitlab.grnet.gr/eSSIF-lab/infrastructure_2/Gataca",https://essif-lab.eu/verifier-universal-interface-by-gataca-espana-s-l/,,Spec,,Standards,,,,,,Verifier API,,2021-04-09,,,,,,,,,,,,,
|
||
GlobalID,,GlobalID,,Alka Gupta; Greg Kidd; Mitja Simcic,,"USA, California, San Francisco",USA,,,Global ID,"At GlobaliD, we’re building a universal identity solution that is easy to use, ties users to unique names and transcends borders and institutions.",,https://www.global.id/,,Company,,Company,Enterprise,ID,SSI,,VCI,,,2016,https://github.com/globalid,https://twitter.com/myglobal_id,https://www.youtube.com/channel/UCnMJDT8IXrg4Y5RDP4W0aOw,https://medium.com/global-idd,https://medium.com/feed/global-idd,,https://www.crunchbase.com/organization/global-id,https://www.linkedin.com/company/global-id-inc/,,,,,
|
||
GlobalID,GlobalID,,Medium,,Future Proof,,,,,Everyone will have an ID wallet,"how ID wallets work within the digital identity ecosystem, briefly explains the trust triangle, and previews the GlobaliD Wallet which will be released later this year","FUTURE PROOF EP 18 — Everyone will have an ID wallet In this episode, we speak with Justin Downey, product marketing manager at GlobaliD about ID wallets. Justin explains how ID wallets work within the digital identity ecosystem, briefly explains the trust triangle, and previews the GlobaliD Wallet which will be released later this year. Past episodes: - EPISODE 17 — Digital wallets of tomorrow will be PRIVATE - EPISODE 16 — How XUMM Wallet is changing the game - EPISODE 15 — Olympic hopeful Lila Lapanja is a GlobaliD ambassador - EPISODE 14 — What we learned at Solana Breakpoint - EPISODE 13 — DeFi and Identity: Compliance in a decentralized world - EPISODE 12 — The future of GlobaliD Groups - EPISODE 11 — The XRP Card and the future of communities - EPISODE 10 — How to decentralize identity and empower individuals - EPISODE 09 — Understanding GlobaliD’s identity platform - EPISODE 08 — Owning your identity and data with VP of Design Antoine Bonnin - EPISODE 07 — Understanding the future of fintech with Ayo Omojola - EPISODE 06 — Establishing trust and safety in tomorrow’s networks - EPISODE 05 — How ZELF combines the power of payments and messaging - EPISODE 04 — The future of blockchain with the creator of Solana - EPISODE 03 — Should we trust Facebook? - EPISODE 02 — JP Thieriot on why Uphold isn’t delisting XRP - EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security Have a question for us? A topic you’d like covered? A guest you’d like to see? Let us know!",https://medium.com/global-id/episode-18-everyone-will-have-an-id-wallet-da5ac358ad60,,Episode,,Explainer,,,,,,,,2022-09-14,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,Future Proof,,,,,How to decentralize identity and empower individuals,"If the internet decentralized information and crypto decentralized money and payments, then verifiable credentials will decentralize identity. In this episode, we chat with Dev Bharel, the software architect leading the charge around verifiable credentials at GlobaliD.","FUTURE PROOF EP 10 — How to decentralize identity and empower individuals If the internet decentralized information and crypto decentralized money and payments, then verifiable credentials will decentralize identity. In this episode, we chat with Dev Bharel, the software architect leading the charge around verifiable credentials at GlobaliD. Past episodes: - EPISODE 09 — Understanding GlobaliD’s identity platform - EPISODE 08 — Owning your identity and data with VP of Design Antoine Bonnin - EPISODE 07 — Understanding the future of fintech with Ayo Omojola - EPISODE 06 — Establishing trust and safety in tomorrow’s networks - EPISODE 05 — How ZELF combines the power of payments and messaging - EPISODE 04 — The future of blockchain with the creator of Solana - EPISODE 03 — Should we trust Facebook? - EPISODE 02 — JP Thieriot on why Uphold isn’t delisting XRP - EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security Have a question for us? A topic you’d like covered? A guest you’d like to see? Let us know!",https://medium.com/global-id/episode-10-how-to-decentralize-identity-and-empower-individuals-3e154612a85,,Episode,,Explainer,,,,,,,Verifiable Credentials,2022-09-16,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,Future Proof,,,,,Understanding GlobaliD’s identity platform,within the context of a self-sovereign identity that means that i as the holder of that Credential i'm the only one that gets to decide who gets to see it which is a pretty wild concept,"FUTURE PROOF EP 09—Understanding GlobaliD’s identity platform In this episode, we chat with Vadim Slavin, Director of GlobaliD’s Credentials Platform, who provides an insightful overview of how GlobaliD’s identity platform works and what makes it unique. Past episodes: - EPISODE 08 — Owning your identity and data with VP of Design Antoine Bonnin - EPISODE 07 — Understanding the future of fintech with Ayo Omojola - EPISODE 06 — Establishing trust and safety in tomorrow’s networks - EPISODE 05 — How ZELF combines the power of payments and messaging - EPISODE 04 — The future of blockchain with the creator of Solana - EPISODE 03 — Should we trust Facebook? - EPISODE 02 — JP Thieriot on why Uphold isn’t delisting XRP - EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security Have a question for us? A topic you’d like covered? A guest you’d like to see? Let us know!",https://medium.com/global-id/episode-09-understanding-globalids-identity-platform-b241a63ff5e0,,Episode,,Explainer,,,,,,,,2022-09-16,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,GlobaliD 101,,,,,Bring Your Own Identity,"At first, accessing all of your accounts on the internet meant you had to create a username and password for each company or service you were trying to interact with.<br><br>Now, you can access many websites by using your existing social media accounts from Facebook, Twitter or LinkedIn. You can even log in with your Google and Apple accounts as well.","GlobaliD 101: Bring Your Own Identity So far, in the GlobaliD 101 series we’ve explored: - Part 1: What a smart and humanistic approach to digital identity would like - Part 2: The Trust Triangle — the system of issuers, holders, and verifiers that illustrates how identity works (and how it’s broken, today) - Part 3: Why the ID Wallet is the first step toward achieving a new vision for digital identity - Part 4: Why every company is an identity company At first, accessing all of your accounts on the internet meant you had to create a username and password for each company or service you were trying to interact with. Now, you can access many websites by using your existing social media accounts from Facebook, Twitter or LinkedIn. You can even log in with your Google and Apple accounts as well. This concept is called Bring Your Own Identity (BYOI or BYO Identity): - A form of digital authentication in which an end user’s username and password is managed by a third party. - The approach leverages Single-Sign On (SSO) technology to make authentication more simple and convenient for users. It’s also beneficial for companies since it allows visitors to quickly register using their existing credentials. An improved sign-on experience can result in as much as a 45% increase in customer registrations. The simplicity of BYO Identity means businesses convert more of their website visitors into customers instead of losing them when they’re asked to create a new account. But there are drawbacks as well: - Many users worry they’re trading convenience for privacy. Every time you log in to a third-party website using Facebook or Google, they gain access to your data and leverage it to sell ads. - It’s not the most secure authentication method. Anybody can create a social media account, but these companies don’t verify your attributes. So, using social identities to log in to third-parties means companies can’t be 100% certain about the identity of their customers. Even with these concerns, BYO Identity is a step in the right direction towards a future where interoperable and portable identities are commonplace. There is a real opportunity to combine technology that is being developed with the concept of BYO Identity that will create a new identity framework where you own and control your data. - By creating an account with an identity service provider who leverages verifiable credentials technology, you will confirm your attributes are real, and companies will rest easy knowing you’re identifying yourself truthfully. - Your identity will be decentralized, not federated. Identity service providers like GlobaliD don’t have access to your data, so they can’t store it. You’ll be able to browse, log in and transact without your data being leveraged by Big Tech. - This makes the process of identity convenient, and portable. In the future, many identity companies will be built on this decentralized approach using verifiable credentials. Rather than being stuck with a certain provider or platform, you’ll be able to easily transfer your identity from one service provider to another. If you’d like to learn more about our current BYO identity tools for individuals and businesses, like our Digital ID Wallet or Global Onboarding, visit our website or contact our sales team. Follow us on Twitter, LinkedIn and YouTube to learn more about GlobaliD. You can also subscribe to the GlobaliD Insider to stay up-to-date regarding the latest company developments and more in the world of self-sovereign identity.",https://medium.com/global-id/globalid-101-bring-your-own-identity-5b9927008190,,Post,,Explainer,,,,,,,,2022-08-30,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,GlobaliD 101,,,,,Device-based identity,"That way, your sensitive Personal data is stored locally on your own device rather than hosted on some company’s server that becomes a target for hackers.","GlobaliD 101: Device-based identity - Part 1: What a smart and humanistic approach to digital identity would like - Part 2: The Trust Triangle — the system of issuers, holders, and verifiers that illustrates how identity works (and how it’s broken, today) - Part 3: Why the ID Wallet is the first step toward achieving a new vision for digital identity - Part 4: Why every company is an identity company - Part 5: What is Bring Your Own Identity? - Part 6: Reusable Identity Historically, our digital identities have been based on what we know. Do you know your username, email, and password? Then you can log in. Do you know your social security number, home address, and mother’s maiden name? Then you can sign up for the service. You can probably see why this is a problematic way to deal with something as important as your identity. If someone nefarious finds out what you know, not only could they access your existing accounts, they could open up new ones in your name. With data breaches on the rise at a mammoth scale, that proposition becomes likelier by the day. Anyone who has had their identity stolen knows just how painful the process of getting things back in order can be. For some, it unfortunately becomes a lifelong pursuit. A much more secure way for managing our digital identities is to base it on what we have. For instance, do you have access to your smartphone? It’s immediately clear why such a framework is far more secure than the former. Wouldbe criminals would need to physically have your phone in their possession. They’d also need a way to get in — a tough ask if you have a PIN or fingerprint security set up. That’s something that might be possible for the FBI but likely outside the wheelhouse of most fraudsters. Traditional digital identities still based on what we know have gotten the memo. It’s why two-factor authentication is highly recommended if you want to keep your accounts secure. Now, it’s also about what you have. But that transition is still only a half measure. Eventually, it makes sense to shift toward a completely device-based identity. That way, your sensitive Personal data is stored locally on your own device rather than hosted on some company’s server that becomes a target for hackers. With device-based identity, you’re in complete control of your data as well as your private keys. This also opens the door for additional features that should become commonplace in the future such as identity portability and reusability. We’ll be able to bring our identities with us from one service to the next. The reputation and social connections you create on one platform will be easily transferable if you ever decide to venture elsewhere. We’ll also be able to verify our identities once and never again, re-using that verified identity for any additional services we sign up for. That minimizes the amount of data we share with other parties while still allowing businesses to trust who their customers are who they say they are. Best of all, it reduces friction for everyone. No more selfies. No more taking photos of your passport. Set up your identity once, and you’re good to go. If you’d like to learn more about GlobaliD, visit our website, contact our sales team or follow us on Twitter, LinkedIn and YouTube.",https://medium.com/global-id/globalid-101-device-based-identity-c6096a5b0890,,Post,,Explainer,,,,,,,,2022-10-05,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,GlobaliD 101,,,,,Every company is an identity company,"At first, every company was a tech company. Every business needed a digital strategy. Back in 2017, Goldman Sachs’ CEO famously stated that the storied investment bank was actually a tech company: “We are a technology firm. We are a platform.”<br><br>Not long after, every company was a fintech company. Every business needed a way to manage money and payments. As Andreessen Horowitz’s Angela Strange wrote in 2019, “I believe the next era of financial services will come from seemingly unexpected places… Fintech is eating the world.”<br><br>Fast forward to today, and every company is an identity company. Every business needs to connect with customers and users as well as manage their data, privacy, and trust in a compliant way.<br>","GlobaliD 101: Every company is an identity company In the first three parts of the GlobaliD 101 series, we’ve explored: - Part 1: What a smart and humanistic approach to digital identity would like - Part 2: The Trust Triangle — the system of issuers, holders, and verifiers that illustrates how identity works (and how it’s broken, today) - Part 3: Why the ID Wallet is the first step toward achieving a new vision for digital identity At first, every company was a tech company. Every business needed a digital strategy. Back in 2017, Goldman Sachs’ CEO famously stated that the storied investment bank was actually a tech company: “We are a technology firm. We are a platform.” Not long after, every company was a fintech company. Every business needed a way to manage money and payments. As Andreessen Horowitz’s Angela Strange wrote in 2019, “I believe the next era of financial services will come from seemingly unexpected places… Fintech is eating the world.” Fast forward to today, and every company is an identity company. Every business needs to connect with customers and users as well as manage their data, privacy, and trust in a compliant way. In other words, every company is a verifier as part of the Trust Triangle. The problem is that most companies are focused on their core business — they’re not experts in the domain of identity, security, and data management. With the way digital identity works today, this becomes an incredibly expensive exercise, not only for businesses and institutions but also for society at large. Home Depot is in the business of selling home improvement supplies both in retail stores and online, but since 2014, the company has spent nearly $200 million in relation to a data breach that impacted over 52 million of its customers. We’ve all seen the headlines — Home Depot is hardly alone in this. Businesses and institutions have taken on much of the cost of an archaic approach to digital identity: - $1 million per year spent on password support costs alone - $6 million average loss from credential stuffing - $7-$30 million spent on data protection compliance - 6%-9% of bank revenue spent on compliance - $60 million per year spent by financial institutions on KYC (Know Your Customer) - $163 billion in U.S. unemployment fraud in 2021 That’s just the tip of the iceberg. Identity needs don’t just increase expenditure and bloat for established firms, they also serve as barriers to entry for new upstarts, serving as a bottleneck for innovation. There’s also the flipside to this, where companies and institutions that want to maintain a more frictionless experience for users must face the reality of diminished trust on their platforms — from bots to fake news, contributing to society’s growing trust deficit. What if there was another way? What if companies didn’t have to choose between focusing on their core business and becoming an identity company? What if companies could trust their customers and users without having to collect and store sensitive Personal information? What if platforms didn’t have to pick between real users and a more frictionless experience? What if companies could empower their users along the way? That’s the GlobaliD vision. You control your data and bring your identity with you. Businesses leverage machine-readable verifiable credentials to be sure who their customers are. Developers offload onboarding and unlock multi-factor, passwordless, authentication instantly. Everybody wins. If you’d like to learn more about our enterprise solutions for verifiers, visit our website or contact our sales team. Follow us on Twitter, LinkedIn and YouTube to learn more about GlobaliD. You can also subscribe to the GlobaliD Insider to stay up-to-date regarding the latest company developments and more in the world of self-sovereign identity.",https://medium.com/global-id/globalid-101-every-company-is-an-identity-company-a851beed999d,,Post,,Explainer,,,,,,,,2022-08-24,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,Heather Dahl; Ken Ebert; Indicio,GlobaliD 101,,,,,How digital identity should work. Part 1:,"In this episode, we’re joined by CEO Heather Dahl and CTO Ken Ebert from Indicio, the market leader in developing trusted digital ecosystems. Heather and Ken discuss how new identity-based technology can help people, governments and companies develop greater digital trust in a modern society.","GlobaliD 101: How digital identity should work What is your identity? It’s the ability to represent to the world who you are. That can cover everything from what you wear to who you associate with to what country you are from. Your identity is a collection of attributes that describe you. In practice, proving your identity is also the key to unlocking your social and economic potential — physically and digitally. Society has always been built on trust, and sometimes, we need to know who we’re dealing with. As such, your identity is core to who you are and what you’re able to do, whether that’s buying something online, opening a bank account, or starting a business. The problem is that the way we deal with identity hasn’t caught up to the modern world. Part of the reason is that our most credible forms of identifying documents like driver’s licenses and passports still live in the analog world. The pandemic further shone a light on those limitations with places like the U.S. still reliant on paper vaccination cards, which are inefficient, difficult to verify, and easy to counterfeit. One of the issues with analog identifying documents is that not everyone has them. The reality is that our current system excludes 1.7 billion people from basic financial services, many of whom lack traditional forms of identity. For instance, migrant workers may not even have a home address. Things aren’t much better in today’s digital world, where an abundance of online accounts means that our identity and Personal data are scattered across servers vulnerable to attack. Outside of just giving away your email and phone number or accepting tracking cookies on your browser, some services collect more official forms of identity. Have you ever had to send a picture of your driver’s license or insert your passport number when buying something online? The result? In just the first half of 2019, an astounding 4.1 billion records were compromised. Meanwhile, we don’t own the digital identities we create. The Facebooks and Googles of the world do and profit mightily from our data. And because they own our data on their proprietary platforms, we can’t easily bring our identity and data with us if we decide to go somewhere else. The reputation you created on Facebook Marketplace as a long time seller is stuck on Facebook. If you ever decide to sell on Ebay, you’re starting from zero. The fragmentation of your digital identity extends well beyond popular websites. A pillar of the United States traditional financial system is the credit score — a system entirely predicated on centralized digital identity that you have no control over. Anyone who’s moved to the U.S. from abroad understands the challenge of trying to get a mortgage or even open a bank account — even if you had great credit in your home country. Do you want to know the worst part? The digital identities described above aren’t even that credible in the first place. Most social media platforms are more concerned with expanding their user base than verifying accounts are owned by real people, contributing to society’s growing trust deficit. What we need is a human-centric approach to digital identity, one that is easier, safer, and cheaper than the one we have today. We need a digital identity that works for people and organizations alike. Your digital identity should be: - Self-sovereign. We should own and control our identity and data. Further, we should be able to decide who we share our data with. - Private, secure and encrypted. Our data should be private and safe, always. You should be confident that only you have access to the information you create, save, and share. Third party entities and bad actors should never have the opportunity to see your information in the first place. - Interoperable and portable. Our identities should be premised on globally accepted standards just like the internet is built on interoperable protocols that power the web and email. They shouldn’t be locked into proprietary, closed ecosystems dictated by corporations or governments. Moreover, we should be able to bring our identity with us to whatever platform we choose. Remember when your cell phone number was locked into your mobile service provider? Today, our phone numbers are portable. You can bring your phone number with you no matter what provider you choose. The same will be the case for our digital identities. - Built on verifiable credentials. You should be able to verify your identity once, receive a machine-verifiable credential, and reuse that credential many times over. This means you won’t have to redundantly verify your identity and re-share your data each time you interact with a new business or service. The best part is that those services never need to see your Personal information to know it is true. That way, businesses can trust that you are who you say you are, and don’t need to store and manage your Personal data on their servers. Less servers holding your data means a more secure identity. - Usable. What good is a fancy digital identity if it is impossible to use in your daily life? Digital identity and the associated credentials are going to take years to be adopted by 100% of establishments. That’s why it is crucial to make safer digital identity useful in the contexts we are living in today. That might mean making it easier to store and share a picture of your ID card. Tomorrow it could mean applying for a bank account. Next year, it might mean doing your taxes. Human-centric digital identity must meet the moment, wherever it may be. - Inclusive. Identity is a human right. Anyone, anywhere should be able to create one. Notably, your identity should grant you access to basic services such as banking and payments. It’s clear that the way we handle our identities today is broken. What’s incredibly exciting is that a convergence of developments across fintech, regtech, and Web3 now enable a smarter, better, and more inclusive framework. Human-centric digital identity is the key to a future that works for us, allowing us to set new standards for how we deal with issues like financial inclusion, communication and censorship, and even the integrity of our democratic elections. Our identities are the building blocks for a modern society and economy. We owe it to ourselves and each other to get this right. If you’d like to learn more about how digital identity can work for you and stay up to date with the latest updates, sign up for the monthly GlobaliD Insider newsletter.",https://medium.com/global-id/globalid-101-how-digital-identity-should-work-fc53ede7b86f,,Episode,,Explainer,,,,,,,,2022-07-05,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,GlobaliD 101,,,,,ID wallets,Why the ID Wallet is the first step toward achieving a new vision for digital identity,"GlobaliD 101: ID wallets So far in the GlobaliD 101 series, we’ve explored: - Part 1: What a smart and humanistic approach to digital identity would like - Part 2: The Trust Triangle — the system of issuers, holders, and verifiers that illustrates how identity works (and how it’s broken, today) Now, we’re getting to the exciting part. We know identity is broken. We also know what great digital identity that works for you, the individual, looks like. But how do we get there? The first step is to make digital identity incredibly convenient, fun, and useful for end users. These are the people who will go out and interact with their communities, purchase goods and services, and build businesses. In the physical world, we rely on identifying documentation. That might be your driver’s license, your insurance card, or your passport. In our daily lives, we’ll keep the most commonly used forms of ID in our wallet. In the digital future, things won’t look all that different. You’ll keep digital forms of relevant documents in your digital ID wallet. It’s easy to understand the benefits of digitization: - It’s convenient. All your relevant documents are in one accessible place. The only thing you’ll need to keep in your actual wallet is cash. As the digital identity ecosystem develops, your interactions and transactions will become seamless. - It’s yours. This is your ID wallet. You own and control your identity and your data. - It’s secure. If you lose your wallet, it’s gone, and your information is out there. That’s not the case with a locked smartphone. Plus, you’ll have a backup. Another aspect is the ability to selectively share only relevant data. If you’re buying a bottle of wine, you don’t need to share your entire driver’s license. You don’t even need to share your actual date of birth. You only need to share the fact that you’re over 21 (for those in the U.S.). Big Tech companies like Apple are already making progress on this front, piloting a program that allows users to keep their driver’s license in their Apple Wallet. The difference with Apple’s approach, of course, is that their solution isn’t portable or interoperable. In regular person terms, it just means that you’re stuck on Apple’s closed ecosystem — no surprise. While progress is being made, these are still early days. Apple’s pilot, for instance, will be rolled out in a limited number of states. All of which means that people are still sending photos of their driver’s license to their AirBnb host for identity verification. That’s crazy insecure! What if instead, you could securely and selectively share your ID with an expiring link? Your name and photo will be visible but other private details will be blurred out. Your Airbnb now believes you are who you say you are because they also trust the issuer of your digital credential. Likewise, you don’t have to worry that a copy of your full license is on a stranger’s phone forever. Everyone rests easy, and you enjoy the heck out of your vacation. Need to provide a copy of your ID to an AirBNB host? Just send them a link to your driver’s license which you’ve stored in your ID wallet. You can make your name and photo visible, while everything else is blurred out. That way the host only sees what you want them to. You can also set a time limit for how long your ID is available. That way nobody has a copy of your license indefinitely. The same applies across all your online interactions and transactions. No more taking photos of your ID for each new platform or service. No more digging through the safe for your passport. No more calling the customer support because you lost your insurance card. Life is just easier with a digital ID wallet. Venture into the digital future confidently What’s even more exciting is where we go from here. With people using a trusted ID wallet, businesses (verifiers) can rethink how they manage identity and trust in the context of their customers, an effort that costs businesses $592 trillion per day. The World Wide Web Consortium (w3c) also reached a major milestone this month around digital identifier (DID) standards, which will become a formal recommendation despite pushback from Big Tech. Just like the internet, open and interoperable protocols will allow any company or project to easily integrate these digital identity standards, paving the way toward mainstream adoption and far-reaching accessibility. Since your digital identity is built on open standards, your identity will also be portable like your mobile phone number is today. Rather than being stuck with a certain provider or platform, you’ll be able to easily transfer your identity from one ID wallet to another. Software developers will compete for your patronage by developing the best possible products. Imagine a world where you own your identity and your data rather than corporations or the government. You finally hold the keys to your own destiny. You share only what you need to in a secure and private fashion. Rather than logging onto your Facebook account, you log on as you. And unlike your Facebook account, which isn’t all that credible, you’ll be able to do important things with your digital identity, such as remotely opening up a bank account. Since digital formats are far more flexible in the context of expressing trust and reputation, this will have a profound impact on financial inclusion for the billions of people who lack traditional identifying documentation. In the future, we’ll see a convergence between our physical and digital identities with everything easily managed in one place. With one identity, you’ll be able to purchase celebratory drinks with your friends or hang out in the Metaverse. It all starts with your ID wallet. Follow us on Twitter, LinkedIn and YouTube to learn more about GlobaliD. You can also subscribe to the GlobaliD Insider to stay up-to-date regarding the latest company developments and more in the world of self-sovereign identity.",https://medium.com/global-id/globalid-101-id-wallets-68fa77e6d0d7,,Post,,Explainer,,,,,,,,2022-08-02,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,,,,,,What is the trust triangle?,"The Trust Triangle — the system of issuers, holders, and verifiers that illustrates how identity works (and how it’s broken, today","In the first part of the GlobaliD 101 series, we gave an overview of how our identity systems haven’t caught up to the modern world. We also provided suggestions of how digital identity should work.<br>The thing is, identity isn’t something your average person thinks about much. Typically, it’s a means to an end. If you want to take a trip to Europe, you need to bring your passport. Want to celebrate with a bottle of wine? You need to show your driver’s license.<br>In part 2 of the GlobaliD 101 series, we’re going to explain how the process of identity actually works, and we’re going to do it with a really nifty concept we call the Trust Triangle.<br>Here’s how the Trust Triangle works: Any identity framework is built on three pillars — issuers, holders, and verifiers. Together, these three pillars form the Trust Triangle.",https://medium.com/global-id/globalid-101-what-is-the-trust-triangle-260e85e1c640,,Post,,Explainer,,,,,,,,2022-06-21,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,,,,,,Why self-sovereign identity matters,"your digital identity represents you as a unique real-life person in a secure digital format. In fact, we likely have many different virtual identities across a spectrum of platforms and services.","Why self-sovereign identity matters It’s one thing to identify yourself in person with an ID. Identifying yourself online? That’s a whole different story. Here’s what you need to know about digital identity, today — as well as why you should be super excited about the momentum surrounding the self sovereign identity (SSI) movement. Table of contents: In most cases in the real world, you can easily and reliably prove your identity by presenting your driver’s license, passport, or credit card. A police officer, bank clerk, or liquor vendor can clearly confirm your details by glancing over the document — or in higher security incidents, could further scan the document to guarantee authenticity. In all, the process only takes a moment and by the end of it, your ID is returned to you and back in your wallet. It’s not an easy protocol to replicate online — when we’re only present as a digital entity. On the internet, your digital identity represents you as a unique real-life person in a secure digital format. In fact, we likely have many different virtual identities across a spectrum of platforms and services. The hope is that all these disparate identities link back to the person they’re supposed to represent — in this case, the real you. As more and more of our social interactions and economic transactions migrate to the digital realm, so too have the stakes increased — and with that, comes more sophisticated criminals and scammers. As the threat of cyber attacks and data breaches continues to balloon, it’s never been more critical to have a holistic comprehension of what exactly your digital identity is and the role it plays. NOTE: While a digital identity can also represent entities like institutions or applications, for the purposes of this piece, we will only refer to digital identity in the context of Personal identities. 1. Digital identity 101 One way to think about a digital identity is that it’s a digital representation of a real-life person’s set of identifying attributes. That could mean Personal information such as your date of birth, your home address, or your mother’s maiden name. Or it could mean a secret passcode, a photograph, or even facial biometric data. Those identifying attributes are then organized in a way such that a software application is able to recognize and authenticate that you are the person you are claiming to be. In other words, a digital identity allows us to build trust online such that we can interact and transact much like we do in the real world. I) A deeper dive on identifiable attributes As our parents may have told us when we’re young, we’re all special and unique individuals. Every human being has a specific name, birth date, demographic, and biometric profile. We also have various documentation linked to us — such as a social security number, government-issued ID, or passport, but also things like your insurance policies, medical records, or utility bills. And then online, we have email addresses and social media accounts. All of these are identifiers that can be linked to you as a person. But there is another dimension of digital identity. While navigating the internet, our activity — with or without our knowledge — is often tracked. (You’ve probably heard of or are familiar with the term “cookie.”) And because that activity can be linked to the identifies mentioned above (email, social accounts, etc.), our online behavior (or shadow data) can be traced back to us. That could mean the websites you browse, your search history, things you’ve downloaded, or even items purchased. This kind of tracking allows platforms and services to deliver algorithmic content feeds, targeted ads, or in general, simply a more bespoke user experience. The downside, of course, includes not only malicious actors who might abuse that power but also the mere fact that we’re unwittingly contributing to an ever-growing database of Personal behavioral records that we generally have little agency over. II) On the internet, no one’s truly anonymous Given all that sensitive and valuable Personal data, companies wield immense responsibility when it comes to protecting their customers. In order to protect user privacy, firm’s will employ certain technological mechanisms and processes like tokenization and cryptography in order to “anonymize” their data sets. That way, PII is essentially scrubbed of any data that could link that info to the person at hand. That’s a good start, but it’s not a full-proof solution. Would-be fraudsters could, with access to enough data, still connect the dots even if the data’s been scrubbed — allowing them to map out a relatively detailed profile of your digital activity and online history. 2. Understanding self-sovereign identity (SSI) Self-sovereign identity (SSI), as a concept, dates back many years but as a movement has only started to build momentum more recently. As its name suggests, a self-sovereign identity puts the users at the center and grants them sole ownership and exclusive administrative rights of their identities. In other words, rather than having companies, governments, or online platforms manage and leverage (and monetize) the Personal information linked to your identity, you control how your data serves you. (We’ll be providing a more in-depth breakdown of SSI, exactly how it works, and the technical specifications behind it in a future piece.) NOTE: Self-sovereign identities can also represent other entities, but for the purpose of this article, we will only refer to SSI in the context of Personal identities. I) Multiple channels, one autonomy Whenever we log on, we engage with a countless variety of websites, platforms, and services. At each juncture, we rely on third-party points of entry or authorization to allow us to proceed, interact, or transact. Along the way, these intermediaries also gain access and insight into our Personal data and behavior and in many cases, become privy to information irrelevant for the necessary authorizations. That’s the way the world works, today. SSI, on the other hand, shifts that power back to the users, allowing us to authenticate and selectively attest only the required piece of identity information. In fact, it eliminates the need to share and expose Personal data altogether for most situations — while providing users with the same access and the businesses the same level of trust (or greater) as traditional protocols. With SSI, users are presented with a greater level of control and ownership, enhanced flexibility, and expanded flexibility. And that’s just the beginning. A world built around SSI means that every individual will be empowered with their own globally accessible digital identity, providing everyone access to the modern economy — compared to the billions today who lack access even to a basic bank account. 3. The million-dollar question Here’s the thing. Addressing the problem of digital identities is one of the looming challenges of our times. Your identity is your key to the modern world, allowing you to actively participate and engage with society and the global economy. And fixing identity will set the tone for this next chapter that we’re entering into when it comes to the convergence of our digital and physical realities. It affects how we interact with our family, friends, and communities; how we receive and distribute information; and how we buy and sell goods and services. It also sets new precedents and norms for how we move forward collectively — in how we deal with issues like financial inclusion, fake news and botnets, and even the outcome of our democratic elections. Here’s what we should be thinking about: - Rights: As users of online services and platforms, we have the right to the privacy and protection of our data, particularly Personally identifiable information (PII). Preventive measures can go a long way and autonomy over our data could mitigate the cases of identity abuse. - Responsibility: As members of societies, both offline and online, we share a responsibility to cultivate a safe environment where fraudsters cannot hide in anonymity and where new contacts or transactions can be attested for to ensure trust. - Value: With a secure, reliable, and interoperable digital or self-sovereign identity, users can safely enter into interactions with a variety of businesses, organizations and other users. For service providers, organizations, banks, healthcare services, education institutions, to online platforms, news publishers and social media, smarter identity frameworks will enhance their ability to deliver better and more innovative products and services. Because that’s the thing. Our digital identity is more than just a set of identifiers in digital format. It’s the building block for a modern society and economy. We owe it to ourselves and each other to get this right. Join a growing trusted community and experience how trusted identity works for you.",https://medium.com/global-id/why-self-sovereign-identity-matters-8fd2c982ca2e,,Post,,Explainer,,,,,,,,2020-04-29,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,Future Proof,,,,,Telling our story with the new GlobaliD website,about the brand new GlobaliD website and how it contextualizes our role in the world of digital identity while allowing us to tell our story more effectively.,"FUTURE PROOF EP 20 — Telling our story with the new GlobaliD website Trey Steinhoff, Product Marketing Director at GlobaliD, joins us to talk about the brand new GlobaliD website and how it contextualizes our role in the world of digital identity while allowing us to tell our story more effectively. Visit https://www.global.id to explore the new site. Past episodes: - EPISODE 19 — Making decentralized identity mainstream - EPISODE 18 — Everyone will have an ID wallet - EPISODE 17 — Digital wallets of tomorrow will be PRIVATE - EPISODE 16 — How XUMM Wallet is changing the game - EPISODE 15 — Olympic hopeful Lila Lapanja is a GlobaliD ambassador - EPISODE 14 — What we learned at Solana Breakpoint - EPISODE 13 — DeFi and Identity: Compliance in a decentralized world - EPISODE 12 — The future of GlobaliD Groups - EPISODE 11 — The XRP Card and the future of communities - EPISODE 10 — How to decentralize identity and empower individuals - EPISODE 09 — Understanding GlobaliD’s identity platform - EPISODE 08 — Owning your identity and data with VP of Design Antoine Bonnin - EPISODE 07 — Understanding the future of fintech with Ayo Omojola - EPISODE 06 — Establishing trust and safety in tomorrow’s networks - EPISODE 05 — How ZELF combines the power of payments and messaging - EPISODE 04 — The future of blockchain with the creator of Solana - EPISODE 03 — Should we trust Facebook? - EPISODE 02 — JP Thieriot on why Uphold isn’t delisting XRP - EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security Have a question for us? A topic you’d like covered? A guest you’d like to see? Let us know!",https://medium.com/global-id/episode-20-telling-our-story-with-the-new-globalid-website-c38278b3e14c,,Episode,,Meta,,,,,,,,2022-09-14,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,Indicio,,,,,GlobaliD connects to the Indicio Network,"The Indicio Network will enable the issuance and verification of credentials on the GlobaliD platform and in the app, allowing individuals to port their credentials for authentication and authorization into any participating use case scenario — including vaccine passports.","GlobaliD connects to the Indicio Network GlobaliD, the trust platform that allows anyone to verify identities, create and join groups, communicate, and make payments, today announced that it will be using the Indicio Network as part of their mission to give users full control and ownership of their portable identity and data. The Indicio Network will enable the issuance and verification of credentials on the GlobaliD platform and in the app, allowing individuals to port their credentials for authentication and authorization into any participating use case scenario — including vaccine passports. In addition to developers building capabilities for signup, verification, messaging, wallet, and cards; the GlobaliD consumer app and web experience allows anyone to create and manage groups that support these same functions natively without the need to code a third-party offering. These verifiable credentials are core to GlobaliD’s Trust Platform — in contrast to less trusted social media, messaging, conferencing, and other legacy apps and offerings in the marketplace. “To address the widespread trust deficit in our society, we need private, secure, and transparent forms of identity in a portable and persistent manner,” says Greg Kidd, co-founder and CEO of GlobaliD. “GlobaliD is the portable and preferred solution for situations where trust is encouraged or required.” Addressing the world’s trust deficit — one identity at a time Decentralized identity, sometimes referred to as ‘self-sovereign identity’ (SSI), is an alternative to the current centralized and federated systems that collect and control user identity information. Thought of as a way to return the internet back to its open roots, this democratizing framework puts individuals back in control of their digital lives, allowing them to manage their own identity and data without reliance on a third-party. This peer-to-peer interaction is not only safe and secure, avoiding the creation of honeypots of large amounts of data collected by multiple entities, it’s also the most privacy-preserving approach to online interactions and compliant with global data regulations. “Unlike other proprietary solutions that claim privacy and security within a single siloed use case, GlobaliD’s portability framework, powered by Indicio, ensures that trusted credentials are both re-usable and user controlled,” says Kidd. “With GlobaliD, individuals no longer need to rely on corporations or governments for garnering levels of trust needed to act in everyday life situations.” Trust as a service Indicio’s mission is to not only create and maintain the Indicio Network for companies like GlobaliD, but also to provide the essential professional services for enterprises around the world to build decentralized identity solutions. “GlobaliD’s app, platform, and SDKs are a fast-track to a more secure digital world because they make verifiable credentials simple and easy to use across a range of vital services,” says Heather Dahl, CEO of Indicio. “The real upshot is that people can protect their privacy and share their information at a distance — two things that are increasingly important to the efficiency of the global economy in the grip of a worldwide pandemic.” Learn more about the Indicio Network and Indicio.tech’s range of services for global enterprises to build decentralized identity solutions at Indicio.tech. Go to Global.iD to claim your GlobaliD, get verified, create and join groups, communicate, pay and get paid online. Indicio.tech is a professional services firm specializing in decentralized identity architecture, engineering, and consultancy. Indicio provides expert guidance to a global community of clients on the use of verifiable credentials to build digital identity solutions. The decentralized networks and tools created by Indicio make verifiable credentials easy to adopt, simple to deploy, and reliable to use. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Indicio believes in privacy and security by design, interoperability, and supports the open source goals of the decentralized identity community. GlobaliD is a trust platform that seamlessly integrates digital identity, communications, and payments — the core building blocks for the next chapter of the internet. Unlike existing offerings, GlobaliD’s open, portable, and interoperable solutions put individuals back in control of their digital lives rather than governments or corporations, while allowing developers and businesses to easily take part in building the future. GlobaliD has offices in the U.S. and Europe and its digital identity framework has been recognized by the World Economic Forum and the Brookings Institute.",https://medium.com/global-id/globalid-connects-to-the-indicio-network-2ad5688d72fd,,Post,,Meta,,,COVID,,,,,2021-01-21,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,,,,,,The future of COVID credentials on GlobaliD,"With our new platform, we’ll also be releasing a brand new COVID credential. Users will be able to validate the authenticity of their digital vaccine record from around the world and store the proof of the validation as a credential in their GlobaliD app.","The future of COVID credentials on GlobaliD With many places such as restaurants, schools, and gyms now checking for COVID credentials in the U.S., we wanted to remind our users that GlobaliD currently has a quick and easy method for storing your vaccination record card with your digital identity. Simply add the COVID-19 Vaccination Record Card self declaration (Identity tab→ [+] → Personal) in your GlobaliD app. You will be asked to take photos of your vaccination record card. Only you have access to the stored imagery, which you can share as necessary to prove that you’ve gotten your vaccination. Of course, there are limits to this approach. The main issue is that there isn’t a convenient way to verify the authenticity of the vaccination card, a growing problem as some companies, schools, as well as the government have implemented vaccination mandates. This implementation was always going to be a stop-gap solution. And so today, we’re super excited to talk about the future of where the GlobaliD credentials platform is going. The new and improved GlobaliD credentials platform The existing GlobaliD verifications platform is undergoing a major upgrade — a new self-sovereign identity (SSI) framework, of which the first implementation has already been built. The primary benefit of this new system is interoperable, portable, verifiable credentials that groups can issue and that users can carry with them across platforms and borders. This new framework will not only be much more scalable but also industry backed — thanks to the work of the Linux Foundation’s Cardea Project, which is working on global standardization around COVID credentials. (GlobaliD is a founding member and part of the steering committee.) In case you missed the announcement, this is what the Cardea Project is all about: Cardea is a complete ecosystem for the exchange of privacy-preserving digital credentials, open sourced as a project in Linux Foundation Public Health. Launched by Indicio.Tech, Cardea provides an easily verifiable, trustworthy, unalterable proof of health tests or vaccination that can be shared in a privacy-preserving way. Cardea easily integrates with existing health systems to ensure trusted data sources for credentials and uses decentralized identity technology to enable better control of data for individuals. With our new platform, we’ll also be releasing a brand new COVID credential. Users will be able to validate the authenticity of their digital vaccine record from around the world and store the proof of the validation as a credential in their GlobaliD app. The user would then be able to present this proof in the interoperable format within the Cardea ecosystem and beyond. However, GlobaliD will not stop there. Because GlobaliD also enables verification of government IDs, verification of ownership of the vaccine digital record will also be possible by comparing Personal information from both documents in a privacy preserving way. Indicio.Tech, the firm that launched Cardea (and also a GlobaliD partner), is already working on active pilot implementations in Aruba and Canada. An interoperability hackathon On September 9, GlobaliD will be participating in an interoperability hackathon — Cardea’s Interop-athon: Cardea, the COVID credential project hosted at Linux Foundation Public Health, is going to host an “Interop-athon” on September 9, 8:00 am to 12:00 pm Mountain Time. As Cardea is now being commercially deployed to share COVID-19 test results, vaccination, and trusted traveler credentials, it is important to facilitate and showcase the interoperability among these projects. To this end, Cardea will host a four-hour interoperability “hackathon style event”. The maintainers of Cardea will stand up a test environment including an Issuer, Mediator, Government, and Verifier Agents for participants to test against. Participants can also bring their own decentralized network! GlobaliD will be demoing an early implementation of our latest version of the COVID credential. If you’re interested in this space, be sure to join us:",https://medium.com/global-id/the-future-of-covid-credentials-on-globalid-7a19a882cf90,,Post,,Meta,,,COVID,,,,,2021-08-30,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,YouTube,Metaco Talks,,,,,,The Sovereignty Stack: Re-thinking Digital Identity for Web3.0 w/ Greg KIDD [METACO TALKS #23],"Greg is a serial entrepreneur who is probably best known for founding and taking public Dispatch Management Services Corp, the world’s largest on demand dispatch network for urgent deliveries. In a highly interesting career so far, Greg was also Chief Risk Officer at Ripple Labs and a senior analyst for the Board of Governors of the Federal Reserve in Washington. In his latest venture Global ID, Greg is acting on his long-held belief that people’s identity should be truly portable and owned by individuals themselves rather than corporations or governments.",,https://www.youtube.com/watch?v=cygggz2pt1i,,Video,,Meta,,,,,,,,2021-10-04,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,Calvin Burrows,anchain.ai,,,,,Introducing PRIVATE,PRIVATE is a new technical framework developed in collaboration with AnChain.AI that carves out a path toward regulatory compliance for non-custodial wallets while preserving user privacy. (PRIVATE stands for Privacy Preservation through Resolution of Identity via Verification and Attestation for Travel Rule CompliancE),"Introducing PRIVATE We’re thrilled to announce the publication of the PRIVATE white paper. PRIVATE is a new technical framework developed in collaboration with AnChain.AI that carves out a path toward regulatory compliance for non-custodial wallets while preserving user privacy. (PRIVATE stands for Privacy Preservation through Resolution of Identity via Verification and Attestation for Travel Rule CompliancE) Until now, efforts around regulatory compliance in the digital asset space have focused on centralized, custodial solutions. That includes Coinbase’s recently announced TRUST platform, a coalition that also includes industry heavyweights such as Fidelity, Circle, and Gemini. That’s a great start. The explosion of mainstream and institutional interest in digital assets, fueled in part by the rapid rise of DeFi and NFTs last year, has shone a bright spotlight on what is still a nascent space. But in order for these innovative new ecosystems to make a long-term, positive impact for end users, they’ll need to comply with existing and upcoming regulations. SEC Chief Gary Gensler argued last summer that developments in decentralized finance likely fall under the scope of his agency’s oversight. That begins with addressing regulations for custodial services such as exchanges, but it also requires answering the question of non-custodial wallets, which, today, serve as the primary portal into the Web3 universe. Why non-custodial wallets matter Non-custodial wallets rest at the heart of how blockchain technology works. They’re software applications — or in some cases, hardware devices — that allow you to directly interact with their corresponding blockchain. You can hold funds, receive tokens, or initiate transactions. In essence, it’s a digital wallet where you, the user, directly control your digital assets. It’s not so different from having cash in your wallet or assets stored away in a safe in your house. Contrast that with custodial wallets, which act more like your typical bank account. A service provider, such as a crypto exchange like Coinbase, manages your funds for you. Because these services mirror things we’re already used to, the path toward regulatory compliance is also more straightforward. A service provider that is already managing user funds has direct touchpoints for also managing a user’s identity and trust. The same cannot be said for non-custodial wallets, which represent a unique framework for how we understand the management of digital value and as such, require a specialized approach to regulatory compliance while preserving the very tenets of privacy and self-sovereignty which DeFI services are based around. That requirement became all the more pressing last October, when the Financial Action Task Force (FATF) updated their recommendations, which encompassed everything from DeFi to stablecoins to wire transfers. Specifically, recommendation 179c updated their guidance on non-custodial wallets. The updated guidance requires that customer information be collected for virtual asset transfers in order to enforce anti-money laundering rules like sanction screening and transaction monitoring. That’s where PRIVATE comes in. Enter PRIVATE The PRIVATE framework’s secret sauce is decentralized identity. Perhaps unsurprisingly, decentralized identity and decentralized finance are a natural fit. The first benefit is privacy. Identity verification can be achieved without revealing a wallet’s true owner. As such, PRIVATE allows for regulatory compliance while fully preserving user privacy. The second benefit is control. Rather than relying on a corporation or government agency, users own their corresponding digital identity just as they own their non-custodial wallet. You might call it a non-custodial identity, preserving the decentralized spirit of non-custodial wallets and Web3 offerings in general. Moreover, PRIVATE is not designed to replace custodial solutions and corresponding compliance platforms such as TRUST. Instead, PRIVATE complements these existing approaches by allowing custodians to seamlessly and compliantly interact with the Web3 world. What’s next Here’s the reality. Regulations are coming. On March 31st, the European Union Parliament passed measures to end anonymous crypto transactions. The proposal requires crypto service providers to collect and share information about all parties involved during asset transfers. This is one of the first major attempts at establishing formal requirements for crypto exchanges, and it certainly will not be the last. As such, providing solutions that simultaneously satisfy DeFi stakeholders who prioritize decentralization, and regulatory entities who prioritize oversight is going to play a critical role in the development of Web3. GlobaliD and AnChain.AI have begun development of an end-to-end solution around the PRIVATE technical framework. This is no small task and we look forward to collaborating with anyone who shares our commitment toward preserving individual autonomy and privacy while abiding by the rules of the road. If you’re interested in contributing to the development of PRIVATE, please reach out: privateframework@global.id See also:",https://medium.com/global-id/introducing-private-65fce62c6a8e,,Framework,,Product,,Compliance,,,PRIVATE,,,2022-04-05,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,,,,,,Building a more inclusive and equitable future,"When we first launched the GlobaliD Wallet powered by Uphold back in June of 2020, the goal was clear. Our vision has always been that anyone, anywhere should be able to create and own their self-sovereign digital identity. And along with that identity, they should get a wallet, providing them access to basic financial services.","Building a more inclusive and equitable future When we first launched the GlobaliD Wallet powered by Uphold back in June of 2020, the goal was clear. Our vision has always been that anyone, anywhere should be able to create and own their self-sovereign digital identity. And along with that identity, they should get a wallet, providing them access to basic financial services. These are the bare necessities of anyone trying to operate in the modern world, and the release of the original GlobaliD Wallet was a huge step toward achieving that vision. But there have also been limitations when it comes to wallet access, where users living in certain countries or states have not been able to participate in the full GlobaliD experience. This is no fault of Uphold, which continues to expand its services to more and more jurisdictions. Instead, it’s the nature of any custodial offering in a highly regulated space. What we’ve learned over the last two years is that in order to truly achieve our vision, we need to transition to a non-custodial solution for the GlobaliD Wallet. That way, anyone who creates an identity with GlobaliD can get a non-custodial GlobaliD Wallet. Beyond access, moving to a non-custodial solution gives us more control over our own destiny with exciting prospects for new products and features — including future debit card programs. It also means our users will be able to own and control their money along with their identity, furthering our mission toward building a more self-sovereign future. So what will these changes look like? 1. Effective July 15, 2022, you will need to access your wallet directly through Uphold rather than GlobaliD. Rest assured, the contents of your wallet won’t change. - On July 15, 2022, the GlobaliD Wallet powered by Uphold will be removed from the GlobaliD app. - GlobaliD Wallet users will still be able to access their wallet through Uphold’s website and app. - If you don’t have separate login credentials with Uphold, use the email address associated with your GlobaliD to login to Uphold using the “forgot password” flow to establish a password there. 2. The new non-custodial GlobaliD Wallet will soft launch in the near future — stay tuned. - This is still a work in progress and will be a staggered release so stay tuned, but this will be a pivotal step towards achieving our vision of providing anyone, anywhere with an identity and a wallet. 3. Debit card programs tied to the custodial GlobaliD Wallet including the GlobaliD and XRP Mastercard® Debit Card will cease operation on July 15, 2022. - As both of these debit cards are tied to the custodial GlobaliD Wallet powered by Uphold, they will also cease operation on July 15, 2022. - We have plans for future card programs, but it’s too early to discuss those details at this time. - XRP Rewards will be ending along with the debit card program. Your XRP Rewards will continue to accumulate through July 15 and will be available in your Uphold account after the card program ends. 4. The XRP Army Group on GlobaliD will live on. - Despite the XRP Card program ending, we will continue to support the XRP Army Group on GlobaliD. - As a core group of early adopters on the platform that have made a huge contribution to GlobaliD’s ongoing success, we will continue to engage with the XRP Army Group. Group members will receive exclusive early access to new products and features such as the new non-custodial GlobaliD Wallet and potential future card and rewards programs. Making these changes wasn’t an easy decision, but we believe that it’s the right strategic move in order to build a best-in-class identity and wallet offering for a more inclusive and equitable future. In the coming weeks we will be announcing and releasing new tools, like our aforementioned non-custodial wallet as well as new privacy-preserving ways for people to use their identity. If you’d like to stay up to date on these and future card releases, subscribe to our GlobaliD Insider newsletter. If you have more questions about this change, your wallet, or your card, you can refer to the FAQ below. If you have additional questions, please reach out to our Customer Care team. For questions regarding your Uphold wallet, you can reach out to Uphold Support. Frequently Asked Questions Why is this happening? - Over the last two years we’ve learned that in order to truly achieve our vision of self-sovereign identity and inclusive finance, we need to transition to a non-custodial solution for the GlobaliD Wallet. The initial non-custodial solution will not support a card program at launch. What happens to my funds in my wallet? - Your funds are safe. You will be able to access them directly through the Uphold mobile or web application. You will not be able to access them via GlobaliD after July 15th. What happens to my rewards? - All of your paid-out rewards will be accessible via Uphold in the Portfolio section. Click the dropdown next to “XRP” and you will see “Reward — XRP.” What happens to the money that was funding my card? - All of your funds will still be available through Uphold in the Portfolio section. What do I do with my card? - The card will stop working on July 15th, 2022. After that point, you will no longer be able to use it to make payments or collect new rewards and are free to dispose of the card if you wish to do so. How do I access my Uphold account? - From the Uphold website, you can log in with the email address you registered with GlobaliD. If you don’t have a password, click on “Forgot password?” and proceed with Uphold’s password recovery flow to create a password. You will now be able to access your Uphold account with your email and password. How do I use my Uphold wallet? - Please visit the Uphold help center to learn more about your Uphold wallet and account. Do I need to do anything to disconnect the XRP card or close it? - No. After July 15th the card will automatically deactivate. Will GlobaliD support a card program in the future? - We are looking into different ways to support card programs on the new non-custodial GlobaliD Wallet, but there are currently no short-term plans to start supporting this feature. If you’d like to stay up to date on product updates and future card releases, subscribe to our GlobaliD Insider newsletter.",https://medium.com/global-id/building-a-more-inclusive-and-equitable-future-745f897a2c2b,,Post,,Product,,,,Wallets,,,,2022-06-29,,,,,,,,,,,,,
|
||
GlobalID,GlobalID,,Medium,,,,,,,GlobalID Introduces Trustees for Key Recovery,"Trustees can be friends or family members from your contact list. Once selected, each Trustee is granted a shard of your private key. Restoring your lost Identity requires approval from the majority of your Trustees.","Introducing GlobaliD Trustees — Account recovery without a private key No one likes losing their phone. Recovering access to your accounts and sensitive data can range from straightforward to downright difficult. In some cases, if you’ve lost your private key, you’re just out of luck. At GlobaliD, that’s not good enough — so we’re flipping the script. Rather than rely on “what you know,” it’s about “who you know.” With GlobaliD Trustees you can recover your Identity even if you don’t know your private key. Trustees can be friends or family members from your contact list. Once selected, each Trustee is granted a shard of your private key. Restoring your lost Identity requires approval from the majority of your Trustees. Here’s how it works: - Select 3 Trustees to act as custodians for your Identity - If you lose access to your Identity, simply initiate a restore request - Contact your Trustees to exchange recovery codes - Once you have majority approval, you’re all set! To see GlobaliD Trustees in action, a few of us internally whipped up a quick video to show you the flow: Special thanks to Chalen, a UX Designer at GlobaliD (and video wunderkind), for producing, shooting, and editing the video! Try GlobaliD Trustees for yourself — download GlobaliD on iOS or Android. If you have any questions, please contact us at: support@global.id Further reading:",https://medium.com/global-id/introducing-globalid-trustees-account-recovery-without-a-private-key-66142a21cba6,,Post,,Product,,,,,,,,2021-06-08,,,,,,,,,,,,,
|
||
Hyland,Thoma Bravo,Hyland,,Packy Hyland,,"USA, Ohio, Westlake",USA,,,Hyland,"Hyland is privately held company and a leading content services provider. We enable thousands of organizations to focus on what they do best and deliver better experiences to the people they serve.<br><br>The power of connection drives Hyland. From connecting technology systems and data to connecting co-workers, teams and global communities, Hyland believes in transforming digital interactions into meaningful outcomes for customers, partners and our own employees.<br>","- Planning your 2023 digital transformation efforts? This 5-minute self assessment provides unique, tailored recommendations on the technologies that can best transform your business processesStart now! Hyland is a Customers' Choice Thanks to high ratings from our end-users, Hyland is recognized as a 2022 Gartner® Peer Insights™ Customers' Choice for Content Services Platforms - At Hyland, we believe technology should transform the way you work, so you can be more informed, empowered and connected through every interaction and in every relationship with everyone you serve. DISCOVER HYLAND Explore Hyland's expertise in your industry Digital transformation is more crucial now than ever. Here's what you'll need to modernize your processes: "There’s so much more to OnBase than making the documents electronic. The solution has provided so many opportunities for us, and the reporting mechanism has been fabulous." "OnBase and Guidewire provide the foundation for us to compile data-driven analysis and models which enables us to provide better care to the claimant, allowing them to recover and return to work faster." “We created efficiencies, increased accuracy and lowered costs through structured processing. This allows us to continue to move quickly to provide great service to our members.” I have 100% more confidence in the security of [HR] information. Hyland news - Hyland names Bob Dunn Vice President of Global Partner Programs - Hyland named one of top companies in Cleveland by LinkedIn - Hyland Healthcare announces support for AWS for health initiative - Hyland joins Gartner Peer Insights Customer First Program for content services platforms - Hyland named a Leader in Content Platforms Evaluation Popular blog posts - Unstructured data: A missing link to healthcare interoperability - 3 reasons OnBase and Episys are better together - 3 examples of how digitizing HR leads to organizational success - Make the jump to Hyperdrive by soaring through the cloud - 4 ways a content services platform improves HR compliance and security",https://www.hyland.com/,,Company,,Company,Enterprise,Credentials,,Content,Hyland Credentials,,,1991,,,,https://blog.hyland.com/,https://blog.hyland.com/feed/,,https://www.crunchbase.com/organization/hyland-software,https://www.linkedin.com/company/hyland-software/,,,,,
|
||
HylandCreds,Hyland,HylandCreds,,,LearningMachine,,USA,,,Hyland Credentials,"Your organization will find that an engagement with Hyland Credentials is a lot more than buying software, we address the needs of your whole organization. We look forward to working with you to make secure, digital credentialing an enduring part of your institution’s legacy.","Hyland Credentials. Get a complete system to issue digital credentials in a blockchain-secured format that is easily shareable and instantly verifiable anywhere in the world. Overview A new generation of digital credentials offers transformative convenience and security for all stakeholders through the use of open standards and blockchain-based verification. Blockchain Security Blockchains offer a new public infrastructure for verifying credentials in a manner far more durable, secure, and convenient than relying upon a single authority. Blockcerts Benefits The open standard for blockchain-based records ensures interoperability, recipient ownership, vendor independence, and choice of any blockchain. Industry Solutions Every sector issues credentials with specific needs and form factors. Hyland Credentials has unique solutions that enable your organization to develop branded templates, automate credential issuance, and learn from your credential data. We help organizations transform the way they issue credentials. MIT MIT offers digital diplomas to all graduating students, including undergraduate, graduate, and PhD-level programs. Malta Malta implemented a nation-wide initiative for educational credentials to be offered as Blockcerts across their various education providers. FSMB Federation of State Medical Boards was the first professional medical organization to issue blockchain-based records.",https://www.hylandcredentials.com/,,Product,,Company,,ID,,,,Credentials,,2020-02-05,,https://twitter.com/HylandCredent,,https://www.hylandcredentials.com/blog/,,,https://www.crunchbase.com/organization/learning-machine,,,,,,Http://community.blockcerts.org
|
||
HylandCreds,HylandCreds,,,,,,,,,Badges and Blockcerts,"Education and training providers have long been wrestling with the legacy of the credit hour and how to adapt credentialing to a modern world that values skills more than time spent in the classroom. This is in part why the industry has seen an explosion of traditional and alternative providers that are experimenting with new credential formats appropriate for the information age. One of the questions we most frequently encounter at Learning Machine from these providers is: What are the differences between different credentials formats? The implicit question behind that one is: When should I use different types of digital credentials, and why?","Badges and Blockcerts In education and workforce development, it’s important to understand the differences between digital credential formats and how to combine them for greatest impact. Education and training providers have long been wrestling with the legacy of the credit hour and how to adapt credentialing to a modern world that values skills more than time spent in the classroom. This is in part why the industry has seen an explosion of traditional and alternative providers that are experimenting with new credential formats appropriate for the information age. One of the questions we most frequently encounter at Learning Machine from these providers is: What are the differences between different credentials formats? The implicit question behind that one is: When should I use different types of digital credentials, and why? To answer these questions, Learning Machine Research is currently preparing a “Digital Credentials Comparison Report” with the Federation of State Medical Boards which outlines the technical differences between credential formats and their pragmatic implications. Findings from this Report will be presented by the FSMB and Learning Machine at the IMS Global Learning Consortium quarterly summit on February 6, 2019. In the meantime, this blog post presents a quick summary of the differences between two of the most popular new digital credentials formats: Open Badges and Blockcerts. This should help leaders at credentialing institutions make informed decisions about when and why to use each type of digital credential. Open Badges 2011 saw the birth of Open Badges, which digitally and visually convey the achievement of a specific skill. Similar to the Scouts movement, which uses a small fabric symbol to represent specific achievements, digital badges were designed to convey a singular achievement through a digital image and a hosted set of data. Initially spearheaded by the Mozilla Foundation, the Open Badges standard is now maintained by the IMS Global Learning Consortium, ensuring interoperability between platforms. The atomization of achievement enabled by digital badges is intended to open up new and novel pathways toward larger educational or professional goals. Carving up learning and achievement into “bite-size” elements facilitates the pursuit of education beyond traditional 2- and 4-year programs and toward a paradigm of lifelong learning from multiple education and training providers. In this way, badges are perfect for low-stakes credentials, or “micro-credentials.” While insufficient for situations which require high-stakes validation (such as, for example, verifying a passport at a border), micro-credentials can effectively reward milestones of Personal achievement and be combined with other achievements to eventually become important elements of a high-stakes credential. In many ways, digital micro-credentials have been an early signal indicating the desire on the part of education providers and employers to digitize all types of credentials. However, the security limitations of digital badges have limited the range of appropriate use cases. For instance, because badge data and badge display are hosted separately, the display could easily be tampered with. Further, because recipients do not control any cryptographic keys connected to their badges, they don’t really have technical ownership over of them. Despite these limitations, however, the security level provided by Open Badges is appropriate for their intended use cases: micro-credentials documenting small steps along the road of greater achievement. Blockcerts In response to the desire for high-stakes credentials in a digital format, the development of Blockcerts began in 2015 as part of a project by the MIT Media Lab. The intent was to leverage the power of the blockchain as a global notary for the verification of digital records. Formally launched in 2016, all of the reference libraries were published under an MIT open source license, making the code free to use by anyone wanting to build their own applications for issuing, receiving, and verifying Blockcerts. Most significantly, the open standard includes an open source Universal Verifier that will verify any Blockcert issued by any institution, anywhere in the world. Anyone can use the Blockcerts Universal Verifier at blockcerts.org or spin up their own Universal Verifier from the open-source code available there. Rather than using a simple image format, like badges, Blockcerts were designed as software (JSON files) that could potentially embody any type of data and generate any type of display. These records are cryptographically signed by the issuer, include recipient keys, and are registered on a blockchain for later verification. In summary, Blockcerts are fundamentally different from badges by offering the following innovations: - Tamper evidence - Issuer and recipient ownership - Flexible form factor - Online and offline sharing with verification - Independent verification A common use case for Blockcerts is a university diploma or transcript. Let’s say Jane has recently graduated from college and receives an official copy of her academic record in a digital format that contains her keys. She can then choose to present her record to anyone—like a potential employer—who can independently verify the issuer of the diploma, the time of issuance, and its status (valid, expired, or revoked). That employer could even verify that the diploma was issued to Jane, and not to someone else. Never before have digital records been this secure or convenient to use. Further, Jane’s academic record could reference any number of other records, like badges, that she may have earned along the way. Once records become software, all kinds of operations become possible. The end result of blockchain-secured records is a reduction in overhead related to verification, a streamlined transfer of information, improved ability for learners to share their data, and an easier movement between education providers, states, and countries — all of which contribute to a dramatic reduction of fraud and greater convenience for everyone involved. Note that Blockcerts fundamentally differs from a National Student Clearinghouse model of credential transfer because it doesn’t rely on a centralized authority to store, send, and verify the credentials on anyone’s behalf. Instead, the student or worker becomes their own “lifelong registrar,” able to store, access, and verify any Blockcert issued to them by any provider anywhere in the world. Institutions or individuals looking to verify Blockcerts don’t need any special software; they don’t need to be part of a “credentials consortium” or join any special network or pay any fees. This is the breakthrough of decentralized credentials enabled by secure, recipient-owned digital records using the global open standard. Spectrum of Security Official records are issued in various ways, each suited for a different purpose. For instance, sometimes paper is appropriate for situations where security needs are low and usage happens within a rapid time frame, like a ticket to enter an event venue. On the other end of the spectrum is Blockcerts, the highest level of security for the most important records people wish to use and keep for a lifetime. Learning Machine is excited to lead the way in helping organizations become issuers of Blockcerts with an easy-to-use product interface. To make the Learning Machine Issuing System even more useful as a credentialing platform, we are releasing a new set of features this January (2019) to enable the issuance of IMS-compliant Open Badges. By allowing education and training providers to issue both standards-compliant Blockcerts and Open Badges in one place, we are helping them consolidate the systems they use for credentialing and creating major credentialing efficiencies for recipients, who can now receive and store all their records in the same way. Looking Forward Beyond the Learning Machine Issuing System, we’re also excited to continue our work with international standards bodies. In addition to co-chairing the W3C Credentials Community Group and being part of the IMS Executive Board for digital credentials, we have also joined the steering committee of the Decentralized Identity Foundation (DIF), where we continue to collaborate with industry leaders to create data standards that ensure the interoperability of digital records between vendors. Everyone sees the power of trustworthy digital records, particularly when they protect privacy and promote convenience for everyone involved. It’s up to us, however, to determine whether those digital records will be accessible and verifiable independent of a particular vendor’s infrastructure. That is the power of using open standards like Blockcerts. If you would like to discuss how Open Badges and Blockcerts can enhance your institution’s credentialing operation, please reach out to us.",https://www.hylandcredentials.com/badges-and-blockcerts/,,Post,,Explainer,,,,,,,,2018-12-13,,,,,,,,,,,,,
|
||
HylandCreds,HylandCreds,,,,,,,,,Digital Identity,"A framework for organizing the categories of digital identity and an analysis of where disruptive innovation is most likely to succeed. [...] Learning Machine has made the strategic choice to disrupt paper documents with verifiable digital records (software), rather than competing directly within the traditional identity space. The following analysis explains why. Note that some startup challengers will be named as exemplars in their categories, which is not meant to imply any criticism of those companies. In fact, many of these companies are collaborating behind the scenes on data standards that will form a common foundation for future interoperability.","Digital Identity A framework for organizing the categories of digital identity and an analysis of where disruptive innovation is most likely to succeed. Digitizing people’s identity to streamline their interactions with a digitally connected world is a movement full of opportunity, but also fraught with danger. While creating convenience and expanding access to services is universally desired, asymmetrical power relationships can lead to predatory practices. Whether this is a government centralizing data or a company driven by profit and expansion, the misuse of Personal data is a growing cause of concern. This concern has resulted in a movement that uses concepts like “self-sovereignty” to denote a raft of practices intended to protect individuals: data minimization, decentralization, consent, ownership, and limited access are just a few. This is a noble and timely movement, but one in which startups are challenging very powerful and wealthy incumbents. Incumbents generally are not motivated to disrupt themselves, so what strategy can effectively disrupt the entrenched digital identity market? As Clay Christensen famously wrote 20 years ago, disruptive innovation is a process by which a new thing transforms an existing market with simplicity, convenience, and affordability where complication and high cost were the norms. This type of innovation can be hard to spot at first due to a lack of features or immediate usefulness, but it contains something new and valuable that sustains its growth over time. Just look at how portable camera phones seemed worthless at first, and then they grew to disrupt the entire camera industry. With disruptive innovation in mind, let’s look at the digital identity problem space from a business strategy perspective. This starts by recognizing that digital identity is not one monolithic sector, but rather a collection of different categories in competition with each other. Access management, regulatory compliance, and Internet accounts are typically considered the three constitutive parts of the identity space. These sectors are where all the money is spent, where research and development are targeted, and where most public attention is focused. However, at Learning Machine, we believe there is an important fourth category: Documents. Together, these categories form an easy-to-remember acronym: ACID. Learning Machine has made the strategic choice to disrupt paper documents with verifiable digital records (software), rather than competing directly within the traditional identity space. The following analysis explains why. Note that some startup challengers will be named as exemplars in their categories, which is not meant to imply any criticism of those companies. In fact, many of these companies are collaborating behind the scenes on data standards that will form a common foundation for future interoperability. We wish all companies fighting for self-sovereignty to have success. Access Access Management facilitates secure login to various ecosystems. This service is ultimately about providing login security, which necessarily creates layers of difficulty. Examples: - Incumbent: Microsoft Active Directory, Okta - Challengers: Uport, Everynm Analysis: Providing login IDs that are rooted to a blockchain and recipient-owned is an altruistic goal because it seeks to limit the leakage of Personal information. However, this space already has difficulty layers in place for security, and today, managing cryptographic keys is hardly considered easy. Increasing the difficulty level and responsibility of users will make it hard to win at scale because the public generally values convenience overall. People want to be empowered, yes, but they also like having a safety net. Other well-funded technologies, like artificial intelligence, may provide better alternatives to passwords in the short term. Compliance Regulatory Compliance for various industries requires knowing their customer (KYC), which is part of a larger anti-money laundering (AML) initiative. Examples: - Incumbent: Experian, LexisNexis - Challengers: Civic, Secure Key Analysis: Current KYC blockchain startups are essentially just providing another vendor-controlled network. In other words, a person’s blockchain-anchored profile will only work within that vendor’s proprietary ecosystem. More importantly, these startups are relying on data that originates from incumbents like Experian, LexisNexis, and others — they are simply putting that data on a blockchain. Having a core dependency upon one’s competition doesn’t seem like a formula for effective disruption. Internet Identity for web-based accounts is how we use online services and social networks. Preferences are configured by people and experiences are tuned to each person’s activity history. This space has been widely covered in the news, in part due to recent violations of trust, privacy, and what many feel to be a dishonest representation of the service. While users may establish separate profiles for different services, these services gather data from beyond their walls to develop expanded profiles that are monetized in ways that violate common conceptions of privacy and trust. Due to the imbalance of power in these relationships, predatory practices often emerge where people are increasingly exploited for payment with dollars or Personal data. Examples: - Incumbent: GAFA (Google, Amazon, Facebook, Apple, etc.) - Challengers: Web 3.0 (Blockstack, Decentralized Apps, Tokens, etc.) Analysis: Creating an entirely new Internet (Web 3.0) where people enter into a relationship with a service maintaining full control over their identity is an astonishingly ambitious goal formed in response to a real problem. The struggle here will be the high switching costs for regular people to move away from the traditional web, along with the vast wealth of these competitors. The extremely low number of early adopters confirms this friction. A decentralized web would be amazing, but it seems like a very long road that will be strewn with startup casualties along the way. ICOs and tokens are interesting because they have created a new funding mechanism to extend the runway for many of these initiatives, but that doesn’t change the uphill dynamics. Documents Official documents (paper) still run the world. Processes like certification, testing, inspection, health, employment, finance, Personal ID, and education still create paper trails to elicit trust and rely on postage to send these trusted artifacts. Just look at how the Apostille Process (notarization) is still the de facto standard for international law. Even PDFs are essentially just paper equivalents with most of the same limitations. Today, these documents are relatively easy to fake and difficult to transmit or verify. And yet, official records are potentially a powerful form of social currency because they allow people to authentically represent their various achievements, experiences, and characteristics (aka attestations or verifiable claims). Examples: - Incumbents: Paper, wax, postage - Challengers: Learning Machine Analysis: We believe that replacing physical credentials (documents) with programmable ones (software) is the most effective way for a startup to disrupt the identity space. The processes surrounding paper to make it certified isn’t convenient and paper inherently lacks powerful features like self-attestation and built-in machine readable data. Further, public blockchains provide a global verification network that can ensure the integrity, ownership, and authenticity of these new digital records. Of course there are other digital approaches to the document space, like Adobe’s Document Cloud/Blue Ribbon. Their pre-blockchain approach has wide adoption and is a precursor to documents as software. Replacing a centuries-old technology with a natively digital object has dramatically more value for issuers, recipients, and verifiers, making this a winnable category in the short term and potentially disruptive to the entire identity space long term. This approach is easy to adopt and the Learning Machine Issuing System is currently being rolled out across entire countries. That’s innovative disruption. Summary When we examine the various parts of the digital identity sector, we have to ask where smaller companies can most likely make an impact. Historically, this doesn’t happen by going toe-to-toe with established incumbents. Rather, innovative disruption happens by developing something that has 10x value, a process that typically must start outside the mainstream. One way disruption gains traction is when it surfs a secular trend with powerful momentum. In the case of digital identity, public decentralized blockchains are providing that momentum and the most successful approaches will be fully aligned with that power. From our point view, competing in the traditional identity space with a blockchain solution offers little added value today and is sometimes at odds with the underlying values of public blockchains—decentralization, permissionless innovation, and inclusiveness. For instance, KYC is necessarily centralized, hierarchical and exclusionary, which makes using a blockchain an unusual tool of choice. On the other hand, replacing documents with software has clear 10x advantages that empower individuals with social currency for a lifetime. This approach is full of winnable markets and completely aligned with the values of decentralization, greatly increasing the effectiveness of these records as well as the transformative potential for this strategy.",https://www.hylandcredentials.com/digital-identity/,https://www.hylandcredentials.com/wp-content/uploads/2018/10/Screen-Shot-2018-10-09-at-5.02.47-PM.png,Post,,Explainer,,,,,,,,2018-10-06,,,,,,,,,,,,,
|
||
HylandCreds,HylandCreds,,,,,,,,,Flexible Systems,"Top-down initiatives to reconstruct entire sectors like digital identity are too brittle to succeed. Flexible systems require a different mindset. [...] unlike every other sector (media, communications, engineering, finance, etc.), official records largely rely on old formats like paper, wax, and PDF for certification, all of which are hard to verify and easy to fake. This is big reason why public blockchains are exciting, because they have the power to prove the authenticity, ownership, and integrity of a natively digital record. The combination of strong cryptography and public blockchains provide a new technical infrastructure that gives people the ability to manage their own records of achievement in a format that is digital, easily shared, and instantly verifiable using a global verification network.","Flexible Systems Top-down initiatives to reconstruct entire sectors like digital identity are too brittle to succeed. Flexible systems require a different mindset. Official records are one of the most powerful forms of social currency. They allow people to demonstrate proof of their abilities, accomplishments, and experiences in a way that helps them gain entrance to new realms of economic possibility. However, unlike every other sector (media, communications, engineering, finance, etc.), official records largely rely on old formats like paper, wax, and PDF for certification, all of which are hard to verify and easy to fake. This is big reason why public blockchains are exciting, because they have the power to prove the authenticity, ownership, and integrity of a natively digital record. The combination of strong cryptography and public blockchains provide a new technical infrastructure that gives people the ability to manage their own records of achievement in a format that is digital, easily shared, and instantly verifiable using a global verification network. Of course, exciting new frontiers can also bring out some impatient instincts. The Instinct to Centralize Anytime the world gets a powerful new technology with the potential to make old dreams come true, the instinct is to harness the power by immediately attempting to pre-architect all desired outcomes in one step. This tempting instinct creeps into both the strategic and technical arenas, ultimately distracting from long-term transformation with a short-term eagerness. Strategic Mistakes Centralization can be powerful and effective in many ways, and so it’s no surprise that strategic plans for new technologies often begin with the word “Universal” — as in universal ID cards, universal academic records, and so on. The problem with instituting this type of top-down control is that it presumes to understand all possible situations, now and in the future, for how the system must operate. This type of vision is not only brittle, it ignores the unique traits of the technology and imposes non-existent limitations. Let’s look at two examples. A universal ID card for a country usually proposes to place all Personally identifiable information in one government controlled system to access a variety of social benefits. While this might feel efficient for a government, overconsolidation of data creates a honey pot that ultimately threatens the safety of citizens and misses a larger opportunity: public blockchains make it possible for citizens to transact and share specific attributes about themselves, without the vulnerability of storing all their data within a single database. A universal academic record proposed by schools typically sounds like this: 1. Write data to a secure common student record 2. Participate in verification revenue over time 3. Accelerate admissions and matriculation with a standard set of definitions. This is basically the Clearing House model, which isn’t new and doesn’t require a blockchain. This mindset also misses a profound reality that education now comes from a proliferation of alternative providers over a lifetime, and that the blockchain enables every student to act as their own lifelong registrar. In both examples, the desire is for a top-down business model. This is powerful, but misses the longer-term possibility of real transformation that generates an entirely new set of conditions for social organization, and even monetization. Technical Mistakes New technical initiatives often feel pressure to provide immediate value, to deliver the dream in too few steps. This eagerness is typically expressed in one of two ways. Let’s look at these examples as they relate to public, decentralized blockchains which act as a simple foundational layer for further layers of innovation. Centralized control over the technology provides greater freedom to invent, extend features, and move quickly. Often called Decentralized Ledger Technology (DLT), these initiatives may be somehow using a blockchain data structure, but in the context of a shared database. While a shared database may provide great value, it doesn’t provide the iconic features of immutability, permissionless innovation, or censorship resistance. Simply using a shared database would be more effective than creating a DLT system for many, if not most, use cases. Smarter chains can also seem like another shortcut to wish fulfillment. By having more features built at the chain level, more functionality arrives out of the box, before applications for the chain ever get developed. The problem lies in both how this function limits unforeseen future applications and creates a larger surface area for failure, which is the last thing you want in a foundation layer. A More Durable Approach Long-lasting technologies are built like geological strata that layer on top of previous achievements. Generally, each layer addresses a specific need. Over time, this stack of technologies adds up in ways that weren’t possible to predict and are far more transformative than previously imagined. Investor Fred Wilson recently posted a reflection on this process: “First, apps inspire infrastructure. That infrastructure enables new apps.” This architectural separation of concerns, between infrastructure and apps, is what enables the perpetual momentum of growth that accrues over time. Good planning starts with the realization that new initiatives operate within social and technical ecosystems that are not entirely ours to control. There are realities and forces at work we must collaborate with, along with patterns of success for which we have some good rules of thumb. The first rule is that the foundational layers of infrastructure should be simple. Let’s use the Internet as an example. By all measures, it is a “dumb” network, meaning that it doesn’t do anything but transmit packets of information. However, it is also this simplicity that makes it strong and capable of supporting all manner of sophisticated apps that come and go over time. Bitcoin has a similar heritage because the network has a similar simplicity, only moving packets of value. While that simplicity is frustrating for some, this industrial layer of strength provides the solid foundation, and the space, necessary for future innovations at higher application layers. It is important to keep in mind the many parts of a technology stack when trying to understand technology news and vendor offerings. Without a framework, the myriad of new initiatives will seem overwhelming and perhaps contradictory. Some vendors are offering blockchains, others are promoting data standards, and a few are actually providing applications that you can use, like Learning Machine. Each of these layers interrelate and directly contribute (or not) to a desired set of values or outcomes. So let’s outline the the layers of the Learning Machine technology stack, starting with the application (the product). - Application: Learning Machine provides an online system to easily create, issue, and manage digital records that are recipient owned and instantly verifiable with any public blockchain. - Data Standard: Blockerts and the Verifiable Claims specification are the Open Standard for how to create, issue, and verify blockchain-based records. It is the reason records issued by Learning Machine are interoperable and can be anchored to any chain. The Blockcerts Universal Verifier also ensures that these records can verify regardless of chain or vendor. - Blockchain: This foundational layer is public infrastructure, like the Internet, that is supported by a global network. Learning Machine recommends rooting records to public decentralized chains with enough scale and adoption to last a lifetime. Keeping these layers separate allows for progress and flexibility to happen at each layer, which is essential for growing better at their core functions — the blockchain layer provides secure verification; the Data standard allows for interoperability; the Application allows for convenience and real-world usage. Understanding how to better facilitate flexible systems is important for pursuing the end goal, which is to empower people with the social currency of their records. To be effective, any currency requires ownership, trust, convenience, independence, and interoperability. Contact us if you would like to learn more about issuing trustworthy digital records that empower people with real social currency.",https://www.hylandcredentials.com/flexible-systems,,Post,,Explainer,,,,,,Blockchain,,2018-10,,,,,,,,,,,,,
|
||
HylandCreds,HylandCreds,,,,,,,,,Hosted ≠ Verified,"Have you ever seen a person’s certification listed on LinkedIn, and then followed the link to the actual credential? It typically resides on the domain of a software vendor, or on the domain of the issuing institution, with the intention of communicating authenticity. While hosted credentials provide convenience for both credential holders and verifiers, hosting is not enough to provide a secure basis for verification.","Hosted ≠ Verified Credentials hosted on an institution’s domain may be convenient, but hosting alone is not sufficient to provide security, reliability, or longevity for recipients and verifiers. Credentialing has been undergoing a renaissance in recent years, encouraged by the unbundling of education and a proliferation of new education providers targeting niche outcomes. This trend has yielded an explosion of digital certificates, micro-credentials, badging, and other innovative symbols of accomplishment. Have you ever seen a person’s certification listed on LinkedIn, and then followed the link to the actual credential? It typically resides on the domain of a software vendor, or on the domain of the issuing institution, with the intention of communicating authenticity. While hosted credentials provide convenience for both credential holders and verifiers, hosting is not enough to provide a secure basis for verification. Easy to Spoof Many of these new credentials are simply a web page. However, as we all know from email phishing scams, websites are easy to fake and a slightly altered domain name can be hard to spot. If a motivated imposter wanted to set up a website to make fake credentials look real, this wouldn’t require much effort. The case of Open Badges is slightly different. These are typically image files with information attached, and easily shareable as a discrete object. However, when verification occurs, it is not the visible badge which is being checked. Rather, verification is checking the hosted version of that badge, not the display that is in hand. This means the display of a badge could be completely changed and it would still successfully verify. This is what we mean when we say a credential is not “tamper evident.” In both cases, what you have are credential displays that are easily spoofed. While this level of security may be fine for temporary or low-stakes accomplishments, it’s fundamentally problematic for higher-stakes credentials like diplomas, transcripts, identity documents, and licenses. Below are two major drawbacks of relying on hosted credentials for long-term verification. Inconsistent Availability Beyond being an untrustworthy display, websites simply aren’t reliable for the long term. Sites go down, links get moved, and so on. For instance, when Open Badge vendors go down, none of the credentials issued through those platforms will remain usable or even visible. Imagine applying for a job and only having a 404 error page when the employer clicks on your credential. It’s hard to believe that some educational institutions are trusting startups for hosting credentials that need 100% availability. Unlikely to Survive Even if your organization chooses to host everything itself, the maintenance of online records is a huge responsibility, and the risk of going down, causing harm, and suffering reputational damage is likely. Plus, very few organizations will last for a lifetime. Don’t you want your graduates to have the confidence that proof of their accomplishments will work for the long term, even if your organization should change or disappear? This is certainly the case with credentials that have value beyond getting one near-term job. In short, hosting credentials provides a convenient way for people to share a link, but it doesn’t provide confidence for verifiers. If new credentials are going to gain the gravitas of traditional records, they will have to grow into a more secure format. This is why Learning Machine provides a Blockcerts-compliant issuing system designed for issuing digital records in an independently verifiable format via any blockchain–public or private. We recommend using public blockchains for their longevity, security, and immutability. Governments, companies, and school systems with an eye toward to future are beginning to move in this direction. Better Credentials Valuable credentials shouldn’t have ongoing dependency upon an issuer or vendor in order to be viewed, shared, or verified. This is what public blockchains help to correct by providing a verification network that has no single point of failure. People can hold and share their digital credentials, and this new public infrastructure allows for those credentials to have a durable and long-lasting source of independent verification. “It’s self-sovereign, trustworthy, transparent, and impossible to destroy because it’s not simply stored on a database in some government building.” Evarist Bartolo Minister for Education and Employment, Malta In addition to being more durable, this type of decentralized verification is instant, free, and extremely detailed when using the Blockcerts Open Standard. In addition to checking for evidence of tampering, the Blockcerts open source verification process also checks issuer signatures, recipient ownership, date of expiry, and revocation. If your organization is issuing important records or certifications of accomplishment, you should be planning when to adopt more secure practices to protect credential owners and to protect your organization from potential liability, ongoing responsibility for credential maintenance, and reputational damage. If you would like to learn more about how Blockcerts can become an integral part of your organization’s long-term strategic credentialing plan, reach out to us at contact@learningmachine.com.",https://www.hylandcredentials.com/hosted-%e2%89%a0-verified/,,Post,,Explainer,,,,,,,,2018-09,,,,,,,,,,,,,
|
||
HylandCreds,HylandCreds,,,,,,,,,Remaking Credentials,"When desktop computers came into the workplace 25 years ago, the problem of paper remained. How could print layouts be shared and displayed across a variety of electronic machines and operating systems? The most prominent solution that emerged was Adobe’s Portable Document Format (PDF), a proprietary solution released in the 1990’s as a way to share documents that included text formatting and in-line images. Even though more features were added over time, in essence, PDFs operated as a paper analogue for computers.","Remaking Credentials As we move from a world of discrete paper repositories to a world of interconnected digital systems, we need official records that are natively digital to reap the full benefits of electronic exchange. When desktop computers came into the workplace 25 years ago, the problem of paper remained. How could print layouts be shared and displayed across a variety of electronic machines and operating systems? The most prominent solution that emerged was Adobe’s Portable Document Format (PDF), a proprietary solution released in the 1990’s as a way to share documents that included text formatting and in-line images. Even though more features were added over time, in essence, PDFs operated as a paper analogue for computers. As the adoption of PDFs became more widespread, this format was adopted by some organizations as an alternative to paper for embodying and conferring official records to recipients. Since PDFs are not hard to alter/edit, they needed to be “sealed.” So, digital signatures from the issuing institution were added, which rely on a centralized party, like Adobe, to verify the entity behind the signature. While this method gained modest traction, it hasn’t created a new normal for the peer-to-peer exchange of official records. In fact, the Apostille (notary) process is still the standard for transmitting official records internationally. Further, plenty of vendors have entire business models to be the trusted middleman for sending or verifying records. The end result of PDFs has been a failure for trusted records. Desktop computers may be able to display them, but little else. People must still pay money and wait a long time to have their records sent. Relying parties must spend time and money to make sure these records are legitimate. Basically, these PDFs are no more functional than paper — they are very large files (slow), not enriched with metadata, and not easily machine readable. All of this prevents the transformational benefits and efficiencies of a truly electronic exchange. PDFs + Blockchain? With the rise of decentralized systems, blockchains have become famous for enabling a new level of security and peer-to-peer exchange for digital assets. Not to miss a marketing opportunity, some software vendors have added blockchain timestamping to their PDF credential service. This is a process by which a document is registered on a blockchain, at a specific point in time, to prove that a certain version existed. The question here is what added value does timestamping actually provide in this scenario? Very little. Blockchains were made to enable decentralized systems where digital assets are cryptographically owned by recipients and function peer-to-peer without relying upon any vendor or third party. So, unless a software provider has gone to lengths that make both of those goals real, no fundamental benefit is being realized from using a blockchain. The vendor still provides all of the assurances, and the blockchain is simply providing redundancy. Further, proprietary approaches that aren’t open-source, or based on open data standards, are doomed to a short lifespan. Even when a PDF has been digitally signed and blockchain timestamped, it doesn’t suddenly become useful as a software object, beyond the mere ability to view it. Official records as stand-alone objects are completely uninspiring. We need to do better and we have the technology to do so. Official records can be made as software to interrelate with other systems in reliable and dynamic ways. This is how we reach the automation, speed, analysis, and discovery that everyone desires. PDFs were a capstone for the age of paper. They are not the way to enter a truly digital age. Natively Digital Credentials JSON is the default choice for transmitting data on the Internet and within web applications. While originally named for moving JavaScript objects, it is now used as the standard format for any popular programming language. The most common use cases are for web APIs that send data between 3rd party systems, or to communicate within a system between a server and a user’s browser. As the de facto standard for transmitting data, JSON must be the starting point for any type of official credential that seeks to take full advantage of the web and electronic exchange. This is why JSON was the starting point for Blockcerts, the open standard for blockchain-based credentials, launched with MIT in 2016. The primary question was how to fully equip a JSON file with the properties needed to operate as a modern credential. In addition to being instantly verifiable using a blockchain as a global notary, a few design principles were always priorities: - Open source - Reliance on open standards - Recipient ownership - Minimize resource requirements (computation, cost, etc.) - Must be viable without any proprietary product - Blockchain-agnostic These minimal requirements resulted in a solution now regarded as the most secure, interoperable, and standards-based way to issue and verify natively digital records. Committed to eventual alignment with the W3C’s Verifiable Credentials Specification, these JSON files are digitally signed by an issuer and anchored to a blockchain for later verification. Even the visual presentation layer has been cryptographically sealed, so parties looking at the credential know all of the machine readable data is fully integrated with what they are seeing on screen. Further, each credential has an embedded cryptographic key unique to a recipient, allowing the recipient to prove ownership of the credential. The potential for computer systems to organize, filter, combine, and understand digital credentials is limitless — for systems of both issuers and verifiers. Imagine having an HR system that automatically verified, organized, and used machine learning to help derive insight about a pool of applicants. In Summary While PDF documents are digital, they carry all the same limitations of paper. They are inert and heavy files whose value are confined within their own display, which is of little value in a world connected by computer systems. It’s not hard to see how the PDF-or-JSON difference might get lost when they look very similar on screen. However, the difference is profound. Understanding that chasm starts with appreciating the full range of function inherent in software objects, their readiness for other systems and processes beyond mere display. Each digital credential has the capacity to interlock with different networks and economies like machine parts, maximizing the value of those credentials in different ways for everyone involved.",https://www.hylandcredentials.com/remaking-credentials/,,Post,,Explainer,,,,,,Credentials,,2019-04-19,,,,,,,,,,,,,
|
||
HylandCreds,HylandCreds,,,,,,,,,Why Use a Blockchain?,"Everyone wants digital records to be shareable and verifiable, but it is only now that we have the technical infrastructure to reliably accomplish that goal. The innovation that makes this possible is blockchain-enabled networks that synchronize around a single truth. While digital signatures and public key infrastructure (PKI) are important pieces of a secure credentialing solution, it is the addition of a decentralized verification network that adds the highest level of security, longevity, and recipient ownership to digital records.","Why Use a Blockchain? Blockchains offer a new public infrastructure for verifying credentials in a manner far more durable, secure, and convenient than relying upon a single authority. Everyone wants digital records to be shareable and verifiable, but it is only now that we have the technical infrastructure to reliably accomplish that goal. The innovation that makes this possible is blockchain-enabled networks that synchronize around a single truth. While digital signatures and public key infrastructure (PKI) are important pieces of a secure credentialing solution, it is the addition of a decentralized verification network that adds the highest level of security, longevity, and recipient ownership to digital records. Traditional solutions for verifying digital records, including PKI, have typically relied on a trusted third party (TTP) to transmit or provide verification. This might be a vendor, an issuer, or a certificate authority. Unfortunately, in these cases, the TPP operates within limited jurisdictions and precariously maintains a single point of failure. This means that if the TTP is ever compromised, loses their records, or stops functioning, verification is no longer possible. Some minimize the risk of such a failure, but catastrophic failures happen all the time across every geographic region, leaving people stranded and exposed. Disaster Examples: - War: In Syria, civil war left major institutions of government and education destroyed. Millions of people can no longer prove who they are or what their skills are because the only institutions who could verify this information are no longer functioning, or have lost their records. - Natural: In 2017, Hurricane Maria hit Puerto Rico. Critical infrastructure was wiped out by the hurricane, causing loss of high-stakes records. These included vital records (birth, death, and marriage certificates), driver’s licenses, property titles, and address and tax records. - Technical: In the United States, the Equifax hack demonstrated how a single honey pot of Personal information, like social security numbers, can leave citizens completely exposed. The point is that disasters are common and can happen anywhere, to any trusted third party. Entrusting a single entity with the power to protect and verify those records creates a brittle system with poor security and longevity. It is insufficient for high-stakes records that need to be accessed and verified reliably for a lifetime. A better alternative is having this same trusted authority backed up thousands of times, across the globe, and accepted across jurisdictions because the data isn’t controlled by any single company or government. That is what public blockchains have enabled. Even better, using an open standard (like Blockcerts) to anchor records to blockchains creates an ecosystem of globally portable, interoperable records that can easily be recovered if disaster strikes. Blockchains and Decentralization Every decade or two, a new computing platform comes along that changes how we live. Personal computers, the Internet, and smartphones are all examples of fundamental innovation. What’s hard to comprehend about new platforms is that they are initially inferior to older platforms in most ways, but they also bring about some profound new capabilities. Today, decentralized software, enabled by blockchains, are the fundamental innovation. While these platforms are sometimes counterintuitive and lack many features, they offer something that has never existed before: Trust. Instead of having to trust a government, or a large company, or even the other people on the network, the only thing that needs to be trusted is math. That bedrock characteristic opens up the door for new types of software to be developed where trust is essential, like money, property, or official records. Further, because trust is built into the platform itself, it can be run by a global network with thousands of participants, rather than a single company like Facebook. A blockchain is a way of storing an identical copy of data across the entire network, so when some piece of data needs to be verified, there is a global consensus supporting that fact. Replication of data provides durability, and decentralization resists censorship. Technical Benefits The main difference between PKI and blockchains is simply that, with blockchains, verification authority is being decentralized. We can call this DPKI. The technical benefits of this are independent timestamping and a globally redundant network for instant verification. Independent timestamping is a security enhancement beyond traditional PKI. A blockchain provides its own timestamp for when each credential was conferred to a recipient, which is a type of transaction. This ultimately gives Issuers the ability to rotate their issuing keys without undermining the ability to verify those transactions. Verification requires checking that the credential originated from a particular Issuer while that issuing key was valid, which requires knowledge of the timestamp beyond anything written into the credential itself. If a private key is compromised, nothing prevents an attacker from issuing fake credentials and backdating the content. Even if an Issuer publicly revoked those fake credentials, an independent verifier would not know the difference between a valid and invalid credential. With blockchain-based independent time stamping, the time of the transaction is recorded, thus rendering the backdating attack impossible. A global verification network with thousands of computers that all contain the same copy of historical transactions removes the vulnerability of relying upon a single authority. The effect is improved availability, the capacity to independently verify, and redundancy that avoids single points of failure. It’s also important to point out that education providers are not surrendering any authority in this situation. Schools still issue, store, and host the records as they always have; they are simply gaining a level of security that didn’t exist before. Overall, blockchains offer promising new features which help to achieve security goals while enabling individuals to hold their own official records, independent of any authority. This is the cross-jurisdictional verification infrastructure needed in today’s globalized world.",https://www.hylandcredentials.com/why-use-a-blockchain/,,Post,,Explainer,,,,,,Blockchain,,2018-09,,,,,,,,,,,,,
|
||
HylandCreds,HylandCreds,,,,MIT; Learning Machine,,,,EDUCAUSE 2018,Credential (n.),"Blockcerts was publicly announced at EDUCAUSE in 2016 to an immense room of attendees consisting of CIOs, Registrars, and IT professionals — a presentation which, over the last two years, kicked off a wave of experimentation, press, and interest for using blockchains as a new infrastructure of trust.<br>Further, MIT has been using the Learning Machine Issuing System to issue official diplomas as Blockcerts to graduates at all levels across the Institute. This October we returned EDUCAUSE 2018 with Mary Callahan, MIT Registrar, to share a progress report on issuing blockchain-based digital diplomas to graduates over the last year and half.","Credential (n.) From the Latin credere: “to believe, trust” Paper documents have been used throughout history to represent aspects of an individual’s identity or qualifications, providing the bearer of that credential a certain amount of credit when asserting a claim. Today, these take the form of birth certificates, academic records, titles, deeds, licenses, and various other instruments that allow people to authentically represent something about themselves to the world. Unfortunately, these paper documents have been losing currency due to being easy to fake and hard to verify. While various seals, watermarks, and complexity have been added over the years, we stand at a moment in history where fraud is rampant and bearer instruments have lost most of their efficacy. Former FBI agent Allen Ezell, and John Bear, Ph.D., have written a book that focuses on corruption within academic credentialing titled, “Degree Mills: The Billion-Dollar Industry That Has Sold Over a Million Fake Diplomas,” a book that will crumple any belief which maintains trust in traditional formats. Of course, loss of trust in these formats has resulted in byzantine processes for the transmission and verification of records that is inconvenient and expensive for everyone involved. This is why MIT and Learning Machine started working together in 2016, to make a new kind of digital record that restores trust in credentials like academic records. The result was launching Blockcerts.org — the open standard for securing digital records by using a blockchain as a global notary system to verify authenticity. The goal of this resource is to provide people with the ability to store their own records and use them directly in the world when they see fit. Further, relying parties can use the open-source verifier to instantly check these credentials, a process that generates a hash of the local document and compares it to a hash on the blockchain. When everything matches, and it has not expired or been revoked, the credential is verified. EDUCAUSE Blockcerts was publicly announced at EDUCAUSE in 2016 to an immense room of attendees consisting of CIOs, Registrars, and IT professionals — a presentation which, over the last two years, kicked off a wave of experimentation, press, and interest for using blockchains as a new infrastructure of trust. Further, MIT has been using the Learning Machine Issuing System to issue official diplomas as Blockcerts to graduates at all levels across the Institute. This October we returned EDUCAUSE 2018 with Mary Callahan, MIT Registrar, to share a progress report on issuing blockchain-based digital diplomas to graduates over the last year and half. Chris Jagers, Learning Machine CEO, kicked off the presentation by talking about the power of open standards as well as explaining the technology behind public decentralized blockchains. Driven by inclusion, security, and trust across borders, open decentralized blockchains provide a new public infrastructure similar to the Internet — a network not controlled by any company or government. Mary Callahan, Senior Associate Dean and MIT Registrar, followed by presenting a summary of experience and data from issuing digital diplomas over the last year, all of which was organized by four core motivations: to empower students with ownership, reduce fraud, increase immediacy of information, and to help students build a lifelong portfolio of credentials. The presentation was followed by a variety of questions, including how to future-proof these digital records. This brought the conversation back to Blockcerts, because open standards are the best way to be prepared for an unknown technology future. While proprietary formats may sometimes gain quick adoption, they get wiped out when open standards take hold and begin to grow. Ultimately digital records should be trustworthy, recipient owned, and vendor independent. If your institution is interested in becoming an issuer of these digital records, we would love to talk with you.",https://www.hylandcredentials.com/credential-n/,,Post,,Meta,,,,,,,,2020-01-01,,,,,,,,,,,,,
|
||
HylandCreds,DHS,,,,LearningMachine,,,,,DHS Awards 159K for Infrastructure to Prevent Credential Fraud,"Phase 1 award project “Leveraging **Learning Machine**’s Commercial Offering in Public Infrastructure for Fraud Prevention” will adapt their current commercial technology using the open-source Blockcerts standard to support emerging global World Wide Web Consortium (W3C) security, privacy and interoperability standards such as decentralized identifiers (DID) and verifiable credentials for credential issuance and verification solutions. The proposed approach enables credential user and DID provider independence from vendor-specific accounts to access credentials and promotes holder control and interoperability.","FOR IMMEDIATE RELEASE S&T Public Affairs, 202-254-2385 WASHINGTON – The Department of Homeland Security (DHS) Science and Technology Directorate (S&T) has awarded $159,040 to Learning Machine Technologies, Inc. based in New York, to develop blockchain security technology to prevent credential fraud. Government agencies issue, validate, and verify credentials for a variety of purposes. For example, DHS operational components, such as U.S. Customs and Border Protection, the Transportation Security Administration, and U.S. Citizenship and Immigration Services, issue, validate or verify eligibility requirements; licenses and certifications for travel, citizenship, and immigration status; employment eligibility; and supply chain security. Current processes are often paper-based, do not facilitate data exchange and use among systems, making them potentially susceptible to loss, destruction, forgery and counterfeiting. S&T is exploring the application of blockchain and distributed ledger technology (DLT) to issue credentials digitally to enhance security, ensure interoperability and prevent forgery and counterfeiting. Learning Machine Technologies’ Phase 1 award project “Leveraging Learning Machine’s Commercial Offering in Public Infrastructure for Fraud Prevention” will adapt their current commercial technology using the open-source Blockcerts standard to support emerging global World Wide Web Consortium (W3C) security, privacy and interoperability standards such as decentralized identifiers (DID) and verifiable credentials for credential issuance and verification solutions. The proposed approach enables credential user and DID provider independence from vendor-specific accounts to access credentials and promotes holder control and interoperability. “Standards-based interoperability is critical to implementing innovative, fraud resistant approaches to digital issuance of currently paper-based credentials.” said Anil John, S&T's Silicon Valley Innovation Program (SVIP)Technical Director. “By adapting their existing platform to build support for emerging W3C global standards, Learning Machine will enable organizations to deploy solutions without vendor or platform lock-in concerns.” The Phase 1 award was made under S&T’s SVIP Other Transaction Solicitation Preventing Forgery & Counterfeiting of Certificates and Licenses seeking blockchain and DLT solutions to fulfill common needs across DHS missions. SVIP is one of S&T’s programs and tools to fund innovation and work with private sector partners to advance homeland security solutions. Companies participating in SVIP are eligible for up to $800,000 of non-dilutive funding over four phases to develop and adapt commercial technologies for homeland security use cases. For more information on current and future SVIP solicitations, visit https://www.DHS.gov/science-and-technology/svip or contact DHS-silicon-valley@hq.DHS.gov. For more information about S&T’s innovation programs and tools, visit https://www.DHS.gov/science-and-technology/business-opportunities. ###",https://www.dhs.gov/science-and-technology/news/2019/11/12/news-release-dhs-awards-159k-prevent-credential-fraud,,Post,,Meta,,,,,,,,2019-11-12,,,,,,,,,,,,,
|
||
HylandCreds,Hyland,,,,LearningMachine,,,,,Hyland acquires blockchain-credentialing provider Learning Machine,"Hyland, a leading content services provider, announced its acquisition of Learning Machine, an innovator in blockchain-anchored digital credentialing solutions. The acquisition was effective February 1, 2020.","Hyland acquires blockchain-credentialing provider Learning Machine Hyland, a leading content services provider, announced its acquisition of Learning Machine, an innovator in blockchain-anchored digital credentialing solutions. The acquisition was effective February 1, 2020. Learning Machine is a pioneer in leveraging blockchain technology to authenticate documents and content. Its credentialing solution facilitates the creation and sharing of blockchain-secured digital records that are recipient owned, vendor independent and verifiable anywhere. The Learning Machine Issuing System allows any organization to easily design their records, import recipient data, issue records and manage the entire credentialing lifecycle. The system allows governments, companies and educational institutions to issue blockchain records at scale, rooted in any blockchain they choose. “This acquisition is a major step toward our goal of revolutionizing the way organizations electronically exchange trusted records,” said Bill Priemer, president and CEO of Hyland. “The addition of Learning Machine’s digital credentialing solutions to Hyland’s content services platform will enable our customers to generate and manage digital documents that are both easily shareable and instantly verifiable.” “The use of blockchain technology for digital credentialing has become an increasingly urgent need as governments, educational institutions and organizations seek to combat fraud, mitigate risk and relieve administrative burdens associated with the exchange of content,” said Chris Jagers, CEO of Learning Machine. “This acquisition creates significant value for Learning Machine customers who will gain the full benefit of Hyland’s notable support, partnership and accountability.” One of the most prevalent uses of the Learning Machine technology today is the issuing of digitally secured diplomas and transcripts for and by higher education institutions. With over 900 of these institutions already leveraging the Hyland content services platform, these customers stand to benefit from Learning Machine’s ability to share and deliver authenticated content. Additionally, Hyland looks forward to accelerating the diversification of Learning Machine solutions across the vertical markets and geographies in which Hyland is already well positioned to help organizations deliver better experiences to the people they serve. Hyland will continue to support Learning Machine’s current solutions and customers as it integrates the technology into existing platform offerings. For more information about Hyland and its leading content services platform, visit Hyland.com. For more information about Learning Machine’s digital credentialing system, visit LearningMachine.com. Hyland is a leading content services provider that enables thousands of organizations to deliver better experiences to the people they serve. Find us at Hyland.com. About Learning Machine Learning Machine is a leading provider of blockchain credentialing solutions based on the Blockcerts open standard, enabling customers around the world to issue verifiable digital records at scale. Find us at LearningMachine.com.",https://news.hyland.com/hyland-acquires-blockchain-credentialing-provider-learning-machine,,Post,,Meta,,,,,,,,2020-02-05,,,,,,,,,,,,,
|
||
HylandCreds,Hyland,,,,,,,,,"Hyland, Dataswift and Case Western Reserve University partner to advance web-based verifiable credential storage","The initial phase of the partnership involved building a web-based interface that enables users to easily store and manage their verifiable credentials by uploading them to a user-owned, encrypted Personal data account (PDA), an innovative privacy-preserving solution developed by Dataswift, another strategic partner of xLab. That account is the storage system of a Personal data server legally owned by users themselves and comes with a Data Passporting function that can be called upon by any application, allowing users to license their data on demand, quickly and securely, with any relevant party.","Hyland, Dataswift and Case Western Reserve University partner to advance web-based verifiable credential storage Students in CWRU’s xLab assist in developing credential storage in Personal data accounts Hyland, Dataswift and students from Case Western Reserve University’s xLab initiative have partnered to advance the development of open standards for blockchain-anchored digital credentials. “This partnership has incredible potential, as digital credentials become increasingly accepted as an easy, secure way to share information,” said Valt Vesikallio, SVP, global services at Hyland and an executive sponsor of the project. “We’re excited to be sharing in this work with students at Case Western Reserve University, a world-class institution in our back yard.” The partnership has been beneficial for all parties, as Case Western Reserve students have gained valuable real-world experience in their field of study, while the University has gained partner companies that expand co-curricular opportunities for its students. Hyland, meanwhile, has expanded its potential pipeline of future developers and hopes to expand the number of CWRU’s students and eventual graduates working at the company. “Our students are proud and excited to work on such a meaningful project and with a well-known and highly regarded company in Hyland,” said Youngjin Yoo, the faculty director of the university’s xLab. “They’re aware of the potential value and impact of credentials and are playing a key role in the development of these real-world use cases, experience that will help them in their future careers as well.” The initial phase of the partnership involved building a web-based interface that enables users to easily store and manage their verifiable credentials by uploading them to a user-owned, encrypted Personal data account (PDA), an innovative privacy-preserving solution developed by Dataswift, another strategic partner of xLab. That account is the storage system of a Personal data server legally owned by users themselves and comes with a Data Passporting function that can be called upon by any application, allowing users to license their data on demand, quickly and securely, with any relevant party. Currently, sharing such private documents securely is difficult, costly and time-consuming, and often relies on third parties that alienate the user from ownership of their own data. CWRU’s xLab was founded in 2019 with the aim of fueling the transformation of Northeast Ohio’s digital economy and building digital intelligence in the region. It partners with Northeast Ohio corporations to assist in the implementation of business models for the new digital economy, by way of a multi-year strategic engagement. Students work on company-specific digital innovation challenges in a class instructed by a digital design faculty member. “We’re excited to be working with Case Western Reserve University students on this important endeavor,” said Natalie Smolenski, Head of Business Development for Hyland Innovation. “This is a step forward for self-sovereign identity; the technology makes it much easier for data owners to share their Personal records securely online and for third parties to access and verify the documents they need.” Hyland is a leading content services provider that enables thousands of organizations to deliver better experiences to the people they serve. Find us at Hyland.com.",https://news.hyland.com/hyland-dataswift-and-case-western-reserve-university--partner-to-advance-web-based-verifiable-credential-storage/,,Post,,Meta,,,,,,,,2021-11-16,,,,,,,,,,,,,
|
||
HylandCreds,HylandCreds,,,,DHS; LearningMachine,,,,,Learning Machine wins DHS Grant to align Blockcerts with the W3C specifications for Verifiable Credentials and Decentralized Identifiers.,"Today, **Learning Machine** is proud to announce that we have won Phase-1 funding for our response to the open call “Preventing Forgery & Counterfeiting of Certificates and Licenses through the use of Blockchain and Distributed Ledger Technology.” The purpose of the call was to develop vendor-neutral technology solutions that prevent the forgery and counterfeiting of official records for immigration, travel, visas, and other use cases pertaining to national and citizen security. Our grant application addressed DHS requirements by proposing an upgrade to the Blockcerts open standard, making it capable of issuing W3C Verifiable Credentials.","Future Proof Learning Machine wins DHS Grant to align Blockcerts with the W3C specifications for Verifiable Credentials and Decentralized Identifiers. When Blockcerts was incubated at MIT, it was the first open-source project in the world that demonstrated how to create, issue, and verify a digital record using a blockchain to ensure the integrity of that record. At the time, Bitcoin was widely considered the most viable blockchain, and the W3C Verifiable Credentials specification was still nascent. Nevertheless, the project moved forward with a commitment to the principles of openness, synchronization with other data standards, recipient control, vendor independence, and viability for any blockchain. Since the launch of Blockcerts, major strides have been made in the Self-Sovereign Identity (SSI) space thanks to the diligent work of groups like the W3C, Rebooting Web of Trust, the Internet Identity Workshop, and the Decentralized Identity Foundation, all of which have built upon 20+ years of hard work from many different companies, organizations, and individuals. Today, with the emergence of the Verifiable Credentials specification, Decentralized Identifiers (DIDs), a Universal Resolver, and other important components of self-attesting digital credentials, the world has a set of tools and specifications which lay the groundwork for a growing consensus about methods and formats that can reliably assert a digital claim. Most importantly, these standards are not owned by any one vendor or institution, making them an infrastructure that enables open innovation. The W3C credential standards are analogous to TCP/IP or GPS: open protocols that enabled the internet and geolocation revolutions. Governments are playing an increasingly critical role in the verifiable credentials ecosystem by funding fundamental research. An important example of governments taking the lead in this way is the Silicon Valley Innovation Program, part of the U.S. Science & Technology directorate within the Department of Homeland Security. SVIP offers a variety of grants to help develop new technologies and accelerate their time to market. Today, Learning Machine is proud to announce that we have won Phase-1 funding for our response to the open call “Preventing Forgery & Counterfeiting of Certificates and Licenses through the use of Blockchain and Distributed Ledger Technology.” The purpose of the call was to develop vendor-neutral technology solutions that prevent the forgery and counterfeiting of official records for immigration, travel, visas, and other use cases pertaining to national and citizen security. Our grant application addressed DHS requirements by proposing an upgrade to the Blockcerts open standard, making it capable of issuing W3C Verifiable Credentials. The open-source reference implementation, targeted for 2020, will include: - Updating the Blockcerts schema to a Verifiable Credentials-based format - Updating the Blockcerts signature/verification scheme to conform to the latest JSON-LD signature suite format - Updating Blockcerts credential issuance and verification - Incorporating a cost-efficient DID method for issuers All of these upgrades to the Blockcerts open standard will also be included in Learning Machine’s SaaS product for issuing digital credentials. By becoming fully aligned with the W3C, Blockcerts (and, by extension, Learning Machine customers) will benefit from many security and feature upgrades. View the DHS Press Release about Learning Machine. Benefits The Blockcerts roadmap has always aimed to enable the issuance and verification of an ever-wider range of credentials, along with related privacy-enhancing measures. These are largely achieved by alignment with Verifiable Credentials and the Decentralized Identifier specifications, which promise the following benefits: More Flexibility Verifiable Credentials allows for flexible data schemas at its core, allowing for a wider range of credentials all backed by a greater range of security and privacy-protecting features. Greater Decentralization The use of DIDs removes the need to rely on issuer-hosted profiles and revocation lists, which creates unwanted dependency on the issuing institution. This enhances auditability of credentials and has many security benefits for key management. Most importantly, however, it ensures that credentials issued by an institution will continue to verify even if that institution no longer maintains its own hosting infrastructure–critical for the long-term ownership and verification of records across time and geographic boundaries. Improved Privacy and Security Improvements include: - New strategies help to avoid correlation of data between credentials. Currently, data aggregation is dangerous because even anonymized data can be correlated to individuals. Working together, the Verifiable Credentials and DIDs specifications make it much more difficult for any actor to correlate data without the data subject’s knowledge or consent. - Enabling the selective disclosure of credential data allows individuals to choose which data points they share with whom, rather than sharing an entire record that includes data that might not be relevant to the transaction at hand. This conforms to the principle of “data minimization,” a key component of self-sovereign identity. A Global Standard The W3C specification offers a world-wide data standard which catalyzes global alignment and thereby facilitates interoperability for all digital claims made on the web or shared peer-to-peer. At Learning Machine, we’re proud to help bring these standards into an open-source reference implementation at Blockcerts.org, as well as within the world’s leading commercial system for issuing and managing blockchain credentials. Our ability to translate these complex technology standards into convenient products will make it easy for governments, education providers, companies, and others to issue a full range of Verifiable Credentials.",https://www.hylandcredentials.com/future-proof,,Post,,Meta,,,,,,,,2020-01-01,,,,,,,,,,,,,
|
||
HylandCreds,Hyland,,,,Hyland,,,,,Study: Optimizing use of content is critical to enhancing customer experiences,"According to a new commissioned study conducted by Forrester Consulting, organizations recognize that “content is critical to improving the customer experience, but few are able to leverage its full potential.”","Forrester Study | March 2019 Content At Your Service: How modern content services platforms power digital transformation Study: Optimizing use of content is critical to enhancing customer experiences According to a new commissioned study conducted by Forrester Consulting, organizations recognize that “content is critical to improving the customer experience, but few are able to leverage its full potential.” If your organization struggles with using its content, you’re probably familiar with the roadblocks at the heart of the issue, like a lack of budget and difficulty migrating content from older systems. But with new technology trends changing the paradigm for user and customer interactions, your organization can’t afford not to pursue a better digital transformation strategy. What can you do? Forrester, a leading consulting and research firm, make a case for using a content services approach to digital transformation in this new study. It will help your organization define what a successful content services strategy looks like through four competency pillars: - Agile adaptivity - Intelligent automation - Tailored solutions - Reimagining business models and processes Is your organization ready to experience higher revenue gains, while providing better experiences for employees and customers? Forrester offers insights and a set of key recommendations, so you will be ready to begin your content services-enabled digital transformation today. Download the study>> Download the study *By submitting this form you are opting into receiving emails from Hyland Software, Inc. Please view our privacy policy for further information.",https://www.hyland.com/en/learn/it-programs/forrester-content-at-your-service-wp,,Study,,Meta,,,,,,,,2023-01-01,,,,,,,,,,,,,
|
||
HylandCreds,WebOfTrustInfo,,,,RWot9; Learning Machine; Blockcerts,,,,,"Blockcerts v3 release, a Verifiable Credentials implementation","As the standards around Verifiable Credentials are starting to take form, different flavors of ""verifiable credentials-like"" data structures need to make necessary changes to leverage on the rulesets outlined and constantly reviewed by knowledgeable communities such as the W3C. The purpose of this paper is to identify all of the changes needed for Blockcerts to comply with the Verifiable Credentials (VCs) and Decentralized Identifiers (DIDs) standards and to expand upon the additional benefits of using a blockchain in combination with Verifiable Credentials. This paper is meant to act as an explainer in which a formal specification can be created.<br>This paper proposes multiple implementation options for several properties. The intention is that we can engage the Blockcerts / Verifiable Credential communities and see what fits best.",,https://github.com/WebOfTrustInfo/rwot9-prague/blob/master/draft-documents/BlockcertsV3.md,,Proposal,,Standards,,,,,,,"Blockcerts,Verifiable Credentials",2016-12-31,,,,,,,,,,,,,
|
||
HylandCreds,HylandCreds,,,,,,,,,A Passport to Greater Opportunity,"One of the earliest challenges of statecraft was developing a legible view of its populations. Translating local complexities into simple and summary descriptions was necessary to enable traditional state functions like taxation and planning. This need to describe impelled various standardization efforts, including permanent last names, land registries, and population surveys, which gave society a visible shape that could be centrally recorded and used within legal frameworks to wield state power.","A Passport to Greater Opportunity Verifiable digital credentials are a critical component of addressing global workforce challenges related to education and mobility — all while following the principles of Good ID. One of the earliest challenges of statecraft was developing a legible view of its populations. Translating local complexities into simple and summary descriptions was necessary to enable traditional state functions like taxation and planning. This need to describe impelled various standardization efforts, including permanent last names, land registries, and population surveys, which gave society a visible shape that could be centrally recorded and used within legal frameworks to wield state power. Migration has historically challenged these standardization efforts and contributed to the reasons governments sought to limit movement. However, as today’s nation-states transition from local industrial economies to a global digital economy, priorities are changing. Building a modern workforce that is competitive and attractive requires a citizenry empowered with digital tools, continuous skills development, opportunity-driven mobility, and the flexibility to compete on a global scale. The pull of this economy is already evident within the field of education. Today, nearly 5 million students travel outside their countries of origin to be educated, a number which is expected to grow to 7 million by 2030. HolonIQ’s visualization of UNESCO data below illustrates the complexity of this student flow between nations. What tools can help sustain this trend? At Learning Machine, we believe verifiable digital credentials offer a critical solution. Blockchains and Digital Identity Official credentials represent an important part of who we are and how we interact with the world. Specific domains like education and employment, as well as digital identity more broadly, are the realms in which people must build a record about themselves in order to access opportunities. However, today over a billion people globally still have no way to prove their identities. This is particularly daunting considering that every type of service in the 21st century–including access to government services–will have to be accessed digitally. The challenge ahead is to develop the wisdom and will to create new identity systems that are transformationally inclusive without being radically invasive. Historically, greater state control has often directly diminished citizens’ rights, a trade-off considered to a certain point necessary to achieve administrative aims. Yet, recent massive breaches of Personal data and trust have ignited a public demand for options that better protect Personal privacy. The urgency to implement better identity systems has given rise to movements like Good ID and Self-Sovereign Identity. These initiatives advocate for approaches that enable individuals to reliably assert Personal claims via a digital Medium, without violating their own privacy, security, or ownership of data. The rise of public blockchains is notable because the technology breaks the old pattern of sacrificing rights for protection. Decentralized blockchains simultaneously increase public security and individual privacy by assuring the authenticity of digital assets. Operating as a global notary, blockchains offer instant mathematical verification of digital asset ownership and integrity, like money or credentials. This results in a reduction of fraud while increasing efficiency, conserving time and money at a massive scale. Strong cryptography combined with public blockchains has created the technical infrastructure to make Personal achievement legible and trustworthy, which is essential for training, recruiting and retaining a competitive workforce. By replacing paper records with blockchain-secured digital credentials (Blockcerts), society gains a faster, simpler, and more secure way to validate official records about our identity. Benefits Beyond Traditional Systems Some governments already have robust systems in place that make earnest attempts to respect Personal data and provide robust verification, which is to be commended. Where Blockcerts adds additional value is in the following areas: - Decentralized Verification of Credentials. Rather than querying a vendor or government database, which could be hacked or taken down, Blockcerts queries a global blockchain directly to determine whether a credential is valid. This provides lifelong vendor-independent verification with the highest level of confidence.Highest Available Digital Document Security. Blockcerts verification registers four things: 1) Whether a certificate has been tampered with in any way; 2) Whether it was actually issued by the authority it claims; 3) Whether it has been revoked by that authority and why; and 4) Whether a certificate has expired.Simplified Verification. Rather than replying upon a bureaucratic process, Blockcerts allows relying parties to embed the open-source verification into their digital systems, or simply check the Blockcert using the Universal Verifier. Blockcerts can also be verified by scanning a QR code displayed on the credential for in-person interactions, like at a job site, interview, or inspection.International Portability and Verifiability. Blockcerts can be shared anywhere in the world for free and instant verification. The high level of security and standardized digital format (JSON) makes each Blockcert a “passport” to opportunity by creating trust between people who live and work in different institutions and geographies. All of these attributes work together to help prevent fraud, enhance operational efficiency, and empower participants to be owners of their official records for a lifetime. Summary We have now more tools than ever before to overcome global social challenges. Many of these tools derive from the power of the centralized state to manage and effect change. Other tools, such as decentralized blockchains, offer instant and cross-border verification of important records. It is the strategic blending of these tools that enables real progress. Historically, many state-initiated attempts at social engineering have failed, at times catastrophically. John C. Scott argues in Seeing Like A State that the worst disasters occur at the confluence of four factors: strong state power, rigid idealism, authoritarian regimes, and a passive society. His warning implies that well-intended initiatives should build in protections by activating society with tools for Personal power, growth, and engagement. As we stand at the edge of instrumenting our world with digital tools, we should keep in mind that the instruments we use also shape our view of the world. Verifiable credentials enabled by blockchains are instruments that encourage positive social change by aligning state power with individual success. That is a vision of a world worth pursuing.",https://www.hylandcredentials.com/passport-greater-opportunity/,,Post,,Standards,,,,,,,,2020-01-01,,,,,,,,,,,,,
|
||
IBM,,IBM,,Thomas J. Watson,DIF; SecureKey; Indy; Sovrin Foundation; Mooti,"USA, New York, Armonk",USA,,,IBM,"We bring together all the necessary technology and services, regardless of where those solutions come from, to help clients solve the most pressing business problems.","IBM joined DIF because we believe it will take open community and standards to achieve the vision of self-sovereign identity. For example, members of DIF are focused on the establishment of an open web platform standard within the W3C standards organization called Decentralized Identifier (DID). A DID will provide a standard global resource naming scheme for identity.<br><br>There is a global Internet standard for naming resources called a uniform resource identifier or URI. When you type https://www.IBM.com into your browser, a URI ensures you always end up at IBM’s website. Similarly, we need one standard to identify an individual, as well.<br>",https://www.ibm.com/us-en/,,Company,,Company,Enterprise,IT,,,,,Verifiable Credentials; DID,1911-06-06,https://github.com/IBM,https://twitter.com/IBM,https://www.youtube.com/ibm,https://www.ibm.com/blogs/blockchain/,https://www.ibm.com/blogs/blockchain/feed/atom/,,https://www.crunchbase.com/organization/ibm,https://www.linkedin.com/company/ibm/,,,,,
|
||
IBM,IBM,,,Philip Duffy,,,,,,Building a digital trust ecosystem for mining in British Columbia,"The Mines Digital Trust Ecosystem wallet uses verifiable credentials which are enhanced digital versions of physical credentials. The Mines Digital Trust Ecosystem is built on technology that is highly transparent, secure, tamper-proof, and immutable. From the moment information is stored, it cannot be changed. Credentials can be revoked and re-issued as business processes dictate.","Share this post: Responsible practices to preserve our planet require innovation, agility, and collaboration. Consumers, investors, producers, and governments around the world are choosing to do business with those that demonstrate a commitment to sustainability. In the mining sector, British Columbia is committed to increased transparency and trust related to where products come from and how they are produced. This includes provenance related attributes for supply chain, tracing, and environmental, social and governance (ESG) reporting. “While there is tremendous progress already underway in this space [Responsible Sourcing Blockchain Network]” says Alex Kaplan, Global Leader for IBM Digital Credentials. “What I’m most excited about is what comes next and where we could go together.” Charting the course The government of British Columbia is leading the way by creating a digital service and convening an ecosystem that brings together producers, purchasers and investors of those raw materials to scale trusted credentialing in the mining space. As part of this initiative the government is convening the digital trust ecosystem led by BC’s Ministry of Energy, Mines, and Low Carbon Innovation (EMLI). In partnership with broader digital trust efforts from the BC Ministry of Citizens’ Services, there is extensive digital trust work taking place within the province. Learn how IBM Blockchain helps government agencies respond to new disruption Blockchain technology is part of the core infrastructure of this initiative because is a catalyst for sustainable development as it enables the trusted exchange and automation of minerals data across all participating members. Leveraging the technical and consultative expertise of IBM Blockchain, a pilot digital trust ecosystem is being activated that will allow BC natural resource producers to share verifiable evidence of where materials came from and the producer’s certified sustainable environmental, social and governance (ESG) practices including the reduction of greenhouse gas emissions. In addition, IBM and EMLI are partnering to create a long-term vision of the how the technology and ecosystem will address market needs and a governance model to accelerate future adoption. The founding members of the digital trust community will be working together over the coming months to build a governing charter for the ecosystem and its process, support onboarding, and expand the services. Making it real: Digital credentials in action This collaboration will use the existing OrgBook BC service. OrgBook BC started in 2017 as an early collaboration and exploration with IBM and Digital ID & Authentication Council of Canada (DIACC) around registries data, then evolved to begin using verifiable credentials, leveraging Hyperledger Aries and Indy technologies. The BC government and IBM helped found and contributed to the Trust Over IP (ToIP) Foundation focused on digital trust. ToIP launched in May 2020 as a confluence of efforts in the digital identity, verifiable credential, blockchain technology, and secure communications spaces to converge and create an interoperable architecture for decentralized digital trust. “Simply put,” says Ryan Forman, Executive Director, Strategic Initiatives Branch, EMLI, “the province of BC is leveraging their investment in open source distributed ledger technology, involvement in the ToIP, and industrial emissions data to enable mining operators to easily share third-party verified information about company performance.” The vision is to enable multiple sectors of the economy to contribute credentials and provincially-held data, going well beyond just provincial data. The digital ecosystem is in the early-adopters stage and, IBM together with the province of BC, are working with an international advisory committee to develop the strategy and approach. BC has been designing an enterprise grade BC Wallet for Business, which is a first for a government to establish for the business community. This will enable the province to issue credentials directly to companies in BC providing self-soveriegn control of the data to mining operators. The Mines Digital Trust Ecosystem wallet uses verifiable credentials which are enhanced digital versions of physical credentials. The Mines Digital Trust Ecosystem is built on technology that is highly transparent, secure, tamper-proof, and immutable. From the moment information is stored, it cannot be changed. Credentials can be revoked and re-issued as business processes dictate. Moving forward with mining in BC Leveraging technology like blockchain gives mining operators, regulators, and customers the potential to get their greenhouse gas reductions verified and share those credentials in way that can be trusted. But the technology alone is not enough. In order for this ecosystem to become a viable solution adopted beyond the pilot phase, and championed by its ecosystem participants in the market, it will require both a long-term vision of how the technology will address market needs and a governance model that allows for future growth. EMLI is partnering with IBM to engage with the founding members of the mines digital trust community, building a governing charter for the community and its process. This partnership will also support onboarding and expand the wallet services. “I’m truly excited to be part of this important initiative that clearly demonstrates BC’s leadership and commitment to supporting leading edge innovation in lowering the carbon footprint of our natural resource industries” says Gerri Sinclair, BC’s Innovation Commissioner. I Personally look forward to sharing more over the coming months as we co-develop a governance strategy that addresses the business, operational, legal and technology aspects of the Mines Digital Trust Ecosystem. Please tune into the demonstration of this work which will be part of the United Nations Global Innovation Hub at the COP26 conference where the current state interoperability solution will be demonstrated. Blockchain solutions that transform industries Join blockchain innovators who are transforming industries around the world. Let’s put smart to work. Find your blockchain solution",https://www.ibm.com/blogs/blockchain/2021/11/building-a-digital-trust-ecosystem-for-mining-in-british-columbia/,,Post,,Ecosystem,Public,,,Physical Creds,,,,2021-11-02,,,,,,,,,,,,,
|
||
IBM,IBM,,,Kal Patel,,,,,,Moving forward with Digital Health Passes,"We envision a future that will include multiple Health Pass solutions, giving organizations and consumers the ability to choose which to utilize. This is why my team and I have put an emphasis on the interoperability of our solution. In addition, easy communication between state and federal health systems will reduce necessary investment and increase access to Digital Health Passes. In the near future we envision a user from any state being able to use their Health Pass in New York or any other state of their choice.","Share this post: Having two daughters as nurses during the early stages and height of the pandemic made for tough weeks and months for myself and my family. The uncertainty of the virus and the inability to secure PPE for my daughters was a time I hope to never relive. The past year of course has brought unforeseen changes to daily life as we knew it for all of us. Effects of the COVID-19 pandemic span the globe, and no one has been untouched by the impact of this disease. Fortunately, the last five months has brought hope, with vaccines from multiple companies proving to be highly effective and the distribution of doses steadily ramping up. Both private and public sectors have pivoted and worked diligently to assuage the challenges we have all endured. With a significant portion of the U.S. population vaccinated and many more having received their first dose at this point, we must now shift our thinking to how we can responsibly and efficiently reopen economies. This will be crucial in getting citizens back to work, enabling business to take advantage of pent-up demand, and ultimately restoring economic prosperity to the many who have been financially impacted by COVID-19. Learn how innovative companies and individuals use blockchain for social good Impact and moving the needle I have been a technologist my whole life, and the capabilities of new industry and public partnerships in helping humanity continues to amaze me. Technology has played a crucial role in assisting schools and offices to swiftly transition to remote work and education. It’s now time for us to take advantage of these advancements once again, by responsibly bringing back employees, students, and consumers. I am the IBM Delivery Executive for the NY Digital Health Pass — Excelsior. Digital Health Passes are powerful technology-enabled solutions that can help restore normalcy for society. IBM’s Digital Health Pass, underpinned by blockchain, leveraged in NY sits at the nexus of data security and healthcare. Users are able to verify their health status without sharing Personal health data with any third party. There has been tremendous hype around Digital Health Passes or “vaccine passports”, but we have converted vision to reality. Our partnership with the state of New York has brought Excelsior Pass, a New York State branded Health Pass, to the market. The first state Health Pass to be rolled out in the United States. Making vision a reality with Excelsior Pass Imagine the streets and restaurants of New York City humming again. Baseball stadiums refilled with fans for summer games, and Broadway back to entertaining the flood of tourists and New Yorkers that flock to see iconic shows. If you live in NY, the Excelsior Pass could be the ticket to this and much more in the summer of 2021. The state of New York contracted IBM to implement a Digital Health Pass to aid state officials in expediting the reopening of businesses. Over the last two months my team and I have worked to make this a reality. And it’s been amazing to work on a solution that will likely help millions of people and businesses. This undertaking sits at the intersection of healthcare and technology, two sectors I am deeply passionate about. The Excelsior Pass launched on 25 March, making the tool available to all 19 million New Yorkers and all businesses across the state. The application has three components, the portal, wallet, and scanner. Those who have been vaccinated or tested can visit the portal, a website for users to receive their Excelsior Pass. After they have successfully received their pass, they can download the NYS Excelsior Pass Wallet application to their smartphone as a place to store their valid credential. The credential is what people will show to businesses accepting Excelsior Pass and looks quite similar to an airline boarding pass. The third and final component is the scanner. Businesses can download the NYS Excelsior Pass Scanner to a smartphone or tablet and scan the QR codes of patrons passes to verify they have been tested or vaccinated. In order to ensure equitable access of the application, the verifiable and tamper-proof passes can be presented either digitally on a smartphone or printed to ensure the pass is genuine and the holder’s current COVID-19 health status complies with necessary guidelines. The use of this tool is optional, but we hope the emphasis on data security will bring confidence to business owners and citizens of New York in using this powerful tool to help jumpstart the economy. Moving forward with Digital Health Passes Seeing an initial positive response from the market gives promise that the implementation of Health Passes is a viable route to restarting economies. Moving forward, as other states start to roll out their own Health Pass solutions, it will be crucial to ensure interoperability between platforms. The emphasis on interoperability will allow other states Immunization Information Systems (ISS) to seamlessly connect with Excelsior Pass or any other digital health pass. We envision a future that will include multiple Health Pass solutions, giving organizations and consumers the ability to choose which to utilize. This is why my team and I have put an emphasis on the interoperability of our solution. In addition, easy communication between state and federal health systems will reduce necessary investment and increase access to Digital Health Passes. In the near future we envision a user from any state being able to use their Health Pass in New York or any other state of their choice. The expansion doesn’t stop there, as more countries begin to implement similar solutions and international travel begins to pick up, this emerging Digital Health Pass ecosystem will continue to grow. My team sees our technical expertise and network design framework (including proven governance and incentive models) as a key differentiator in helping governments and organizations build and fine tune their programs. The implementation of Excelsior Pass in New York and other emerging health pass systems will help create a standard for vaccine and test verification. This will give companies, schools, and businesses the option to utilize a standard system as opposed to having to build and customize their own. Ultimately, this will result in significant savings for any organization toying with the idea of implementing a health pass system. As we move towards the hopeful end of the pandemic, we are excited to see the impact and value digital health passes will bring to empower all facets of the economy to fully restart. I’m optimistic about the future and am looking forward to the days when I can get back on the road, meet with my colleagues, and continue to solve the world’s problems through technological solutions. Blockchain healthcare and life sciences solutions Tackle issues of trust, transparency and data integrity with blockchain-based networks and solutions. Helping you build trust in our healthcare system",https://www.ibm.com/blogs/blockchain/2021/05/why-digital-health-passes-are-the-smart-and-responsible-way-forward/,,Post,,Ecosystem,Public,,COVID,,,,"DID,Verifiable Credentials",2021-05-11,,,,,,,,,,,,,
|
||
IBM,NYTimes,,,,,,,,,New York’s Vaccine Passport Could Cost Taxpayers $17 Million,"The state’s contract with IBM details a Phase 2 of the Excelsior Pass, which could include uses that some advocates say raise privacy concerns.","New York’s Vaccine Passport Could Cost Taxpayers $17 Million The state’s contract with IBM details a Phase 2 of the Excelsior Pass, which could include uses that some advocates say raise privacy concerns. New York officials introduced the Excelsior Pass app earlier this year as the country’s first government-issued vaccine passport, designed to help jump-start the state’s economy. But newly obtained documents show that the state may have larger plans for the app and that the cost to taxpayers may be much higher than originally stated. The state’s three-year contract with IBM — obtained by an advocacy group and shared with The New York Times — to develop and run the pass establishes the groundwork for a future where at least 10 million people in the state would have an Excelsior Pass. It would provide them with a QR code that would not only verify their vaccination status but could also include other Personal details like proof of age, driver’s license and other health records. The total cost could end up being as high as $17 million, much more than the $2.5 million the state had initially said it cost to develop the program. The contract also requires IBM to deliver to the state a “road map” to scale the digital health pass to 20 million individuals — the entire population of New York. The ambitious vision contrasts with the limited uses for the pass that the state has so far described to residents. Roughly two million New Yorkers have downloaded the pass as of Monday, the state said, up from 1.1 million two weeks before. Tens of thousands of people who want passes, the state said, have been unable to download them because of a variety of technical delays, user mistakes and data entry errors. The contract estimates that two-thirds of the adult population of the state will download passes by 2024. The contract also reserves $2.2 million for the optional implementation of a Phase 2 of the project, the nature of which is not disclosed. The state expects that the federal government will reimburse all funds. More on the Coronavirus Pandemic - New Subvariant: A new Omicron subvariant, known as XBB.1.5, is surging in the northeastern United States. Scientists say it remains rare in much of the world, but they expect it to spread quickly and globally. - Travel: The European Union advised its 27 member nations to require negative Covid-19 tests for travelers boarding flights from China to the region, amid a surge in coronavirus cases in the country. - Misinformation: As Covid cases and deaths rise in parts of the United States, misleading claims continue to spread, exasperating overburdened doctors and evading content moderators. - Free at-Home Tests: With cases on the rise, the Biden administration restarted a program that has provided hundreds of millions of tests through the Postal Service. Vaccine passports have become a political flash point in the nation’s recovery from the virus, with some states, including Georgia, Alabama, Arizona and Florida, banning their use over concerns regarding the sharing of Personal information. But New York has taken a different approach. Gov. Andrew M. Cuomo approved the contract under pandemic emergency powers he was granted by the Legislature last year, which allowed him to skirt normal procurement laws. Since the contract was signed in March, legislators have scaled back the governor’s powers. Thousands are already flashing the pass at Yankees and Mets games and comedy clubs in New York, as well at the door of a small number of bars and restaurants, to prove their vaccination status or show recent test results. The program is voluntary and optional: Paper cards, the state has said, must also be accepted as proof of vaccination. The pass may also become largely obsolete when the state makes most virus restrictions optional in the coming weeks. But there are growing concerns among civil liberties and government watchdog groups that those without smartphones, and those who cannot or do not want to get a pass, will not have equal access to whatever uses will exist for the app. Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project, which filed the Freedom of Information Request to obtain the contract, said that he felt it was “indefensible” that the state had only publicly disclosed the initial cost of the Excelsior Pass, when the contract reveals the framework for a more ambitious, costlier effort. The minimum cost of the contract is almost $10 million over the three years. The state can cancel the contract for “convenience or cause” with a month’s notice at any point. “Given the millions they plan to invest in expanding it, and the three-year term of the contract, we have to ask, what comes next?” Mr. Cahn said. “Because this is such a charged issue, the state should be going above and beyond the level of transparency we normally use in government procurement, not trying to lower the bar.” Freeman Klopott, a state budget office spokesman, said that the state had only discussed the $2.5 million initial cost because that was what it has already spent. “Like many state contracts, this contract includes options that the state may or may not undertake, including additional budget capacity that may not be needed,” he said. Mr. Klopott added that the state was negotiating the scope and cost of Phase 2 of the Excelsior Pass with IBM. And he added that the contract’s estimate of 10 million passes by 2024 was not a forecast, but rather “the highest usage threshold established for pricing.” The state did not specify what the plans were for Phase 2, but officials have said they are looking to ways to increase the app’s use. The contract, between IBM and the New York State Office of Information Technology Services, runs from Jan. 25, 2021, to March 19, 2024. In addition to a fixed $2.5 million in development costs, it estimates that New York will pay IBM up to $12.3 million in licensing fees over three years and $2.2 million for a second phase of development. The state will pay IBM at least $200,000 monthly in licensing costs for the next three years, which includes the right to upload two million passes. Those costs will go up as more passes are downloaded. The contract estimates that five million people will be using the pass by the end of the second year, for a monthly fee of $350,000. Ten million people, it estimates, will have passes by the next year, costing the state $600,000 monthly. The current scope of the pass is limited: It primarily holds vaccine information and recent Covid-19 test information for people who had vaccines or tests in New York State, though New Yorkers can ask their doctors to add out-of-state vaccinations to the registry. It does not work for those who had their vaccines out of state, or at a veteran’s facility or other federal site. But Eric Piscini, the vice president of emerging business networks at IBM, said in a recent interview that the state was considering broader uses. He said discussions were underway to expand the pass into a broader digital wallet that could store driver’s license information, other health information and more. He also said that other states and foreign governments were exploring ways to integrate the Excelsior Pass into their own verification systems, as New York seeks to include records from other states in its system. The state and IBM have both said that the app does not track location information, and that venues that scan the app do not retain any identifiable information about visitors. But much about the privacy protections in the app remains unclear, including the exact nature of the “blockchain technology” that IBM pledges is securing New Yorkers’ Personal information. As a result, dozens of lawmakers, along with the New York Civil Liberties Union, are backing legislation that would explicitly protect vaccine passport information from law enforcement and federal immigration authorities and ensure that people who can’t get vaccinated for medical reasons can still participate in society. The bill is expected to come to a vote before the end of the legislative term. “We have to remember that the vaccine is huge, it’s a game changer, for public health and our ability to reopen,” said Donna Lieberman, the executive director of the N.Y.C.L.U. “But it can’t be a game changer that ignores the inequities that are built into health care or that ends up destroying our privacy by creating private and government databases of highly Personal information.”",https://www.nytimes.com/2021/06/09/nyregion/excelsior-pass-vaccine-passport.html,,Post,,Ecosystem,Public,,COVID,,,,"DID,Verifiable Credentials",2021-06-09,,,,,,,,,,,,,
|
||
IBM,IBM,,,Glenn Finch,IBM Digital Health Pass,USA: NYS,,,,Opening New York State for business with the power of blockchain,Excelsior Pass Plus expands travel and commerce opportunities for New Yorkers by enabling compatibility with New York State’s Excelsior Pass platform which has generated three million passes since its launch in March that provide digital proof of vaccination or a negative test result.,"Share this post: What excites me the most about being part of the team at IBM is the work we do for our clients that truly makes a difference in individual lives and provides for smarter and safer interactions with each other and our planet. The urgency to reopen all areas of the economy safely as we navigate the global pandemic is a recent example. People are eager to get back to gathering with others and doing all the things that are part of daily life — from going to the office, restaurants, sporting events and concerts, to traveling within the U.S. and abroad. So, they need an easy, trusted way to show proof of vaccination. That’s why I’m so excited to share with you that recently we were privileged to be part of a First-of-a-Kind partnership that launched Excelsior Pass Plus across the state of New York to support the safe and secure return of tourism and commerce to New York. Trusted proof of vaccination Excelsior Pass Plus expands travel and commerce opportunities for New Yorkers by enabling compatibility with New York State’s Excelsior Pass platform which has generated three million passes since its launch in March that provide digital proof of vaccination or a negative test result. New Yorkers will be able to display their Excelsior Pass Plus at hundreds of businesses and locations. This includes Broadway theatres, Madison Square Garden, Barclays Center, Yankee Stadium, and many other major venues that require proof of vaccination, as well as when traveling to areas where SMART Health Cards are accepted. Accelerate your COVID-19 response with new levels of trust and visibility Excelsior Pass Plus is a result of the strategic partnership between New York State and a coalition of public and private organizations which will enable New Yorkers to safely access and retrieve a verifiable, secure digital copy of their COVID-19 vaccination record using the SMART Health Cards Framework — making their interstate and international travel and commerce experiences safer, contactless and more seamless. Digital credentials with blockchain Health credentials — another term for “health cards” or “health pass” — are easier for everyone to work with when they’re digital and there’s no worry of damaging, tampering with or losing a paper card. Perhaps an even more compelling reason to go digital is the recent rise in fraudulent vaccination cards being intercepted by U.S. federal agents. Digital credentials are the answer — but this highly Personal information has to remain private and secure. So, organizations are turning to blockchain as a proven way to enable a secure and trusted digital credentials platform and improve services. New York State’s Excelsior Pass Plus leverages IBM Digital Health Pass powered by IBM Digital Credentials, a blockchain-based platform that anchors digital credentials in trust and provides individuals and organizations with the core capabilities they need to securely issue, manage and verify digital credentials. Proof of vaccination or a negative test result is auditable, traceable and verifiable — in seconds. Protecting their privacy, individuals remain in control of their own Personal data that they store in an encrypted wallet on their smartphone and share, at their choosing, with an organization through a secure QR code as trustworthy proof of health status. New Yorkers now have a better way to move forward and do what’s important to them, with confidence their credentials are safe and valid. Meanwhile, the open, secure architecture of IBM Digital Credentials allows other states to join the effort based on their own criteria for ultimate flexibility and interoperability. The result? A foundational platform to help create a secure and interwoven ecosystem enabling governments, businesses and people nationwide to get back to business smarter, and safer. Learn more about how to navigate the digital world confidently with IBM Digital Credentials. Blockchain solutions that transform industries Join blockchain innovators who are transforming industries around the world. Let’s put smart to work. Find your blockchain solution Find your blockchain solution",https://www.ibm.com/blogs/blockchain/2021/08/opening-new-york-state-for-business-with-the-power-of-blockchain/,,Post,,Ecosystem,Public,,COVID,,"Excelsior Pass, Smart Cards",,Verifiable Credentials,2021-08-24,,,,,,,,,,,,,
|
||
IBM,IBM,,,Anthony Day,,,,,,3 key areas of enterprise blockchain adoption in 2021,"Government policies vary on the topic, standards are only just starting to emerge, and citizens and enterprises are rightly focused on preserving privacy and equality with our national and international responses. IBM is supporting countries like Germany as well as the State of New York to issue trusted, privacy-preserving credentials.","Share this post: Many businesses are seeing the COVID-19 pandemic as a watershed for technology and innovation investment. Technology budgets have been reprioritised with a laser focus on near-term return on investment as a necessity for most. At the same time remote working, lockdowns and supply chain challenges have accelerated digital transformations that otherwise might have taken a decade to achieve. Organisations also face mounting pressure to enhance their sustainability and ESG performance to such a level that small, incremental change will not be sufficient. So where does this leave the role of blockchain? We see enterprise and government clients focusing on the following three areas: provenance, identity and tokenization, and at a recent Blockchain Opportunity Summit we learned about three contemporary examples of just how blockchain technology can help to address some of the world’s most challenging issues. Let’s take a look. Register for the Blockchain Opportunity Summit Provenance — Group Renault’s XCEED compliance platform Automotive supply chains are about as complex as it gets, with large OEMs needing to manage a global, multi-tiered network of suppliers and yet maintain visibility and adherence to an ever-increasing array of standards. Recently Groupe Renault along with Faurecia, Knauf Industries, Simoldes, and Coşkunöz, in association with IBM, have announced a new partnership to scale XCEED (eXtended Compliance End-to-End Distributed), a blockchain-based platform that can trace the compliance of thousands of parts assembled in a vehicle in near real time. The initial focus countries will be France, Spain and Turkey, but the platform is open to any OEM. It is easy to onboard suppliers of any size, and protects companies’ confidentiality, intellectual property and data ownership while ensuring Renault, its customers and regulators can get full transparency of parts and materials used across the life of a vehicle. It’s a far cry from leafy greens and other food supply chain applications, showing the successful application of blockchain in increasingly complex supply chain use cases. Identity — IBM Digital Health Pass COVID has escalated the consideration of digital health data and self-sovereign identity to a level never seen previously. Having worked with governments, airlines, sports and entertainment venues, large employers, academia, and many others over the last 12+ months, it is clear that verification of health credentials is a highly challenging and controversial topic. Government policies vary on the topic, standards are only just starting to emerge, and citizens and enterprises are rightly focused on preserving privacy and equality with our national and international responses. IBM is supporting countries like Germany as well as the State of New York to issue trusted, privacy-preserving credentials. It’s also important to note that “health passports” exist on a spectrum of sophistication and in most cases do not yet include tethering to a verifiable Personal ID capability, so a second form of identity is required alongside the certificate to authenticate the holder. Furthermore, many national solutions (and public perceptions) are focused on vaccination certifications, where we need to be looking broader to include testing, proof of recovery or other methods to allow for inclusion of those who haven’t, won’t or can’t vaccinated against COVID-19. Public-private partnerships will be essential if we are to achieve this at speed and scale. Tokenization — IPwe’s marketplace for Intellectual Property (IP) The recent announcement with IPwe was exciting for a number of reasons. IP management and patents are complex domains that, while relatively democratised in terms of the application process, suffer from a significant amount of manual and legal effort to manage. Here’s where the tokens come in: IPwe sees significant potential in issuing patents and non-fungible tokens (NFTs) to allow them to be more easily sold, traded, commercialised or otherwise monetised, bringing new liquidity to this asset class for investors and innovators. This should also make the transaction process far simpler across a potential global audience. For small businesses, representing patents as digital assets is particularly powerful because it allows IP to be treated as collateral or assurance of an organization’s value, also allowing it to be more easily leveraged when seeking funding — in the “traditional” finance world, or even in the world of DeFi (decentralised finance). This initiative is also going to require a significant focus on interoperability between different NFT marketplaces which operate on different blockchain protocols. I’m genuinely excited to see this partnership and marketplace progress. Finally, if you’re looking for some light-hearted listening on topic of tokenization, I got together with a few blockchain leaders from a range of disciplines to talk about the good, the bad and the very, very ugly of the NFT space today. You can check it out here. Turning strategy into business outcomes IBM Blockchain Services can help bring your ideas to life. Explore the use of blockchain and digital assets in your business. Connect with the blockchain experts",https://www.ibm.com/blogs/blockchain/2021/04/3-key-areas-of-enterprise-blockchain-adoption-in-2021/,,Post,,Explainer,,,,,,,"DID,Verifiable Credentials",2021-04-03,,,,,,,,,,,,,
|
||
IBM,IBM,,,Anouk Brumfield,,Global,,,,Automating workplace vaccination verification — a path out of the pandemic,The Department of Labor’s Occupational Safety and Health Administration (OSHA) recently released a rule on requiring all employers with 100 or more employees to ensure their workforce is fully vaccinated or require any workers who remain unvaccinated to produce a negative test result on at least a weekly basis before coming to work. This rule impacts ~80 million workers — every company in the S&P 500,"Share this post: Workplace vaccination mandates are coming for employers. In the United States, The Department of Labor’s Occupational Safety and Health Administration (OSHA) recently released a rule on requiring all employers with 100 or more employees to ensure their workforce is fully vaccinated or require any workers who remain unvaccinated to produce a negative test result on at least a weekly basis before coming to work. This rule impacts ~80 million workers — every company in the S&P 500 and most companies in the Russell 2000. By now, we all know that implementing a workplace vaccination policy requires balancing employee privacy with responsible return to work employer initiatives. Policy that makes sense, keeps things simple and addresses questions like: - What should I accept as proof of vaccination? - How do I know if it’s valid? - How do I make this process as simple as possible for my workforce and visitors? - How should I manage requests for medical and religious exemptions? - How do I keep up with changing requirements for booster shoots & the growing list of approved vaccines? Transforming digital identity into trusted identity Getting ready for the mandate We have built a new verification solution (Workplace Credentials) to help employers quickly collect and validate vaccination credentials, process exceptions and religious/medical exemptions to support their unique return to workplace processes and privacy policies. It automatically calculates a score of 0 to 100 for each submitted proof of vaccination based on employer specific policies and can operate stand alone or integrated with workplace applications like Workday, PeopleSoft, Work.com and ServiceNow. This solution is already in use by companies and government in the United States and Canada. Verification solution in action To explain how this works, I would like to give you an example. Sarah, who is the HR manager and her team define the workplace policy for vaccinations (1) and configures the application rules to reflect government guidance and company policy (2). Michael, who is the employee of this company receives an email from his employer explaining the policy to return to the workplace and need to provide proof of vaccination (3). Michael will then sign in with his workplace username and password and is directed to the application (4) and enters information about the vaccination he received and uploads his proof document which can include CDC card, record from the state Immunization Information Systems (IIS), digital SMART health card (5). Once his credentials have been evaluated informing him that he is all set to return to the workplace he receives an email (6), and the HR and badging systems are updated to reflect his status (7). As the employer’s HR manager, Sarah receives regular progress updates as employees submit their proof of vaccination (8) On the journey, together Of course, no two companies are alike, and this kind of flexibility is needed in workplace vaccine verification policies so employers can specify what’s important to them in determining an overall score, which of course can vary in the places around the world that they operate in. What I hear as well in our client discussions, is a need for speed and simplicity in deployment, not only for an employer’s responsible workplace practices and peace of mind, but also to achieve compliance in their ability to do business with their clients, especially government. Our implementation time for the above solution is approximately three weeks, so really another key reason I was energized to tell this story. But there’s so much more to share — if you’d like to chat with our teams driving this work today to see if there might be a fit for your organization, we’re ready. Turning strategy into business outcomes IBM Blockchain Services can help bring your ideas to life. Explore the use of blockchain and digital assets in your business. Connect with the blockchain experts",https://www.ibm.com/blogs/blockchain/2021/11/automating-workplace-vaccination-verification-a-path-out-of-the-pandemic/,,Post,,Explainer,,,COVID,,,,"DID,Verifiable Credentials",2021-11-11,,,,,,,,,,,,,
|
||
IBM,IBM,,,Tim Olson,,Global,,,,Blockchain for trusted security labels,"A blockchain-based self-sovereign identity (SSI) network in conjunction with W3C verifiable credentials would provide an open, governable, system-independent means of issuing, holding, presenting, and verifying trusted security labels for any entity at scale — person or non-person. These blockchain-based security labels may be used by both MLS and non-MLS systems as a trusted basis for access control and authorization decisions to reduce risk exposure.","Share this post: Blockchain makes it possible to securely and at-scale identify and label any subject and object entity with cryptographically verifiable security credentials. When literally everything is labeled with verifiable, authoritative, machine-readable security credentials (such as classification level, access category and others), multi-level security (MLS) systems can enforce mandatory and discretionary access controls and other MLS-specific isolation. They can also audit policies that enable information of different classifications and access categories to be stored, accessed, and processed on shared storage, compute, and networking infrastructure while simultaneously assuring the data and other resource objects are only accessed by authorized subjects. Trusted security labels reduce infrastructure costs, promote assured information sharing, and provide a means to comply with ever-expanding data privacy and security rules and regulations. Learn more about blockchain today The problem: Shared infrastructure and unlabeled data elevates security risk exposure As businesses look to cut costs and increase efficiencies by migrating their applications to the cloud, digitizing their operations, making data-driven analytics-based decisions, and monetizing their data, they increase their security risk exposure by: - Multi-tenant cloud infrastructures that share compute, storage, and networking resources amongst multiple different organizations - Multiple incompatible classifications of data collected, processed, stored, and accessed. Different classifications such as Personally identifiable information (PII), public, sensitive, confidential, proprietary and others, require different storage, handling, audit and access controls. - Proliferation of data protection controls, audit requirements and non-compliance penalties - Expanding digital business networks — partnering with organizations and service-providers of unknown or uncertain security risks. Can they be trusted to protect your shared data? Historically, the government and other risk conscious industries that generate and handle highly classified and sensitive data, have relied on secure computing platforms and multi-level security (MLS) systems to facilitate secure sharing of data. A foundational security control for MLS is OS-level mandatory access control (MAC) that enforces security access policies using security labels applied to all system resources and users. By comparing the security label of the accessing subject to the accessed object, the OS either allows or denies access. Using MAC and other MLS-specific security controls, data of different classifications and access categories can be co-located on the same storage, compute, and network infrastructure yet subjects are only able to see and access appropriately labeled objects. All object accesses are logged and auditable. But MLS is complex and difficult to implement and maintain for a number of reasons including: - Modern systems are large and complex. The number of objects and subjects and their potential interaction combinations makes it difficult to create and maintain labels and policies using traditional OS-provided utilities. Technical documentation recommends minimizing the number of categories and labels for performance reasons. - Modern systems are networked — they don’t work in isolation. They need to work with subjects and objects located remotely. But availability and trustworthiness of external entities and externally supplied labels is suspect. - A high degree of reliance and responsibility on the system security administrator who is operating all these utilities and defining these labels and policies. Lack of oversight and governance to ensure labels and rules are correct and properly applied and maintained. - Policies and labels are simplistic, and rule based. For example, security labels for IBM Z Systems are limited to the association of a hierarchical security level, and zero or more non-hierarchical associated categories. The labels are just system-level attributes — no source attribution or digital signature. The blockchain-enabled trusted security labeling solution A blockchain-based self-sovereign identity (SSI) network in conjunction with W3C verifiable credentials would provide an open, governable, system-independent means of issuing, holding, presenting, and verifying trusted security labels for any entity at scale — person or non-person. These blockchain-based security labels may be used by both MLS and non-MLS systems as a trusted basis for access control and authorization decisions to reduce risk exposure. A blockchain distributed identity ledger binds digital identities of all person and non-person entities (NPEs) to their private/public key pairs and distributes them throughout the identity network without the use of third-party certificate authorities. Digitally signed, verifiable credentials asserting security attributes such as clearance, classification, role, categories and others, can then be issued by an authoritative source entity to a known entity using their digital identity and key pair. Using its own digital identity, a holding entity countersigns the issued credential and maintains it in their digital wallet or other credential repository. The holding entity presents its credentials to verifiers, such as an MLS system or some other identity and access enforcement point. Prior to authorizing access to a system object, a policy enforcement point such as an MLS system, uses its own digital identity to request security credentials from both the subject and object, validates the digital signatures, and then applies and enforces its security policy. For performance reasons, these credentials may be cached and/or used to populate traditional MLS system security label attributes. The above figure illustrates a simple example. An MLS stores both classified and unclassified files. - Security label for the file in the MLS is populated using signed credentials asserting the file is unclassified. The label takes the form of a W3C verifiable credential. - Bill requests access to the file. He includes his DID and a signed nonce in his request. - The MLS OS Agent accesses the identity network and using Bill’s DID as the key, looks up Bill’s public key, service endpoints, and other publicly identifying metadata in Bill’s DID document on the identity network blockchain, and authenticates the signed nonce. - The MLS OS Agent using the appropriate DID document-identified service endpoint for Bill, looks up Bill’s clearance keyed to his DID. Bill’s clearance is provided in the form of a W3C verifiable credential issued by a trusted clearance authority. - The MLS OS Agent, using the issuing clearance authority’s own DID-keyed public key on the identity network blockchain, authenticates the clearance authority’s signature. - The MLS OS Agent checks the security label of the requested file. - The MLS OS Agent compares Bill’s clearance (TS) to the file’s classification (Unclass) and grants access and recording all pertinent details (including subject and object DIDs) for auditability. The value to you Blockchain-based digital identities in conjunction with W3C verifiable credentials provides portable, trusted security labels that will lower security risk and enhance data share-ability. Collapse IT infrastructure and costs. Trusted security labels, in the form of digitally signed verifiable credentials, can be used by MLS systems to collapse IT infrastructure — eliminating the need to segregate information of different classification on different infrastructure. MLS systems with trusted security labels can make multi-tenant cloud infrastructures more trustworthy by keeping customer data co-located but separate. Trusted security labels can be used with database systems to provide MLS databases. Trusted security labels make widespread MLS viable by providing govern-ability, portability, and visibility to traditional security labels. Secure data access. Trusted security labels can be used by any access control policy enforcement point, not just MLS, to make more assured, defendable and granular access decisions. - Facilitate cross domain information transfers. Use trusted security labels at High Assurance Controlled Interfaces or guards to enforce cross-domain policies. - Public-key enable all your applications and implement application-layer MAC to comply with privacy and data security rules and regulations. Facilitate data sharing and re-use. Verifiable credentials can be issued for any purpose — not just security. Label your data assets with metadata to facilitate discovery for analytics — accuracy, usage restrictions and others. Explore more about how blockchain can be deployed as your trusted security labeling solution through the IBM Developer blockchain hub. I look forward to more great conversations on the advantages of blockchain as a trusted security labeling solution. Learn how industries are revolutionizing business with IBM Blockchain",https://www.ibm.com/blogs/blockchain/2019/11/blockchain-for-trusted-security-labels/,,Post,,Explainer,,Security,,,,,"DID,Verifiable Credentials",2019-11-05,,,,,,,,,,,,,
|
||
IBM,IBM,,,Milan Patel,,,,,,How do we start tackling the existing identity problem,,"Identity and control of Personal identity is top of mind, given recent events as well as the European Union’s General Data Protection Regulation (GDPR). A lot of our identity is shared without our explicit consent, gets stored in locations we are unaware of, and when compromised creates tremendous setbacks. Almost everything we do in the digital world is user name and password driven. With decentralized identity, you reduce risk by associating credentials typically used for in-person interactions, as instruments for virtual interactions where it is difficult to verify who or what is on the other side of the screen. Offline, in-person identification is also riddled with fraud as people falsify and use expired documents which puts everyone at risk. Decentralized identity enables more secure and trusted exchanges of identity in the physical world. IBM has made some recent announcements over the past few months regarding the vision and recent activity in this identity space, and we are working with partners to shape our focus on trusted identity solutions. Listen to my recent podcast, where I talk about where identity is going and how blockchain and emerging identity networks are driving change. I also get into some of the business and legal aspects that are essential in transforming identity explained in this video. Imagine a new way Imagine applying for a loan and quickly being vetted by banks by only sharing the information that is pertinent, removing the majority of manual verification. This would reduce costs, and the application time from weeks to days. Imagine going into a new country and becoming ill due to something you ate and being able to receive healthcare at a local clinic because you are able to identify yourself with a globally accepted identifier. You are able to provide not only who you are, but also your medical history so physicians know exactly what medicine to give. Imagine going to a bar where all that is required is a credential from the DMV, which indicates only that you are over 21 and a photo ID. You don’t have to provide unnecessary information such as your address or exact birthdate. Imagine data controllers and enterprises that can mitigate the liability of holding Personal identifiable information, by only requesting the required information to establish trust in a relationship. GDPR will require these data controllers to justify why the information is being collected and for how long it needs to be held. Decentralized identity allows data controllers to remain relevant and meet regulation requirements as data privacy becomes further regulated. Imagine replacing your physical wallet with a digital one, for online and offline interactions. This digital wallet sits on devices at the edges of the network, such as your phone and laptop. You control where credentials get stored and have the ability to manage them with your devices. Trusted, known issuers within the identity network cryptographically attest and issue credentials directly into your digital wallet. You can then control what pieces of information are shared about you, who it is shared with, and only with your explicit consent. Why now? The advent of blockchain technology, along with various public breaches in identity, has created an opportunity to transform how relationships between people and institutions are established and maintained. Blockchain enables point-to-point cryptographic exchanges of identity at the edges of the network, at the devices. If a world existed where individuals controlled their identity, the creation of digital certificates would not be at scale with public key infrastructure (PKI) rooted within certificate authorities. As key generation sits with identity owners in a decentralized PKI model, rooting trust will require a web of relationships with the ability to scale, blockchain provides immutability of identity owner and key relationships, instilling that trust in every relationship. How and what information is provided is also critical. Blockchain enables the ability to share the minimum amount of information while still ensuring trust in all these possible relationships. Establishing the foundation Before we can imagine this new world, identity networks need to be established. A critical component in these early days is making it easier for individuals and organizations to participate in a capacity that meets their identity needs. In the same light, business and legal components need to accompany technical roadmaps from the onset. Business in a digital era will require collaboration in three facets, and IBM is establishing the foundation of identity networks by focusing on these areas: We use multiple identity instruments many times per week without much thought and, for the most part, college students are a great example of identity consumers. Students typically use a single identity document for many daily functions such as food service, dorm entry and exams, and event participation registration. Next-level student identification IBM has teamed […] Several years ago, the Sovrin vision was introduced using a dot metaphor to describe a future whereby individuals would be able to take back control of their identity and participate at a peer-to-peer level with their online and offline relationships. Today the landscape of supporting open communities — network, code and standards — to achieve […] Imagine a world in which you always have peace of mind that your Personal information is safe. Imagine a world in which your information cannot be shared without your clear, explicit consent at the time of the transaction; where you decide who can access what information, when, and for how long. In this world, you […]",https://www.ibm.com/blogs/blockchain/2018/06/how-do-we-start-tackling-the-existing-identity-problem/,,Post,,Explainer,,,,,,,"DID,Verifiable Credentials",2018-06-04,,,,,,,,,,,,,
|
||
IBM,IBM,,,Kal Patel,,,,,,internet’s next step: era of digital credentials,"Imagine being able to rid your wallet of a driver’s license, an insurance card, a student or employee ID and more. Imagine not having to worry about losing your passport and vaccination records on a trip abroad, or about authenticity of designer shoes you just purchased","Share this post: Imagine being able to rid your wallet of a driver’s license, an insurance card, a student or employee ID and more. Imagine not having to worry about losing your passport and vaccination records on a trip abroad, or about the authenticity of the designer shoes you just purchased. This and much more is possible with the introduction of verifiable digital credentials. Credentials have been around for decades, if not centuries. The idea of obtaining documentation that proves a qualification, competence or authority is not, by itself, a novel idea. In fact, it is these long established, deeply seeded practices we often think may not be transformed by the shifts in technology. Yet it is precisely those daily activities that we habitually continue doing without much thought that can be, and in many instances already are, profoundly impacted by digitizing credentials. In short, the era of digital credentials is here. The internet has been around for decades, and I’ve been using a smart phone for years. Why is this the first I’m hearing of digital credentials? These are likely some of the questions that come to mind when initially hearing about digital credentials. The truth is, while the internet and smart phones have made significant progress over the last two decades, the blockchain technology enabling verifiable digital credentials has only recently matured to meet the required standards for broadscale application and adoption. Transform digital identity into trusted identity with blockchain Using blockchain technology, IBM Digital Credentials gain permissionability, immutability, and verifiability. Digital credentials, or even just attributes of a credential, can be shared using QR codes or private and public keys. In most credentialing use cases, there will be three participants — the issuer, the holder, and the verifier. The issuer will instate the credential to the blockchain, thereby certifying the qualification or the validity of an assertion. The holder can then store that credential in their digital wallet. When need be, the holder can physically or digitally present credentialed information to a verifier who needs to validate that the holder’s credential is trustworthy. Digital credentials eliminate the hassle of managing multiple physical documents, mitigates fraud risk, and allows holders to selectively share only necessary data with the requesting verifier. The applications of a secure and trusted issuer-holder-verifier transaction pattern are boundless and will positively impact every industry. First wave and future use cases The adoption of digital credentials will come in waves, with the initial implementation of more apparent use cases. As the benefits are realized by the entities involved with the initial waves of adoption, we believe companies and governments with fringe use cases will take a chance on digital credentials. We see the primary use cases being related to occupational and professional licensure, recreational permits, learning credentials and vaccine verifications. As these initial use cases take hold, we anticipate credentials for verifying the authenticity of physical products to be a subsequent phase of adoption. This would give individuals and institutions the ability to verify the authenticity of parts, clothing items, sports equipment and more. Some of the primary uses cases include medical credentials, driver’s licenses, and health records. Medical credentials are a prime candidate to be digitized, immutable and verifiable. In the medical industry, the verification of records for new physicians is time consuming and cost ineffective. The process of verifying a new member of a hospital often takes 12-25 or more independent organizations. This results in the onboarding process of new hires to take 4-6 months. Due to this timely process, hospitals lose USD 7,500 to 9,000 daily. With medical credentials being stored on blockchain, the hiring hospital would be able to verify a new hire’s certifications in minutes rather than months. Similar to medical credentials, we also anticipate the credentialing of health records and driver’s licenses to occur in the first wave of adoption. Allowing individuals to digitally hold their health records or driver’s licenses would benefit the safety and security of one’s identity. We are often presenting more information to a third party than required. For example, if a verifying party needs to validate your age, there should be no reason to display any additional information such as your address or your full name. With digital credentials stored on blockchain you would have the ability to verify your age without visually presenting any information, even your date of birth! IBM Digital Credentials for Learners We have proven the capabilities of the IBM Digital Credentials platform through the success of applying the platform to our Learning Credential Network use case. We started the journey of digitizing credentials in the most opportune industry, education. Being able to obtain and hold a digital record of your diploma is what may immediately come to mind. However, the digitization of learning records goes much further. Students and employees around the world who may not have the opportunity to attend 4-year colleges or even 2-year, now have new methods to acquire skills through online courses, on-the-job training, skills-based experiential learning, and more. The implementation of a digital credentialing system, such as the IBM Digital Credentials platform empowers these employees and students to have verifiable proof of skills they have gained and are ready to use in a job. This subsequently creates a skills-based economy as well as benefits for leaders across various industries. Digitizing learning credentials has broadened the talent pool, created more diversity in the workplace, and allowed for easier access to individuals with niche skills. Companies, students, employees, and educational institutions who have leveraged the learning application of the IBM Digital Credentials platform are already seeing benefits from our solution. Credentialing for government entities The IBM digital credentialing team has also partnered with a local government in the state of New York to bring the power of the IBM Digital Credentials platform to their social services programs. For an individual or family to verify their eligibility for these programs, there are several steps that need to be taken and a significant amount of documentation that needs to be provided. It’s an extremely time consuming and tedious (and often duplicative) process for individuals to have to present verification of their income, address, identification and more. Given that those eligible for one program are likely eligible for others, the local government decided to partner with IBM to create digital credentials to quickly verify the eligibility of participation in multiple social programs. This will ultimately save significant time for both constituents and government workers who assist in these processes. This initiative started with emergency rental relief eligibility verification but is looking to help expedite social service verification across all programs. Turning strategy into business outcomes IBM Blockchain Services can help bring your ideas to life. Explore the use of blockchain and digital assets in your business. Connect with the blockchain experts",https://www.ibm.com/blogs/blockchain/2021/12/the-internets-next-step-the-era-of-digital-credentials/,,Post,,Explainer,Public,,"Education,Healthcare",,,,Verifiable Credentials,2021-12-01,,,,,,,,,,,,,
|
||
IBM,IBM,,,Dan Gisolfi,Sovrin,Global,,,,Self-sovereign identity: Why blockchain?,"Several years ago, the Sovrin vision was introduced using a dot metaphor to describe a future whereby individuals would be able to take back control of their identity and participate at a peer-to-peer level with their online and offline relationships. Today the landscape of supporting open communities — network, code and standards — to achieve this vision has begun to mature at a rate whereby early adopters can begin to validate applicability and build that most important bridge across the technology adoption lifecycle chasm.","One of the most common questions I get when talking to customers and analysts about the self-sovereign identity (SSI) movement is, “Why blockchain?” This question tends to stem from the notion that data associated with a person’s identity is destined to be stored, shared and used for verification on some form of distributed ledger technology. My hope is that this article with help to debunk that notion and provide a basic foundational understanding of how distributed ledger technology is being used to solve our identity infrastructure dilemma and resolve the impacts of the internet lacking an identity layer. Busting the myth of on-chain PII One of the most common myths surrounding blockchain and identity is that blockchain technology provides an ideal distributed alternative to a centralized database for storing Personally identifiable information (PII). There are several flavors of this perception: (a) use blockchain to store the data; (b) use a blockchain as a distributed hash table (DHT) for PII data stored off-chain. Yes, blockchain can technically support the placement of PII on the chain or used to create attestations on the chain that point to off-chain PII storage. Just because technology can be applied to solve a specific problem does not mean that it is the proper tool for the job. This misconception about PII storage in the early stages of the blockchain technology adoption lifecycle is so pervasive that it recently inspired a Twitter thread dedicated to the debate on why putting hashed PII on any immutable ledger is a bad Idea. From GDPR compliance, to correlation, to the cost of block read/write transactions, the debate continues. Blockchain technology is much more than a distributed storage system. My intent herein is to help the inquisitive identity solution researcher debunk beliefs about PII storage approaches by gaining an understanding for how blockchain can be used as an infrastructure for identity attestations. My hope is this article will offer a helpful aid towards that education and awareness. The SSI initiative is a perfect counterpunch to detrimental PII management practices. A SSI solution uses a distributed ledger to establish immutable recordings of lifecycle events for globally unique decentralized identifiers (DIDs). Consider the global domain name system (DNS) as an exemplar of a widely accepted public mapping utility. This hierarchical decentralized naming system maps domain names to the numerical IP addresses needed for locating and identifying computers, services or other connected devices, with the underlying network protocols. Analogous to the DNS, a SSI solution based on DIDs is compliant with the same underpinning internet standard universally unique identifiers (UUIDs) and provides the mapping of a unique identifier such as DID, to an entity — a person, organization or connected device. However, the verifiable credentials that are associated with an individual’s DID and PII are never placed on a public ledger. A verifiable credential is cryptographically shared between peers at the edges of the network. The recipient of a verifiable credential, known as a verifier, in a peer to peer connection would use the associated DID as a resource locator for the sender’s public verification key so that the data in the verifiable credentials can be decoded and validated. No PII on ledger, then why blockchain? So, what problem is blockchain solving for identity if PII is not being stored on the ledger? The short answer is that blockchain provides a transparent, immutable, reliable and auditable way to address the seamless and secure exchange of cryptographic keys. To better understand this position, let us explore some foundational concepts. Encryption schemes Initial cryptography solutions used a symmetrical encryption scheme which uses a secret key that can either be a number, a word or a string of random letters. Symmetrical encryption blends a secret key and the plain text of a message in an algorithmic specific manner to hide a message. If the sender and the recipient of the message have shared the secret key, then they can encrypt and decrypt messages. A drawback to this approach is the requirement of exchanging the secret encryption key between all recipients involved before they can decrypt it. Asymmetrical encryption, or public key cryptography, is a scheme based on two keys. It addresses the shortcomings of symmetrical encryption by using one key to encrypt and another to decrypt a message. Since malicious persons know that anyone with a secret key can decrypt a message encrypted with the same key, they are motivated to obtain access to the secret key. To deter malicious attempts and improve security, asymmetrical encryption allows a public key to be made freely available to anyone who might want to send you a message. The second private key is managed in a manner so that only the owner has access. A message that is encrypted using a public key can only be decrypted using a private key, while a message encrypted using a private key can be decrypted using a public key. Unfortunately, asymmetric encryption introduces the problem of discovering a trusted and authentic public key. Today the most pervasive technique for public key discovery in communications based on a client-server model is the use of digital certificates. A digital certificate is a document that binds metadata about a trusted server with a person or organization. The metadata contained in this digital document includes details such as an organization’s name, the organization that issued the certificate, the user’s email address and country, and the user’s public key. When using digital certificates, the parties required to communicate in a secure encrypted manner must discover each other’s public keys by extracting the other party’s public key from the certificate obtained by the trusted server. Trust chains A trusted server, or certificate authority, uses digital certificates to provide a mechanism whereby trust can be established through a chain of known or associated endorsements. For example, Alice can be confident that the public key in Carol’s digital certificate belongs to Carol because Alice can walk the chain of certificate endorsements from trusted relationships back to a common root of trust. Our current identity authentication scheme on the internet is based on asymmetric encryption and the use of a centralized trust model. Public key infrastructure (PKI) implements this centralized trust model by inserting reliance on a hierarchy of certificate authorities. These certificate authorities establish the authenticity of the binding between a public key and its owner via the issuance of digital certificates. As the identity industry migrates beyond authentication based on a current client-server model towards a peer-to-peer relationship model, based on private encrypted connections, it is important to understand the differences between symmetric and asymmetric encryption schemas: Symmetric encryption uses a single key that needs to be shared among the people who need to receive the message. Asymmetrical encryption uses a public/private key pair to encrypt and decrypt messages. Asymmetric encryption tends to take more setup and processing time than symmetric encryption. Asymmetric encryption eliminates the need to share a symmetric key by using a pair of public-private keys. Key discovery and sharing in symmetric key encryption can be addressed using inconvenient and expensive methods: Face-to-face key exchange Reliance on a trusted third party that has a relationship with all message stakeholders Asymmetric encryption eliminates the problem of private key exchange, but introduces the issue of trusting the authenticity of a publicly available key. Nevertheless, similar methods can be used for the discovery and sharing of trusted public keys: Face-to-face key exchange Reliance on a trusted third party that has a relationship with all message stakeholders Certificates that provide digitally signed assertions that a specific key belongs to an entity Rebooting the web of trust What if we wanted to avoid this centralized reliance on a trust chain of certificate authorities? What if we could leverage distributed ledger technology as a transparent and immutable source for verifying and auditing the authenticity of the binding between a public key and its owner? An alternative to the PKI-based centralized trust model, which relies exclusively on a hierarchy of certificate authorities, is a decentralized trust model. A web of trust, which relies on an individual’s social network to be the source of trust, offers one approach to this decentralized alternative. However, the emergence of distributed ledger technology has provided new life to the web of trust vision. Solutions using SSI can leverage distributed ledger as the basis for a new web of trust model that provides immutable recordings of the lifecycle events associated with the binding between a public key and its owner. Decentralized PKI in a nutshell As explained earlier and depicted in the diagram below, in a PKI based system Alice and Bob need to establish a way to exchange and store their public keys. Conversely, in a blockchain-based web of trust model, the storage of public keys are managed on the public ledger. As participants in a global identity network, Alice and Bob create their unique DIDs, attach their public keys and write them to the public ledger. Now any person or organization that can discover these DIDs will be able to acquire access to the associated public keys for verification purposes. Conclusion My hope is that this article has provided you with a basic understanding and appreciation for why blockchain offers a powerful infrastructure to identity attestations. The SSI movement uses a blockchain to addresses several solution requirements but the most basic is for the secure and authentic exchange of keys which was not possible using PKI. Minimally, you should now be armed with enough awareness of decentralized identity principles to establish some doubt about those advocates that champion the use of blockchain for the storage of Personal data. Imagine being able to rid your wallet of a driver’s license, an insurance card, a student or employee ID and more. Imagine not having to worry about losing your passport and vaccination records on a trip abroad, or about the authenticity of the designer shoes you just purchased. This and much more is possible with […] Workplace vaccination mandates are coming for employers. In the United States, The Department of Labor’s Occupational Safety and Health Administration (OSHA) recently released a rule on requiring all employers with 100 or more employees to ensure their workforce is fully vaccinated or require any workers who remain unvaccinated to produce a negative test result on […] Blockchain makes it possible to securely and at-scale identify and label any subject and object entity with cryptographically verifiable security credentials. When literally everything is labeled with verifiable, authoritative, machine-readable security credentials (such as classification level, access category and others), multi-level security (MLS) systems can enforce mandatory and discretionary access controls and other MLS-specific isolation. […]",https://www.ibm.com/blogs/blockchain/2018/06/self-sovereign-identity-why-blockchain/,,Post,,Explainer,,,,,,"Indy,Sovrin","DID,Verifiable Credentials",2018-06-13,,,,,,,,,,,,,
|
||
IBM,IBM,,,Dan Gisolfi ; Milan Patel ; Rachel Radulovich,Sovrin,Global,,,,Decentralized Identity Introduction,"An ecosystem model whereby users generate and manage their own digital identity without relying on a central repository.<br>• Identity is derived through Distributed certified credentials<br>• Trust Frameworks: Global Public and Domain Specific (Business, Legal, Technical)<br>• Built-for security and scale: push identity to the edges of the networks<br>• Built using Hyperledger Indy",,https://www.ibm.com/downloads/cas/opeqyel7,,Presentation,,Explainer,,,,,,"Indy,Sovrin","DID,Verifiable Credentials",2018-09-10,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,Global,,,,Finance Use Case,"David Vincent wants to apply for a loan online. His bank’s<br>know-your-customer process for obtaining a loan needs to be<br>compliant with federal regulations. As such, David is required to<br>present a government-issued citizen ID and proof of employment.<br>Let’s compare how David could use a Decentralized Identity Network<br>or a Consortium Identity Network to make the process easier and more<br>secure for him, seamlessly protecting his identity.",,https://www.ibm.com/downloads/cas/wg5edxn9,,Presentation,,Explainer,,,Finance,,,,"DID,Verifiable Credentials",2018-08-21,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,Global,,,,Government Use Case,"Laura Barnes has graduated from college and has her first job. She’s<br>decided to lease and insure a new car. The car dealer and insurance<br>company both require Laura to present proof of employment and a<br>driver’s license.<br>A few weeks after getting her new car, she gets pulled over for a<br>traffic volition. The officer asks her to present proof of her driver’s<br>license, auto registration and insurance. Let’s compare how Laura<br>could use a Decentralized Identity Network or a Consortium Identity<br>Network to make the process easier and more secure for her,<br>seamlessly protecting her identity.<br>In a Decentralized Identity Network, the participants would be…",,https://www.ibm.com/downloads/cas/ebywbqvn,,Presentation,,Explainer,Public,,,,,,"DID,Verifiable Credentials",2018,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,Global,,,,Healthcare Use Case,"Alice arrives at the clinic and needs to provide the order, her proof of insurance and her driver’s license. In a Decentralized Identity Network, the participants would be...",,https://www.ibm.com/downloads/cas/r9ywplkl,,Presentation,,Explainer,,,Healthcare,,,,"DID,Verifiable Credentials",2018,,,,,,,,,,,,,
|
||
IBM,IBM,,,,Sovrin,,,,,Towards Self Sovereign Identity,Credit goes to the Sovrin foundation and Hyperledger Indy who produced most of the slides (or some variation) in this presentation.,Activate your 30 day free trial to unlock unlimited reading. Activate your 30 day free trial to continue reading. Credit goes to the Sovrin foundation and Hyperledger Indy who produced most of the slides (or some variation) in this presentation. Credit goes to the Sovrin foundation and Hyperledger Indy who produced most of the slides (or some variation) in this presentation.,https://www.slideshare.net/alehors/towards-self-sovereign-identity-20180508,,Presentation,,Explainer,,,,,,"Indy,Sovrin","DID,Verifiable Credentials",2018-05-08,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,Innovation Insight for Decentralized Identity and Verifiable Claims: A Gartner Report,"While the risk of fraud and data misuse is increasing, decentralized identity and credentials are meeting the demands of businesses across the digital identity value chain with:<br><br>Enhanced security<br>Privacy & user experience with the ability to easily consent<br>Shareable & verifiable claims without having to disclose sensitive data<br>With this report, access promising use cases, risks and considerations, and expert recommendations on creating value for the fully decentralized future.",,https://www.ibm.com/account/reg/us-en/signup?formid=urx-51223,,Report,,Explainer,,,"Security, Privacy",Machine Readable,,,,2021-08-18,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,What is a vaccine passport?,A vaccine passport is a physical or digital health credential to confirm a person has been vaccinated for a particular contagious disease to enable travel.,"Let’s start with vaccine passports – also called digital health passports or green certificates. Many travelers are familiar with the yellow card, or Carte Jaune, which is an official vaccination record created by the World Health Organization. This document, named for the yellow paper it's traditionally printed on, is a public health tool that has been used for international travel since the 1930s and is typically carried with a passport. It shows customs authorities that a traveler has been vaccinated against certain diseases, such as yellow fever, typhoid or cholera. Although vaccination cards like yellow cards are still being used and remain a popular way to document immunizations, many governments are considering creating modern, digital vaccine passports that are harder to forge. With the public health threat posed by the COVID-19 pandemic, multiple countries are exploring whether vaccine passports and health passes could serve as proof of COVID-19 vaccination to restore confidence in international travel and help people resume their normal activities. Israel was the first country to issue a modern vaccine passport with the launch of Green Pass in February 2021. As of May 2021, Israel, China, Bahrain and Japan are the only countries that have issued vaccine passports to vaccinated people for international travel and other uses. Australia and multiple countries in the European Union, such as Denmark and Greece, have committed to developing programs, while other countries are still weighing their options. In the United States, the Biden administration and leaders at the Centers for Disease Control and Prevention (CDC) have stated the federal government will not support or issue vaccine passports for Americans. Vaccine passports are taking advantage of the increasing prevalence of new, secure digital credentialing technology. Beyond vaccine passports for international travel, it’s also being applied in other settings. For example, organizations that gather people in groups are looking for digital alternatives to both paper vaccination cards and test results. In some cases, this means identifying whether individuals have been tested or vaccinated in a voluntary and privacy-preserving manner. Digital health passes – not to be confused with vaccine passports – are a voluntary, convenient option for individuals to share their health status, such as if they have been vaccinated or tested negative for COVID-19. Rather than having to remember to carry around multiple documents, people with digital health passes can share a scannable QR code on their smartphone or print a paper copy of their credential that confirms their status, while Personal information remains securely encrypted in a digital wallet on the individual’s phone. With COVID-19 vaccine rollouts underway around the world, digital health passes are one of many tools governments, private companies, nonprofits and industry groups are considering to help people return to their favorite activities. For example, in March 2021, New York State launched Excelsior Pass, a free, secure and voluntary digital health pass to support the safe reopening of New York. Even after the COVID-19 pandemic subsides, digital credentialing technology will remain a useful tool for individuals to show they have received any necessary vaccinations or other aspects of their health status. Digital credentials could also become a useful way for schools to manage student vaccination records or for employers to oversee any medical clearances or vaccinations required for job sites. With a digital health pass, people don’t have to worry about carrying around sensitive health records, like vaccine certificates, that could get misplaced. All they would need is their smartphone or a printed certificate that can be easily reprinted from a computer or mobile device if lost. The technology underlying digital health passes is designed for users to manage their Personal health data and control what they share with whom and for what purpose. Only the verified credential is shared with others while the underlying data remains private and protected. Organizations have different needs when it comes to understanding people’s health status and verifying re-entry. An airline screening travelers for international flights might have stricter requirements than an outdoor stadium screening sports fans. Digital health passes make it easier for organizations to design rules that fit their specific needs. Although many people will enjoy the convenience of using a digital health pass on their smartphone, some people may not have a compatible mobile device or prefer not to use one. Additionally, phones can be forgotten at home or batteries can deplete at inopportune times. Designers of digital health credentialing technology recognize these limitations, and many have added additional features like printable certificates that help people access their credentials, if needed, from a desktop computer or other device. As lockdowns and other restrictions become less common, returning to pre-pandemic activities will require coordination from different organizations. Private sector businesses that want to welcome people back to their venues need straightforward ways to verify people’s health status voluntarily according to local regulations and their own policies. Healthcare organizations need simple ways to issue digital health credentials that other organizations can trust. Digital health passes can simplify the process for issuers of COVID-19 and other health credentials, such as pharmacies, labs and providers, and the verifiers who are checking the credentials, like an airline gate agent. Instead of having to follow a one-size-fits-all process, digital health passes give organizations a chance to customize their processes according to their specific rules. For example, an outdoor stadium might decide to admit fans who have received a negative COVID-19 test within 72 hours or proof of vaccination. An international flight to Europe might require travelers to show proof of receiving a COVID-19 vaccine. Digital health passes can accommodate the requirements of both organizations. Similarly, healthcare organizations can issue credentials to individual holders. The security and privacy built into digital health passes help make it simpler for issuers to provide credentials that are trustworthy without a lot of extra work on their part. Read about the Excelsior Pass program in New York Digital health passes are designed so that Personal health information is encrypted using a digital wallet that can be accessed on a smartphone. The user has control over their information and how that information is shared. That control is maintained through secured digital credentials. Pharmacies, labs and providers can issue secured health credentials (QR code), such as a COVID-19 test result or vaccination record, for individuals to add to their digital health wallet. Using a QR code minimizes exposure of your underlying health information to third parties without your knowledge or consent during verification. Those are securely stored in the user’s digital health wallet. Credentials simply provide a voluntary way to share health credentials in a secure manner with an employer, airline or amusement park. How can these credentials be trusted? Some passes use a technology called blockchain, which uses a decentralized identity architecture. For example, it allows individuals to become active participants by giving them control over their data and the ability to choose how it will be used. Blockchain makes it so there’s no need to have a central database of sensitive health information. It helps organizations check the authenticity and validity of COVID-19 health credentials while the holder maintains control of their underlying Personal health information. Learn more about how it works A digital health wallet is a secured digital alternative to COVID-19 paper vaccination or test results and provides a convenient option for an individual to manage and share their vaccination status or a negative test result for the COVID-19 coronavirus. Here's an example of how a digital health wallet could work: Step 1: You get a COVID-19 test at your local pharmacy. The pharmacy issues a verifiable credential based on your negative test result and sends it to you. Step 2: You receive the credential and add it to the digital wallet on your smartphone. Step 3: Now let’s say you want to board a flight or attend a sporting event where you will have the option to use IBM Digital Health Pass. If you choose to do so, the airline personnel or event staff verifies your credential by scanning the QR code in your digital health wallet before you enter. It’s that simple. Going to jobsites When more people return to sharing offices, warehouses and other indoor co-working spaces, employers will need to confirm people are following health guidelines. Upon arrival, the employee might need to present the QR code in their digital health wallet app to be scanned to gain entry to the building. This isn’t too different from using an employee ID to gain access to a jobsite. Traveling for work or fun Cruiselines, airlines and hotels might use a digital health pass to verify individuals’ health status before they travel. An airline might have different screening requirements than a cruiseline, but travelers can voluntarily choose what data to share and with whom. Catching a concert or game COVID-19 taskforces are prioritizing health and safety so fans can return to enjoying their favorite teams and bands in person. In addition to checking tickets for a concert or a sports game, ticket takers, ushers or security screeners could ask to see a health pass as well before admitting people into a venue. Designed to provide organizations with a smart way to bring people back to a physical location, such as a workplace, school, stadium or airline flight. Explore how new technologies can support the complex challenges of vaccine management and distribution to reduce risks and support safety. Use Watson Works to develop plans that help organizations re-open and stay open by prioritizing employee health, safety and productivity. Learn about IBM’s vaccine management solutions for efficiency, security, and supply chain resiliency. Explore an example of how digital health credentials could work at a ballpark. Find out how blockchain can be used to solve challenges in the healthcare industry. Learn how vaccination credentials can support long-term vaccine efforts. Discover how digital credentials are helping us safely and effectively re-open today and why they’re here to stay. Hear the lessons health experts learned from the first two months of COVID-19 distribution. Take a closer look at how individuals can use this technology to get back to normal activities.",https://www.ibm.com/topics/vaccine-passport,,Topic,,Explainer,,,COVID,,,,,2021-01-01,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,indy-ssivc-tutorial,"A turnkey, Docker-based tutorial for help developers get acquainted with Self-Sovereign Identity and Verifiable Credentials.",,https://github.com/ibm-blockchain-identity/indy-ssivc-tutorial,,Code,,HowTo,,,,,,"Python, Apache-2.0",,2019-03-14,,,,,,,,,,,,,
|
||
IBM,IBM,,,Sharath Kumar R K; Corville Allen; Marie Wallace; Manjula Hosurmath,,,,,,Get started with IBM Digital Health Pass,"How can you bring people back to physical locations such as the workplace or airports without compromising on safety protocols? And, how can you ensure that the information being shared is secure? IBM Digital Health Pass can help. Digital Health Pass is an open standards-based platform that allows the secure, privacy-preserving, and verifiable exchange of data between organizations and their patients, employees, customers, and citizens, to drive agile and responsive businesses. Data is exchanged as verifiable credentials that, in combination with sophisticated cryptographic and obfuscation techniques, makes data tamper-proof so that it can be trusted by all parties<br>",,https://developer.ibm.com/tutorials/getting-started-on-ibm-digital-health-pass/,,Post,,HowTo,,,,,,,"DID,Verifiable Credentials",2022-02-22,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,NYS,,,,New York State and IBM Digital Health Pass Pilot,,,https://newsroom.ibm.com/new-york-state-and-ibm-digital-health-pass-pilot,,Press,,Meta,,,COVID,Pilot,,,"DID,Verifiable Credentials",2021-03-11,,,,,,,,,,,,,
|
||
IBM,ID2020,,Medium,,Good Health Pass Collaborative; Airports Council International (ACI); Commons Project Foundation; Covid Credentials Initiative; Evernym; Hyperledger; International Chamber of Commerce (ICC); Linux Foundation Public Health; Lumedic; Mastercard; Trust Over IP Foundation,,,,,Good Health Pass a new Cross Sector Initiative to restore Global Travel,"ID2020 announced the launch of the Good Health Pass Collaborative along with more than 25 leading individual companies and organizations in the technology, health, and travel sectors — including the Airports Council International (ACI), Commons Project Foundation, Covid Credentials Initiative, Evernym, Hyperledger, IBM, International Chamber of Commerce (ICC), Linux Foundation Public Health, Lumedic, Mastercard, Trust Over IP Foundation, and others.","Good Health Pass: A New Cross-Sector Initiative to Restore Global Travel and Restart the Global Economy Today, ID2020 announced the launch of the Good Health Pass Collaborative along with more than 25 leading individual companies and organizations in the technology, health, and travel sectors — including the Airports Council International (ACI), Commons Project Foundation, COVID-19 Credentials Initiative, Evernym, Hyperledger, IBM, International Chamber of Commerce (ICC), Linux Foundation Public Health, Lumedic, Mastercard, Trust Over IP Foundation, and others. The Good Health Pass Collaborative is an open, inclusive, cross-sector initiative to create a blueprint for interoperable digital health pass systems that will help restore global travel and restart the global economy. The COVID-19 pandemic has impacted every segment of the global economy, but none as profoundly as travel and tourism. Last year, airlines lost an estimated $118.5 billion USD with related impacts across the economy in excess of $2 trillion USD. In conjunction with the announcement, the Collaborative also released its first white paper, entitled, Good Health Pass: A Safe Path to Global Reopening. Collaboration Among a New Ecosystem of Players “There’s one thing the world agrees on — we need to address the health concerns today to support a return to normalcy,” said Ajay Bhalla, President of Cyber & Intelligence at Mastercard. “Delivering a global, interoperable health pass system can only happen if we come together in a way that meets the needs of everyone involved. This Collaborative will be critical in helping to define how we connect the pieces that will bring travel back safely, spark job creation and jumpstart the world’s economic engine.” Various efforts are currently underway to develop digital health credentials systems — both vaccination and test certificates — for international travel. Yet, despite this race to market, it is unlikely that a single solution will be implemented universally — or even across the entire travel industry. Thus, it is critical that solutions are designed from the onset to be interoperable — both with one another and across institutional and geographic borders. The Good Health Pass Collaborative is not intended to supplant existing efforts but rather to help weave them together, fill gaps where they may exist, and facilitate collaboration among a new ecosystem of stakeholders, many of whom have never worked together before. “Fragmentation is a risk we simply cannot ignore,” said ID2020 Executive Director Dakota Gruener. “To be valuable to users, credentials need to be accepted at check-in, upon arrival by border control agencies, and more. We can get there — even with multiple systems — as long as solutions adhere to open standards and participate in a common governance framework. But without these, fragmentation is inevitable, and travelers — and the economy — will continue to suffer needlessly as a result.” Global Travel & Digital Health Credentials COVID-19 test results are already required for entry at some airports and at international borders. But existing paper-based certificates are easy to lose, unnecessarily expose sensitive Personal information, and are prone to fraud and counterfeiting. By contrast, digital health credentials can be printed (e.g., as a QR code) or stored on an individual’s mobile phone. They enhance user privacy and “bind” an individual’s identity to their test result or vaccination certificate, thus enabling real-time, fraud-resistant digital verification. “Our health data consists of the most sensitive Personal information, deserving of the strongest privacy,” said Dr. Ann Cavoukian, Executive Director of the Global Privacy & Security By Design Centre. “Release of our health data must be under our Personal control. The Good Health Pass does just that: With Privacy by Design embedded throughout, you control the release of your digital health data, and to whom; all de-identified and decentralized. Privacy and functionality: Win/Win!” The World Health Organization recently convened the Smart Vaccination Certificate Consortium to establish standards for vaccination certificates, but no analogous effort currently exists for test certificates. Given that it is expected to take years for vaccines to be universally available globally, widespread testing will remain an essential public health tool — and one that must continue alongside vaccination to ensure a safe and equitable return to public life. The Good Health Pass Collaborative has defined four primary requirements that digital health credential systems for international travel must satisfy: - Cross-border: Solutions must work at airports, airlines, ports-of-call, and borders worldwide and comply with international and local regulations. - Cross-industry: Solutions will require the collaboration of the travel, health, governments, and technology sectors. - Secure & privacy-protecting: Solutions will require the collaboration of the travel, health, governments, and technology sectors. Solutions must comply with all relevant security, privacy, and data protection regulations and must be able to bind the presenter of the credential to the credential itself at the required level of assurance. - Frictionless: Solutions must seamlessly integrate into testing and travel processes, thus enhancing and streamlining the experience for individuals and airlines alike. Solutions must not add new material costs for travelers. Optimally, validation processes will be contactless to maintain or enhance hygiene. The Collaborative welcomes the participation of policymakers and representatives of government agencies; companies in the health, technology, and travel sectors; and civil society organizations who share a commitment to safely restoring international travel and economic activity while simultaneously ensuring that equity, privacy, and other civil liberties are protected. If you are interested in learning more, please visit the Good Health Pass website at goodhealthpass.org. Endorsing Organizations - Affinidi - Airport Council International (ACI) - Airside - analizA - AOKpass - Bindle Systems - BLOK Solutions - CLEAR - The Commons Project Foundation - Covid Credential Initiative (CCI) - Daon - Everynym - Global Privacy & Security by Design Centre - Grameen Foundation - Hyperledger - IBM - IDramp - International Chamber of Commerce (ICC) - iProov - Linux Foundation Public Health - Lumedic - Mastercard - MIT SafePaths - National Aviation Services (NAS) - Panta - PathCheck Foundation - Prescryptive Health - SITA - STChealth - Trust Over IP Foundation - ZAKA",https://medium.com/id2020/good-health-pass-a-new-cross-sector-initiative-to-restore-global-travel-and-restart-the-global-8b59eb1050a0,,Post,,Meta,,,COVID,,,,,2021-02-09,,,,,,,,,,,,,
|
||
IBM,SecureKey,,,,,,,,,IBM and SecureKey Technologies to Deliver Blockchain-Based Digital Identity Network for Consumers,"IBM (NYSE: IBM) and SecureKey Technologies today announced they are working together to enable a new digital identity and attribute sharing network based on IBM Blockchain. The network will be designed to make it easier for consumers to verify they are who they say they are, in a privacy-enhanced, security-rich and efficient way. When launched later this year, consumers can use the network to instantly verify their identity for services such as new bank accounts, driver’s licenses or utilities.","Las Vegas – IBM InterConnect – 20 March 2017: IBM (NYSE: IBM) and SecureKey Technologies today announced they are working together to enable a new digital identity and attribute sharing network based on IBM Blockchain. The network will be designed to make it easier for consumers to verify they are who they say they are, in a privacy-enhanced, security-rich and efficient way. When launched later this year, consumers can use the network to instantly verify their identity for services such as new bank accounts, driver’s licenses or utilities. To create a highly secure, global and enterprise-ready ecosystem for sharing identity requires both advanced federated identity technology and blockchain technology specifically designed for regulated industries. Together SecureKey and IBM are developing a digital identity and attribute sharing network using IBM’s Blockchain service which is built on top of the Linux Foundation’s open source Hyperledger Fabric v1.0. As a permissioned blockchain, the Hyperledger Fabric is an essential component in delivering services that comply with regulations where data protection and confidentiality matter. The network is currently in the testing phase in Canada, and once it goes live later in 2017 Canadian consumers will be able to opt-in to the new blockchain-based service using a mobile app. Consumers – or network members – will be able to control what identifying information they share from trusted credentials to the organizations of their choice, for those organizations to quickly and efficiently validate the consumer’s identity and arrange new services. For example, if a consumer has proven their identity with their bank and a credit agency, they can grant permission to share their data with a utility to create a new account. Since the bank and the credit agency have already gone through extensive verification of the consumer’s identity, the utility can choose to rely on the fact that the information is verified, and the consumer can be approved for new services. “What IBM is building with SecureKey and members of the digital identity ecosystem in Canada, including major banks, telecom companies and government agencies, will help tackle the toughest challenges surrounding identity,” said Marie Wieck, general manager, IBM Blockchain. “This method is an entirely different approach to identity verification, and together with SecureKey, we have a head start on putting it on the blockchain. This is a prime example of the type of innovation permissioned blockchain networks can accelerate.” Hyperledger Fabric is by far the most advanced permissioned-blockchain technology available today, in my opinion, both in protecting user data and allowing us to work within the context of industry and country privacy laws,” said Greg Wolfond, founder and CEO, SecureKey Technologies. “Among the many contributors to Hyperledger Fabric including SecureKey, IBM is a standout innovator that has proven that they can rapidly bring blockchain solutions to production. We are very excited to enter into this formal agreement that will benefit consumers around the world. Canada’s leading banks, including BMO, CIBC, Desjardins, RBC, Scotiabank and TD joined the digital identity ecosystem in October, 2016, investing $27M collectively in SecureKey. The Digital ID and Authentication Council of Canada (DIACC) and the Command Control and Interoperability Center for Advanced Data Analytics (CCICADA), a research center of excellence funded by the U.S. Department of Homeland Security Science & Technology Directorate, have also provided funding to bring the new approach to digital identity to market. SecureKey’s leadership in identity is evidenced by its association with industry leaders and regulators such as DIACC, Privacy By Design, NIST, FIDO, OIX, Kantara and the Linux Foundation. “Our goal for this partnership is to accelerate the pace at which we can develop a service to help consumers better manage, protect and control their digital assets and identity, and ultimately provide our customers with greater convenience and a better overall experience,” said Andrew Irvine, Head of Commercial Banking and Partnerships, BMO Bank of Montreal. “Implementing forward thinking innovation is key to ensuring our clients have the best possible experience in today’s digital environment,” said Todd Roberts, Senior Vice President, Innovation, CIBC. “We are pleased to continue working with SecureKey to implement leading edge technology that protects our clients’ security and privacy in the digital ecosystem.” “We believe that combining SecureKey’s expertise and innovation in identity and the technological knowledge and leadership of Hyperledger Fabric and IBM Blockchain’s High Security Business Network will be foundational in delivering a great identity solution for consumers in Canada and also help pave the way at the international level,” said Patrice Dagenais, Vice president, Payment and Business Partnerships for Desjardins group. “Collaborating with partners like SecureKey and IBM in the development and implementation of solutions that make our clients’ interactions secure and seamless is essential to meeting evolving expectations in a digital world,” said Eddy Ortiz, VP, Solution Acceleration and Innovation, RBC. “Canada has an important opportunity to innovate with emerging technologies like blockchain to advance digital identity in Canada.” “Scotiabank is embracing digital technologies like blockchain to offer a superior customer experience and to make it easier for customers to bank with us whenever they want and wherever they are,” said Mike Henry, Executive Vice President, Retail Payments, Deposits and Unsecured Lending, Scotiabank. “We are pleased to work with SecureKey and other innovative partners to provide Canadian consumers with an easy and secure privacy-enhanced digital ID process.” “Helping Canadians control the security of their Personal data to reduce the risk of fraud online, in person, or over the phone is innovating with purpose,” said Rizwan Khalfan, Chief Digital Officer, TD. “We are thrilled to work with SecureKey and its partners in the creation of an innovative identity ecosystem designed to allow our customers to digitally and securely validate their identity, when and how they want to.” About SecureKey Technologies SecureKey is a leading identity and authentication provider that simplifies consumer access to online services and applications. SecureKey enables next generation privacy-enhancing identity and authentication network for conveniently connecting people to critical online services using a digital credential they already have and trust. SecureKey is headquartered in Toronto, with offices in Boston and San Francisco. For more information, please visit www.SecureKey.com. About IBM IBM is the leader in secure open-source blockchain solutions built for the enterprise. As an early member of the Linux Foundation’s Hyperledger Project, IBM is dedicated to supporting the development of openly-governed blockchains. IBM has worked with more than 400 clients across financial services, supply chains, IoT, risk management, digital rights management and healthcare to implement blockchain applications delivered via the IBM Cloud. For more information about IBM Blockchain, visit www.IBM.com/blockchain. InterConnect is IBM’s cloud and cognitive conference where more than 20,000 developers, clients and partners are being introduced to the latest advancements in cloud computing through 2,000 sessions, labs and certifications. IBM is positioning both enterprise and startup clients for success with a complete portfolio of cloud services and marquee partnerships, supporting a wide range of applications including: big data, analytics, blockchain and cognitive computing. For more information, visit: https://www.IBM.com/cloud-computing/. Engage in the conversation through @IBMCloud and #ibminterconnect. For more information, please contact: Sarah Kirk-Douglas, Director of Marketing SecureKey Technologies +1 905 251 6502 | sarah.douglas@SecureKey.com Holli Haswell IBM +1 720 396 5485 hhaswell@us.IBM.com",https://securekey.com/?securekey_pr=ibm-securekey-technologies-deliver-blockchain-based-digital-identity-network-consumers,,Press,,Meta,,,,,,,"DID,Verifiable Credentials",2017-03-20,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,IBM Verify App,,,https://doc.ibmsecurity.verify-creds.com/,,App,dead,Product,,,,,Verify,,,2020,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,Digital Health Pass,"the digital wallet can allow individuals to maintain control of their Personal health information and share it in a way that is secured, verifiable, and trusted. Individuals can share their health pass to return to the activities and things they love, without requiring exposure of the underlying Personal data used to generate the credential.",,https://www.ibm.com/products/digital-health-pass,,Product,,Product,,,COVID,,,,"DID,Verifiable Credentials",2022-06-03,,,,,,,,,,,,,
|
||
IBM,IBM,,,,Verity,,,,,IBM Security Verify,"Modernized, modular IBM Security™ Verify solution provides deep, AI-powered context for both consumer and workforce identity and access management. Protect your users and apps, inside and outside the enterprise, with a low-friction, cloud-native, software-as-a-service (SaaS) approach that leverages the cloud. For legacy, on-prem apps, the Verify Access version provides a smooth path to cloud, so you can transition at your own pace.",,https://www.ibm.com/products/verify-identity,,Product,,Product,,,,,,,,2020-07,,,,,,,,,,,,,
|
||
IBM,IBM,,,,Verity,,,,,IBM Verify App,"With IBM Verify Credentials, you can begin your journey of exploring the benefits of decentralized identity. We have provided an interactive experience centered around the challenge of proving your identity while opening a financial account. Additionally, we will walk you through the development of your first end-to-end decentralized identity solution.<br><br>You will first obtain two credentials: one issued by a fictional government and one from IBM HR, your fictional employer. You will then use those credentials to open a financial account with BigBlue Credit Union.<br><br>Once you’ve experienced this interactive exploration, you can build your own decentralized identity applications that emulate other issuances and verifications of credentials.<br>","With IBM Verify Credentials, you can begin your journey of exploring the benefits of decentralized identity. We have provided an interactive experience centered around the challenge of proving your identity while opening a financial account. Additionally, we will walk you through the development of your first end-to-end decentralized identity solution. You will first obtain two credentials: one issued by a fictional government and one from IBM HR, your fictional employer. You will then use those credentials to open a financial account with BigBlue Credit Union. Once you’ve experienced this interactive exploration, you can build your own decentralized identity applications that emulate other issuances and verifications of credentials. Step 1: Prepare Create your account to deploy and manage agents. Then download the mobile app via TestFlight or Google Play Store to manage credentials on mobile devices and the IBM Verify Credentials Chrome Extension to interact with the ecosystem from your desktop. Step 2: Explore Once you establish an account and configure the mobile app and browser extension, use provided sample apps to get your first verifiable credential from a government institution and IBM HR. You will then use that issued credential to prove who you are to BigBlue Credit Union. Step 3: Develop Clone the IBM Verify Credentials Samples to start developing your decentralized identity application. The samples, combined with the OpenSSI Web SDK, provide a simplified experience to programmatically issue and verify credentials. Step 4: Promote When you develop a decentralized identity application, the next step is to tell everyone about what you’ve done – and the value you’ve discovered. Ask your peers to obtain a decentralized identity from the application you built!",https://www.ibm.com/docs/en/sva/9.0.2.1?topic=verify-application,https://doc.ibmsecurity.verify-creds.com/img/prepare-explore-develop-promote.png,Product,,Product,,,,,,,,2021-03-05,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,Global,,,,IBM Digital Health Pass,"Built on IBM Blockchain technology, Digital Health Pass is designed to enable organizations to verify health credentials for employees, customers and visitors entering their site based on criteria specified by the organization. It can allow an individual to manage their information through an encrypted digital wallet on their smartphone and maintain control of what they share, with whom and for what purpose. It’s one solution in our Watson Works suite of workplace solutions.
|
||
","Overview What is IBM Digital Health Pass? IBM® Digital Health Pass is designed to help businesses efficiently verify multiple types of COVID-19 health credentials for employees, customers, fans and travelers entering their site based on their own criteria. Privacy is key. The digital wallet can allow individuals to maintain control of their Personal health information and determine what they share, with whom and for what purpose. Get advice for your industry Employers How your COVID-19 taskforce can bring employees back to the workplace with IBM Digital Health Pass (02:27) Employers To help address COVID-19, Digital Health Pass offers an end-to-end vaccination and COVID-19 test verification solution that is compliant with employee privacy and trust. Sports and entertainment Sports and entertainment Stadiums, amusement parks and concert venues can welcome fans by setting the criteria for COVID-19 health credentials and entry requirements. Travel and transportation Travel and transportation Cruise ships, airlines, hotels and travel authorities could implement Digital Health Pass to verify COVID-19 health credentials for travelers prior to a visit. Public health Public health As federal, state and local agencies roll out COVID-19 testing and vaccination programs, verifiable digital credentialing can help support businesses. Colleges and universities Colleges and universities Digital Health Pass can provide students, faculty and visitors with a convenient option to share COVID-19 test results or vaccination status. Why Digital Health Pass? Trust and transparency Privacy and security Data-driven Flexible and agile Features With Digital Health Pass, your organization can: Respect user privacy The technology minimizes the need for you to collect or store Personal data and helps you meet HIPAA, GDPR and CCPA regulations. Choose a trusted end-to-end solution Comprehensive technology includes verification of vaccinations and COVID-19 tests, test scheduling, access to testing partners* and near real-time reporting. Verify multiple credentials Use the IBM Verify app to confirm different types of COVID-19 health credentials, such as IBM Digital Health Pass, Good Health Pass, SMART® Health Card and EU Digital COVID Certificate. How it works Digital Health Pass is designed for various entities For individuals An individual can receive vaccination and COVID-19 test credentials, load them into their smartphone and share their health credentials with an organization. For verifiers Check the health and safety of employees and individuals upon entrance—whether it’s the workplace, a stadium, airport or elsewhere. Next steps Learn how you can manage and execute verification policies for COVID-19. Disclaimer *Participating businesses need to be registered with the Digital Health Pass network",https://www.ibm.com/watson/health/resources/digital-health-pass-blockchain-explained/,,Product,,Product,,,COVID,,,,Verifiable Credentials,2021-05-24,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,indy-tutorial-sandbox,"A turnkey, Docker-based sandbox that enables quick and easy exploration of Hyperledger Indy concepts.",,https://github.com/ibm-blockchain-identity/indy-tutorial-sandbox,,Code,,Resources,,,,,,"Makefile,Apache-2.0",,2019-03-14,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,openssi-websdk,Official libraries for using IBM Verify Credential Account Service APIs.<br><br>Currently supported languages include:Node.js,,https://github.com/ibm-blockchain-identity/openssi-websdk,,Code,,Resources,,,,,,"Javascript, Apache-2.0",,2022-06-01,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,trust-your-supplier,"This repository is meant to demonstrate how the Decentralized Identity concepts can be demonstrated in an ecosystem where a supplier collects a digital credential from an LEI issuer, and leverages the credential to build a relationship (and further credentials) from an IBM Food Trust Network founder, the Trust Your Supplier Network and the IBM Food Trust Network.
|
||
|
||
Play with these samples to learn how to integrate the openssi-websdk into your own website.
|
||
|
||
For more information on the technology enabling these samples, take a look at our docs.",,https://github.com/ibm-blockchain-identity/trust-your-supplier,,Code,,Resources,,,,,,"Javascript, Apache-2.0",,2020-08-20,,,,,,,,,,,,,
|
||
IBM,IBM,,,,,,,,,verify-creds-samples,Sample issuer/verifier apps built using the openssi-websdk,,https://github.com/ibm-blockchain-identity/verify-creds-samples,,Code,,Resources,,,,,,"Javascript, Apache-2.0",,2022-06-14,,,,,,,,,,,,,
|
||
IBM,IBM,,,Luc Desrosiers; Ricardo Olivieri,,,,,,Oracles: Common architectural patterns for Hyperledger Fabric,"In a previous article, we showed you two mechanisms for implementing off-chain logic that maintain trust, visibility, and transparency as qualities of service for a blockchain network. The first approached extended smart contracts by having peers in the blockchain network invoke third-party services collocated with them, while the second approach extended smart contacts by having these invoke a third-party trusted service that resides outside of the blockchain network. These third-party trusted services are commonly referred to as oracles. In this article, we explore the second approach further by presenting three common architectural patterns that can be used in the context of a Hyperledger Fabric network.",,https://developer.ibm.com/articles/oracles-common-architectural-patterns-for-fabric/,,Post,,Resources,,,,,,"Fabric, Oracles","DID,Verifiable Credentials",2019-03-11,,,,,,,,,,,,,
|
||
IBM,IBM,,,,IBM Blockchain Pulse,,,,,Blockchain newsletter: Emerging coronavirus variants spur blockchain innovations in healthcare,"Get a first look at the Gartner report for decentralized identity and verifiable claims. Access promising use cases, risks and considerations, and expert recommendations on creating value for a fully decentralized future.","Share this post: Get a first look at the Gartner report for decentralized identity and verifiable claims. Access promising use cases, risks and considerations, and expert recommendations on creating value for a fully decentralized future. Here’s your complimentary access to Gartner’s Innovation Insights. Delta variant refocuses attention on vaccine passports The surge of COVID-19 cases due to the Delta SARS-CoV-2 variant is driving organizations to expand the use of vaccine passports with interoperability and extensibility in mind. Compatibility with open standards organizations and frameworks can enable cross-border recognition for vaccine passports. The same technology can provide user control over access to other health records, from lab test results to genomic data. A travel technology company has integrated digitized credentials like airline tickets with IBM Digital Health Pass to simplify travel. Receive your free access of Gartner’s Innovation Insights Outside of healthcare, using the core blockchain-based self-sovereign verified credentialing technology behind our vaccine passport, governments can offer broader services like digital driver’s licenses or other digital identities and organizations can offer digital employee identification. Identity and credentials outside of healthcare The need to have identification information and credentials in digital form is pressing, because modern information systems are geared to digital formats. However, we’re in a time when identities are often stolen and credentials can be counterfeited. To move forward securely and confidently, you need the kind of full-featured support offered by IBM Blockchain. Blockchain and healthcare efficiency Industry leaders Aetna, Anthem, Cleveland Clinic and IBM are joining forces to launch Avaneer Health, a new venture that uses blockchain technology to improve efficiencies in the American healthcare system. The project is an outgrowth of the 2019 Healthcare Utility Network collaboration between Aetna, PNC Bank, IBM, Anthem and HCSC. Watch, read and listen White paper: Digital health credentials for COVID-19 and beyond Read this recent Frost & Sullivan report to learn how digital credentials are helping organizations and economies re-open safely and why they’re here to stay. Event: Blockchain Expo North America 2021 Attend this virtual conference September 29–30 to explore blockchain innovations. Catch Shyam Nagarajan, Executive Partner, IBM Blockchain Services on the Day 1 Keynote and Ryan Rugg, Americas Blockchain Partner, IBM, on the Day 2 panel on central bank digital currencies. Webinar: Validating Personal identity information with digital credentials Join our webinar: Proving you are you – Digital credentials powered by blockchain, which will be held Wednesday, October 13 at 12:00 PM (EDT), also available for later playback. Blog: Opening New York State for business with blockchain Read the story of the Excelsior Pass Plus, IBM Blockchain and digital credentialing coming together to help New York re-open its economy. Our solutions and how to get started No matter where you are in your adoption journey or what industry you’re in, we’re here to help you use blockchain technology to reach your business goals. Still not sure where to start? Schedule time to talk with one of our experts specific to your industry, and they can help guide you in the right direction. We’ll be back next month with more news you can use from IBM Blockchain. In the meantime, if someone forwarded you this email and you’d like to subscribe, sign up here. Blockchain solutions that transform industries Join blockchain innovators who are transforming industries around the world. Let’s put smart to work. Find your blockchain solution",https://www.ibm.com/blogs/blockchain/2021/09/blockchain-newsletter-emerging-coronavirus-variants-spur-blockchain-innovations-in-healthcare/,,Report,,Standards,,,COVID,,,,"DID,Verifiable Credentials",2021-09-28,,,,,,,,,,,,,
|
||
IBM,IBM,,,Jerry Cuomo,,,,,,"Paving the Road to Self-Sovereign Identity with Blockchain, Open Standards",,"October 10, 2017 | Written by: Jerry Cuomo Categorized: Blockchain | security Share this post: Imagine a world in which you always have peace of mind that your Personal information is safe. Imagine a world in which your information cannot be shared without your clear, explicit consent at the time of the transaction; where you decide who can access what information, when, and for how long. In this world, you can even later choose to revoke that privilege. You are in control. Every person, organization, or thing can have its own truly-independent digital identity that no other person, company, or government can take away. Today, we are not in control of our identity. Our Personal information lives in centralized repositories outside of our control. Information is often shared without our awareness. On a daily basis, we see stories of security breaches and identity theft that erode our confidence and trust. At IBM we are focused on leading a global shift to decentralized identity that is built on blockchain technology. Blockchain provides distributed ledger technology as the foundation for decentralized identity. In this solution, trust is not rooted in any single point of control but is shared across participants in a network where each person has varying degrees of permission to view data. Beyond just the technology, however, we must work as a community to establish standards and evolve regulations to work in a decentralized world. That is why today we are excited to announce IBM has joined the Decentralized Identity Foundation (DIF) as a complement to our current stewardship in the Hyperledger Project. Today, the Hyperledger Project has also announced that they are joining DIF as we together join like-minded organizations such as Microsoft, Evernym, the Sovrin Foundation, and others who aspire to make the vision of self-sovereign identity a reality. IBM joined DIF because we believe it will take open community and standards to achieve the vision of self-sovereign identity. For example, members of DIF are focused on the establishment of an open web platform standard within the W3C standards organization called Decentralized Identifier (DID). A DID will provide a standard global resource naming scheme for identity. There is a global Internet standard for naming resources called a uniform resource identifier or URI. When you type https://www.IBM.com into your browser, a URI ensures you always end up at IBM’s website. Similarly, we need one standard to identify an individual, as well. In addition to a distributed ledger and global standards, one of the most significant contributions of blockchain based identity management will be to enable verifiable claims. Verifiable claims are tamper-proof, cryptographic statements of truth. For example, let’s say “Sam” is applying for a car loan with “Acme Bank.” The Bank needs to know that Sam is trusted and can afford the car. Today, he would fill out a loan application and provide his Personal information. In the new world of self-sovereign identity, this is no longer necessary. If Sam’s employer is a provider of verifiable claims in the blockchain identity network, the employer can attest that Sam is employed with them and makes more than $50,000 a year. Since he does business with three other banks and these banks are also providers on the network, he can give consent for his employers and the three banks to validate his claim with Acme Bank. Acme can issue a new loan to Sam with minimal information all shared with Sam’s explicit consent. Using this process not all Personal information needs to be shared, such as his exact salary, instead, the network validates that it is above a certain threshold. Today, we are at a transformative juncture in Personal identity made possible by blockchain and open standards through the work of organizations like DIF and the Hyperledger Project. IBM is already pioneering new digital identity and attribute sharing networks built on open standards through our partnership with SecureKey. We are currently piloting a network in Canada designed to make it easier for consumers to verify they are who they say they are, in a privacy-enhanced, secure and more efficient way using the IBM Blockchain Platform.",https://www.ibm.com/blogs/think/2017/10/self-sovereign-id-blockchain/,,Post,,Standards,,,,,,,"DID,Verifiable Credentials",2017-10-10,,,,,,,,,,,,,
|
||
IDramp,,IDramp,,Mike Vesey,,"USA, Iowa, Indianola",USA,,,IDramp,"The Identity Fabric for Cloud Directed Business<br>As your company grows, IDramp adapts to your changing needs.",,http://idramp.com,,Company,,Company,Enterprise,ID,SSI,,VCI,,,2016,https://github.com/idramp,https://twitter.com/identityramp,https://www.youtube.com/channel/UCjAZo4oNMynl7nha0Iq-6VA,https://idramp.com/id-news/,https://idramp.com/feed/,,https://www.crunchbase.com/organization/idramp,https://www.linkedin.com/company/identity-ramp/,,,,,
|
||
IDramp,IDramp,,,,Oracle,,,,HGF 2021,"Hyperledger Forum Recap – Identity Proofing, and Passwordless User-friendly Digital Identity","IDramp presented with Oracle at [Hyperledger Global Forum](https://events.linuxfoundation.org/Hyperledger-global-forum/) June 2021. The event focused on enterprise use of blockchain technologies using the 15 projects that fall under the Hyperledger “greenhouse”. Keynotes and speakers shared their insights on the current state of enterprise blockchain adoption across several hot topics including central bank digital currencies (CBDCs), non fungible tokens (NFTs), and most importantly– identity.","Hyperledger Forum Recap – Identity Proofing, and Passwordless User-friendly Digital Identity IDramp presented with Oracle at Hyperledger Global Forum June 2021. The event focused on enterprise use of blockchain technologies using the 15 projects that fall under the Hyperledger “greenhouse”. Keynotes and speakers shared their insights on the current state of enterprise blockchain adoption across several hot topics including central bank digital currencies (CBDCs), non fungible tokens (NFTs), and most importantly– identity. IDramp CEO, Mike Vesey presented with Mark Rakhmilevich, Senior Director, Blockchain Product Management at Oracle. In their session, titled “Identity Proofing Solution Combining HL Indy and Fabric”, Mike and Mark presented the benefits and ease of integrating an identity proofing solution based on Hyperledger Indy, Hyperledger Fabric, while leveraging the Oracle blockchain and how using two separate distributed ledgers makes the solution stronger. A few key points they discussed: - Adding verifiable credentials to proven identities transforms existing identity processes as we know, protecting the privacy of the user. - A properly implemented privacy-preserving system also has the byproduct of creating a secure and easy to use identity, something that is lacking in many existing systems today. - A tight identity proofing system can eliminate bad actors, reduce fraud and really strengthen the customer experience. - This is a repeatable process that customers, employers and other end users can go through to gain access to different services and that it actually provides a much better customer experience by taking out a lot of the complexity of usernames passwords across different interfaces for different systems. When asked by a session attendee about the types of verification, Mike described how the system can scale in multiple verticals, “The framework that we showed here is very flexible and so it can easily be adopted to leverage whatever additional back-end verification might be necessary depending on where you’re creating the identity right if you are in opening a bank account if you are going to getting a library card, there’s all kinds of different requirements and each organization can decide the level of verification necessary and then plug in those mechanisms.” They concluded the presentation with a demonstration, highlighting the Hyperledger Fabric piece of the solution that leverages the Oracle blockchain platform, which is Fabric based. Using the context of the public sector, they showed how a government can affect multiple services by using consistent identity and processes. A key requirement for agencies and departments in the public sector is the ability to authenticate users against a single set of credentials for multiple applications. They don’t want to have the users having to set up identities for each application separately in its own silo. There needs to be an environment where a single set of credentials can be used across multiple applications. The demo showed how this type of decentralized identity system can be deployed using existing investments and can help downstream systems gather information and make better decisions. Mike went on to say, “The really important takeaway from the presentation and the demonstration here is how the mix of technologies to form a really simple solution, right for the services it’s very easy to integrate and build this uh build this dynamic verification system. And we’re providing some significant benefit to the end user by making it simpler and easier for them to interact. So the the this you know what really makes the project special and the solution special is that we’re not trying to fix everything with a single tool, we’re using the best best of class solutions, as Mark indicated earlier, to really provide the best experience for both the data integrity security as well as the user friction the user experience to make that you know to make that solution.” view the slides from this session IDramp delivers tools and services to enterprise identity with a zero trust approach. The IDramp platform transforms how business is done. Bridging legacy corporate Identity and Access Management technology with flexible and easy to use distributed ledger technology, companies and governments can change the way they interact with their customers and employees . Orchestrate your systems today Contact us for a demo on the IDramp suite of tools and services",https://idramp.com/hyperledger-forum-recap-identity-proofing-and-passwordless-user-friendly-digital-identity/,,Post,,Ecosystem,,,,,,,,2021-07-02,,,,,,,,,,,,,
|
||
IDramp,IDramp,,,,Doc Searls; Katherine Druckman,,,,,IDramp Identity Solutions – Reality 2.0 Podcast,"Doc Searls and Katherine Druckman talk to Mike Vesey, CEO of IDramp, about verifiable credentials, decentralization, and real-world identity solutions.","IDramp Identity Solutions – Reality 2.0 Podcast Doc Searls and Katherine Druckman talk to Mike Vesey, CEO of IDramp, about verifiable credentials, decentralization, and real-world identity solutions. About About Reality 2.0 Podcast – Join Privacy and Open Source advocates, Doc Searls and Katherine Druckman, as they navigate the new digital world, covering topics related to digital privacy, cybersecurity, digital identity, as well as Linux and open source and other current issues. Doc Searls – is co-author of The Cluetrain Manifesto (Basic Books, 2000, 2010), author of The Intention Economy: When Customers Take Charge (Harvard Business Review Press, 2012), a fellow of the Center for Information Technology & Society (CITS) at the University of California, Santa Barbara, and an alumnus fellow of the Berkman Klien Center for Internet & Society at Harvard University. He continues to run ProjectVRM, which he launched at the BKC in 2006, and is a co-founder and board member of its nonprofit spinoff, Customer Commons. He was recently editor-in-chief of the long-running premier open source publication, Linux Journal. Katherine Druckman – Katherine is a digital privacy and open source software enthusiast and advocate, longtime Digital Director for the late and highly esteemed Linux Journal, as well as a decorative arts history and wine enthusiast. She is currently an enthusiastic Drupal engineer. Mike Vesey, IDramp CEO – Mike has created several companies that provide transformational digital solutions for the global enterprise. He has developed award-winning products in unified communications, service operations, security, and data management. Mike co-founded WebCentric Communications in order to develop innovative telecommunications solutions for modernizing call center integration. WebCentric was awarded a patent for its ‘click to dial’ technology, which is widely used in call centers today. Mike went on to co-found DBVisions Inc to develop an enterprise grade content management and data security platform. The DBVisions platform was eventually acquired by a leading content management system provider. Michael then founded VCI Inc to focus on identity and security integration. VCI developed the first enterprise Single Sign on solution for Microsoft Live Meeting and a range of related collaboration products. VCI was the exclusive Microsoft partner for deploying Live Meeting into global enterprise environments. VCI has developed and maintained complex identity management integrations with some of the world’s largest organizations. Mike’s success led him to design IDramp, a decentralized integration fabric focused on identity orchestration, password elimination, and service delivery. The platform includes support for groundbreaking Self Sovereign Identity, verifiable credentials, and distributed ledger networks that make it easy for organizations to implement state-of-the-art Zero Trust identity protection. Orchestrate your systems today Contact us for a demo on the IDramp suite of tools and services",https://idramp.com/idramp-and-identity-solutions-reality-2-0-podcast/,,Post,,Explainer,,,,,,,,2022-02-21,,,,,,,,,,,,,
|
||
IDramp,IDramp,,,Mike Vesey,,,,,,Lessons From the School of Cyber Hard Knocks Podcast,"Passwords and zero-trust and pink locker rooms, oh my! In this episode, Mike discusses IDramp, what self-sovereign identity is, why we still have passwords today, zero-trust, what the near future holds, pink locker rooms!, his path to IDramp, and as always, his toughest lesson learned.","Lessons From the School of Cyber Hard Knocks Podcast This podcast is about successful cyber leaders and their toughest lessons in the cyber battlegrounds. School of Cyber Hard Knocks Podcast – IDramp CEO Mike Vesey: Pink Locker Rooms Passwords and zero-trust and pink locker rooms, oh my! In this episode, Mike discusses IDramp, what self-sovereign identity is, why we still have passwords today, zero-trust, what the near future holds, pink locker rooms!, his path to IDramp, and as always, his toughest lesson learned. For more School of Cyber Hard Knocks Podcast episodes please visit Runsafe Security Orchestrate your systems today Contact us for a demo on the IDramp suite of tools and services",https://idramp.com/school-of-cyber-hard-knocks-podcast-mike-vesey-pink-locker-rooms/,,Episode,,Explainer,,,,,,,,2021-12-30,,,,,,,,,,,,,
|
||
IDramp,IDramp,,,,,,,,,Orchestrate your identity management strategy,"It’s time to stop buying expensive bridges to Failureland. It’s time to shift our perspective on identity management away from what clearly doesn’t work and won’t work and instead, employ technologies that make the systems we have work better. It is time to focus on the technologies that simplify identity management, can be easily integrated, and provide a path to evolution at a pace and cost that meet business and government needs.","Orchestrate your identity management strategy Public sector has often lagged behind the marketplace when it comes to digital innovation. But when it comes to identity management, it’s now a leader, placing identity management at the center of zero trust initiatives. As Carole House, the Cybersecurity and Secure Digital Innovation director for the White House National Security Council, told a recent virtual conference, “Identity sits at the heart of any zero trust implementation.” But does zero trust sit at the heart of current commercial solutions dominating the marketplace, used by the federal government? We don’t think so. While identity management is supposed to be a solution, it’s increasingly turned into its own, special kind of headache: It’s often difficult to deploy and operate, is too rigid to encompass the diversity of essential business applications and is underpowered to meet emerging security approaches like zero trust and integrate with new services. Worse, the solution to what should already be a solution to this mess is to re-platform every few years, convinced that, this time, it will be different. You hold out hope that it will be finished on time, it will reduce friction and not add to the poor user experience, that it will meet all your business needs and accommodate new services and technologies — and while doing all this, it won’t cost a fortune. It’s time to stop buying expensive bridges to Failureland. It’s time to shift our perspective on identity management away from what clearly doesn’t work and won’t work and instead, employ technologies that make the systems we have work better. It is time to focus on the technologies that simplify identity management, can be easily integrated, and provide a path to evolution at a pace and cost that meet business and government needs. This approach is called identity orchestration. Why more of the same legacy management won’t work For the past two decades, digital businesses have used monolithic centralized and federated platforms to manage identity. Known as “walled gardens,” these platforms absorb, hold, and control immense amounts of customer data. As platforms, they are complicated to operate and slow to change. Unsurprisingly, an industry of identity providers has created many variations on these systems, all promising to deliver newer and better solutions and all competing against each other with essentially, the same product features. In parallel, cloud applications are embedding identity management features in their centralized directories, making digital identity harder to manage and protect. Modern businesses are now using multiple cloud providers and hundreds or even thousands of online services. A single centralized identity management platform strategy is no longer viable. Businesses need to manage many identity management features across many service providers. They need to reduce the risks of identity data sprawl across multiple centralized directories. They need consistent trust policies that provide a secure, smooth customer experience across all services. Adopting and removing new features and services must change at the speed of business. For all these reasons, the generation of the centralized, monolithic, walled-gardens garden is not sustainable. New versions of the same centralized approaches don’t solve the underlying problems in digital identity management: fragile security, the reliance on centralized storage of Personally-identifying information (PII) for verification, and all the privacy and consent headaches this creates; they just add more cost and more complex implementation roadmaps. Identity Orchestration makes your legacy system work for you Complex operations, slow migrations, poor user experience, and the vulnerabilities of identity sprawl can be solved through a simple decentralized identity orchestration strategy. With this approach, an ID-orchestration fabric is used to quickly add and remove features, tailor the customer experience, and provide consistent trust policies across any range of service providers. Instead of focusing on one centralized platform with extended customization, the orchestration strategy focuses on no-code integration, and rapid deployment for flexible, secure user experiences. The fabric automates complex integration and policy management across service providers to reduce operational cost and increase business velocity and security. Decentralized identity orchestration gives you a way to easily solve these challenges without needing platform upgrades or advanced development skills. It automates trust policies and integration across disparate providers with zero code. It unifies your identity landscape into an agile fabric that allows you to quickly design tailored user experiences that are more secure and friendly. And, critically, it provides a simple, elegant way to easily manage the continuous verification required by zero-trust security approaches. Adapt now to manage Web 3.0 If walled-garden platforms and centralization have failed to remedy identity issues in Web 2.0, they are going to struggle to get a grip on the massive scaling of identity in Web 3.0 to encompass machines and even non-digital objects. To put it bluntly, Web 3.0 is not going to be secure without decentralized, portable identities with robust privacy control features. Orchestration is going to be indispensable to managing these emerging and fast-moving digital ecosystems and securing your business. The time to build a way to manage all this is now. The benefit will start with the end of expensive centralized platforms that cannot deliver the protection, flexibility, and privacy we need online today. Decentralized identity orchestration is your smart exit strategy, an off-ramp from more centralized expense—and a gateway to the future. This article was first written by IDramp for biometricupdate.com. To see the original post CLICK HERE. Begin your digital identity transformation now! Orchestrate your systems today Contact us for a demo on the IDramp suite of tools and services",https://idramp.com/orchestrate-your-identity-management-strategy/,,Post,,Explainer,,,,,,,,2022-05-18,,,,,,,,,,,,,
|
||
IDramp,IDramp,,,,,,,,,Zero Trust & Decentralized Identity Podcast,"They explore low-code/no-code orchestration services, what to consider when making long-term complex identity decisions, and what the US is doing to protect Americans from sophisticated cyber threats after the White House issued Executive Order 14028 on Improving the Nation’s Cybersecurity.","Zero Trust & Decentralized Identity Podcast On this week’s State of Identity, host Cameron D’Ambrosi welcomes Mike Vesey, CEO at IDramp for an action-packed discussion surrounding zero-trust frameworks, identity orchestration, and interoperability. They explore low-code/no-code orchestration services, what to consider when making long-term complex identity decisions, and what the US is doing to protect Americans from sophisticated cyber threats after the White House issued Executive Order 14028 on Improving the Nation’s Cybersecurity. Host : Cameron D’Ambrosi, Managing Director at Liminal Guest: Mike Vesey, CEO at IDramp About State of Identity (SOI) – is the identity industry’s leading podcast. Each week host Cameron D’Ambrosi brings together the greatest minds in identity for an open discussion on the present and future technologies, companies, people, and paradigms that define who we are in the modern world, and how the world defines us. This podcast was first published for liminal.co. To visit the original post CLICK HERE. Orchestrate your systems today Contact us for a demo on the IDramp suite of tools and services",https://idramp.com/zero-trust-decentralized-identity-podcast/,,Episode,,Explainer,,,,,,,,2022-05-23,,,,,,,,,,,,,
|
||
IDramp,IDramp,,,,Trust Stamp,,,,,Truststamp partners with IDramp to transform multi-factor biometric authentication,"Trust Stamp (Nasdaq: IDAI, Euronext Growth: AIID ID), the Privacy-First Identity CompanyTM providing AI-powered trust and identity services used globally across multiple sectors, announces a partnership with IDramp, a leader in decentralized identity orchestration products and services. Together, they will launch an innovative biometric multi-factor authentication (“MFA”) offering that can be augmented with a range of leading access management, social sign-on, and bring-your-own identity services, all through IDramp’s no-code platform.","Atlanta, GA, May 19, 2022 (GLOBE NEWSWIRE) — Trust Stamp (Nasdaq: IDAI, Euronext Growth: AIID ID), the Privacy-First Identity CompanyTM providing AI-powered trust and identity services used globally across multiple sectors, announces a partnership with IDramp, a leader in decentralized identity orchestration products and services. Together, they will launch an innovative biometric multi-factor authentication (“MFA”) offering that can be augmented with a range of leading access management, social sign-on, and bring-your-own identity services, all through IDramp’s no-code platform. Built on the Company’s advanced biometric tokenization technology, Trust Stamp’s transformative approach to Biometric MFATM streamlines trust assurance with a simple selfie. With a global rise in cybercrime associated with digital operations, the high security and ease of use of Biometric MFATM make it a powerful addition to authentication processes at all levels of risk, from standard account access to financial transaction authentication. Paralleling Trust Stamp’s streamlined privacy-first identity offerings, IDramp delivers dynamic Zero Trust identity orchestration through passwordless credentials on a no-code basis. Organizations can leverage leading identity solutions across providers from one location, enabling rapid custom implementation of robust multi-factor authentication flows. IDramp simplifies identity orchestration across disparate systems to strengthen and accelerate identity assurance. Trust Stamp Chief Commercial Officer Kinny Chan comments, “IDramp uniquely complements Trust Stamp’s own Biometric MFA and custom end-to-end identity workflow solutions with a platform that enables frictionless migration between identity providers. By unifying top identity services in one no-code platform, IDramp delivers the best in identity authentication while addressing complex and evolving assurance needs across individual touchpoints for efficient, fraud-resistant digital operations. Trust Stamp’s biometric authentication and tokenization technology delivered through IDramp’s platform fills a pressing market need for robust security, flexibility, and speed in establishing trust. This partnership expands the reach of our biometric technology to deliver meaningful value to IDramp’s impressive client base. With a shared focus on data privacy, protection, security, and usability, we look forward to our continued strategic work with the IDramp team.” IDramp CEO Mike Vesey comments, “Our customers manage digital ID across a wide variety of disparate environments. They need robust fraud protection that is flexible and easy to use. Trust Stamp transforms digital identity with world class biometric security, bullet proof data protection, and state-of-the-art fraud detection. Combined with IDramp’s decentralized orchestration platform, Trust Stamp will plug and play into any combination of multi-cloud, multi-IDP, and even Web3.0 environments. This powerful combination provides unmatched agility and superior Zero Trust fraud protection for any digital ecosystem. Zero code, no passwords, and no expensive, slow-moving migrations required.” About Trust Stamp Trust Stamp, the Privacy-First Identity CompanyTM, is a global provider of AI-powered identity services for use in multiple sectors including banking and finance, regulatory compliance, government, real estate, communications, and humanitarian services. Its technology empowers organizations with advanced biometric identity solutions that reduce fraud, protect Personal data privacy, increase operational efficiency, and reach a broader base of users worldwide through its unique data transformation and comparison capabilities. Located in seven countries across North America, Europe, Asia, and Africa, Trust Stamp trades on the Nasdaq Capital Market (Nasdaq: IDAI) and Euronext Growth in Dublin (Euronext Growth: AIID ID). Founded in 2016 by Gareth Genner and Andrew Gowasack, the company now employs over 100 people. About IDramp IDramp provides identity orchestration for a multi-cloud, decentralized, Web3.0 world. We automate the composable enterprise so your organization can deploy applications and services wherever you want, using the identity features you need. IDramp provides Zero Trust control over disparate multi-cloud environments, ID systems and applications. Combine traditional identity management with the latest Web3.0 innovation and blockchain identity. Design distinct user experiences with any combination of features, including biometrics, fraud detection, MFA, document proofing and much more. IDramp is built on open standards. As a founding and steering member of the Trust Over IP Foundation, Linux Foundation Cardea project, Indico trustee and node operator, and Sovrin network steward, IDramp is committed to open source interoperability for state-of-the-art security, privacy and agility. Safe Harbor Statement: Caution Concerning Forward-Looking Remarks All statements in this release that are not based on historical fact are “forward-looking statements” including within the meaning of the Private Securities Litigation Reform Act of 1995 and the provisions of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. The information in this announcement may contain forward-looking statements and information related to, among other things, the company, its business plan and strategy, and its industry. These statements reflect management’s current views with respect to future events-based information currently available and are subject to risks and uncertainties that could cause the company’s actual results to differ materially from those contained in the forward-looking statements. Investors are cautioned not to place undue reliance on these forward-looking statements, which speak only as of the date on which they are made. The company does not undertake any obligation to revise or update these forward-looking statements to reflect events or circumstances after such date or to reflect the occurrence of unanticipated events. Orchestrate your systems today Contact us for a demo on the IDramp suite of tools and services",https://idramp.com/trust-stamp-partners-with-idramp-to-transform-multi-factor-biometric-authentication/,,Post,,Meta,,,,,,,,2022-07-19,,,,,,,,,,,,,
|
||
IDramp,TalkCMO,,,,Qiqochat,,,,,IDramp and QiqoChat Announce Verifiable Credentials for Online Collaboration,"QiqoChat has really stepped up in this time of need to provide an incredible online event user-experience, enabling a re-creation of the IIW experience throughout our Covid travel restrictions. This week they announced the launch of a Verifiable Credentials integration with the QiqoChat platform.","IDramp and QiqoChat have launched the world’s first implementation of verifiable Personal identity credentials for virtual conferences and collaboration. This new form of digital identity provides QiqoChat customers with a self-sovereign privacy-focused solution protected by state-of-the-art cryptographic blockchain security. QiqoChat participants can now have full control over their digital identity but what are the other benefits? Verifiable credentials allow people to bring their own identity to any online service. User experience is improved by eliminating usernames and passwords. Privacy is increased by removing any need to share Personal data with 3rd party services like Google or Facebook. Security is fortified by not storing Personal data in central databases. All Personal data remains on the user device. Cost and liability are reduced by removing the need for monolithic identity infrastructure. Performance is increased by removing the need to move all user authorization traffic through one centralized location. Read More: Strengthening the Foundation for CMO – CFO Relationship “Qiqochat is leading the way in adoption of verifiable credentials. It is an innovative collaboration platform focused on emulating in person experiences online. Personal credentials share that goal by making your digital identity verifiable and private. Just like your driver’s license or birth certificate. Verifiable credentials are a perfect compliment to QiqoChat or any online service. IDramp allows service providers to adopt verifiable credentials quickly without the need to re-platform or develop code. IDramp is built on open standards for verifiable credentials and is compatible with all other standards-based providers. Deploying verifiable credentials to QiqoChat required only a few hours of configuration and testing but the impact is profound,” said Mike Vesey, CEO of IDramp. “The community of professionals working on data privacy & consumer protection has been an early adopter of QiqoChat. During regional and global conferences, they have used the platform to share ideas and deliberate about the future of user-centric identity. Through these conferences, we’ve learned how solutions like IDramp can be tremendously empowering for Internet users. We are thrilled to implement this initial partnership with IDramp so that we can begin to explore what becomes possible when we let users take control of their own identity on our platform.” – Lucas Cioffi, CEO of QiqoChat",https://talkcmo.com/news/idramp-and-qiqochat-announce-verifiable-credentials-for-online-collaboration/,,Press,,Meta,,,,,,,,2021-01-05,,,,,,,,,,,,,
|
||
IDramp,IDramp,,,,Oracle,,,,,Passwordless Credential Orchestration Manager is Now Available in the Oracle Cloud Marketplace,"This new service offers password elimination, identity proofing, and orchestration capabilities for any Oracle ecosystem.","Enable Passwordless Zero Trust for Oracle services and applications today IDramp Announces Passwordless Credential Orchestration Manager is Now Available in the Oracle Cloud Marketplace DES MOINES, Iowa, November 24, 2021 – IDramp, a leading provider of Zero Trust identity orchestration services, today announced their new Passwordless Credential Orchestration Manager (PCO) service is now available on the Oracle Cloud Marketplace. This new service offers password elimination, identity proofing, and orchestration capabilities for any Oracle ecosystem. It operates with Oracle Cloud Infrastructure (OCI) and applications using Oracle blockchain. The Oracle Cloud Marketplace provides a broad range of partner solutions for accelerating and optimizing cloud and hybrid deployments. Oracle customers can easily secure their applications using zero-trust passwordless identity orchestration from PCO. IDramp PCO uses verifiable credential cryptography to remove the need for usernames and passwords that are prone to identity theft. It simplifies delivery of services by providing unified trust policies and rapid deployment of applications across disparate systems. Oracle customers can create verifiable digital credentials using PCO using Oracle Identity Cloud or any other OCI data source. PCO is a Zero Trust identity fabric that is easy to connect with all OCI applications and third-party services. PCO allows OCI assets to be more portable, stronger, and easier to access. The Passwordless Credential Orchestration Manager is an ideal solution for Oracle customers wanting a simpler, safer, and more efficient way to access the breadth of applications and services offered by Oracle and Oracle Cloud Marketplace. “It frees them from password vulnerabilities, big-bang migrations, and being tied to a single provider,” said Mike Vesey, CEO of IDramp. “Oracle customers use anywhere from a few to hundreds of applications, each offering unique business solutions. Now they can deploy these vital services with cutting-edge Zero Trust security and unprecedented flexibility that moves at business speed.” The Oracle Cloud Marketplace is a one-stop shop for Oracle customers seeking trusted business partners that offer unique business solutions to extend Oracle Cloud Applications. Oracle Cloud Infrastructure is a next-generation enterprise cloud that delivers next-generation security across a comprehensive portfolio of services and applications. The IDramp Passwordless Credential Orchestration Manager is also available as a stand-alone product that works with any leading platform or diverse cloud environment. The PCO system architecture and design stem from decades of experience in enterprise identity, security, and service delivery. IDramp is a pioneer in using verifiable credentials for decentralized Zero Trust in the enterprise. With deep roots in the open-source community, IDramp is also a founding member of the Trust over IP Foundation, member of the Linux Foundation Cardea steering committee, member of the Good Health Pass Collaborative, a Steward of the Sovrin Foundation, and a node operator on the Indicio Network. About IDramp With IDramp, you can orchestrate passwordless identity using decentralized zero trust technologies that work with existing identity systems. Secure the future with IDramp today. Orchestrate your systems today Contact us for a demo on the IDramp suite of tools and services",https://idramp.com/idramp-passwordless-credential-orchestration-manager-is-now-available-in-the-oracle-cloud-marketplace/,,Product,,Product,,,,,,,,2021-11-29,,,,,,,,,,,,,
|
||
Indicio,,Indicio,,Frances Donegan-Ryan; Heather Dahl; Ken Ebert,,"USA, Washington, Seattle",USA,,,Indicio,"Indicio is Empowering Trust<br><br>Indicio provides companies the ability to create and manage Trusted Data Ecosystems for the exchange of high-value information and data assets, the creation of marketplaces, and the development of new business models around trusted data. <br><br>Specializing in financial, healthcare, and travel markets, Indicio’s global decentralized network and its software and information management products enable customers all over the world to issue, hold, and verify data through encrypted digital credentials. <br><br>Our software and infrastructure allows companies to confirm data authenticity repeatedly and efficiently from its source without the expense or risk of direct integrations. Privacy-by-design architecture simplifies data compliance and deploys continuous Zero-Trust security, boosting bottom-line profit, mitigating costly risks, and enhancing an institution’s reputation for information privacy.<br><br>Contact us for quick implementation of trusted digital ecosystems today.","Indicio’s public benefit mission is to advance decentralized identity. How did we do in 2022?By Trevor Butterworth... Employment verification made easy Issue, verify, and scale tamper-proof, privacy-preserving digital employee credentials. Build, innovate, and scale with Indicio on Google Cloud One click procurement to begin creating, sharing, and verifying data. The next step in scaleable self-sovereign identity from the leaders in open-source decentralized identity Prove Anything A complete starter kit to easily adopt open source decentralized verifiable digital credentials, integrate them into your existing systems, and build complete Trusted Digital Ecosystems that you fully own Issuer and Verifier Simple software to connect, issue, and verify credentials; APIs available Maintenance and Updates Managed updates and comprehensive testing to ensure maximum performance Mobile App and Mediator Software for users to download, store, and use a credential on mobile devices Decentralized Ledger Network Run on the Indicio Networks or any public or private Hyperledger Indy-based network Verifiable Credential Templates for creating verifiable credentials using open source standards Support and Training Continuous customer support and field-leading training from industry experts Machine Readable Governance Agent software to establish trusted issuers and automate information flows via governance files Indicio implements gold standard credential types, such as Anoncreds for privacy-preserving selective disclosure and predicate proofs. Indicio uses JSON-LD for publicly shareable credentials Introducing Holdr+ Indicio’s new mobile app to hold, connect, and communicate using your verifiable digital credentials What will you do with verifiable digital credentials? Indicio customers are using verifiable credentials to… Click on image to enlarge - Lower KYC and onboarding costs - Create seamless travel experiences - Manage and share trusted device and asset data - Portable health information without direct integration Success story: An award-winning verifiable credential solution for travel SITA, the leading global provider of technology to the air transport industry, and the island of Aruba’s Health Department chose Indicio to develop a privacy-preserving digital health credential for visitors to prove they had tested negative for COVID-19. Watch the demonstration video by SITA to see how verified data created a Trusted Digital Ecosystem. Indicio provides everything you need to take advantage of verified credential technology, decentralization, and trusted data Customized Solutions Adopt verifiable credentials at your own pace, built from open standards on open source technology, without being locked-in to a particular vendor or relying on expensive solutions. The Indicio Network A MainNet designed for mission critical deployments, TestNet for building, TempNet for stress testing, and a DemoNet for demonstrations. All with continuous technical support from expert staff. Hosting We provide hosting for enterprise-grade solutions, managed nodes as a service, and customized public and private networks—all with continuous, expert, technical support. Learning Academy Indicio is the leading provider of instructor-led training in open source decentralized identity. Experience our hands-on, customizable workshops for every skill level. Business and Marketing Get help on every step of your journey adopting open source verifiable credentials. Get your project from pilot to production. A leader in open source digital identity Indicio’s memberships and active partnerships Indicio leads and is actively involved in many community standards groups and projects, promoting interoperability, innovation, and open source methodology. Indicio strongly believes in the development of open source technology for decentralized identity applications, viewing it as key to adoption and scale Indicio regularly contributes critical technology, knowledge, and insights to open source community projects. Latest from our blog Indicio Public Benefit Report 2022 Governments Go Digital (Identity) Proven Works — The Future of Employment Verification © 2022 Indicio, PBC All Rights Reserved",https://indicio.tech/,,Company,,Company,Enterprise,ID,SSI,,VCI,Indy,,2020-04-15,,,,https://Indicio.tech/blog/,https://Indicio.tech/feed/,,https://www.crunchbase.com/organization/Indicio-tech,https://www.linkedin.com/company/indiciotech/,,,,,
|
||
Indicio,Indicio,,,,,,,,,Become a Node Operator,"we’ve seen a rapid rise in demand for robust, stable, and professionally maintained networks to support decentralized identity solutions. It’s not a surprise: decentralized identity’s moment has arrived. That’s why we’ve been hard at work creating Hyperledger Indy networks upon which developers all over the world are building, testing, and launching their solutions.","Join the growing list of forward-thinking companies and organizations across the globe who are actively building the future of digital identity. This is your chance to be a part of the newest and most dynamic network in decentralized identity technology, open for innovative developers and companies eager to bring their solutions to market. At Indicio, we’ve seen a rapid rise in demand for robust, stable, and professionally maintained networks to support decentralized identity solutions. It’s not a surprise: decentralized identity’s moment has arrived. That’s why we’ve been hard at work creating Hyperledger Indy networks upon which developers all over the world are building, testing, and launching their solutions. Powering these networks are Node Operators— companies and teams from around the world and from various industries who are designing and launching decentralized identity solutions. What is a Node Operator? At the heart of a decentralized identity ecosystem lies the distributed ledger— a distributed database made up of multiple copies of a ledger, hosted by various nodes. In practice at Indicio.tech, this means companies and organizations, together as a community, volunteer to run a copy of the ledger on a server that is under their authority. On the Indicio Network, we call these “Node Operators.” Together, these copies make up a verifiable data registry, from which credential issuers and verifiers can prove important information. Set your solutions up for success by becoming a Node Operator Be where the action is happening We’re creating a community of doers, made up of companies worldwide who are creating digital identity solutions for use cases of all kinds, including banking, education, supply chain, travel, and humanitarian efforts. As a node operator, you’ll be on the frontline of the innovation, playing a leading role in this world-changing digital transformation. Get access to resources Node Operators are eligible to receive a complementary business support package for their first year in the program, including architectural guidance, best practice checks, an account-dedicated Slack channel, and a dedicated network engineer monitoring your environment and assisting you with your needs. We also help our node operators prepare their presentations and marketing materials for webinars and informational events. Learn by doing There’s no better way to get trained on how a decentralized identity ecosystem works than to play a critical role in the ecosystem itself. Supporting one of the nodes on the network gets your team a front-row view of how a network functions from the inside. We’ve seen firsthand how operating a node speeds up a company’s ability to develop and deploy their own solutions. Take part in community events Indicio hosts community events, such as monthly Node Operator sync-ups and spotlights, giving our Node Operators a platform to showcase, demonstrate, and discuss their solutions. We help keep our node operators up-to-speed by discussing new open source tools, improvements, network updates, and standards progress, as well as help them identify business opportunities. Make identity simpler The decentralized identity world can be daunting for newcomers and veterans alike. There are myriads of working groups, governance bodies, standards organizations, and cross-industry initiatives. While these all play a vital role in the development and adoption of the technology, they can often lead to “information overload” and distract your team from developing a refined, commercial-ready product. We’re here to help our Node Operators make sense of the tools and information available to them in the community, saving them valuable time, money, and resources. We don’t just talk the talk. We understand business demands and work closely with Node Operators to get to market fast. Concerned running a node might be too challenging? Our “Node Operator as Service” option can take care of your network needs, leaving you free to focus on building your identity solution and participate in the Node Operator community. Indicio can host your node on a service of your choice, maintaining it with business-critical updates. Apply today and join a community of builders leading the way in digital identity innovation.",https://indicio.tech/be-a-part-of-the-most-dynamic-network-community-in-decentralized-identity/,,Post,,Ecosystem,,,,,,,,2021-02-17,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Indicio launches blockchain-enabled network for identity,"“Our clients asked for a stable, fully-staffed network based on Hyperledger Indy— one that could provide the Service Level Agreements their customers need for mission-critical workloads,” said Heather Dahl, CEO of Indicio. “Today, we are excited to announce that this MainNet is open for business.”“This is the network we need to accelerate adoption of passwordless zero trust ecosystems for enterprise customers” said Mike Vesey, President of [IDramp](https://IDramp.com), a leader in decentralized identity and a Genesis Node Operator on the Network.","Professionally-staffed MainNet supports mission-critical, enterprise-grade decentralized identity market solutions Technology provider Indicio.tech, a public benefit corporation advancing decentralized identity software and solutions, today announced the public availability of the Indicio MainNet, a professionally-staffed decentralized identity network designed for global enterprises that need a reliable platform to develop and scale identity services and products. The development of the Hyperledger Indy-based network follows on the successful deployment of the Indicio TestNet, a market leader in decentralized identity networks. The Indicio MainNet uses distributed ledger technology—multiple identical databases spread across different nodes—to enable the use of privacy-preserving verifiable digital credentials. This provides the foundation for flexible, portable, and permanent digital identities that are always under the control of the identity holder—the individual—and which provide an evolutionary leap forward in security. “Our customers asked for a stable, fully-staffed network based on Hyperledger Indy— one that could provide the Service Level Agreements their customers need for mission-critical workloads,” said Heather Dahl, CEO of Indicio. “Today, we are excited to announce that this MainNet is open for business.” “This is the network we need to accelerate adoption of passwordless zero trust ecosystems for enterprise customers” said Mike Vesey, President of IDramp, a leader in decentralized identity and a Genesis Node Operator on the Network. “Our customers are developing service delivery ecosystems that require world class support, and leading edge features managed by a team with deep technical experience. The Indicio network provides exactly that.” “The Indicio Network enables GlobaliD to deliver a digital identity platform that puts you in control of your identity and your data,” says Mitja Simcic, CTO of GlobaliD, one of the first companies to use Indicio’s MainNet. “Most digital identity platforms take ownership and control of your digital identity and your data for their own purposes. For instance, social media companies make money from selling your data to unauthorized third parties. Indicio is creating an ecosystem for providers that are working to make this practice obsolete. This network is bringing real change to real people, all over the world.” The Value of Decentralized Identity Decentralized identity allows individuals to control their own data and solves the privacy and security issues that undermine current models for handling identity online. This privacy-preserving model for identity, where everyone controls their own information, makes it easy for companies and organizations to comply with data privacy laws, makes business partner integrations more secure, and does away with the need for third-parties to manage and hold Personally identifiable information (PII). It is important to note that as part of Indicio’s governance, no Personal data, such as names, addresses, or birth dates, are written to any of the Indicio Network ledgers. Instead, machine-readable cryptographic information identifies the issuer of the credential and the details that demonstrate the credential is authentic. With just a few writes to the Indicio MainNet, millions of credentials can be issued, all pointing to the same few ledger writes making the system easily scalable. How to use the Indicio MainNet Anyone using technology to verify a verifiable credential that is presented to them may access the Indicio MainNet for free. Several wallets currently in production now point to the Indicio Network, enabling credentials to be issued on, and read from, the Indicio Network. Global innovators interested in becoming part of the Indicio Network are welcome to become an Indicio Node Operator. This diverse, supportive, and collaborative network of dynamic companies, work together to support a copy of the ledger while helping to advance decentralized identity. Learn more about the other benefits of becoming a Node Operator. Finally, those that want to use the publicly available ledger as a platform for an identity solution may write directly to the Indicio Network. Go here to learn more about how you can write to the Indicio Network today! As part of our commitment to advancing decentralized identity, Indicio is committed to being a resource hub by providing enterprise-grade open source tools so that everyone can start building solutions today. In addition to the public MainNet, Indicio offers a TestNet, and builds Private Networks. Indicio’s customizable, instructor-led training programs are an excellent introduction to understanding how decentralized identity works, and scale to all levels of expertise. A note on environmental impact As a public benefit corporation, Indicio takes environmental impact seriously. The use of a distributed ledger in decentralized identity does not involve “proof of work” or mining, both of which entail substantial energy costs. Instead, with the optimal network size being 25 or fewer nodes, writing to the Indicio MainNet is energy comparable to logging into a website or sending a form. Much of the activity in a decentralized identity ecosystem takes place off ledger. This all makes decentralized identity a low-energy consumption practice. ABOUT US Indicio.tech provides technology development services for decentralized identity, and offers a complete software ecosystem for business, consumer, and mobile applications to issue, verify, and exchange verifiable digital credentials. Founded on the belief in privacy and security by design, Indicio supports the open source and interoperability goals of the decentralized identity community. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Identity and application teams rely on Indicio’s simplicity, extensibility, and expertise to make identity work for everyone.",https://indicio.tech/indicio-launches-blockchain-enabled-network-for-identity/,,Post,,Ecosystem,,,,,,Indy,,2021-03-17,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Indicio Tech: Why we converted to a public benefit corporation,"The idea of a benefit corporation begins with long-simmering dissatisfaction in the argument that the only responsibility or duty a company had was to increase its profits, a claim that had been forcefully made by University of Chicago economist Milton Friedman in the New York Times magazine in 1970.","In December, Indicio.tech reincorporated as a public benefit corporation, joining a worldwide movement committed to align profit with a positive material impact on society. For Indicio, it has always been clear that decentralized identity benefits the public—that is what brought us, the founders, together. It solves a massive structural flaw in the architecture of life online: The lack of an effective way to encode uniqueness and thereby verify individual identity; and it does so in a way that removes the need for third parties to control and store Personally identifying information. Decentralized identity allows people to give meaningful consent to sharing their data in a maximally private and secure way. It answers the deep disquiet over the misappropriation of Personal data that has been given a voice in data privacy regulation—and it makes compliance with such laws easy. All of these are public “goods.” Now, add in decentralized identity’s capacity to help those who have no formal, legal identity, those who are stateless, those who are refugees—a number estimated at over a billion people—to prove that they exist, secure access to health and financial services, and establish rights over property. To dream this big we have to formulate achievable, incremental steps to get there. We have to create the technology and infrastructure that can realize these public goods; we have to make the tech interoperable and, wherever possible, open source. We have to make it as easy as possible to understand, use, and adopt. We have to build use cases and help others build use cases to reveal its value. As Indicio grew, and as we saw decentralized identity as an ecosystem that needed to be seeded and cultivated, the public benefit corporate model became more and more compelling as a way of ensuring that our beliefs and values were baked into this mission. But we also saw the benefit corporation as a way of encoding a positive and inclusive culture inside our company. If each team member is genuinely valued for the work they do, they will give their best to our customers, they will become the most effective advocates for our mission. A brief overview of the benefit corporation movement The idea of a benefit corporation begins with long-simmering dissatisfaction in the argument that the only responsibility or duty a company had was to increase its profits, a claim that had been forcefully made by University of Chicago economist Milton Friedman in the New York Times magazine in 1970. Arguing that only an individual had responsibilities, and a corporation couldn’t be a person, Friedman defined a new era of shareholder supremacy in business. In practical terms, the easiest way to see whether a business was acting responsibly was to see if its share value was increasing, a simple metric that had profound consequences for the way a business or corporation was run. The CEO’s job became defined by what he or she did to increase their company’s share price. Shareholders didn’t need to buy into the reasons why the business was founded, or the vision of its founders, or even the value the company provided its customers and society: share price higher, company good. There was no obligation to think, strategically, outside the short-term, or to consider the welfare of community, the environment, or the company’s employees. Dissatisfaction with the inflexibility of this model from the business side and growing public interest in economic and environmental sustainability and social responsibility helped to open up a legal middle way between for-profit and nonprofit corporations. The “benefit” corporation was the result and the first benefit corporation legislation was introduced in Maryland in 2010. Simply put, profit and public benefit can be combined in a way that allows company directors to balance shareholder and stakeholder interests in the pursuit of that public benefit. Many states now offer similar legislation. In Delaware, where Indicio is incorporated, such corporations are called public benefit corporations. The case for benefit corporations has been most forcefully put by one of the best-known B-Corps, Patagonia. In registering as the first California benefit corporation in 2017, founder Yves Chouinard said, “Benefit corporation legislation creates the legal framework to enable mission-driven companies like Patagonia to stay mission-driven through succession, capital raises, and even changes in ownership, by institutionalizing the values, culture, processes, and high standards put in place by founding entrepreneurs.” The social impact of technology It’s not surprising that environmental impact has been central to defining the B-Corp movement and the companies that have embraced it. 1 But decentralized identity offers a similar opportunity for tech companies to think about the social impact of technology. We need to set standards for what the public should expect from technology companies and from decentralized identity. We need independent third parties, like B-Lab, which was instrumental in creating the B-Corp model, to help codify and provide independent certification that we—and other tech companies—are walking the walk on digital identity, data privacy, and security when we build and govern decentralized identity infrastructure. At a time when “Big Tech” is looking more 19th century than 21st century in the way it acts—“Big Tech face its Standard Oil moment” was an end-of-2020 headline in the Financial Times—a transformational technology like decentralized identity gives us an organic opportunity for a reset. We have the means to give people control of their identities, the right to share their data, and to give the identity-less legal agency in the world. We believe this will trigger a new wave of innovation that will benefit business and organizations too. But we believe, most of all, that it’s the right thing to do. A public benefit corporation is not just the way to do this, it’s the way to create a meaningful conversation in business about the role of technology in people’s lives—and to hold us accountable for all this talk. 1The use of a distributed ledger in decentralized identity does not involve “proof of work” or mining, both of which entail substantial energy costs. Instead, with the optimal network size being 25 or fewer nodes, writing credentials to the ledger is energy comparable to logging into a website or sending a form. Much of the activity in a decentralized identity ecosystem takes place off ledger. This all makes decentralized identity a low-energy consumption practice.",https://indicio.tech/because-decentralized-identity-can-make-life-better-why-we-converted-to-a-public-benefit-corporation/,,Post,,Ecosystem,,,,,,,,2021-01-11,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,Tim Spring ,Anonyome,,,,,Node Operator Spotlight: Anonyome,"Each of the capabilities of the Sudo Platform is attached to a persona. This includes masked email and masked credit cards, private telephony, private and compartmentalized browsing (with ad/tracker blocker and site reputation), VPN, password management, decentralized identity and more.","A distributed ledger is a database that has copies distributed across a network of servers (nodes), all of which are updated simultaneously. A network like this is the foundation of decentralized identity, a way of generating robust trust and collaboration free of the security risks of centralized databases. We call the companies and organizations that support an Indicio Network node on a server that is under their control “Node Operators.” Recently we caught up with Paul Ashley, CTO and Co-CEO of Anonyome Labs, a current Node Operator of Indicio, to discuss their current projects, some goals for the future, and where they think decentralized identity is heading. Tell us about Anonyome: how did it start, where did it start, and who makes up your team? The goal of Anonyome Labs is to shift the control of Personal information back to normal users. Everything we do is recorded, collected, mined, profiled, stored, targeted and sold. The balance of power has shifted to the cabal of tech giants and data miners who overtly or covertly monitor and control what is seen, clicked, and cared about. At Anonyome Labs we build the tools that shift control of Personal and private information from the big data miners back to the user. Anonyome Labs was founded in 2014 and is headquartered in Woodside California, with teams in Salt Lake City, Utah and Gold Coast, Australia. Anonyome Labs has about 70 employees – the teams have deep enterprise and consumer expertise across identity, cyber security, authentication, authorization, privacy and cryptography – with hundreds of granted patents. What are some of the products/services (Self Sovereign Identity or not) that you currently offer? Who are your target customers? What sets you apart from the competition? Anonyome Labs created the Sudo Platform to provide enterprise software developers with capabilities to add persona (Sudo) based identity, privacy and cyber safety features to their applications. The Sudo Platform provides to these enterprise software developers mobile and web SDKs, sample apps, documentation and UI Kits to accelerate their application development. Each of the capabilities of the Sudo Platform is attached to a persona. This includes masked email and masked credit cards, private telephony, private and compartmentalized browsing (with ad/tracker blocker and site reputation), VPN, password management, decentralized identity and more. In addition, Anonyome Labs created the MySudo mobile application to put the same identity, privacy, and cyber security capabilities into the hands of normal users for their interactions with the online and offline world. Each user is able to create a number of personas (Sudos) and with each of them have access to various Sudo Platform capabilities. What Self Sovereign Identity /Decentralized Identity products/services are on your roadmap? A key offering of the Sudo Platform is Decentralized Identity based services. This includes both client (Edge Agent) and server (Cloud Agent) offerings. This allows the enterprise to become a Decentralized Identity Verifiable Credential Issuer and/or Validator. And it allows the enterprise’s users to take part in a decentralized identity ecosystem – by giving them a mobile wallet/agent to manage decentralized identities, connections and verifiable credentials. What motivated your work in Decentralized Identity? Why did you become a node operator? What appeals to you in this field? We believe that Decentralized Identity is the most important innovation in identity to help normal users have control over their Personal information as they interact with the online world. Given Anonyome’s focus on privacy and cyber safety, it was a natural extension to our Sudo Platform to add Decentralized Identity services. Anonyome Labs became a founding steward of the Indicio decentralized identity network in anticipation of using that network for our customer’s enterprise applications. Where do you see the future of Self Sovereign Identity/Decentralized Identity? It is our belief that decentralized identity will become the core foundational technology of future privacy and cyber safety capabilities. Over time we will transition from the current privacy invasive technologies, to new systems founded on decentralized identity. For more information about the Sudo platform or any of their other products, go to Anonyome.com",https://indicio.tech/node-operator-spotlight-anonyome/,,Post,,Ecosystem,,,,,,,,2021-10-13,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,Tim Spring ,IDramp,,,,,Node Operator Spotlight: IDramp,"Recently we caught up with Karl Kneis, COO of IDramp, and Eric Vinton, Chief Business Officer of IDramp, one of the first companies to become an Indicio Node Operator, to discuss their current projects, some goals for the future, and where they think decentralized identity is heading.","A distributed ledger is a database that has copies distributed across a network of servers (nodes), all of which are updated simultaneously. A network like this is the foundation of decentralized identity, a way of generating robust trust and collaboration free of the security risks of centralized databases. We call the companies and organizations that support an Indicio Network node on a server that is under their control “Node Operators.” Recently we caught up with Karl Kneis, COO of IDramp, and Eric Vinton, Chief Business Officer of IDramp, one of the first companies to become an Indicio Node Operator, to discuss their current projects, some goals for the future, and where they think decentralized identity is heading. Tell us about IDramp: how did it start, where did it start, and who makes up your team? IDramp was born from years of frontline experience in enterprise identity management and service delivery. With IDramp we wanted to reduce the pain and vulnerabilities that surround digital identity passwords, platform migration, operation, and service delivery. The cost and resource requirements of managing and replacing identity platforms can be astronomical. Operation requires special skills and complex customization. Migrations can take years to complete and often fail. Service delivery can be slow and require premium resources. — Our experience found that adapting decentralized, Zero-Trust identity principles will reduce cost while increasing security and accelerating the speed of service delivery. We founded IDramp to help remove passwords, automate expensive tasks, reduce the need for advanced skills, and simplify the adoption of new solutions, all while improving overall security through decentralized Zero Trust. Instead of reinventing identity management platforms every few years with mammoth projects, organizations can use IDramp to enjoy continuous adoption of new services and solutions at the speed of business. Decentralized verifiable credentials can easily be adapted to any service or system for advanced Zero-Trust protection and password elimination. No coding or long term platform projects are required. People appreciate the improved privacy and simplified experience of passwordless ecosystems. Security authorities appreciate the reduced data liability and the stronger protection of Zero Trust credentials. Our team’s deep experience working through generations of multinational digital identity projects gives IDramp a unique perspective. We excel at solving complex problems with simple effective solutions that improve the bottom line. What are some of the products/services (Self Sovereign Identity or not) that you currently offer? Who are your target customers? What sets you apart from the competition? Our premier product is the IDramp platform. It caters to the public sector, enterprise and SMB customers across all industries. It provides service orchestration with zero-trust decentralized identity, and password elimination. While IDramp is a zero-code solution we also provide robust APIs that can be used to extend capabilities into any custom application or ecosystem experience. The APIs offer a limitless palette of design opportunities for application development. We also provide a free digital identity wallet to securely share Personal information, such as education certifications, health data, or employment credentials. The wallet provides multi-wallet stewardship capabilities that allow people to manage credentials for other people or things.This feature can be used to manage family credentials, or eldercare use cases, for example. IDramp is built on open standards for interoperability. It operates automatically across any standards-based digital identity network. While the IDramp wallet offers robust capabilities, any standards based identity wallet can be used with the IDramp suite of tools. Recently, we co-developed a series of groundbreaking IDramp-based apps with security software provider Bak2.life. These apps include: - Bouncer Zoom Event Attendee Security — extends Zoom meeting security with email 2FA or verifiable credentials for all participants. - Return to Life — provides a simple way for organizations to offer safe access to events and facilities based on verifiable health credentials, digital ticketing or custom credentials tailored to business needs. - Webcast Security Portal — provides end-to-end protection and access control for multiple webcast providers, including Zero-Trust, passwordless verifiable credentials. What motivated your work in decentralized identity? Why did you become a node operator? Decentralized identity reduces data liability, increases privacy, improves security, and human experience. It is a natural compliment to our suite of Zero-Trust passwordless solutions. Decentralized design has always been core to the IDramp strategy. Adapting new standards in decentralized identity helps our customers achieve the best possible protection across their ecosystems. The problems and challenges of enterprise security have been getting worse and worse over the past decade—Zero Trust identity provides much needed relief. However, the next iteration of Zero Trust will require a decentralized network to remove the need for centralized databases that carry inherent risks and increased costs. Being a Node Operator helps IDramp provide a more comprehensive Zero Trust service to our customers. Where do you see the future of Self Sovereign Identity/Decentralized Identity? The need for secure identity is a high priority because the cost of a mistake with Personal data can be very expensive. Terms like “SSI” and “decentralized” will eventually fade into globally accepted standard terms for digital identity. As decentralized identity becomes the preferred security standard, new threats and attacks will be developed and new Zero Trust-solutions will be required. With IDramp, organizations can stay ahead of the rapidly changing digital identity security landscape and avoid expensive technical detours that slow business and leak revenue. For more information about the IDramp platform or any of their other products, go to IDramp.com.",https://indicio.tech/node-operator-spotlight-idramp/,,Post,,Ecosystem,,,,,,,,2021-10-05,,,,,,,,,,,,,
|
||
Indicio,GlobalID,,Medium,,,,,,,Making decentralized identity mainstream w Heather Dahl and Ken Ebert (Indicio),"how new identity-based technology can help people, governments and companies develop greater digital trust in a modern society.","FUTURE PROOF EP 19 — Making decentralized identity mainstream In this episode, we’re joined by CEO Heather Dahl and CTO Ken Ebert from Indicio, the market leader in developing trusted digital ecosystems. Heather and Ken discuss how new identity-based technology can help people, governments and companies develop greater digital trust in a modern society. Past episodes: - EPISODE 18 — Everyone will have an ID wallet - EPISODE 17 — Digital wallets of tomorrow will be PRIVATE - EPISODE 16 — How XUMM Wallet is changing the game - EPISODE 15 — Olympic hopeful Lila Lapanja is a GlobaliD ambassador - EPISODE 14 — What we learned at Solana Breakpoint - EPISODE 13 — DeFi and Identity: Compliance in a decentralized world - EPISODE 12 — The future of GlobaliD Groups - EPISODE 11 — The XRP Card and the future of communities - EPISODE 10 — How to decentralize identity and empower individuals - EPISODE 09 — Understanding GlobaliD’s identity platform - EPISODE 08 — Owning your identity and data with VP of Design Antoine Bonnin - EPISODE 07 — Understanding the future of fintech with Ayo Omojola - EPISODE 06 — Establishing trust and safety in tomorrow’s networks - EPISODE 05 — How ZELF combines the power of payments and messaging - EPISODE 04 — The future of blockchain with the creator of Solana - EPISODE 03 — Should we trust Facebook? - EPISODE 02 — JP Thieriot on why Uphold isn’t delisting XRP - EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security Have a question for us? A topic you’d like covered? A guest you’d like to see? Let us know!",https://medium.com/global-id/episode-19-making-decentralized-identity-mainstream-1d9d8734a14f,,Episode,,Explainer,,,,,,,,2022-09-14,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Identity Blockchains and Energy Consumption,"A decentralized network using a blockchain-based distributed ledger means you can use [Peer DIDs](https://identity.foundation/peer-did-method-spec/) to move most “transactions” and their cryptographic proofing off ledger. This means that for those peer-to-peer interactions, identity blockchains don’t need to do any ledger transactions at all.","Bitcoin has given blockchain the carbon footprint of Godzilla; but when it comes to identity, blockchain-based distributed ledgers are light on energy use and long on benefits Blockchain has become synonymous with cryptocurrency, and crypto is rapidly becoming to energy consumption what crack cocaine once was to addiction. Headlines about bitcoin miners stealing electricity to “Bitcoin consumes ‘more electricity than Argentina” have generated much heat but not always a lot of light (this article from Harvard Business Review offers a nuanced view of the energy consumption controversy). The problem is that this mental shortcut can leave the impression that the energy intensive computation required to validate bitcoin transactions — which is known as “proof of work”— is a process required by all blockchains, thereby making the technology environmentally unfriendly in general It isn’t and here’s why: - An identity blockchain like the Indicio Network uses signatures rather than mathematical computation to generate proof. No complex mathematical processes are needed. You either accept the signature or you don’t. - A write to the ledger (and one write can be the basis for millions of identity credentials) or a look up on the ledger uses no more energy, and possibly less, than browsing a web page. - A decentralized network using a blockchain-based distributed ledger means you can use Peer DIDs to move most “transactions” and their cryptographic proofing off ledger. No Personally identifying information is written to the public ledger– ever. This means that for those peer-to-peer interactions, identity blockchains don’t need to do any ledger transactions at all. As most of our digital interactions are on a one-to-one basis, there is no need for them to take place on the blockchain; the blockchain is simply the root of trust for the identities of the parties issuing credentials: once these identities have been looked up and confirmed by each party, everything else happens peer-to-peer. And with Peer DIDs, each communication is cryptographically unique — a huge advancement in privacy and security requiring no more energy than, say, using encrypted email. Although harder to quantify, the energy saved from using a technology that enables you to trust information online is also something to be taken into account. The same goes for more efficient and effective usability and much better risk mitigation. But the point doesn’t require this detailed analysis to hold true: All blockchains are not Bitcoin and identity blockchains using Peer DIDs are low energy consumers. That’s why we run the Indicio Network and believe in and advocate for this technology: and that’s why it would be a huge loss if a low energy use of blockchain were to be mistakenly seen as having the carbon footprint of Godzilla.",https://indicio.tech/identity-blockchains-and-energy-consumption/,,Post,,Explainer,,,,,,Bitcoin,PEER:DID,2021-10-19,,,,,,,,,,,,,
|
||
Indicio,YouTube,,,,,,,Biden's Cybersecurity Executive Order,,Houston we have a Problem – An Identity Problem in the Oil and Gas industry,"- President Biden’s cybersecurity executive order<br>- The security landscape for global enterprises<br>- Decentralized identity, what it is and how it fortifies existing data infrastructure<br>- Case study: applying zero trust and decentralized identity to energy",,https://www.youtube.com/watch?v=iat3gyryfpe,,Video,,Explainer,,,,,,,,2021-08-24,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,GlobalID,,,,,21 Industry leaders from five continents join Indicio Network consortium to drive global adoption of decentralized identity,"[GlobaliD](https://global.id/), USA; [Uphold](https://uphold.com/), Portugal; [ID Ramp](https://IDramp.com/), USA; [Cynjatech](https://www.cynja.com/), USA; [Finclusive](https://finclusive.com/), USA; [Xertify](https://xertify.co/), Colombia; [Snowbridge Inc.](https://www.snowbridge.se/), Taiwan; Entrustient, USA; [Bot Ventures, Inc](https://botventures.io/)., Canada; [BlockSpaces](https://blockspaces.io/), USA; [Blockster Labs](https://blockster.global/), [Anonyome Labs](https://Anonyome.com/), Australia; [Selfd.id](https://selfd.id/), Romania; [Liquid Avatar Technologies](https://liquidavatar.com/), Canada; [Snapper Future Tech](https://snapperfuturetech.com/), India; [Lorica Identity](https://loricaidentity.com/), USA; [BizSecure](https://bizsecure.com/), USA; [Networks Synergy](https://www.synergy.kz/), Kazakhstan; Absolutely Zero Cyber, USA; [Cysecure](https://cysecure.us/), USA; [VERSES Labs](https://www.verses.io/), USA","Indicio.tech, a public benefit corporation, today announced the twenty-one companies backing its global network for blockchain-based decentralized identity. With each company hosting a copy of Indicio’s public ledger, the Indicio Network enables companies and organizations around the world to provide privacy-preserving ways of proving identity and authenticity to people, businesses, and even the Internet of Things. “We’re thrilled to be a founding Node Operator on the Indicio Network,” said Greg Kidd, co-founder and CEO of GlobaliD. “Indicio’s enterprise grade network is a core part of GlobaliD’s vision for giving individuals ownership of their digital identity in a privacy-preserving way. With verifiable credentials, which are reusable and user-controlled, users and groups can carry their trusted credentials wherever they go.” “Our customers require an enterprise-grade network to enable Zero Trust identity and passwordless authentication,” said Mike Vesey, CEO of IDramp, also a founding Node Operator. “Indicio’s dedicated service and support provides the reliability and performance expected for production-ready decentralized services. This dynamic community of experts is helping transform the future of digital trust for business.” The Indicio Network is composed of three networks, a MainNet, for deploying products and services, a TestNet for development, and a DemoNet for pilot and product demonstration—all three networks host the latest monitoring and service tools. “Together, we’re working to build a better digital world,” said RJ Reiser, Chief Business Development Officer, Liquid Avatar Technologies. “Indicio Node Operators are creating a transformational change in digital identity, one that empowers users to manage, control, and even benefit from their digital identity and online data.” Indicio Node Operators are spread over five continents: GlobaliD, USA; Uphold, Portugal; ID Ramp, USA; Cynjatech, USA; Finclusive, USA; Xertify, Colombia; Snowbridge Inc., Taiwan; Entrustient, USA; Bot Ventures, Inc., Canada; BlockSpaces, USA; Blockster Labs, Anonyome Labs, Australia; Selfd.id, Romania; Liquid Avatar Technologies, Canada; Snapper Future Tech, India; Lorica Identity, USA; BizSecure, USA; Networks Synergy, Kazakhstan; Absolutely Zero Cyber, USA; Cysecure, USA; VERSES Labs, USA Great companies interested in becoming an Indicio Network Node Operator can apply here.",https://indicio.tech/21-industry-leaders-from-five-continents-join-indicio-network-consortium-to-drive-global-adoption-of-decentralized-identity/,,Post,,Meta,,,,,,,,2021-07-06,,,,,,,,,,,,,
|
||
Indicio,Bonifii,,,,,,,,,Bonifii increases financial inclusion with GlobaliD digital wallet and Indicio Network,"Bonifii, the financial industry’s first verifiable exchange network for financial cooperatives, today announced the Bonifii credential, a decentralized digital identity that provides underserved individuals with access to traditional banking services in a way that maximizes their privacy and security. Bonifii created the digital credential in partnership with GlobaliD, a trust platform and digital wallet. The credential is underpinned by the Indicio Network, a global network built on Hyperledger Indy for decentralized digital identity using distributed ledger technology (DLT).","Bonifii increases financial inclusion with GlobaliD digital wallet and Indicio Network Privacy-preserving credential helps onramp underbanked to traditional banking services Denver, CO — (October 25, 2021) — Bonifii, the financial industry’s first verifiable exchange network for financial cooperatives, today announced the Bonifii credential, a decentralized digital identity that provides underserved individuals with access to traditional banking services in a way that maximizes their privacy and security. Bonifii created the digital credential in partnership with GlobaliD, a trust platform and digital wallet. The credential is underpinned by the Indicio Network, a global network built on Hyperledger Indy for decentralized digital identity using distributed ledger technology (DLT). The Bonifii credential transforms the way new accounts are created by streamlining the delivery of information needed to open an account at a traditional financial institution. By enabling an end-to-end digital online application process, the credential offers a secure and meaningful entry point into accounts with financial institutions for millions of underbanked people, giving them a pathway to achieving financial stability. “Now, financial institutions that use the Bonifii credential can achieve higher levels of assurance than traditional application methods. The identity of the account applicant can be verified from a variety of attributes that create trust and assurance,” said John Ainsworth, CEO and President of Bonifii. “This type of digitally verifiable KYC reduces fraud, increases financial inclusion, and provides friction-free interactions between account holders and financial institutions.” The FDIC reported in 2019 that over 12 percent of Hispanic households, nearly 14 percent of Black households, and over 16 percent of American Indian/Alaska Native households in the U.S. don’t have access to a mainstream checking account. The FDIC also reports that while these figures have been trending downward, the number of unbanked households will likely increase in the aftermath of the ongoing Covid-19 pandemic. “Real financial inclusion will only be possible with fraud-resistant mechanisms that can adapt to peoples’ real-life situations and economic activities,” said Ainsworth. “Bonifii combines the availability of the GlobaliD wallet and services that run on the publicly available Indicio Network to ensure secure, privacy-preserving, scalable access to millions of underbanked people. This combination of technology also minimizes the risk of illicit activity, reduces the widespread problem of fraud, and simplifies the challenge of compliance within the U.S. financial system.” “Our partnership with Bonifii and Indicio is about the three i’s—inclusion, innovation, and interoperability,” said Greg Kidd, co-founder and CEO of GlobaliD. “With a simple, universal credential, anyone can now access traditional financial services—all of which is powered by a fundamentally self-sovereign solution.” Bonifii chose to partner with GlobaliD due to their deep experience in secure, private, portable, digital identity and payments, their experience with the Indicio Network, and their existing use of digital money transaction platform Uphold. Uphold also relies on GlobaliD to sign up and login their customers. In turn, Uphold provides GlobaliD users an easy way to hold assets, send funds to other GlobaliD users, and spend money against their GlobaliD wallet. “Access to traditional banking services will transform the lives of millions of people. The Bonifii credential will help people currently without the traditional paper documents required to open an account and, at the same time, provide financial institutions with enhanced protection from fraud,” said Heather Dahl, CEO, Indico. “Indicio is committed to further supporting deployments that enable financial inclusion and protect customers’ privacy and institutions from fraud. Our mission is to enable innovators, like Bonifii and GlobalID, to create trusted data ecosystems to help improve the world.” For more information about the Bonifii credential visit https://Bonifii.com ### About Bonifii – https://Bonifii.com Denver-based Bonifii is the financial industry’s first verifiable exchange network designed to enable trusted digital transactions using open standards and best-of-breed security technologies. Bonifii empowers credit unions to change the way they interact with their members by enabling a seamless user experience in every financial transaction through a secure, private, trusted and transparent resolution of the entities’ identity. To learn more about Bonifii, visit www.Bonifii.com, email us at [email protected] or follow the company on the Bonifii blog, LinkedIn or Twitter. About GlobalID – https://global.id GlobaliD is a trust platform that seamlessly integrates digital identity, communications, and payments — the core building blocks for the next chapter of the internet. Unlike existing offerings, GlobaliD’s open, portable, and interoperable solutions put individuals back in control of their digital lives rather than governments or corporations, while allowing developers and businesses to easily take part in building the future. GlobaliD has offices in the U.S. and Europe and its digital identity framework has been recognized by the World Economic Forum and the Brookings Institute. About Indicio – https://Indicio.tech/ Indicio provides development and hosting for Trusted Data Ecosystems. Enterprise, consumer, and mobile applications run on the Indicio Network and use its comprehensive ecosystem of software to issue, verify, and exchange verifiable digital credentials. The company develops, runs, and hosts multiple networks using the latest in Hyperledger Indy network monitoring tools and resources. It led the creation of Cardea, a complete architecture for verifiable and secure health records for Linux Foundation Public Health and runs comprehensive instructor-led educational training workshops. These power a growing ecosystem that solves fundamental problems in online verification, identity, privacy, and zero trust security. Media contact Information Julie Esser, SVP Client Engagement [email protected] 608.217.0678",https://bonifii.com/2021/10/bonifii-increases-financial-inclusion-with-globalid-digital-wallet-and-indicio-network/,,Post,,Meta,,,,,,,,2021-10-22,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Decentralized Identity opens the doors for safe travel and tourism,Machine readable governance enabled businesses and venues to trust that tourists had been tested on arrival by Aruba’s health department. Visitors using the digital Aruba Happy Traveler Card could be swiftly and reliably verified with a phone app. This freed both businesses and the government from the burden of mechanically collecting data with the attendant risk of error or fraud.,"Learn how Indicio and SITA worked together using privacy-preserving technology to reshape contactless health information sharing. Proof of testing or vaccination has become central to how we reopen travel to countries, admit visitors, and bring tourism economies back to life. Providing privacy and control for people is the key to establishing public confidence in a system for proving one’s health status. A digital proof of a Covid negative test or vaccination must be designed to protect individual privacy. It should enable a medical or test center to directly provide that information to an individual—and involve no one else storing or managing their data. It should be tamper proof and incapable of being faked. And it should be easy to download and quick to use. This is why Indicio.tech, a public benefit corporation that provides decentralized identity software solutions, and SITA, the leading global technology provider for the air transport industry, have used open source, privacy-by-design technology to build a solution that allows airports, airlines, and all elements of the tourist economy to use verifiable digital credentials to safely and securely return to life. How to reopen travel and tourism and preserve privacy Trusted data ecosystems use distributed ledger technology, cryptography, and a new way to provide an individual with control of their digital information. This means identity credentials that contain health information are issued directly to that person’s digital wallet, without any handoff to or management by third-parties. Trusted organizations can quickly issue millions of credentials without any of the information they contain being collected and stored in a third-party database. Then, when the person decides they want to share all or just part of the information, such as the specific details of their test status, the authenticity and original source of that information can be definitively proven. This makes the digital credential compliant with health and data privacy law (HIPAA, GDPR). The advantages of a Trusted Data Ecosystem are that it can: - Replace paper cards with fully digitized identity information - Increase efficiency by automating many tasks involved in presenting Personal health status - Ensure consent and control when sharing Personal data - Allow a user to select which information they want to disclose while obscuring the rest - Enhance security through privacy-by-design, user-friendly digital records, and tamper-evident distributed ledger technology - Avoid the problem of fraudulent health cards or paper forms from being presented - Scale to include millions of participants, including employees, travelers, and residents, with just a few writes to a public ledger and an inexpensive mobile application - Speed recovery of reopening venues and countries Open and manage public spaces Indicio’s identity ecosystem is built using Cardea, a complete ecosystem for the exchange of privacy-preserving digital credentials, open sourced as a project in Linux Foundation Public Health. Based on Hyperledger Indy and Aries open source technology, its flexible design means it can be easily adapted and deployed by any company, government, or organization that needs a privacy preserving digital credential for managing access. Indicio’s implementation of Cardea for SITA and the Government of Aruba features a mobile interface in the form of a mobile app for users and a second mobile app for use by venues to receive and verify credentials from users. Software called mediator agents and enterprise agents allow for scaling and automation of the credential issuing and verification processes. Distributed ledger technology provides cryptographic assurance that the data within any given credential has not been tampered with or altered. Cardea’s architecture protects privacy and aids compliance by separating issuers, holders, and verifiers of credentials. Issuers cannot know where credentials were used by those holding them, and verifiers (receivers) of credentials are able to limit the amount of data they receive and retain. Successful test deployment in Aruba The island of Aruba and global air transport technology provider SITA came to Indicio to create a trusted traveler system that makes it easy for visitors to share their health status privately and securely using their mobile device. Aruba is focused on finding innovative ways to strengthen its tourism industry while minimizing the risk of Covid-19 infection from visitors. Unlike immunity passports, the verifiable digital credential system from Indicio allows visitors to share a trusted proof of their health status. This trust is possible because the traveler has shared their health status and had it verified by a public health agency. Once a test result is approved, the traveler is issued a second credential by the public health agency to confirm that they have tested negative. This credential contains no medical data whatsoever and is used only to prove the person’s test status. The Happy Traveler Card, as this credential is called in Aruba, is verified by hotels, restaurants, and entertainment venues that the traveler visits. It is an easy way for even their smallest businesses to ensure the safety and health of their guests. The Happy Traveler Card in action Machine readable governance enabled businesses and venues to trust that tourists had been tested on arrival by Aruba’s health department. Visitors using the digital Aruba Happy Traveler Card could be swiftly and reliably verified with a phone app. This freed both businesses and the government from the burden of mechanically collecting data with the attendant risk of error or fraud. The Cardea ecosystem enables Aruba to move toward a privacy-first approach to supporting their tourism industry, which in 2019 accounted for 98.3% of Aruba’s GDP and supported 47,000 jobs—99% of all employment on the island. Build on our experience for your solution The tourism and hospitality identity solution for SITA is highly replicable for use cases in any industry and easy to integrate with existing systems. With a professionally staffed global network for verifiable digital credentials supported by many of the leading companies in this space, Indicio is building the future of Trusted Data Ecosystems. Open source and interoperable software means your solution is scalable and sustainable. Our expert team of architects and engineers can create and customize a solution quickly for business, governments, and organizations who need a privacy-first identity solution and they can deploy it in weeks. To learn more or schedule a free one-on-one consultation to find out how you can benefit from ta Trusted Data Ecosystem, contact us.",https://indicio.tech/decentralized-identity-opens-the-doors-for-safe-travel-and-tourism/,,Post,,Meta,,,,,,,,2021-06-23,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,Aruba Health Department; SITA,,,,,Finalist for CRN Social Impact Award,Indicio worked with SITA and the Aruban government to develop a decentralized identity solution for managing Covid testing and vaccination for travelers to the tourism-dependent island.,"Indicio a Finalist in CRN Awards for social impact By Tim Spring The CRN Tech Impact awards are given to IT vendors, distributors, and resellers for their social and environmental impact. This year, Indicio is a finalist in the Social Impact Project Category for its work with SITA, the world’s leading provider of IT to the air transport sector, on verifiable credentials for travel. Indicio worked with SITA and the Aruban government to develop a decentralized identity solution for managing Covid testing and vaccination for travelers to the tourism-dependent island. The goal was to avoid the need for airlines or governments to directly integrate with health care providers, while providing travelers with a tamper-and-fraud proof, privacy-preserving way to prove their health data (and thereby comply with health data privacy requirements). After successful trials, the code was donated to Linux Foundation Public Health as a privacy-preserving way for public health authorities to share health data. Known as the Cardea Project, the codebase continues to be developed to address other health data sharing needs. As a Public Benefit Corporation, Indicio has made the advancement of decentralized identity its public benefit mission. The social impact of Cardea is enormous. There is now a complete open source decentralized ecosystem for issuing, holding, and sharing health data in a privacy-preserving way. At the same time, SITA’s application of the same digital technology promises to transform our experience of travel from check in to border control. To read more about some of the work that was done you can read the original press release here or read the Cardea white paper. Other finalists in the social impact category are Epson UK and Clevertouch Technologies and Interactive AV Solutions. For the full list of categories and finalists, visit CRN. Indicio is honored to be nominated and looks forward to the announcement of the winners in September. In the meantime, to keep up with Indicio be sure to subscribe to our newsletter for more news and updates from our community!",https://indicio.tech/the-crn-tech-impact-awards/,,Post,,Meta,,Travel,COVID,,,,,2022-06-23,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,How to Create a Trusted Digital World,"We’ve completed what we set out to do two years ago: finish the technology’s foundation and create a full, open source ecosystem around it. You can issue, share, and verify data through verifiable digital credentials today. You can layer this on top of your existing identity and access management systems now.","After announcing our oversubscribed seed round, we reflect on what we achieved and what’s to come By Heather Dahl It was a race toward the next big thing—except it felt like everyone was jogging in circles. All of the talk was about decentralized identity solving all the big online problems—verification, privacy, security—but where were the solutions? We had been talking and jogging for years and the finish line for the technology wasn’t getting closer. It was time to up the pace, to lead a group of talented, like-minded individuals—and to get the technology to a stage where enterprises and organizations could start implementing working solutions. That was the sense of opportunity and need that drove the creation of Indicio two years ago. There was a lot to do. It would have been overwhelming if we hadn’t broken it down into a roadmap of small steps, each building on the other, each extending the technology, each adding a community of builders, each bringing more customers and collaborators into the mix, and all feeding a virtuous cycle of innovation and growth. The initial investment was conservative; but customer revenue was instant and scale was rapid. We focused on building solutions, starting with our customers’ simplest problems and use cases. And when those solutions worked, and our customers saw what the technology could deliver, they asked for more and to deliver new features; they began to think of new applications and opportunities. The success of decentralized identity technology lay in each of our customers being successful. Each successful customer deployment was a victory, each advanced the technology, and each created new opportunities. This was our strategy: the more we built solutions, the more customers we’d have, and the more they’d ask us to do. It worked; indeed, the investment community liked this strategy so much, our seed round ended up oversubscribed. Actionable data now Normally, a startup would use a seed round to launch a business; but we were already up and running: we had customers, products, and services. That’s what excited our investors: if we had come so far on so little, how much faster would we go with more? Well, we’re excited too. We’ve completed what we set out to do two years ago: finish the technology’s foundation and create a full, open source ecosystem around it. You can issue, share, and verify data through verifiable digital credentials today. You can layer this on top of your existing identity and access management systems now. We’ve also developed machine-readable governance to simplify managing complex information flows and governance processes offline and in ways that meet the needs of regulatory authorities. Governance for decentralized identity has often been a sticking point in the marketplace because it reads like a bigger problem than the problem decentralized identity is supposed to solve. Not any more. Even the language around this technology has shifted to resonate with what the market understands and needs. It’s not just about engineers talking to engineers anymore; it’s about the value we create in trusted digital interaction, the efficiencies that come with verification, and the protean capacity of the communication protocols to manage this interaction in new, secure ways; it’s about actionable data and digital relationships. In short, we’ve made it easy and cost-effective to begin a powerful process of digital transformation that creates trust. These flexible, customizable “Trusted Digital Ecosystems” can be deployed rapidly by being layered on top of your existing identity systems. This means creating digital transformation in weeks rather than years. They are easy—surprisingly easy—to use. They deliver value fast and set you up to manage the wider digital change of coming with web 3 and the spatial web. Creating magic In many ways, closing this seed round is the end of the beginning, we’ve warmed up for the real race and we’re going to run it fast. Because we now have the foundational interoperable framework to drive use and adoption of verifiable credentials and decentralized identity, we can start demonstrating the real power behind this technology to businesses, governments, and organizations. History has shown that with a small set of robust, interoperable components, you can create magic. And that’s what we’re going to do with this seed investment. Finally, none of this would be possible without a stellar team all pulling together. Our architects and engineers are at the top of their game because the business side is at the top of its game. It means our passion and expertise are in sync. It means we approach problems and build solutions holistically. The result is that people like working with us. We make their needs ours, we give them everything, and together we get things done.",https://indicio.tech/how-to-create-a-trusted-digital-world/,,Post,,Meta,,,,,,,,2022-04-28,,,,,,,,,,,,,
|
||
Indicio,IDramp,,,,,,,,,IDramp Offers Market-Ready Decentralized Identity Platform on the Indicio Network,"IDramp, a leader in decentralized identity products and services, announced today that it now provides market-ready solutions leveraging the Indicio Network, a professionally-run decentralized network for identity.<br>IdRamp provides enterprise and government customers with digital wallets via the Passport mobile application, and on-boarding services with the IDramp Service Delivery Platform. The company has established itself as a market leader in decentralized identity. IDramp selected the Indicio Network for its reliability and expert support services.","IDramp Offers Market-Ready Decentralized Identity Platform on the Indicio Network IDramp provides enterprise and government customers with digital wallets via the Passport mobile application, and on-boarding services with the IDramp Service Delivery Platform. The company has established itself as a market leader in decentralized identity. IDramp selected the Indicio Network for its reliability and expert support services. IDramp has long recognized the complexity of an ever-growing collection of disconnected protocols, standards, regulations, and technologies. The IDramp Service Delivery Platform offers decentralized identity that simplifies the experience, removing centralized data, allowing businesses to focus on their business rather than managing technology. “Part of what adds value to our customers is quick and easy integration into their various legacy ecosystems,” says Mike Vesey, CEO of IDramp. “Without having to lift and shift anything, businesses and organizations of all shapes and sizes can use decentralized identity to improve user experience by eliminating any need for passwords, increasing privacy by removing the need to share Personal data with 3rd party services like Twitter or Facebook and improving data protection by not storing Personal data in central databases. This reduces cost by reducing investments in monolithic identity platforms. Having an enterprise grade network to deliver decentralized services for our customers is critical. We are excited to have Indicio Network providing IDramp customers with a dependable, reliable, and robust enterprise ready network with experienced staff at the helm. Indicio provides the best-in-class distributed network that our customers need.” Indicio’s fully-monitored decentralized identity network debuted in July with experienced concierge-class expert support to help companies design, build, and bring decentralized identity products to market quickly. As a Genesis Node Operator, IDramp helped launch the Indicio Network, contributing dedicated server space, ongoing governance contributions, and community leadership. The IDramp platform allows for the simple management of decentralized identity verifiable credentials in one easy-to-use platform that can be linked to virtually any existing application. These credentials are tamper-proof, and with simple security reporting and flexible APIs, improving business productivity and security oversight. “We are thrilled to see the growing Indicio community begin to run their customer workloads on our network,” says Heather Dahl, CEO of Indicio. “IDramp offers solutions that help people move from a world of centralized digital existence with a multitude of logins, passwords, and profiles in someone else’s database, to one where identity is digital and decentralized, yet controlled by the individual. Decentralized identity allows us to protect our privacy and share our information at a distance – two things that are increasingly important to the efficiency of the global economy, and critically important in the context of a worldwide pandemic.” “IDramp and Indicio are helping to grow a community focused on delivering decentralized identity solutions that will drive adoption of this empowering technology,” says Karl Kneis, COO of IDramp. “Now IDramp can provide a suite of simple-to-use, professional grade tools and services, all connected to a state-of-the-art network. This community effort will drive our solutions to scale management of verifiable credentials at the speed our customers need. Decentralized identity is new, but it is the most secure and trustworthy means of data sharing yet devised. Now it’s also easy to use.” About IDramp IDramp is a service delivery platform that helps businesses manage verifiable credentials and identity integration within diverse application ecosystems. We combine verifiable credentials with well-known identity management capabilities for MFA, consents, access rules, directory integration, and analytics. Businesses use IDramp to create trust ecosystems without complex upgrades or specialized technical skills. IDramp is built on open standards for verifiable credentials and established identity protocols. As a founding and steering member of the Trust Over IP Foundation IDramp is committed to delivering state of the art security, privacy and assurance for all trust ecosystems. About Indicio.tech Indicio.tech is a professional services firm specializing in decentralized identity architecture, engineering, and consultancy. Indicio provides expert guidance to a global community of clients on the use of verifiable credentials to build digital identity solutions. The decentralized networks and tools created by Indicio make verifiable credentials easy to adopt, simple to deploy, and reliable to use. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Indicio believes in privacy and security by design, interoperability, and supports the open source goals of the decentralized identity community.",https://idramp.com/idramp-offers-market-ready-decentralized-identity-platform-on-the-indicio-network/,,Post,,Meta,,,,,,,,2021-01-01,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,New York VC Network,,,,,Indicio named by New York VC Network in their list of the most exciting early-stage teams,Indicio is proud to have been named by [New York VC Network](https://www.vcnet.nyc/) in their recently compiled list of the most exciting early-stage teams that they’ll be following closely this year!,"Indicio is proud to have been named by New York VC Network in their recently compiled list of the most exciting early-stage teams that they’ll be following closely this year! By Tim Spring This exclusive list is made up of recently founded companies that are now raising their Seed or Series A and have applied to theVC matching program in the past six months. Based on the space the teams are working in, they are divided into 5 categories: HealthTech, Fintech, B2B Solutions, Consumer Tech, and Top Scorers in ESG. The New York VC Network Rating Committee consists of current and former VCs, angel investors, exited entrepreneurs, and Fortune 500 employees in M&A roles. The rating for companies is primarily based on two criteria. The first is the team’s past track record, mostly relying on founder profiles, past achievements, and their ability to gather a talented team around them. The second is the company’s current traction/scalability, based on the chosen market, current traction, and ability to scale. The focus here was not on raised capital, but for insight these companies had already raised $3.4M on average (the median being $1.4M) and continued to receive more after applying. More information on the full list, including companies, full company profiles, and contact information are available in the full announcement from New York VC Network.>",https://indicio.tech/indicio-named-by-new-york-VC-network-in-their-list-of-the-most-exciting-early-stage-teams/,,Post,,Meta,,,,,,,,2022-01-28,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Indicio Named Finalist in IDC’s Inaugural Best in Future of Digital Infrastructure North America Awards,"The finalists have effectively used digital infrastructure across on-prem, edge and public cloud platforms to transform their most important business processes and to launch new digital business innovations. They are to be congratulated for their vision and industry leadership!","Indicio Named Finalist in IDC’s Inaugural Best in Future of Digital Infrastructure North America Awards By Tim Spring Seattle WA – September 14, 2021 – Indicio today announced it has been named a finalist in the inaugural IDC Future Enterprise, Best in Future of Digital Infrastructure North America Awards in the Ubiquitous Deployment category. The new awards were designed to highlight companies that demonstrate innovation and excellence in using cloud-centric computing to enable digital infrastructure resiliency, ensure consistent and ubiquitous workload placement and cost optimization across hybrid or multiple clouds, and take full advantage of autonomous operations. The Indicio Network is the world’s only professionally-managed, enterprise-grade Hyperledger Indy-based network for decentralized identity. Indicio facilitates a consortium of 23 diverse, forward-thinking companies on five continents that are driving the use of decentralized identity to improve privacy and security in fintech, healthcare, travel, and the Internet of Things (IOT). Node Operators include GlobalID, Liquid Avatar, IDramp, Bonifii, BizSecure, Entrustient, Blockspaces, Lorica Identity, and Networks Synergy. Learn more about the Indicio Node Operator Consortium membership. “Being named a finalist in the IDC Future of Enterprise awards recognizes not just the accomplishments of Indicio’s dedicated professionals who designed and built the network, but of everything our network’s Node Operator community has done to support its establishment and maturity in such a short period of time,” said Heather Dahl, CEO, and co-founder of Indicio.tech. “We created this network to enable businesses use the power of decentralized identity to create Trusted Data Networks. These transform the way we authenticate, share, and verify data in a secure, privacy-preserving way so that information from people and connected things can be trusted. We’re seeing our Node Operators apply this technology to an extraordinary range of use cases—and they’re able to do that because they have a professionally supported enterprise-grade decentralized network to build on.” Finalists joining Indicio in the Ubiquitous Deployment category are Toyota Financial Services for their Digital Infrastructure Transformation and US Air Force Cloud One. These organizations were recognized for their help to improve business agility and resiliency using outcome-driven infrastructure governance and portability enabled by subscription-based infrastructure consumption strategies and shared cloud management control planes. These initiatives often allow internal IT staff to offload infrastructure maintenance and support across widely dispersed locations by shifting to remote and intelligent vendor support and continuous technology refresh agreements. “We were overwhelmed by the number of thoughtful and strategic initiatives submitted and congratulate all the finalists named in our inaugural IDC Future Enterprise Best in Future of Digital Infrastructure North American Awards program,” said Mary Johnston Turner, Research Vice President for the Future of Digital Infrastructure – Agenda research efforts at IDC. “The finalists have effectively used digital infrastructure across on-prem, edge and public cloud platforms to transform their most important business processes and to launch new digital business innovations. They are to be congratulated for their vision and industry leadership!” Indicio Node Operators are responsible for supporting a copy of the Indicio ledger on the Indicio Network as well as guiding strategy and ecosystem development. The number and business and geographic diversity of Node Operators is the foundation of its stability and resilience. Indicio supports the network with dedicated engineering staff and field-leading monitoring tools. To learn more about becoming a Node Operator, visit our website. To learn more about the IDG annual awards, please visit here. About Indicio Indicio provides development and hosting for Trusted Data Ecosystems. Enterprise, consumer, and mobile applications run on Indicio’s network and use its comprehensive ecosystem of software to issue, verify, and exchange verifiable digital credentials. Founded on the belief in reducing fraud, privacy by design, and user-friendly security, Indicio supports the open source and interoperability goals of the decentralized identity community. As a Public Benefit Corporation, Indicio is committed to advancing Trusted Data Ecosystems as a public good that enables people to control their identities online and share their data by consent. Identity and application teams rely on Indicio’s simplicity, extensibility, and expertise to make trusted data work for everyone.",https://indicio.tech/indicio-named-finalist-in-idcs-inaugural-best-in-future-of-digital-infrastructure-north-america-awards/,,Post,,Meta,,,,,,,,2021-09-14,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Indicio.Tech Incorporates as a Public Benefit Corporation,"Indicio joins companies such as Patagonia and Kickstarter in embracing a corporate model that aligns shareholders and stakeholders around a shared mission to deliver a material benefit to society, not only through products and services and how they are made and delivered, but through prioritizing the welfare of employees, diversity and inclusion, and environmental impact.","New structure supports the company’s mission, values, and its belief that identity technology should serve the public interest. Decentralized identity is a transformational technology that can protect an individual’s privacy, enable their consent in data sharing, and provide a pathway to formal identity for hundreds of millions of people currently without any legal means of proving who they are. Indicio.tech was founded to advance decentralized identity through providing the kind of professional services and critical infrastructure that can catalyze adoption of this technology. Today, in recognition of the role it can play in building and shaping a technology for the greater good, Indicio, announces that it has reincorporated as a public benefit corporation (PBC). Indicio joins companies such as Patagonia and Kickstarter in embracing a corporate model that aligns shareholders and stakeholders around a shared mission to deliver a material benefit to society, not only through products and services and how they are made and delivered, but through prioritizing the welfare of employees, diversity and inclusion, and environmental impact. “When it comes to our digital lives, it is hard to think of a technological advance more beneficial to the public than decentralized identity,” says Heather Dahl, CEO of Indicio. “It will transform people’s digital lives by giving them control over who they are online and who they share information with. It will create identity for the hundreds of millions of people who currently lack formal, legal identity, which means giving them a way to access financial and health services. The advances in identity technology help us recover some of the lost, early idealism of the internet as a benefit to everyone. And while we know one company can’t save the world, we can take a stand about how the world can be a better place. Decentralized identity is our stand.” As a Delaware PBC, the company will operate under the same management structure and corporate and tax laws it does today and with the same commitment to strong growth and profitability. “Decentralized identity needs a variety of business models to rapidly scale,” says Dahl. “And we think for Indicio, the PBC model combines the best attributes of the traditional for-profit corporation with the public mission orientation of a nonprofit. We need to be agile. We need to be sustainable. We need to be innovative. And we need all of these qualities to be directed, without compromise, toward advancing decentralized identity.” “For Indicio, becoming a PBC means honoring the idealism of the open source community that brought decentralized identity technology into existence,” says Ken Ebert CTO. “This means open sourcing the infrastructure that we build, and by making interoperability the compass point that directs how we build for others. Indicio has already begun doing this by open-sourcing its monitoring tools package and the company is about to release more tools and services that will make it easier for companies to develop and use decentralized identity solutions.” As a PBC, Indicio will continue to pioneer architectural solutions and deliver superlative development and engineering support to its list of global customers, and it will do so by cultivating a company culture where employees and interns can get the professional development and mentoring they need in order to consistently deliver their best. “When we reflect on the values that inspired our launch, propelled our growth, and delivered for our customers, we want to bake them into our company,” says Dahl. “We want to hold ourselves accountable to those values, and we want to be held publicly accountable for them. That’s a powerful feature of the PBC model. And just as it has enabled credible, third-party assessment on whether a company is delivering on its environmental commitments, we see it as providing a path for identity technology to be assessed in a similar way. There’s a long way to go, but at a time, when technology is under increasing criticism, we have a chance to build better and audit better from the beginning.” Indicio joins a growing number of companies worldwide embracing the public benefit corporate model recognizing that businesses can build greater long-term value by committing to stakeholders, employees, and communities. So far, 35 states and the District of Columbia have passed legislation enabling public benefit corporations (sometimes called benefit corporations), and many countries have followed with similar legislation. Indicio’s PBC status will position the company as a leader in trusted identity platform builders as they advance the technology, the industries it serves, and connect the growing field of decentralized identity vendors. Indicio will set out its public benefit goals in the coming weeks. ### About Indicio Indicio.tech is a professional services firm specializing in decentralized identity architecture, engineering, and consultancy. Indicio provides expert guidance to a global community of customers on the use of verifiable credentials to build digital identity solutions. The decentralized networks and tools created by Indicio make verifiable credentials easy to adopt, simple to deploy, and reliable to use. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Indicio believes in privacy and security by design, interoperability, and supports the open source goals of the decentralized identity community.",https://indicio.tech/indicio-becomes-a-public-benefit-corporation/,,Post,,Meta,,,,,,,Public Benefit Corporation,2020-12-30,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Paving the way to safer travel,"Indicio.tech, together with SITA and the Aruba Health Department, are trialing the Aruba Health App, a pilot that makes it easy for visitors to share a trusted traveler credential – based on their health status – privately and securely on their mobile device. This credential will provide access to participating hospitality sites on the island.","SITA, INDICIO PAVE WAY TO SAFER TRAVEL EXPERIENCE WITH LAUNCH OF ARUBA HEALTH APP Using blockchain, the app creates a secure travel credential that is accepted by airlines, hotels and hospitality partners without sharing private health information ORANJESTAD – 5 May 2021 – Indicio.tech, together with SITA and the Aruba Health Department, are trialing the Aruba Health App, a pilot that makes it easy for visitors to share a trusted traveler credential – based on their health status – privately and securely on their mobile device. This credential will provide access to participating hospitality sites on the island. Aruba is focused on rebuilding its tourism industry in the wake of the COVID-19 pandemic while ensuring that the risk of infection from people visiting the island is minimized. The island has been less impacted by the pandemic than many other countries globally and is open to most tourists. To ensure their safety, and those of the island’s residents, all tourists are required to provide a negative PCR test taken 72 hours before flying. Using the Aruba Health App, visitors to the island who have provided the required health tests to the Aruba government will be issued with a unique trusted traveler credential, using blockchain technology. This credential then can be verified by hotels, restaurants, and entertainment venues through the unique QR code on a visitor’s mobile device without sharing any private data. The digital credential also enables the Aruba government to restrict visitors from leaving their hotel rooms until they have received a negative PCR test result. Unlike immunity passports, these verifiable digital credentials are part of a technology known as decentralized identity, an evolutionary leap forward in individual privacy protection and security. It allows users to share only a trusted verification that they have the relevant documentation to complete their transaction, without having to share Personal information. This credential can be used across the journey. This trial is yet another step towards SITA developing a more durable, secure travel credential that could combine all travel documents such as passport, visa and health information into a single credential that puts the traveler’s privacy first. This credential will provide verification that a traveler has the right documentation, making border crossings more automated, with checks done before departure. Diana Einterz, SITA President for the Americas, said: “Giving travelers the ability to share verifiable health data with relevant stakeholders throughout their journey will help expedite the industry’s recovery. It is vital to ensure we open borders safely and securely, and this trial puts us one step closer to single travel token that will give passengers more control and convenience by allowing them to securely share their credentials with governments, airports and airlines from their mobile device.” A recent poll from IATA highlighted that 78% of passengers who took part in the survey would only use a travel credential app if they have full control over their data. Heather Dahl, CEO of Indico, said: “With the decentralized identity ecosystem we’ve built with SITA for Aruba, we’ve created a path to a better future. We can reopen economies and restart travel without people having to give up their privacy. We’re not just solving a pandemic problem, we’re solving a privacy and security problem. That’s because this technology was designed from the outset to respect a person’s right to control their own data and identity. Aruba and SITA have taken a global lead on privacy-first identity technology for travel, and their pioneering efforts are laying the foundation for a fairer world to come.” Dangui Oduber, Aruba’s Minister of Tourism, Public Health and Sport: “The Aruba Health App is fundamental in balancing the dual challenges of reopening our island to tourism while managing the risks of COVID-19. By providing a trusted traveler credential, we can be sure that visitors have the right documentation needed to move freely around the island while making the verification of that trusted status easy without having to divulge Personal information. That is a revolutionary step forward.” To see how the Aruba Health App, using blockchain, creates a secure travel credential, watch this video. About Indicio Indicio.tech provides development and hosting services for decentralized identity. Enterprise, consumer, and mobile applications run on Indicio’s network and use its comprehensive ecosystem of software to issue, verify, and exchange verifiable digital credentials. Founded on the belief in privacy and security by design, Indicio supports the open source and interoperability goals of the decentralized identity community. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Identity and application teams rely on Indicio’s simplicity, extensibility, and expertise to make identity work for everyone. Contact us here. About SITA SITA is the IT provider for the air transport industry, delivering solutions for airlines, airports, aircraft and governments. Our technology powers more seamless, safe and sustainable air travel. Today, SITA’s solutions drive operational efficiencies at more than 1,000 airports while delivering the promise of the connected aircraft to more than 400 customers on 18,000 aircraft globally. SITA also provides the technology solutions that help more than 60 governments strike the balance of secure borders and seamless travel. Our communications network connects every corner of the globe and bridges 60% of the air transport community’s data exchange. SITA is 100% owned by the industry and driven by its needs. It is one of the most internationally diverse companies, with a presence in over 200 countries and territories. For further information, go to www.SITA.aero",https://indicio.tech/paving-the-way-to-safer-travel/,,Post,,Meta,,Travel,,,,,,2021-05-05,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,Privatyze,,,,,Privatyze collaborates with Indicio to build a decentralized data marketplace,"“In a data-driven economy, we need a marketplace for data that’s inclusive and not exploitative,” said Heather Dahl, CEO of Indicio. “That means that participants can meaningfully consent to data transactions and do so in a way that enables zero-trust security. This just isn’t possible without decentralized solutions, and we are excited to work with Privatyze on building this solution.","The Privatyze team is excited to announce a collaboration with IndicioID to develop a robust, decentralized data marketplace. By Tim Spring Privatyze, an innovative startup for privacy-respecting data monetization solutions, has announced a collaboration with Indicio, a global leader in developing the infrastructure for trusted data ecosystems, to help guide the Privatyze team as they develop a robust, decentralized data marketplace using the Indicio Network. The new Privatyze platform will provide an open and secure peer-to-peer environment for market participants to trade data directly, with full user consent and support for enrollment and discovery. The result will be a more efficient, secure, and transparent platform than any offered by traditional data markets. Users can be compensated for the use of their data. It will mean that those with data to trade will be in complete control of the process, removing the data privacy and security problems of third-party data control. The use of verifiable credentials to manage authentication and decentralized identifiers to generate unique P2P encryption for every transaction will provide the zero-trust level of assurance needed for participation. “We are excited to partner with Indicio to deliver a world-class data monetization platform that meets the stringent security and privacy requirements of our customers,” said Madison Majeed, CEO of Privatyze. “With the rise of big data and the increasing demand for secure ways to monetize this information, this partnership represents an important step forward in the decentralization of data and the evolution of privacy-respecting technologies.” “In a data-driven economy, we need a marketplace for data that’s inclusive and not exploitative,” said Heather Dahl, CEO of Indicio. “That means that participants can meaningfully consent to data transactions and do so in a way that enables zero-trust security. This just isn’t possible without decentralized solutions, and we are excited to work with Privatyze on building this solution. This collaboration represents an important step towards the development of decentralized data markets and will help to promote data ownership, transparency, and privacy protection for all participants. About Privatyze: Privatyze is a San Diego based technology startup on a mission to end the Surveillance Data Economy, and siloed-centralized data monopolies, known to many as Surveillance Capitalism. To do this they’re enabling everyday people to take ownership of their digital footprint, data, and privacy and turn it into a valuable digital asset. Privatyze is empowering users to take control of their data by giving them the freedom to collect and store their own data and information, verify its validity, and take it to the marketplace; where they can claim their piece of the $300 billion big data and digital advertising industries. Privatyze was grown out of Launch Factory which brings together talented founders, elite advisors, seed capital, vetted technology, and an accelerator program to give entrepreneurs the unfair advantage they’re looking for. Launch Factory partner Bill Orabone said, “Privatyze is capitalizing on exactly the type of wide-ranging, big thinking opportunity we seek. Everyone knows that data privacy is a huge problem and Privatyze’s approach hits directly at its center.” Learn more about Privatyze at Privatyze.io About Indicio: Indicio builds software and infrastructure needed to create and manage Trusted Data Ecosystems for the exchange of high-value information, data assets, the creation of marketplaces, and the development of new business models around trusted data. TDEs simplify data compliance through privacy-by-design architecture and continuous Zero-Trust security. Specializing in financial, healthcare, and travel markets, Indicio’s global decentralized network and information management products enable customers all over the world to issue, hold, and verify data through encrypted digital credentials that can repeatedly and efficiently confirm data authenticity from its source without the expense or risk of direct integrations. Indicio TDEs boost bottom-line profit, mitigate costly risks and enhance a company’s reputation for information privacy. Originally published on EIN news",https://indicio.tech/privatyze-collaborates-with-indicio-to-build-a-decentralized-data-marketplace/,,Post,,Meta,,,,,,,,2022-03-03,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,What Indicio’s Seed Funding Means for Decentralized Identity Technology,"Our [new funding](https://apnews.com/press-release/AccessWire/technology-business-4dbf651613d76693bc479321f7b041f5) will be used to refine the open-source, decentralized-identity technology stack. We have the basic technology for a functional ecosystem, now we improve that functionality by adding all the features, user interfaces, and management tools that make it easier to deploy, use, and monitor.","Maturity and universality are now the goals Since the launch of Indicio two years ago, we have focused on identifying and filling in the gaps in decentralized identity technology that limited its functionality. Launching an open-source mobile agent and developing machine-readable governance were critical to delivering real-world solutions. As we saw how these solutions worked to solve problems, we came to understand how these components worked in concert to deliver what we now call a Trusted Digital Ecosystem. A Trusted Digital Ecosystem is our shorthand for the simultaneous impact of decentralized identity technology on authentication, communication, governance, privacy, and security. A Trusted Digital Ecosystem can be simple—indeed, as we advise our customers, it’s always better to start by solving the simplest use case first, see how the technology works, and then expand. Therefore, having a core set of interoperable components is essential. They can be arranged to manage increasingly complex interactions in any kind of informational ecosystem, and to facilitate interaction across ecosystems. Technology stack: From basic to universal Our new funding will be used to refine the open-source, decentralized-identity technology stack. We have the basic technology for a functional ecosystem, now we improve that functionality by adding all the features, user interfaces, and management tools that make it easier to deploy, use, and monitor. We will add new engineering team members to help design, implement, and test this new software. These engineering efforts will synergize with Indicio’s future technology roadmap which will focus on expanding the adoption of Trusted Digital Ecosystems in the marketplace. Our goal is for Trusted Digital Ecosystems to serve as complete solutions in many vertical markets and to be as easy to deploy as they are easy to use. Continuing our open-source community support The new funding will also enable us to continue and extend our work with the open-source community by providing leadership to working groups, code maintenance and review, and our own direct code contributions. We see open source as the key to both expanding decentralized identity and to continued innovation. At the same time, and following on our successful training workshops for the Hyperledger Foundation, we believe that ongoing education is critical. We have created the most comprehensive, hands-on training available in decentralized identity, whether in the basics or in the more technical aspects of the stack, and we will continue to refine these offerings—as well as develop new courses focused on how to implement complete solutions. Indicio has also pioneered “interop-a-thon” events, where companies and organizations come together to test their products and solutions for interoperability. With two interop-a-thons under our belt (hosted through the Cardea Project at Linux Foundation Public Health), we are more certain than ever of their importance to finding glitches, spurring adoption of standardized protocols, and fostering confidence in the technology. Interop-a-thons also provide a chance to see the future potential of verifiable credential systems that are interoperable, where, for example, a credential issued by a government can be used in a wide variety of contexts and thereby gain in value to issuers, users, and verifiers alike. It’s one thing to claim this as a possibility; it’s another to see it happen. We will devote more resources in the coming year to interop-a-thons because they are one of the clearest ways of visualizing the value of decentralized identity as a network of networks—and the clearest way to accelerate this happening.",https://indicio.tech/what-indicios-seed-funding-means-for-decentralized-identity-technology/,,Post,,Meta,,,,,,,,2022-05-04,,,,,,,,,,,,,
|
||
Indicio,SITA,,,,Aruba Health Department,,,,,"SITA, Indicio pave way to safer travel experience with launch of Aruba Health App","SITA, together with [Indicio.tech](https://Indicio.tech/) and the Aruba Health Department, are trialing the Aruba Health App, a pilot that makes it easy for visitors to share a trusted traveler credential – based on their health status – privately and securely on their mobile device. This credential will provide access to participating hospitality sites on the island.","SITA, together with Indicio.tech and the Aruba Health Department, are trialing the Aruba Health App, a pilot that makes it easy for visitors to share a trusted traveler credential – based on their health status – privately and securely on their mobile device. This credential will provide access to participating hospitality sites on the island. Aruba is focused on rebuilding its tourism industry in the wake of the COVID-19 pandemic while ensuring that the risk of infection from people visiting the island is minimized. The island has been less impacted by the pandemic than many other countries globally and is open to most tourists. To ensure their safety, and those of the island’s residents, all tourists are required to provide a negative PCR test taken 72 hours before flying. Using the Aruba Health App, visitors to the island who have provided the required health tests to the Aruba government will be issued with a unique trusted traveler credential, using blockchain technology. This credential then can be verified by hotels, restaurants, and entertainment venues through the unique QR code on a visitor’s mobile device without sharing any private data. The digital credential also enables the Aruba government to restrict visitors from leaving their hotel rooms until they have received a negative PCR test result. Unlike immunity passports, these verifiable digital credentials are part of a technology known as decentralized identity, an evolutionary leap forward in individual privacy protection and security. It allows users to share only a trusted verification that they have the relevant documentation to complete their transaction, without having to share Personal information. This credential can be used across the journey. This trial is yet another step towards SITA developing a more durable, secure travel credential that could combine all travel documents such as passport, visa and health information into a single credential that puts the traveler’s privacy first. This credential will provide verification that a traveler has the right documentation, making border crossings more automated, with checks done before departure. Diana Einterz, SITA President for the Americas, said: “Giving travelers the ability to share verifiable health data with relevant stakeholders throughout their journey will help expedite the industry’s recovery. It is vital to ensure we open borders safely and securely, and this trial puts us one step closer to single travel token that will give passengers more control and convenience by allowing them to securely share their credentials with governments, airports, and airlines from their mobile device.” A recent poll from IATA highlighted that 78% of passengers who took part in the survey would only use a travel credential app if they have full control over their data. Heather Dahl, CEO of Indicio, said: “With the decentralized identity ecosystem we’ve built with SITA for Aruba, we’ve created a path to a better future. We can reopen economies and restart travel without people having to give up their privacy. We’re not just solving a pandemic problem; we’re solving a privacy and security problem. That’s because this technology was designed from the outset to respect a person’s right to control their own data and identity. Aruba and SITA have taken a global lead on privacy-first identity technology for travel, and their pioneering efforts are laying the foundation for a fairer world to come.” Dangui Oduber, Aruba’s Minister of Tourism, Public Health and Sport, said: “The Aruba Health App is fundamental in balancing the dual challenges of reopening our island to tourism while managing the risks of COVID-19. By providing a trusted traveler credential, we can be sure that visitors have the right documentation needed to move freely around the island while making the verification of that trusted status easy without having to divulge Personal information. That is a revolutionary step forward.” To see how the Aruba Health App, using blockchain, creates a secure travel credential, watch this video.",https://www.sita.aero/pressroom/news-releases/sita-indicio-pave-way-to-safer-travel-experience-with-launch-of-aruba-health-app/,,Press,,Meta,,Travel,COVID,,Aruba Health App,,,2021-05-05,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Indicio’s Associateships: A starting point for the next generation of professionals,"Learning new skills in a new industry and technology is always good for growth, both Personally and professionally. It’s particularly important for those in college to be able dip their toes into different worlds off campus and get a sense of the kind of work they want—or don’t want—to pursue in life.","“I can’t think of anything I’ve done that I’ve been more proud of.” By Tim Spring Learning new skills in a new industry and technology is always good for growth, both Personally and professionally. It’s particularly important for those in college to be able dip their toes into different worlds off campus and get a sense of the kind of work they want—or don’t want—to pursue in life. Indicio is committed to finding and cultivating the next generation of technical and business professionals through our part-time Associates Program. Associates work side-by-side with experienced staff on all our key projects. It’s an intense learning experience with a ton of opportunities to learn new skills and take on responsibilities. Our unique position as an early-stage startup means there is much to do and our associates quickly get to see the impact of their work. It is not uncommon for new associates to be given a crash course on decentralized identity, maybe some relevant exercises to bring them up to speed, and then have them working on client projects within a week or two of accepting an offer. The team is, of course, always behind them to help with the quick transition and any questions they might have, but we believe our associates have what it takes to fix any problems a client may experience, and to interact with them to ensure our reputation for customer satisfaction at every step of the way. Specifically, this year saw the implementation of a more structured 90-day review and mentorship program that we are particularly excited about. While mentorship programs are fairly common, at Indicio we don’t require our associates to stay inside their team and, in fact, we encourage them to experience other parts of the business. An associate on the technical team might have a mentor in finance to give them a more well rounded view of the business and help with their career development. We hold 90-day reviews so associates and their managers can mark progress towards more long term goals and provide a chance for the associate to both receive and give feedback on the program. What former Indicio associates say about their experience “The Associate Program at Indicio has been great for me. I got the position while I was still in school, and I was able to work while I finished my degree. When I started, I only knew javascript, and I’d never worked on a mobile app before. I learned React Native on the job, and now I work on apps every day. After school, I went from associate to full-time without missing a beat, so the Indicio Associate Program totally jump started my career.” — Sabrina Jensen, Mobile Software Engineer, Indicio, former associate “I’ve appreciated the balance of challenge and support I’ve received as an Indicio associate. I’ve been able to work on client projects, take ownership of my tasks, learn new technical skills, and more with the help of a mentor and a team who is happy to answer questions and help out with issues.” — Char Howland, Software Engineer, Indicio, former associate “My associate experience at Indicio has been phenomenal. Initially, I was a little nervous; this was my first time programming for work rather than just for school. Beyond that, it was my first “office job” and my first time working from home. A few weeks into the program, after I’d gone through the orientation and trainings, I was given what turned out to be my favorite project I’ve ever worked on. Without going into too much detail, I was given a very basic structure and some project requirements and told to get to work. Of course, the team was super willing to offer help and guidance when I ran into snags, but in large part, I was given responsibility for the project. I’ve been working on that project as time permits ever since, and there’s always more to polish and update with it, but the bulk of the project is functional and working smoothly. I can’t think of anything I’ve done that I’ve been more proud of.” — Micah Peltier, Software Engineer Associate In the coming year, we’ll be running more events for associates, so they can get to know each other better outside of client projects; we’ll be providing more professional development, so that we can help associates structure and progress in their careers; and, critically, we’ll be giving them the time to find, develop, and work on a passion projects. If you are interested in becoming an Indicio associate keep an eye on our careers page!",https://indicio.tech/indicios-associateships-a-starting-point-for-the-next-generation-of-professionals/,,Post,,Product,,,,,,,,2022-03-08,,,,,,,,,,,,,
|
||
Indicio,AccessWire,,,,,,,,,"Indicio Launches Proven, A Complete Trusted Digital Ecosystem For Sharing Actionable, Trustworthy Data","Indicio Proven is how you get actionable data without sacrificing privacy or security, said Heather Dahl, CEO of Indicio. ""What makes data actionable is that it can be trusted. You can prove its source. You can prove it hasn't been faked or tampered with. Decentralized identity has long been seen as the solution to the interconnected problems of verification, privacy, and security. With Indicio Proven, the marketplace now has a range of ready-to-use products to implement that solution and create Trusted Digital Ecosystems for sharing and verifying data at any scale.""","Indicio Launches Proven, A Complete Trusted Digital Ecosystem For Sharing Actionable, Trustworthy Data Authenticate and share high-value data, make it immediately actionable, preserve privacy and enhance security with Indicio Proven™, a complete, open source solution for using decentralized verifiable credential technology. SEATTLE, WA / AccessWire / July 20, 2022 / Indicio, the market leader in developing Trusted Digital Ecosystems to verify and exchange high-value information, today announced the launch of Indicio Proven™, its flagship solution for authenticating and sharing high-value data while preserving privacy and enhancing security. Indicio Proven is an off-the-shelf, end-to-end system that delivers open source technology to help companies, organizations, and public sector agencies deploy and configure their own interoperable trusted digital ecosystems using verifiable credentials. Proven data means actionable data. Indicio Proven is a solution that moves at the speed of business. Traditional processes for verifying digital data and identity are complex, costly, and ineffective at dealing with the challenges of digital commerce in an age of increasing fraud and friction. Proven can be quickly integrated into existing systems in a cost effective way. Companies can develop customer-centric solutions to meet the demands of an evolving digital marketplace. ""Indicio Proven is how you get actionable data without sacrificing privacy or security,"" said Heather Dahl, CEO of Indicio. ""What makes data actionable is that it can be trusted. You can prove its source. You can prove it hasn't been faked or tampered with. Decentralized identity has long been seen as the solution to the interconnected problems of verification, privacy, and security. With Indicio Proven, the marketplace now has a range of ready-to-use products to implement that solution and create Trusted Digital Ecosystems for sharing and verifying data at any scale."" Indicio Proven makes decentralized identity technology simple. It provides complete scalable components needed to get up and running fast: - Issuer and Verifier Agents: Simple software to connect, issue, and verify credentials; integration APIs available - Mobile App and Mediator: Software to enable users to download, store, and use a credential on mobile devices - Machine-Readable Governance: Agent software to establish trusted issuers and automate information flows via governance files - Distributed Ledger Network: Configuration and deployment on existing Indicio Networks or any Hyperledger Indy-based distributed ledger network or a custom, public or private network - Verifiable Credential Schema: A flexible template for creating a verifiable credential using open source and interoperable standards - Support and Training: Continuous customer support, field-leading training covering every aspect of Proven and Trusted Digital Ecosystems - Maintenance and Updates: Managed updates and comprehensive testing to ensure maximum performance Indicio Proven is built on the Hyperledger Indy and Hyperledger Aries codebases, the most widely used open source code for interoperable, decentralized identity solutions, leveraging AnonCreds and W3C credentials - and with years of contributions from Indicio and the active developer community, the most robust and advanced. This makes Proven interoperable with other systems and components, so companies can build at a pace that meets their needs and scale as fast as they desire. And they get to do this on the systems they already own. ""Indicio Proven is the ‘easy button' for adopting verifiable credentials,"" said Ken Ebert, CTO of Indicio. ""There can be a steep learning curve to building with open source and we designed Proven to flatten that curve by providing everything a team would need to remove roadblocks. It means we provide expert professional hosting, support, training, and scheduled updates-and it means enterprises can immediately start issuing verifiable credentials and launch their own solutions quickly."" To discuss how you can use Indicio Proven to build Trusted Digital Ecosystems, contact us here. SOURCE: Indicio",https://www.accesswire.com/viewarticle.aspx?id=708970,,Press,,Product,,,,,Indicio Proven,,,2022-07-20,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Indicio expands workshops and introduces a new certification program,"Looking to get up to speed in decentralized digital identity and verifiable credentials? Our custom trainings and workshops are designed to provide key insights into fundamental use cases, dive deep into the impact of the technology, and offer both technical and nontechnical audiences the opportunity to ask the questions they need to continue their education in decentralized identity. Because nothing beats learning by doing.","Responding to strong demand, Indicio now offers certification upon completion of its virtual, instructor-led decentralized identity training. Technology provider Indicio.tech, a public benefit corporation advancing decentralized identity, today announced a new professional certification program. Program participants completing Indicio’s popular instructor-led workshops, including the newest workshops on the open source Hyperledger Aries Mediator Agent and Mobile Agent, are awarded certificates of completion issued in the form of a verifiable credential. “We first envisioned Indicio training as the gateway to understanding decentralized identity,” says Heather Dahl, CEO of Indicio. ”Decentralized technology, its terminology, and its technical process can be confusing to grasp on your own, even though the actual process of issuing, holding, and verifying credentials is surprisingly easy. So our thought was that we should give people the opportunity to learn about decentralized identity through hands-on activities guided by an experienced instructor. It turned out that this delivered the ‘aha’ moment for workshop participants and inspired them to want to know more.” “Building this into a certificate program recognizes the need and the demand for systematic training in our field,” says Dahl. “And it only makes sense the certificate is issued in the form of a verifiable credential, using the EntTrustient platform, an innovator in credentials for governments, education, employment, travel, health, and finance industries.” “We’re thrilled to support the issuance of Indicio’s training credentials for participants who’ve successfully demonstrated skills attainment and program completion.” says Tim Dutta, Chairman, and CEO of Entrustient, “Indicio’s workshops are designed to enable the next generation of trained professionals to be fluent in decentralized identity technology. By using Entrustient’s innovative, redundant, and decentralized distributed blockchain ledger platform, recipients of these credentials will now have full agency and the ability to provide a proof presentation of their earned accomplishments, to any third-party verifying organization that relies on certifications from Indicio, the original credential issuer.” Expanding decentralized identity curriculum The evolution of Indicio’s training workshops comes as companies and organizations are increasingly seeing decentralized identity as the solution to the longstanding challenge of verifying identity online, a critical process that has only become more urgent with the impact of the Covid pandemic on all aspects of our lives. Designed for participants ranging in technical experience, skill level, and area of specialization, Indicio’s workshops provide the groundwork needed for both business and engineering team members to build and run decentralized networks and the applications that run on them. Participants will receive a Certification of Completion and a verifiable credential that can be shared with peers and networks across social platforms and add ‘Decentralized Identity’ to their resumes. Indicio’s instructor-led workshops provide: - Hands-on learning covering each step of decentralized identity: These courses cover topics ranging from the fundamentals to network operations and agent mediation, with more courses expected to be added throughout the year. Additionally, self-paced labs and demos are also available providing further hands-on experience. - Certificate of completion: The instructor-lead courses conclude with a certificate of completion and professional emblem for sharing with networks and display on social media. - Continuing professional development: By participating in the Indicio certification community, individuals can use their training completion certification to gain access to networking opportunities and ongoing community events within the identity community. Indicio will soon be expanding its certification program with workshops on Mobile Agents, including mobile application user interface design, and user experience best practices. These listings target designers, graphics, and product teams providing the opportunity to learn more about the growing field of decentralized identity. Future planned workshops also include technical writing, communicating, and governance for decentralized identity-based products and networks. “Since Indicio launched its training program, we continue to expand our courses to meet the demand for more and more technical knowledge, says Ken Ebert, Indicio CTO. “But people also want to understand the business value of decentralized identity and learn about user experience and design. We are lucky to have some of the best engineers and business experts in decentralized identity on our team. They’re actively building identity solutions for global enterprises—so who better to teach and learn from?” To learn more about our workshops and how to enroll, click here. ABOUT US Indicio.tech provides technology development services for decentralized identity, and offers a complete software ecosystem for business, consumer, and mobile applications to issue, verify, and exchange verifiable digital credentials. Founded on the belief in privacy and security by design, Indicio supports the open source and interoperability goals of the decentralized identity community. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Identity and application teams rely on Indicio’s simplicity, extensibility, and expertise to make identity work for everyone.",https://indicio.tech/indicio-expands-decentralized-identity-workshops-and-introduces-new-certification-program/,,Product,,Product,,,,,,,,2021-03-08,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Prove it all with Indicio Proven,"Sometimes called “self-sovereign identity,” or “user-centric identity,” or “reusable identity,” the open source technology behind Proven provides an authoritative way to authenticate any data without having to check in with the source of that data — or go through a third party.","Indicio launches its marketplace solution to using and scaling verifiable digital credentials using open source technology. By Trevor Butterworth Today, we launch Proven™ — a complete, decentralized ecosystem for using verifiable credentials to create, share, and verify data of any kind while preserving privacy and enhancing security. It means the future’s here. Sometimes called “self-sovereign identity,” or “user-centric identity,” or “reusable identity,” the open source technology behind Proven provides an authoritative way to authenticate any data without having to check in with the source of that data — or go through a third party. “Verifiable credentials are a new foundation for trust in digital interaction,” says Heather Dahl, CEO of Indicio. “They are a simple, powerful solution to the internet’s missing verification layer for people — but they go way beyond that. They can be used to verify the identity of devices and digital objects; they can verify any kind of data associated with an identity. And they deliver breakthrough privacy and security features. This technology does so much that we say it creates “Trusted Digital Ecosystems.” Proven is designed to be a complete starter kit for creating your own Trusted Digital Ecosystem. It contains all the components needed to create, share, and verify data through verifiable credentials, along with continuous upgrades and customer support from Indicio’s experienced engineering team from integration through implementation. “We wanted to make it easy to start using verifiable credentials by giving people an off-the-shelf solution,” says Ken Ebert, CTO of Indicio. “To do that the product had to be complete. It couldn’t leave the customer searching for components and struggling with compatibility. It couldn’t leave the customer dependent on proprietary tech or struggling to master open source codebases. As leaders in the open source community, we believe open source is critical for adoption, scale, and innovation, but an unfamiliar codebase is still going to be a heavy lift for any development team. Proven removes this obstacle, while remaining fully open source so a customer can develop on it to meet their needs.” “We all believe in the power of this technology to solve chronic problems in digital interaction,” says Dahl, “but we also see Proven as a gateway to opportunity. We’ve seen our customers win awards for building Trusted Digital Ecosystems. Now, we want to seed that innovation as widely as possible. With our experience as market leaders in decentralized identity, we know what works. We know what’s needed. Proven is it.” To learn more about how you can implement Proven, you can get in touch here.",https://indicio.tech/prove-it-all-with-indicio-proven/,,Product,,Product,,,,,,,,2022-07-20,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Introducing the Indicio DemoNet—a new decentralized network for product demonstration,"The Indicio DemoNet joins the [Indicio TestNet](https://Indicio.tech/Indicio-testnet/), which is used for developing new technology releases, and the [Indicio MainNet](https://Indicio.tech/Indicio-mainnet/), which hosts mission-critical products and services. With the DemoNet, Indicio now provides a full suite of networks for decentralized identity development and deployment.","Business-critical decentralized identity product demonstrations now have a dedicated platform at Indicio. Indicio.tech, a public benefit corporation providing hosting and build services for decentralized identity ecosystems, today announced it has launched a new decentralized network to support business-critical demonstrations. The Indicio DemoNet joins the Indicio TestNet, which is used for developing new technology releases, and the Indicio MainNet, which hosts mission-critical products and services. With the DemoNet, Indicio now provides a full suite of networks for decentralized identity development and deployment. “The DemoNet completes our network offerings, filling an important gap in the journey from proof of concept to pilot to deployment,” said Heather Dahl, Indicio CEO. “Companies and organizations need a dedicated platform to demonstrate and showcase their technology before it makes the leap to public and commercial release. This is a critical moment for development teams, and Indicio has their back.” Typically, a company will develop on the TestNet, demonstrate on the DemoNet, and then launch on the MainNet. Unlike the TestNet, which is subject to repeated resets, the DemoNet provides the stability needed for product demonstrations. Indicio’s ecosystem of networks speeds up the time to launch—everything is in one place and supported by our industry leading team of engineers. The launch of the DemoNet comes after Indicio’s recent announcement of Cardea, a complete open-source project for verifiable health credentials that is now housed at Linux Foundation Public Health. Indicio now provides businesses and organization with a full ecosystem of decentralized identity products, all built on the Hyperledger community’s open source tools, a dynamic, collaborative Node Operator community supporting its networks, and a wide range of training and support services. Contact us to learn more.",https://indicio.tech/introducing-the-indicio-demonet-a-new-decentralized-network-for-product-demonstration/,,Post,,Resources,,,,,,,,2021-05-20,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,One woman’s open-source journey to decentralized identity,"Noha Abuaesh, a Bahrain-based computer scientist, has been exploring decentralized identity for the last year, often with assistance from Indicio.tech’s open-source tools and free communications channels.","Noha Abuaesh, a Bahrain-based computer scientist, has been exploring decentralized identity for the last year, often with assistance from Indicio.tech’s open-source tools and free communications channels. She took a moment to answer some questions about her work and her journey in the decentralized identity space. (The conversation has been edited for length and clarity) What’s your background and how did you come to be interested in decentralized identity? I graduated computer engineering in 2007. My graduation project was in robotics. I worked as a technical writer for a while before I completed a master’s in computer science in 2014. My thesis was on embedded systems. Then, I became a dedicated homemaker until late 2018, when I decided to explore my career possibilities. I felt like one of the seven sleepers, (who slept for about 300 years to avoid persecution; they appear in both the Quran and Christian tradition —ed.) if you know their story. Because in the insane world of computer science, if you snooze for four years—or even less—it can feel like you have been sleeping for centuries! I really didn’t know where to start. Long story short, I took a couple of professional courses, then a nanodegree on blockchain development. I built some projects on Ethereum and Bitcoin. But it wasn’t until last year when I came through a LinkedIn post on self-sovereign identity. I was intrigued. So, I started reading about it, took an edX course on self-sovereign identity and Hyperledger technologies. I was blown away with the potential of this field. I knew I wanted to continue there. I thought, if I am to plan for the next five years of my career, I want to be in this area. What kind of decentralized identity work are you doing now? I am now building my first SSI wallet using Hyperledger Indy, experimenting with what can be done with it and exploring its possible use cases. It is command line-based, just a proof-of-concept kind of a thing. I can work on the interface later (or maybe never). There are many cool general-purpose SSI wallets out there with pretty cool user interfaces. What are your goals in exploring decentralized identity? Anyone who learns about decentralized technology knows the enormous potential it has in many different applications. Decentralized identity will touch the lives of billions of people in the future. It promises to be the magic wand to a lot of the problems many people are living with now. Who doesn’t want to be part of that? I am looking to be a field expert in this area, inshallah. I am really hoping I can do something with it for the good of humanity. What Indicio resources have you used in your explorations? I used Indicio’s TestNet. They have a strong support system. They also have documentation with very clear steps for what to do and what to expect when you run your software development kit. I couldn’t find that anywhere else. I made use of that and other articles as well. At some point, I left a note on their website, not really expecting anything. I was surprised to find their response in my mailbox a couple of days later. They offered a FREE 30-minute session to answer my questions! What have been your challenges or obstacles to overcome? Installing the Indy SDK on Windows was challenging for me. Either because it is the very first thing I faced when I decided to work on this, or because it really is a cumbersome process. Nevertheless, I got that working at the end, thank God. The community is relatively small. At first, my questions didn’t seem to get any attention when I posted them in the public help group. When you compare that with other developer communities that, sometimes, answer your questions within an hour or two, well — it was disappointing. It is really tough when you are new to something and you don’t know where to go for assistance. However, Indicio’s engineering team, I have to say, are doing a great job serving the community in this area. Another challenge I faced was outdated documentation. It took me some time to get some of the information together. Also, some features are not very well-documented. At times, I had to refer to Jira threads to know what is going on. I am so glad they are keeping these public! What’s next for you as the field continues to accelerate? Well, to be honest, I really am not sure what my next step will be. But I am certain that the future holds good things both for me, and for decentralized identity. * * * Ready to begin your decentralized identity journey? Look no further than the Indicio TestNet, which provides an independent and reliable decentralized network for the exchange of verifiable credentials. Beyond the technology, Indicio also can provide concierge-level support and training to make your journey a successful one.",https://indicio.tech/one-womans-open-source-journey-to-decentralized-identity-with-the-help-of-indicio-tech/,,Post,,Resources,,,,,,,,2021-03-29,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Why the Indicio TestNet is the Best Way to Explore Decentralized Identity,"The Indicio Network contains three networks: a MainNet for hosting solutions, a DemoNet for demonstrating products, and a TestNet for development and experimentation. We’ve made the TestNet free for developers to use — making it ideal for exploring, building, testing, and demonstrating your ideas.","The Indicio TestNet is a robust platform that’s free to use for developers, making it the best way to dive into the powerful technology behind decentralized identity. By Tim Spring The Indicio Network contains three networks: a MainNet for hosting solutions, a DemoNet for demonstrating products, and a TestNet for development and experimentation. We’ve made the TestNet free for developers to use — making it ideal for exploring, building, testing, and demonstrating your ideas. Using the open-source technology of Hyperledger Indy, Ursa, and Aries, the Indicio TestNet has a stable 100% uptime, is monitored by professional staff, and offers limited technical support (for greater technical support we offer a range of highly competitive plans). Why Indicio? There are several decentralized networks that offer some form of testing network. But here’s why the Indicio TestNet is the best option: - It’s free - It’s professionally staffed and monitored: We don’t rely on volunteers to keep our networks running — our networks are supported by engineers whose job is to ensure they are always stable and accessible. - It’s stable: We understand the critical importance of network stability; the Indicio TestNet clocks at 100% uptime. - Tools are already set up for ease of use: We have a monitoring tool and a scanning tool already implemented; they can tell you which nodes are live and what has been recently written to the network at a glance. - Cross-network test ready: We see a future of interoperable credentials and networks. Our TestNet is ready to test your interoperability with other products from other networks. Issue and verify on multiple networks, explore the possibilities! - Straightforward governance: We believe in simple and streamlined governance with clear principles and rules. You’re here to build — not hear us philosophize! - Node Operator Program: For those that want to really dive into decentralized identity we offer the unique opportunity to host a node on the network. This will give you hands- on experience running part of the network as well as significantly more technical support. See the perks here: https://Indicio.tech/node-operator-program/ - Community resources: Because our network is based on open-source technology, we have a repository of additional resources we can point new users to for general information and assistance. This provides a more immediate place to seek help with your project in addition to the more hands on support offered by our technical team. The Indicio Network contains three networks: a MainNet for hosting solutions, a DemoNet for demonstrating products, and a TestNet for development and experimentation. We’ve made the TestNet free for developers to use — making it ideal for exploring, building, testing, and demonstrating your ideas. Using the open-source technology of Hyperledger Indy, Ursa, and Aries, the Indicio TestNet has a stable 100% uptime, is monitored by professional staff, and offers limited technical support (for greater technical support we offer a range of highly competitive plans). Why Indicio? There are several decentralized networks that offer some form of testing network. But here’s why the Indicio TestNet is the best option: - It’s free - It’s professionally staffed and monitored: We don’t rely on volunteers to keep our networks running — our networks are supported by engineers whose job is to ensure they are always stable and accessible. - It’s stable: We understand the critical importance of network stability; the Indicio TestNet clocks at 100% uptime. - Tools are already set up for ease of use: We have a monitoring tool and a scanning tool already implemented; they can tell you which nodes are live and what has been recently written to the network at a glance. - Cross-network test ready: We see a future of interoperable credentials and networks. Our TestNet is ready to test your interoperability with other products from other networks. Issue and verify on multiple networks, explore the possibilities! - Straightforward governance: We believe in simple and streamlined governance with clear principles and rules. You’re here to build — not hear us philosophize! - Node Operator Program: For those that want to really dive into decentralized identity we offer the unique opportunity to host a node on the network. This will give you hands- on experience running part of the network as well as significantly more technical support. See the perks here: https://Indicio.tech/node-operator-program/ - Community resources: Because our network is based on open-source technology, we have a repository of additional resources we can point new users to for general information and assistance. This provides a more immediate place to seek help with your project in addition to the more hands on support offered by our technical team.",https://indicio.tech/why-the-indicio-testnet-is-the-best-way-to-explore-decentralized-identity/,,Post,,Resources,,,,,,,,2022-02-01,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Indicio completes Hyperledger Indy DID Method—A Milestone in the Evolution of DID Interop,The Indy DID Method paves the way for Hyperledger Indy credentials to scale globally by allowing Indy networks to seamlessly interoperate and create a “network-of-networks” effect.,"The completion of the Indy DID Method by Indicio paves the way toward a network of networks. Verifiable credentials issued on multiple networks can now be verified by any agent that supports did:indy, affirming Hyperledger Indy and Hyperledger Aries as the most advanced framework for interoperable, decentralized identity By Ken Ebert Network interoperability has taken a major leap forward with the release of the Indy DID Method for Hyperledger Indy-based networks. With this new upgrade, completed by Indicio, verifiable credentials issued on a specific Hyperledger Indy network can now be resolved by any agent supporting the did:indy method regardless of which Hyperledger Indy network the verifying agent might use to anchor its issuing DIDs, schemas, or credential definitions. The Indy DID Method paves the way for Hyperledger Indy credentials to scale globally by allowing Indy networks to seamlessly interoperate and create a “network-of-networks” effect. The Indy DID Method was also needed to bring Hyperledger Indy—the most popular open-source codebase for creating robust distributed ledger networks for identity—into sync with the more recent World Wide Web Consortium (W3C) Decentralized Identifier (DID) Specification. The Indy DID Method originally began as a community development effort within Hyperledger Indy. Earlier this year, the government of British Columbia, Canada, announced a “Code with Us” challenge, to push the effort to completion. The challenge was won by Indicio and, thanks to the hard work of our talented engineers, the Indy DID Method is now available to the entire Indy community. It’s hard to understate why this is a really important step forward for decentralized identity adoption. It means adding the potential of scale to every deployment, which is something we and all our customers want. With interest in verifiable credential technology increasing every day, the timing could not be better. It’s also important to recognize that this is the kind of the rapid innovation that can be achieved in open source technology when a nonprofit-led community, a government, and an enterprise collaborate. We are enormously grateful to the government of British Columbia for sponsoring this “Code with Us” challenge. We also applaud it: This is a model for open source infrastructural innovation that governments everywhere should learn from and follow. The next step is for networks and agent frameworks to incorporate did:indy into production software stacks. This community adoption will increase the viability of the Indy and Aries project stack and position it to be the globally dominant way to issue and share verifiable credentials in a multi-ledger world. *** The Indicio team would like to thank BC Gov for funding this work and Dominic Wörner, another contributor to the Code With Us challenge, for his work on Indy VDR. - Where to find the work:PR to Indy Node: https://GitHub.com/Hyperledger/indy-node/pull/1740PR to Indy VDR: https://GitHub.com/Hyperledger/indy-vdr/pull/84Indy HIPE about did:indy: https://GitHub.com/Hyperledger/indy-hipe/tree/main/text/0164-did-indy-methodDemo: https://GitHub.com/Indicio-tech/did-indy-demo - Where to ask questions:Daniel Bluhm (Indy Node questions)Discord: dbluhm#9676GitHub: https://GitHub.com/dbluhmDominic Wörner (Indy VDR questions)Discord: domwoe#9301https://GitHub.com/domwoe",https://indicio.tech/indicio-completes-hyperledger-indy-did-method-a-milestone-in-the-evolution-of-decentralized-identity-network-interoperability/,,Post,,Standards,,,,,,,DID:Indy,2022-05-10,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,TOIP,,,,,Machine Readable Governance is the Key to Scaling Decentralized Trust,"We’re not convinced that “constraint” is the right theoretical approach for an emerging technology, especially one that is being deployed in different sectors for different use cases. To underscore this, we want to address a particular constraint implied by ToIP’s design concepts that is likely to be fatal to any deployment.","Where do you put a trust registry in a decentralized digital ecosystem? Not where it turns into a wrench The Trust over IP Foundation has just published a long document describing a set of design principles “to inform, guide, and constrain the design of… decentralized digital trust infrastructure.” We’re not convinced that “constraint” is the right theoretical approach for an emerging technology, especially one that is being deployed in different sectors for different use cases. To underscore this, we want to address a particular constraint implied by ToIP’s design concepts that is likely to be fatal to any deployment. This follows from the design concept of “transitive trust,” which can be summarized by the deduction that If A trusts B and B trusts C, then A can trust C. In other words, if a Verifier trusts an Issuer, it should logically trust a Holder bearing a digital credential that is verified as being from that Issuer. This is how passports work. To scale this “trust triangle” for ecosystems where there are many, many issuers of digital credentials, ToIP proposes that the triangle must become a “governance trust diamond,” where a governance authority rules on which Issuers can be trusted by Verifiers. This sounds reasonable and straightforward; someone, inevitably, is going to set the rules for an ecosystem and we need to acknowledge that someone in the architecture. How could any verifier know all the possible issuers of a particular kind of credential (say a lab test result) in anything but a very small network? Wouldn’t the simplest way be to ping a trust registry or a rules engine under the control of a governance authority to get that information? Yes and no. Yes, because all ecosystems are going to need governance; no, because governance handled through a centralized trust registry or rules engine will, at best, be inefficient, and at worst, be unworkable. If it doesn’t work offline, it doesn’t work. The fundamental problem with a centralized trust registry is that it’s dependent on real-time calling and this makes the whole system dependent on being able to make those calls. What happens when the connection goes down — or the Internet connection is weak or intermittent? You can’t have a trusted ecosystem that is only capable of delivering trust some of the time. There is, however, a simple solution to this fatal system error—decentralize the governance so that the trust registry rules are cached locally, in the software for issuers, holders, and verifiers. This means these rules will work offline. We call this “machine-readable governance.” Instead of calling the trust registry to verify in real time, governance authorities publish their rules in files that can be quickly uploaded and propagated across participants in a network. This has the added benefit of making verification quicker as there is no need to check in with an intermediary. Think of machine-readable governance as a “smart” trust registry — it makes the governance authority portable. There’s also another significant benefit to using machine-readable governance: it allows for more complex governance interactions such as “A trusts B and B trusts C, but A only trusts C for some purposes or in some contexts.” A machine-readable governance file makes these “if this, then that” governance rules easy to implement without any sharing of private information with a trust registry. Diamond of Trust or Ring of Power? We understand that in any ecosystem for verification and data sharing, there needs to be a governance function—where people get to enact governance as to who can participate and how. But it’s not clear that it is wise to encode this function in a third-party entity that functions as the sole source of truth for the entire network. What if some participants want to reject some or all the governance—should they be excluded from the ecosystem? Another advantage in avoiding a single centralized trust registry model is that it allows multiple governance authorities to coordinate governance rules in hierarchical ways, such as national rules superseding state or local rules. These multiple “opinions” are all available to verifiers who can then choose which combination is important as they evaluate presented credentials. The buck stops at the verifier, and nuanced interpretation is possible. This makes an ecosystem capable of mapping to the governance requirements that exist in the real world. The phone home problem. A centralized trust registry also raises the problem inherent to any centralized exchange: It knows your business—who’s verifying whom, and who’s using which credentials where. This kind of surveillance runs counter to the spirit of decentralized and self-sovereign identity—especially when you combine it with the next point: For whom the Trust Registry tolls. A centralized trust registry opens the door to monopolistic business practices and rent seeking. If you allow a third party to erect a toll booth, it will charge a toll. Great for the third party, not so great for everyone using the road. Unsurprisingly, when third-party trust registries and rules engines are created in the real world, this is what happens. Where sovereign authorities must be in control. In our experience, national governments and global enterprises want to be in control of the things they are supposed to control and are held accountable for. That’s why they prefer machine-readable governance to governance by third parties. For all these reasons, we recommend ToIP add the concept of machine-readable governance to its design principles and to explore the many ways it can be implemented. A machine-readable governance reference implementation For those interested in machine-readable governance, Indicio will shortly make a reference implementation available through the Linux Foundation Public Health’s open source Cardea Project for digital health credentials. On March 17, Cardea will then run a second “Interop-a-thon” for companies using Hyperledger Aries Agents to practice implementation in interoperable environments. Keep checking in with Cardea for more details—or join the Cardea Community Group! And, If you want to discuss how machine-readable governance could solve your information flow needs now, then contact us!",https://indicio.tech/where-do-you-put-a-trust-registry-in-a-decentralized-digital-ecosystem-not-where-it-turns-into-a-wrench/,,Post,,Standards,,,,Governance,,Trust Registry,,2022-02-15,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Scale Your Decentralized Identity Solution by Upgrading to the Indy DID Method,"Again, the Indy DID Method is not an optional upgrade. It’s a major development that delivers interoperability.","Indicio takes the stress out of managing this essential upgrade with two new integration packages By James Schulte The Hyperledger Indy DID Method is a major step forward in interoperability. As Indicio’s CTO, Ken Ebert put it, “The Indy DID Method paves the way for Hyperledger Indy credentials to scale globally by allowing Indy networks to seamlessly interoperate and create a ‘network-of-networks’ effect.” But this can only happen if all those currently using Hyperledger Indy and Hyperledger Aries update their operating systems, nodes, and agents to use the new Indy DID method. We can’t overemphasize this enough: If you want interoperability between Indy networks, you really have to have this. These code changes build in the resolution of DIDs, schemas, and other ledger objects to the network which contains them.Without these code changes it is very difficult to support multiple networks. We understand this can be a time consuming process and, in the case of upgrading agents, it could be a heavy lift. So why not let the authors of the code implement it for you? Enter Indicio’s DID:Indy Integration Service Packages! Two packages: One for agents and one for networks Upgrading customer agents is the most complex part of the process and where you’ll benefit most from knowledgeable implementation. We’ll also provide all the training you need to use the new update. Agent Package - Cloud Agent updates for issuers, verifiers, and holders - Mobile Agent updates for holders We can also upgrade your networks if you want to save time—or you don’t have a network operations person to do this work for you. Networks Package - Full network operating system updates - Node software updates for each running node Again, the Indy DID Method is not an optional upgrade. It’s a major development that delivers interoperability. We’re here to make it simple and stress free. Contact us for further information!",https://indicio.tech/scale-your-decentralized-identity-solution-by-upgrading-to-the-indy-did-method/,,Post,,Standards,,,,,,,DID:Indy,2022-05-23,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,The Perfect Signature Style is the Enemy of the One that Works Today,BBS+ signature styles are not going to be ready for deployment anytime soon. This is precisely why you should build today and in a way that allows you to add them later.,"BBS+ signature styles are not going to be ready for deployment anytime soon. This is precisely why you should build today and in a way that allows you to add them later. Sam Curren, Senior Architect New technology is inevitable: some of it will be evolutionary, some of it will be revolutionary; some of it will eat your business, and some of it might change the world and make life better. How do you know when to wait and when to jump? This is the dilemma for many people looking at decentralized identity right now. Is it ready to be implemented, are there “off-the-shelf” products I can use—or will it all be so much better next week or in a month or in a year? The dilemma seems to divide the decentralized identity community. Standards groups and initiatives advocate for the best possible solution, in the hope that it will eventually exist, while companies building solutions—like Indicio—say “build now because what we have works and works well and can be added to later. The ‘better’ may never come but the good—especially if open source—will continually get better and be relatively easy to upgrade.” But we believe our position is not just a matter of business logic: There’s a massive downside to letting failing technology—our current centralized and federated ways of managing identity—continue to fail businesses and consumers, citizens and governments. This downside is vastly greater than any of the differences between decentralized identity technologies that can be used today, those in development, and those hypothesized as being available sometime in the future. Don’t turn BBS+ into a minus This issue is not abstract. Right now, there is much discussion around JSON-LD BBS+ being “the” standard for managing verifiable credential key signatures in decentralized identity systems. The Good Health Pass, for example, recommends BBS+ for Covid digital credentials. BBS+ is good and Indicio is excited about adding it to its options for customers building decentralized solutions. But we can’t do so because BBS+ is still under development and it’s unclear when the final version will be available. Meanwhile, we have JSON-ZKP-CL Signatures that provide the ingredient BBS+ is working to add: privacy preserving predicate (zero-knowledge) proofs and blinded identity binding. Predicate proofs mean that you are able to generate a proof of something—such as age—without having to disclose the actual information, and they are a boon to preserving privacy. When thinking about BBS+, it is important to remember that credential format is just one part of a larger system that must be developed. Governance, Issuance and Verification agents, Holder apps, and more all need to be implemented; user experience must be developed; business relationships created: Decentralized identity is an ecosystem of infrastructure, software and governance working together as a product. All of these things can be deployed using existing production-ready credential formats. And the gains made now will translate into the future adoption of BBS+. The bigger point is this: Decentralized identity is at a breakthrough point. Governments in Canada and Germany have decided that verifiable credentials are the way forward; pilots and consumer products are being unveiled on a weekly basis. This is not the moment to say, “let’s wait; ” this is the moment to say “let’s scale.” At Indicio, we’ve shown how to make decentralized ecosystems work to solve real problems for lots of customers. In building, we’ve advanced the tech. In advancing the tech, we’ve built more solutions. This is the virtuous cycle of innovation and scale that we’re creating. We will add BBS+ into our products when it is available. But until then, we’re going to build solutions that BBS+ can be added to—and we think you should too.",https://indicio.tech/the-perfect-signature-style-is-the-enemy-of-the-one-that-works-today/,,Post,,Standards,,,,,,BBS+,,2021-11-22,,,,,,,,,,,,,
|
||
Indicio,Indicio,,,,,,,,,Trust Registry or Machine-Readable Governance?,"The world will move towards decentralized identity if we make it easy for them to do so—and easy means, above all, fast. The solution is machine readable governance—a smart way of implementing rules for how to manage trust.","The world will move towards decentralized identity if we make it easy for them to do so—and easy means, above all, fast. The solution is machine readable governance—a smart way of implementing rules for how to manage trust. If you want a high-speed train to go fast, you need the right kind of track. It needs to be laser-straight, have few, if any, crossings, and be free of slower freight trains. Unfortunately, the U.S. has, mostly, the wrong kind of rails: lots of crossings, lots of freight trains, and lots of curvy and unaligned tracks. One section of the Northeast Corridor can’t handle train speeds above 25mph. And while billions will soon be spent on new high-speed trains that are lighter, more capacious, and more energy efficient, they will still run on the same rails at the same speeds. As we race ahead with decentralized identity networks—Ontario’s announcement of its Digital ID program is the most visible sign yet that we are in an accelerating phase of a paradigm shift on identity—we face lots of infrastructural choices, the answers to which could put us in an Amtrak-like bind. If you think of a decentralized identity network as a set of rails that allow information to be issued, shared, and verified, this process should be as frictionless and fast as possible; and it is, because it is powered by software—called agents— that enable consent and trust at every point in the system. Once you decide that an issuer of a verifiable credential is trustworthy, verifying their credentials is straightforward. You can also apply all kinds of rules at the agent level to govern more complex information requirements in a frictionless, automatic way. A verifier agent could be programmed to accept only certain kinds of tests from a laboratory, or only tests from approved laboratories at a national or international level. The ability to do this instantaneously is essential to adoption. This is why machine-readable governance, which takes place at the agent layer, is integral to the successful deployment of any kind of decentralized trusted data ecosystem: It’s a real-time way to handle governance decisions—the Boolean choreography of ‘if this, then that’— in the most frictionless way possible. This also means that a network can organize itself and respond as locally as possible to the constant flux of life and changes in information and rules. Not everybody wants the same thing or the same thing forever. Machine-readable governance therefore functions as a trust registry—literally a registry of who to trust as issuers and verifiers of credentials—and as a set of rules as to how information should be combined, and for whom, and in which order. It can also power graphs—sets of connections—between multiple registries. This means that different authority structures can conform to existing hierarchical governance structures—or to self-organize. Some entities may publish their ‘recipe’ for interaction including requirements for verification, while others may simply refer to other published governance. When everyone knows each other’s requirements, we can calibrate machine-readable governance to satisfy everyone’s needs in the most efficient way possible. Choreographing this complex workflow at the agent level delivers the speed needed by users. The elements of machine-readable governance Machine-readable governance is composed of elements that help to establish trust and enable interoperability: trusted participants, schemas (templates for structuring information in a credential), and rules and flows for presenting credentials and verifying them. Machine-readable governance can be hierarchical. Once a governance system is published, other organizations can adopt and then amend or extend the provided system. In the diagram above, Government A has published a complete set of governance rules. Government B selected Schema 1 for use and added its own rule and flow to the governance from Government A. Federal Medical Assn. C created its own list of trusted issuers (C1, C2), selected Schema 1 for use, and layered customized governance on top of the governance that Government A publishes. State Medical Assn. D has taken the layered governance selected by Federal Medical Assn. C and duplicated everything except its list of issuers. If we have this fantastic, high-speed way to verify in decentralized networks where, then, is the Amtrak problem? It lies in the belief that the best way to do governance is to divert all traffic through a centralized trust registry. This trust registry will be run by a separate organization, third party business, or consortium which will decide on who is a trusted issuer of credentials—and all parties will consult this single source of trust. Here’s why this isn’t a good idea: First, the point of high-speed rails is speed. If you must ping the trust registry for every look up, then you have created a speed limit on how fast verification can take place. It will slow down everything. Second, a trust registry creates a dependence on real-time calling when the system needs to be able to function offline. A downloadable machine-readable governance file allows pre-caching, which means no dependence on spotty connectivity. Given that we want interoperable credentials, it’s a little bit naïve and first-world-ish to assume the connection to the trust registry will always be on. Third, a centralized trust registry is unlikely to be free or even low cost, based on non-decentralized real-world examples. Being centralized it gets to act as a monopolist in its domain, until it needs to interact with another domain and another trust registry. Before you know it, we will need a trust registry of trust registries. With each layer of bureaucracy, the system gets slower and more unwieldy and more expensive. This kind of centralized planning is likely to only benefit the trust registry and not the consumer. And it can all be avoided if governments and entities just publish their rules. The kicker is that as the trust registries currently envisioned cannot yet accommodate rules for choreographing presentation and verification, it’s literally a case of ripping up the high-speed track and replacing it with slower rails. Yes, the analogy with Amtrak isn’t exact. The tracks that crisscross the U.S. are legacy tech while decentralized identity is entirely new infrastructure. But trust registries are an example of legacy thinking, of bolting on structures conceived for different systems and different infrastructural capacities. We can, with machine-readable governance, have smart trust registries that map to the way governments, local, federal, and global, actually make decisions and create rules. We also move further away from a model of trust that depends on single, centralized sources of truth, and toward zero trust-driven systems that enable fractional trust—lots of inputs from lots of sources that add up to more secure decision making. But most of all, we use the rails we’ve built to share information in a secure, privacy-preserving way in the fastest, most efficient way possible.",https://indicio.tech/trust-registry-or-machine-readable-governance/,,Post,,Standards,,,,Governance,,,,2021-09-28,,,,,,,,,,,,,
|
||
Jolocom,,Jolocom,,Joachim Lohkamp,W3C; DIF; INATBA; ESSIFLab; EBSI; T-Labs; IOTA,"European Union, Germany, Berlin",Europe,,,JoloCom,"Jolocom builds global infrastructureto support decentralized digital identity management.Smart agents own and control the data that defines them, a prerequisite for self-sovereign identity. ",,https://www.jolocom.com,,Company,,Company,Enterprise,ID,SSI,,Smart Wallet,"Ethereum,BigchainDB","Verifiable Credentials,DID,Social Linked Data",2002,https://github.com/jolocom,https://twitter.com/getjolocom,https://www.youtube.com/channel/UCmpF6TdeLM2H6XcpZI2ceBg,https://stories.jolocom.com/,https://stories.jolocom.com/feed,,https://www.crunchbase.com/organization/jolocom,https://www.linkedin.com/company/jolocom/,https://jolocom-lib.readthedocs.io/en/latest/,,,,
|
||
Jolocom,Jolocom,,,,Solid,,,,,Trusted Data Sharing with Social Linked Data (Solid) and Ethereum,"At the core of Solid is the WebID, which Jolocom integrates with the Ethereum blockchain, to build a self-sovereign digital identity that allows you to represent yourself and to enrich your data with semantic meaning. Besides that and storing data, it also lets other applications ask for your data. Solid authenticates the DApps (Decentralized Applications) through Access Control Lists (ACLs) and if you’ve given access permission to the requester of the data, the Solid server delivers it.","Trusted Data Sharing with Social Linked Data (Solid) and Ethereum This post intends to give the reader a perspective on how Jolocom brings trusted data sharing to IoT (AGILE is a H2020 project). It should provide essential value to the user, not only but also in context of the internet of things, and not least to benefit from the EU’s General Data Protection Regulation (GDPR). The original idea of the World Wide Web To start with, the vision of Jolocom aligns with the original idea of the World Wide Web, which was distributed: everyone would have their own node (e.g. home page), everyone would share their content (e.g. blog posts), and everyone would own their own data. The web consisted of nodes connected through links with no center. Jolocom wants to help reclaiming this vision that everyone owns their own node (digital identity) and that every node can communicate with any other node, with no intermediation (e.g. centralized platform). The dominating power of a few Today a handful of companies dominate vast parts of the web’s activities — Facebook for social networking, Google for searching, Paypal for payments or eBay for auctions, Samsung/IBM for IoT — and they actually own the data their users have provided and generated. Ergo these companies have unprecedented insight and power over us. They can influence and nudge us without our knowledge, which gives them not only a huge competitive advantage, but also interferes with fundamental values of society and the right for privacy. Social Linked Data (Solid) and Blockchain (Ethereum) Jolocom uses a decentralized software architecture that is very promising. It was initiated by Tim Berners-Lee who invented the web and gave it to us as a gift, free and open source. His new project is called Solid (“social linked data”) and it allows you to own your own data, while also using it with only the applications you want to use. With Solid, you simply store your data in your own Personal Data Store (PDS; in Jolocom’s case: a Solid Server), which is hosted wherever you wish. At the core of Solid is the WebID, which Jolocom integrates with the Ethereum blockchain, to build a self-sovereign digital identity that allows you to represent yourself and to enrich your data with semantic meaning. Besides that and storing data, it also lets other applications ask for your data. Solid authenticates the DApps (Decentralized Applications) through Access Control Lists (ACLs) and if you’ve given access permission to the requester of the data, the Solid server delivers it. Here’s a concrete example.You might store data from your IoT devices or sensors in your own PDS: the sort of data about yourself that would normally be uploaded directly from your IoT device to a third party. That way if someone built a new DApp, to offer specialized services to people, you could join it by using your WebID. To share information with others (individuals or organisations), you simply give them permission to access the appropriate information in your PDS. The data in your PDS would remain your own, in every sense of the word: fully under your control, stored where you choose, and usable only by an Organization’s WebID that you’ve given permission to. The fantastic thing about Solid is that it does all this without having to centralize information in hands that we can’t- and too often also should not — fully trust. General Data Protection Regulation (GDPR) Users are becoming increasingly aware of the need and importance for strong data rights. Governments are slowly adapting to this, with the upcoming EU General Data Protection Regulation as the first move towards a market in which businesses will have to adapt with new business models and technical infrastructure. With the decentralized web as an answer to these needs, users will be able to use services they want to interact with, data will be stored in their own private location, and they will be able to switch between them. This will allow and encourage for a market with a significantly lowered barrier to innovate, one in which collaboration between players is much favourable over competition. Without the main competitive advantage of data, network effects and vendor lock-in will become virtually obsolete. We help businesses create and participate in collaborative decentralized ecosystems where the value generated by its services benefits the ecosystem as a whole. GDPR compliance is now mandated by May 2018. This means businesses are now required to show exactly how the data they collect is used and enables them to freely take this data with them to different services. Conclusion Social Linked Data with its decentralized architecture has the properties to profoundly enrich trust, data portability, and privacy. At the same time it will step up usability to a whole new level for both the user and service providers, while simultaneously becoming compliant to GDPR. Author: Joachim Lohkamp, Jolocom https://Twitter.com/JockelLohkamp",https://stories.jolocom.com/trusted-data-sharing-with-social-linked-data-solid-and-ethereum-in-the-internet-of-things-iot-7dc242944624,,Post,,Ecosystem,,DWeb,Web3,,,Ethereum,,2017-06-20,,,,,,,,,,,,,
|
||
Jolocom,Jolocom,,,,T-Labs; BigchainDB; IOTA; Riddle+Code,,,,,PRESS RELEASE: T-Labs (Deutsche Telekom) announces project with major blockchain startups,"Benefiting from the expertise in Berlin, T-Labs partnered with BigchainDB, IOTA, Jolocom and Riddle & Code to abstract the complexity of blockchain development for enterprises. With the prototype developers can combine different DLTs to enable decentralized storage, identity management, smart contracts and payments. This allows enterprises to build a decentralized back-end in a matter of minutes.","PRESS RELEASE: T-Labs (Deutsche Telekom) announces project with major blockchain startups The blockchain group, from the Deutsche Telekom Innovation Laboratories (T-Labs) launched its prototype operating stack service this week at the Bosch Connected World (BCW) 2018 conference and hackathon. The service was created to simplify the decision-making process for developers wondering which blockchain technology to use… Dear Reader, We have moved this article to Jolocom Logbook, our official new blog since 1st July 2020. For the full story, visit Jolocom.io/blog/press-release-t-labs-deutsche-telekom-announces-project-with-major-blockchain-startups",https://stories.jolocom.com/press-release-t-labs-deutsche-telekom-announces-project-with-major-blockchain-startups-e6ac451d8b3,,Press,,Ecosystem,,,,,,,,2020-07-04,,,,,,,,,,,,,
|
||
Jolocom,DWebMeetup,,archive,,,,,,DWebMeetup,Jolocom's lightning talk at DWeb meetup - Self-sovereign Identity In Germany,"A brief video introduction to use cases, strategies and challenges of the four German SDI projects.","Due to a planned power outage on Friday, 1/14, between 8am-1pm PST, some services may be impacted. Search the history of over 778 billion web pages on the Internet. Capture a web page as it appears now for use as a trusted citation in the future. Please enter a valid web address 112 Views Uploaded by Unknown on March 26, 2021",https://archive.org/details/jolocom-at-dweb-march-self-sovereign-identity-in-germany,,Video,,Ecosystem,,,,,Recap,,,2021-03-26,,,,,,,,,,,,,
|
||
Jolocom,Jolocom,,,,,,,,,5 years of decentralizing identities,"Joachim kicked off the conversation with a recap of Jolocom stemming from its initial founding in 2014. Why did Joachim decide to found Jolocom as a decentralized identity company? Read the full story here. Beyond the tech, Joachim also touched on the importance of community and building the space into a vibrant network of individuals committed to the values of decentralization. In this spirit, this was also the year he worked with Brewster Kahle and Wendy Hanamura of the Internet Archive to found Get Decentralized.<br>","We were elated to do just that with 50+ partners, friends, teammates — new and former — community members and +1s of all kinds at our new home in Betahaus Kreuzberg. 2014 Joachim kicked off the conversation with a recap of Jolocom stemming from its initial founding in 2014. Why did Joachim decide to found Jolocom as a decentralized identity company? Read the full story here. Beyond the tech, Joachim also touched on the importance of community and building the space into a vibrant network of individuals committed to the values of decentralization. In this spirit, this was also the year he worked with Brewster Kahle and Wendy Hanamura of the Internet Archive to found Get Decentralized. 2015-2016 Eugeniu then came on to speak about early Jolocom products from a developer’s point of view. Where we originated with the Freedom Box, a privacy-oriented Personal server running SOLID (social linked data) to enable Personal data storage wherever you wish, to our development of our first lightweight wallet, Little Sister (the opposite of Big Brother), to the decentralized identity Wallet, library, and protocol that Jolocom is proud to showcase today. It was during this time (January 2016) that we joined the Agile Horizon 2020 project to provide a trusted data sharing provider using SOLID and Ethereum for IoT devices. Want to learn more about our early work with Horizon 2020 and our tech of the day? Check here. 2017 Kai then expanded by providing an overview of some of our more recent work with partners and other community members like: - Deutsche Telekom T-Labs and Riddle&Code to build a fully decentralized emobility ecosystem - Stad Antwerpen, the Flemish government innovation procurement department, VICTOR, and Digipolis to bring decentralized identity services to municipal employees and citizens in Flanders - INATBA, the International Association of Trusted Blockchain Applications, of which Kai is a newly elected Board Member - Bundesblock, of which Joachim was a founding member, and with whom we authored the Self-Sovereign Identity Position Paper #SSIPaper with other leaders in the SSI space Many, many more were mentioned during the event. For more information on who we are working with, visit our partners page. 2018 Ira took the stage next to talk about design at Jolocom. Key to creating products usable by people are good UX and UI. Ira highlighted how she created a new visual identity for Jolocom with input from the team and built the interfaces you see when you interact with Jolocom — both online and off! — today. Ira also gave an introduction to #DWebDesign, one of three DWeb meetup communities. For more and upcoming DWeb Design events, visit the DWeb Berlin page on Meetup. For more on past events and words from Ira, check out our Design stories. 2019 Evolved from the GetD community but with its rebrand and launch in March 2019, DWeb continues to be a thriving community with chapters in Berlin and San Francisco with new branches (soon to be!) cropping up in Toronto and more. Ellie elaborated on Jolocom’s work as curators of the DWeb Berlin community with a look back at some of our past events, and a look ahead at the culminating event — DWeb Camp — slated to take place July 2019. Registration is open now. For more: Lastly, the newest members of our team introduced themselves and what they do here at Jolocom. On the development side that meant a brief hello from Charles, Sean Li and Mina, our newest developer from Cairo and from Sean, our technical writer. Visit our team page to find out more about who we are! From there, guests were encouraged to try out our demos, contribute feedback to DWeb Berlin, and get to know one another — which they did, until almost midnight. We want to thank everyone who came, and offer a special thanks to Ana Gogole from Moeco.io for taking so many of the above photos! See you next year!",https://jolocom.io/blog/5-years-of-decentralizing-identities/,,Post,,Meta,,,,,,,,2019-05-29,,,,,,,,,,,,,
|
||
Jolocom,Jolocom,,,,,,,,,Eight Years of Jolocom,"On our 8th birthday, we are grateful to look back on milestones, developments, and challenges that we have overcome. Beyond the tech, we believe in the importance of community and building a space for individuals and companies that are committed to the values of decentralization. We are happy to share our story. ","Jolocom’s role is to empower everyone and everything capable of having a self-sovereign identity to freely communicate and share information with each other. Based on this principle, our company was founded back in 2014. While a lot has changed since then, our mission still remains the same. Read the full story of why Joachim Lohkamp decided to found Jolocom as a decentralized identity company here. On our 8th birthday, we are grateful to look back on milestones, developments, and challenges that we have overcome. Beyond the tech, we believe in the importance of community and building a space for individuals and companies that are committed to the values of decentralization. We are happy to share our story. We especially and sincerely thank all of our partners, communities, advisors and individuals who have been contributing to what Jolocom is today. 2015-2016 These years saw the development of our first lightweight wallet, Little Sister (the opposite to Big Brother), the decentralized identity Wallet, library, and protocol. It was during this time (January 2016) that we joined the Agile Horizon 2020 project to provide a trusted data sharing provider using SOLID and Ethereum for IoT devices. Want to learn more about our early work with Horizon 2020 and our tech of the day? Check here. 2017 As cannot be stressed often enough, we strongly believe in the spirit of community and are thus proud to work with a bunch of partners that inspire and motivate us further. Some of those include: - Deutsche Telekom, T-Labs and Riddle&Code to build a fully decentralized emobility ecosystem - Bundesblock, of which Joachim was an intitator and founding member, and with whom we authored the first of its kind Self-Sovereign Identity Position Paper #SSIPaper, with many leaders from the SSI space - Stad Antwerpen, the Flemish government innovation procurement department, and Digipolis to bring decentralized identity services to municipal employees and citizens in Flanders - INATBA, the International Association of Trusted Blockchain Applications, of which Kai is a Board Member For more information on who we are working with, visit our partners page. 2018 Jolocom released its technical whitepaper, a comprehensive introduction to the Jolocom Protocol for digital identity. Those principles also found their way into our next edition of the SmartWallet (replacing the alpha version from March 2017), which was released in March 2018, running on our decentralized identity protocol. Furthermore, we joined the “Blockchain on the Move” Project, partnering with the Flemish government, the goal being to return control over their identity data back to citizens. 2019 One of our principles is that of interoperability. To apply it in practice, we took part in an interoperability focused proof of concept. The scope of the project was to achieve interoperability across a multistep use case, called OSIP 2.0. In the same year, we had the pleasure of traveling to the headquarters of Deutsche Telekom in Bonn for the official launch of Xride. This fully decentralized e-mobility pilot was initiated by T-Labs and built in collaboration with Riddle & Code, Bundesdruckerei, Simple Mobility, and Jolocom. Find out more about our role in this project here. Started in 2014, the DWeb community took off in February 2019 and soon formed a thriving network of Nodes with Jolocom as the leader of DWebBLN Node (formerly Digital Identity Meetups Berlin). 2020 Starting the new decade right, Jolocom participated as SSI technology provider in 4 of 11 competing regional showcase projects developing ecosystem solutions. The Schaufenster Sichere Digitale Identitäten (SDI) innovation competition was funded by the German Federal Ministry for Economics Affairs and Energy. Eventually, our projects ONCE, SDIKA, and ID-IDEAL were selected for a three year implementation phase. The SDI projects are special because they bring a broad and diverse group of stakeholders to the table who are working together to kick-start an ecosystem for decentralized identities. Interoperability is at their core, so that the use of digital identification does not remain inefficient. Furthermore, the Jolocom SmartWallet passed a GDPR compliance audit, a major milestone to becoming fully production-ready. 2021 In spring we began the implementation phase for the project ONCE, followed by ID-Ideal in the summer of 2021. In the initial phase, the projects focused extensively on the development, implementation and integration of the technical components. Soon, however, the implementation of first use cases from the areas of public administration, mobility and tourism will come to the fore. Over the years, Jolocom pulled off a great number of project implementations. The focus here is of course on SSI. But Jolocom has also been able to acquire a lot of knowledge through active work in the field of technology or in networks of various bodies (e.g. W3C, DIF, INATBA) and consortia (e.g. eSSIF-Labs, SDI lighthouse projects). All that, in turn, helped to build out our core solution stack. A great example is the ConDIDi project, where Jolocom built a use case for decentralized conference participant management together with the TIB Leibniz in Hannover, which can be tried out here. Tech developments and projects with Jolocom are constantly evolving. To Jolocom the community matters, and we believe that core protocols need to be shared to enable an open infrastructure. This is why Jolocom implemented two essential building blocks for SSI (DIDCommv2 and KERI) in Rust, and donated them to DIF (Decentralized Identity Foundation). Also, Jolocom implemented the did:stack:v2 for Stacks Foundation. Jolocom’s SSI integration for Stacks addresses the issue of properly verifiable and secure information from different parties. 2022 The year has started by launching the Jolocom Agent to further compliment the full Jolocom end-to-end solution. The Agent is for creating, issuing and managing verifiable credentials and define, perform and manage verifications. The Agent provides a Graphical User Interface for ease of use and rapidly bulding out your use case. An APIs is avalilable for a seemless integration in your backend. Of course much more has happened. A good way of keeping track of our progress, partnerships, community and more is getting the monthly Jolocom’s SSI Digest in your inbox. What’s next: beside the launch of multiple use cases in the second half of 2022, we are also working hard to ship the Jolocom SDK 2.0 to you. Stay tuned!",https://jolocom.io/blog/eight-years-of-jolocom/,,Post,,Meta,,,,,,,,2022-05-24,,,,,,,,,,,,,
|
||
Jolocom,Jolocom,,,,,,,,,FAQ,"We get a lot of question in regard to what we are doing and what our solution is all about. We think now is a good time to do our first round of FAQ. This post is intended to be a living document. We will update it frequently to keep it current and relevant. So in case you have questions that are not covered here or on our webpage, feel free to drop it as a comment directly under this post or contact us via Twitter.","FAQ We get a lot of question in regard to what we are doing and what our solution is all about. We think now is a good time to do our first round of FAQ. This post is intended to be a living document. We will update it frequently to keep it current and relevant. So in case you have questions that are not covered here or on our webpage, feel free to drop it as a comment directly under this post or contact us via Twitter. Questions covered are: - What Problem does Jolocom solve? - Who is the target group? - What is the SmartWallet? - What are claims and verified claims? Where is the difference and why should I care? - How does Jolocom treat my data and what is the Personal store? - If developers are interested, what can they use? - What are the advantages for services/apps/dapps? - What do you store on the Blockchain? - How can I verify my information? - Can I also verify my own claims or from others?…coming soon - Can I find the app on appstore? - What is your Roadmap? …coming soon - How can I contribute? …coming soon Let’s get started! 🚀 What problem does Jolocom solve? It’s 2017. When we look around us we see a world which is dominated by data slavery. In this world a few big companies own a big share of your data. Although you are the one generating this data, you don’t control it nor own it, and you are definetly not the one monetizing it. You pay with your privacy and navigate the chaotic digital lanscape with numerous usernames and passwords suppling the data silos of big corporations with a never ending stream of your Personal data. At Jolocom we want to change this. We think that we need to move away from data slavery to a world of data sovereignty where you are in control of your data. Taking control of your data starts with your Personal information — with your identity. We are developing a solution to realise this vision. Who is the target group? Jolocom’s user facing app is the SmartWallet. This app is geared towards data conscious people who want to take back control of their data footprint and effortless navigate the digital world. What is the SmartWallet? The SmartWallet is our app facing the user. Think of the SmartWallet like your normal physical wallet where you carry your IDs and money, but with smart functionalities on top. The SmartWallet is an app that lets you manage your identity related data like email address, phone number, or ID card. Get an overview over your Personal details and verified information easily with our app. Like with the physical wallet in the real world, you can use the SmartWallet in the digital world to identify yourself or pay for things (currently only ether are supported). So next time when you would like to register with a car sharing company, you can use the SmartWallet app and log in with one click instead of creating usernames and passwords. Privacy lies at our heart, so we make sure that your data stays private and you are always aware and in control which data you share with whom. So in the case of the car sharing company, you would be prompted with a screen which shows exactly the information requested by the company, leaving you the choice to accept or decline their request. All your data is stored and managed in your ‘Personal store’ which you control. Jolocom has no access to this store, nor does anybody else without your permission. In case you grant access to another party to information in your store, like your drivers’ license in our car sharing company example, you can always revoke it. Moreover, you can always quickly and easily check which service has access to which data. This provides you with an instant overview over your connections. As our app supports ether, the SmartWallet enables you to plug into the Ethereum infrastructure seamlessly. So when other applications provide smart contracts functionality to support their service offering, you can access them conveniently with our app. What are claims and verified claims? Where is the difference and why should I care? A claim is a piece of information that you or others say about you, like your email address for example. Now when you communicate your email address to your friend, she will likely assume that this claim is true because you know her since many years and your relationship is based on mutual trust. However, when you deal with parties where no previous relationship and no trust is in place, providing a claim is likely not enough. You would want to trust the information without necessarily trusting the party involved. This is where verification comes in handy. A verified claim is a piece of information about you which was checked and cryptographically signed by another party. It is always transparent who the other party is — and when you accept or provide the verified claim, you can choose to trust this information without knowing the owner of the claim itself. So when you present your verified claim of let’s say your drivers’ licence, which was signed by an official authority, during the registration to a car sharing service, they will likely accept it. Because you communicate these verified claims digitally, it makes your life easier. Now you can freely move between different applications and on-board on new ones with just a couple of clicks. How do you treat my data and what is the Personal store? Your data is stored in your Personal store. As we have decentralized the whole logic to manage your identity details, permissions etc., we don’t execute any processes for you. You can navigate the digital world independently. For this, you need to enable your Personal store. This is done during our onboarding phase. The Personal store can run anywhere you feel comfortable with, but it has to be availbale ideally 24/7. A common solution to this can be a hosting provider (which is e.g. located in Iceland), a freedom box, or if you already have an own web server, set up your Personal store there. Don’t worry, we tried to make this process as easy as possible and pre-selected some suggestions for you. We think a Personal store is very important. Especially if we think ahead to a world where Dapps just rent your data with your permission instead of owning it. If developers are interested, what can they use? Developers can implement the functionality of Single-Sign-On (Jolocom SmartLogin) on their web application so that users with a SmartWallet can easly onboard and login. The advantage is that with this implementation neither you nor your customers/users need to take care of usernames and passwords anymore but can use verified information from the user. Moreover, developers can use the Jolocom SDK to ‘connect’ the functionality of their smart contracts to the SmartWallet. This has the big advantage that users with a SmartWallet can use these smart contracts easily. It also removes the need for the app/dapp developer to implement ether transactions functionality or educating/onboarding new users. This also allows the developer to display method descriptions of used smart contracts to the SmartWallet user in a very user-friendly way which brings transparency to the whole process, also for non-technical people. What are the advantages for services/apps/dapps? The advatages for a service or dapp is that you can control which minimum information you require from users of your application and if this information has to be verified. This removes a lot of headaches, like checking if you are e.g. a human and not a bot, implementing an own verification department to check drivers’ licences, or storing sesitive data of the user. Note that this functionality is made available to you through our SmartLogin solution. Moreover, using our ethereum integration (Jolocom SDK) you can implement smart contract logic and make it available for your users though our SmartWallet. It builds a bridge to the blockchain world for your users. What do you store on the Blockchain? We don’t store anything on the blockchain per default. However, we have Ethereum integrated in our solution which gives you the opportunity to create an identity contract with the SmartWallet app. When you do this, the only information that we would store is your identifier and your public key. No Personal information like your name or email address will ever be stored on the blockchain. How can I verify my information? When you sign up, you can get your phone number and email address instantly verified with Jolocom. For verification like e.g of your ID card or drivers licence you would need to go to a partner which carries out the verification process and signs your claim (like e.g. a bank for ID card verification). We are currently working on establishing a network of these partners so that you can find one conveniently in your city. Remember that you would ideally just do the verification process once and be able to reuse the signed claim on every following interaction that requires the verified information in question. Can I also verify my own claims or claims from other people? Coming soon. Can I find the app in the app store? Currently only the web application is available in an alpha release. The app for iOS and Android will be published soon. What is your Roadmap? Coming soon. How can I contribute? Coming soon.",https://stories.jolocom.com/faq-34d24e2579d2,,Post,,Meta,,,,,,,,2017-10-12,,,,,,,,,,,,,
|
||
Jolocom,Jolocom,,,,,,,,,ID-Ideal as an integrated solution,"The starting point is the fact that every user has 70 digital identities online. Why not introduce one single secure digital ID solution to merge all of those identities? ID-Ideal is one way of many, offering an integrated solution or a middle way so that many identities can be supplemented by a single, secure digital ID solution.","In the ID-Ideal project, Jolocom brings sovereign identities and SSI into a single wallet. To improve efficiency and interoperability in the digital space, Jolocom presents the ID-Ideal project. The starting point is the fact that every user has 70 digital identities online. Why not introduce one single secure digital ID solution to merge all of those identities? ID-Ideal is one way of many, offering an integrated solution or a middle way so that many identities can be supplemented by a single, secure digital ID solution. The project is part of the competitive innovation program “Showcase Secure Digital Identities” (SSDI) funded by Germany’s Federal Ministry for Economic Affairs and Energy (BMWi) and one of four projects that qualified for the implementation phase. Jolocom is a partner in three of the four SDI implementation projects, to which it will lend its expertise in self-sovereign identity and years of experience in developing digital identity wallets. Other projects include “ONCE” and the “SDIKA” project. The SDI projects are special because they bring competitors around the table who are working together on one solution. In addition, they achieve interoperability, so that the use of digital identification no longer remains inefficient. ID-Ideal’s implementation phase began in May 2021. Current developments relating to the European eIDAS regulation are also taken into account, with the clear aim of providing future-orientated solutions on the subject of digital identities. Citizens should be able to act from the comfort of their home – be it when changing their place of residence, applying for a care center or other notifications of changes. The solution should not be used regionally but across Germany and Europe. Based on the so-called ID-Ideal Trust Framework, the technical, legal, and semantic interoperability of services and applications is regulated. The aim is to stimulate the creation of an ID ecosystem. If many ID services can coexist, the overarching exchange of digital evidence becomes possible. This is how we actively create trust between actors inside the digital space – one of the basic principles of Jolocom. The idea behind ID-Ideal is to create a basis for existing and future identity services. To achieve this, interoperability between the individual ecosystems is necessary. For this reason, the project mainly focuses on key aspects: A) Trust Framework: harmonize various ID services and create standards for secure digital identities B) High relevance to everyday life and very good usability to increase the incentive C) Establishing a TrustNest initiative: an open community that promotes certification, exchange, and further development Would you like to find out more about the ID-Ideal project? You can findits official homepage here: https://id-ideal.hs-mittweida.de/ and more information at the BMWi: https://www.digitale-technologien.de/DT/Navigation/DE/ProgrammeProjekte/AktuelleTechnologieprogramme/Sichere_Digitale_Identitaeten/Projekte_Umsetzungsphase/IDideal/IDideal.html Partner HTW Dresden, Hochschule Mittweida, Landeshauptstadt Dresden, Stadtverwaltung Leipzig, Stadtverwaltung Mittweida, Jungheinrich AG, EXXETA AG, EECC GmbH, Fraunhofer FIT, Jolocom GmbH, AUTHADA GmbH, evan GmbH, KAPRION Technologies GmbH, Stromdao GmbH, SQL Projekt AG",https://jolocom.io/blog/id-ideal-as-an-integrated-solution/,,Product,,Product,,,,,IDIDeal,,,2021-11-17,,,,,,,,,,,,,
|
||
Jolocom,Jolocom,,,,,,,,,Jolocom goes Consulting,"Our consultant focus is of course on decentralized digital identity (aka Self Sovereign Identity). Our advantage is that Jolocom has been able to build up enormous know-how through projects over the past 7 years. Not only was knowledge in the technology area built up, but also an extensive network through active work in committees (e.g. W3C, DIF, INATBA, ESIF / EBSI), associations (e.g. federal bloc) and consortia (e.g. shop window projects digital identities).","Jolocom macht jetzt auch Beratung? Eigentlich macht Jolocom schon immer Beratung. Neben der technischen Kompetenz, den Projekten und den Aktivitäten in Gremien, Verbänden und Vereinen haben wir unsere Kunden in den letzten 7 Jahren natürlich auch beraten. Aufgrund des Wachstums von Jolocom haben wir uns dazu entschlossen, die Beratung weiter auszubauen, um unsere Kunden noch besser unterstützen zu können und die Bereiche klarer zu trennen. Im Folgenden ein Auszug aus den Aktivitäten der letzten 7 Jahre: Was kann Jolocom, was andere nicht können? Unser Beraterfokus liegt natürlich auf Dezentraler Digitaler Identität (aka Self Sovereign Identity). Unser Vorteil liegt darin, dass Jolocom in den letzten 7 Jahren enormes know-how durch Projekte aufbauen konnte. Es wurde nicht nur Wissen im Technologiebereich aufgebaut, sondern auch ein umfangreiches Netzwerk durch aktive Arbeit in Gremien (z.B. W3C, DIF, INATBA, ESIF/EBSI), Vereinen (z.B. Bundesblock) und Konsortien (z.B. Schaufensterprojekte Digitale Identitäten). Jolocom deckt alle Bereiche und Ebenen ab, von erstem Kontakt mit dem Thema digitale Identitäten, Erstellung der Strategie, Auswahl der Technologie bis zu einer Implementierung. Dies ermöglicht eine effiziente Beratung aus erster Hand ohne zeit- und kostenintensive Umwege. Beratung vom Anfang bis zum Ende Betrieb ohne Umwege. Unsere Prinzipien – offene Plattform Wir sind davon überzeugt, dass eine dezentrale Identität eine offene Plattform mit einheitlichen Standards sein muss, unabhängig von einer einzelnen Lösung. Produkte und Lösungen müssen interoperabel sein. Nur so können die Potentiale der Technologie für den Einzelnen, für den öffentlichen Bereich und die Privatwirtschaft umgesetzt werden. Dieser Grundsatz leitet uns in der Beratung und in der Erstellung unserer frei verfügbaren Plattform. In der Beratung sind wir daher technologieunabhängig und verfolgen das Ziel, dass ein Vendor lock-in oder andere Abhängigkeiten vermieden werden. Was bietet Jolocom Consulting? Diesen Beratersatz kennen sie sicher: “Jeder Kunde ist individuell”. Das stimmt auch. Wir haben trotzdem Kategorien von Beratungstätigkeiten definiert, um die Orientierung zu erleichtern. - Your soil – Aufbau von Wissen: “Verfügt ihr Unternehmen über das benötigte Wissen, um die Auswirkung und Chancen von dezentralen Identitäten richtig abzuschätzen?” Der Wissensaufbau ist wichtig bevor man sich über Anwendungsfälle den Kopf zerbricht. Das Wissen kann durch Vorträge, Workshops oder Coaching Session vermittelt werden. - Your ideas – Ideen und Anwendungsfälle: “Welche SSI Anwendungsfälle können helfen Geschäftsprozesse zu vereinfachen oder Neue zu entwickeln, und sind diese umsetzbar?” In dieser Phase können wir mit Ihnen Anwendungsfälle erarbeiten, bewerten und reihen. - Grow ideas – Ausarbeitung und Planung: “Was brauche ich und was bedeutet es einen Anwendungsfall umzusetzen?” Haben sie eine vielversprechende Anwendung identifiziert können wir bei Strategieentwicklung, Business Case Erstellung, Budget- und Projektplan, Architekturerstellung, Buy/Build/Join Entscheidungen, Lieferantenauswahl, Ausschreibung, Konsortienbildung/-suche/-beitritt, etc. unterstützen. - Implement ideas – Umsetzung: “Wie kann ich sicherstellen, dass die Implementierung so läuft wie geplant?” Jolocom Consulting unterstützt sie, egal für welche Produkte oder technischen Partner sie sich entscheiden. Jolocom Consulting unterstützt bei Projekt-/Konsortienmanagement und als Spezialist, damit die Lösung wie geplant in Betrieb geht und die Potentiale realisiert werden. Für alle Fragen betreffend Consulting: (hello[at]Jolocom.com).",https://jolocom.io/blog/consulting-ssi/,,Product,,Product,,,,,,,,2021-09-28,,,,,,,,,,,,,
|
||
Jolocom,Jolocom,,,,SSI Position Paper,,,,,A universal identity layer we can only build together,"We have recently published a paper that aims to take the first step towards the discussion of self-sovereign identity based on a shared consensus. A shared consensus of the concept and terminology as well as important topics such as standardization, privacy and security. That paper is Self-Sovereign Digital Identity: A position paper on blockchain enabled identity and the road ahead. Our decision to write it as a community was first motivated by the lack of objective material on the topic and then the resulting confusion and skepticism we ran into discussing it.","We have recently published a paper that aims to take the first step towards the discussion of self-sovereign identity based on a shared consensus. A shared consensus of the concept and terminology as well as important topics such as standardization, privacy and security. That paper is Self-Sovereign Digital Identity: A position paper on blockchain enabled identity and the road ahead. Our decision to write it as a community was first motivated by the lack of objective material on the topic and then the resulting confusion and skepticism we ran into discussing it. ‧ ‧ ‧ A typical question for a self-sovereign identity company: “It sounds like you are all doing the same thing, how will this ever work?” Over the last two years, we have seen more and more companies join the community wanting to build a decentralized identity solution, where individuals truly own and control their data. With more actors, the question of competition came up frequently. We have since been busy explaining the idea of a universal identity layer both as Jolocom but also as a community. A great illustration of the difference between the universal identity layer and competing identity platforms is that of email and messengers. Your email account allows you to send messages to everyone, no matter whether they use Gmail, GMX, Posteo, a company email or something else. When you open a messenger app on your phone, the world looks totally different – you can only communicate with people that also use that exact messenger. As a result, people have to use multiple messenger apps to stay connected with their friends. This can get out of hand quickly, leaving you with numerous siloed apps that ultimately all do the same thing: they send a message (text, emoji, photo, etc.) to a friend. The world of digital identity looks a lot like the messenger world today, forcing users to create multiple siloed identities throughout their digital life. The idea of a universal identity layer is to build for identity what email has provided for communication — a protocol that can be used by everyone based on open and interoperable standards. A universal identity layer is only possible if we collectively build and maintain the building blocks of self-sovereign identity as an open source commons for everyone to benefit from. The #SSIpaper is published at a very important time in the development of self-sovereign identity. With the emergence of the Decentralized Identity Foundation in 2017 and the earlier efforts by W3C and others, we are now leaving the stage of research and proof of concepts and rapidly entering a new phase of beta versions and, soon, production systems. To achieve the vision of a universal identity layer, we need to make sure that these systems don’t just allow their users to own and control their identity but also work openly, hand-in-hand across all associated technical layers.For self-sovereign identity to be credible and trustworthy, it can’t be owned or controlled by any company. It must be built and maintained by a global community that shares the vision of a decent, decentralized web. While we see great efforts towards interoperability and active discussion in the self-sovereign identity community, the wider world is very much at the beginning of this journey.Self-sovereign identity offers something radically different: a new type of platform that’s not strong because of exclusivity, but through its radical openness and interoperability. Given our active involvement in spreading this message in Germany and Europe on behalf of both Jolocom and the German Blockchain Association, we compiled a first resource for those interested in learning more about self-sovereign identity and the prospect of a universal identity layer. ‧ ‧ ‧ With authors from the German self-sovereign identity community, we started to assemble a first draft of this paper. We then contacted our wider community and went through two phases of extensive peer review. This gave the paper contributions from 26 individuals and 12 different identity companies. The result is a position paper that presents a consensus within this emerging industry on the status quo of self-sovereign identity and the road ahead, providing targeted calls to action for all stakeholders. Readers are encouraged to provide feedback on social media using the hashtag #SSIpaper. And all feedback, questions and comments using #SSIpaper are uploaded automatically to the Bundesblock website. See what’s been said so far: http://bit.ly/ssipaper_feedback Follow the live discussion ↗",https://jolocom.io/blog/a-universal-identity-layer-we-can-only-build-together/,https://jolocom.io/wp-content/uploads/2020/04/10-principles-of-ssi-icons-1024x640.png,Post,,Resources,,,,,,,,2018-10-23,,,,,,,,,,,,,
|
||
Jolocom,Jolocom,,,,,,,,,"A Decentralized, Open Source Solution for Digital Identity and Access Management","The protocol logic encodes a granular, claim-based model of identity that is highly generalized and unrestrictive in scope in order to accommodate a multiplicity of potential use cases and broad range of subjects of identity (users), including individual persons as well non-person entities like organizations (e.g. companies, governmental bodies), IoT devices (e.g. hardware/software agents), and autonomous agents (e.g. DAOs).",,https://jolocom.io/wp-content/uploads/2019/12/jolocom-whitepaper-v2.1-a-decentralized-open-source-solution-for-digital-identity-and-access-management.pdf,,Whitepaper,,Resources,,,,,,,,2019-12,,,,,,,,,,,,,
|
||
Lissi,Neosfer GmbH,Lissi,,,,"European Union, Germany",,,,Lissi,We provide software tools for trusted interactions between organisations and their customers.,,https://lissi.id/,,Company,,Company,Enterprise,ID,SSI,,,,,2019-06,,https://twitter.com/lissi_id,,https://lissi-id.medium.com/,https://medium.com/feed/@lissi-id,,,https://www.linkedin.com/company/lissi/,,,,,
|
||
Lissi,Lissi,,Medium,,,,,EIDAS,,eIDAS and the European Digital Identity Wallet,"The vast majority of citizens regularly use the internet. According to statista, for 16–24-year-olds, the European average of daily internet users amounts to 95 per cent in 2020. Even for the age group of 55–64 years, the percentage of daily users is as high as 69 per cent on an EU average. Hence, access to digital services is expected. This includes services offered by governments and the private sector alike.","eIDAS and the European Digital Identity Wallet: Context, status quo and why it will change the world. In 2021 the European Commission announced the European digital identity wallet. This article explains the basic concepts, highlights the significance of this development and provides an overview of the status quo. The vast majority of citizens regularly use the internet. According to statista, for 16–24-year-olds, the European average of daily internet users amounts to 95 per cent in 2020. Even for the age group of 55–64 years, the percentage of daily users is as high as 69 per cent on an EU average. Hence, access to digital services is expected. This includes services offered by governments and the private sector alike. The difference between foundational and contextual identity When speaking about “digital identity” we need to differentiate between a foundational and contextual identity. A foundational identity has a legal context and uniquely identifies a natural person. A contextual identity exists depending on a particular context and is not directly subject to government regulations. While a person generally only has one foundational identity, he or she can have hundreds of contextual identities. Foundational Identities are also referred to as government-issued, eID, regulated-, foundational-, base-, or core identity. Foundational or regulated identities are issued by an authoritative body of a government. A classic example is a passport. It grants rights and privileges in a global context and is subject to a highly regulated environment. The Pan Canadian Trust Framework defines a foundational identity as followed: “A foundational identity is an identity that has been established or changed as a result of a foundational event (e.g., birth, person legal name change, immigration, legal residency, naturalized citizenship, death, organization legal name registration, organization legal name change, or bankruptcy)” PCTF V1.4. Contextual identity: also referred to as non-regulated-, private- or pseudonymous identity. The Pan Canadian Trust Framework defines a contextual identity as followed: “A Contextual Identity is an identity that is used for a specific purpose within a specific identity context (e.g., banking, business permits, health services, drivers licensing, or social media). Depending on the identity context, a contextual identity may be tied to a foundational identity (e.g., a drivers licence) or may not be tied to a foundational identity (e.g., a social media profile)”. Hence, one needs to know the context of the identity in question to understand who we are talking about. If we just say “follow @earthquakebot to get immediate information about earthquakes 5.0 or higher” you don’t know where to go and search for this bot. The context, which is missing is that the bot exists within the authoritative domain of the Twitter platform. However, on other platforms, this name might already be taken or used for other purposes. Identification and authentication Before we dive deeper into the topic of the eIDAS regulation we want to explain two key concepts, which the regulation is aiming to improve: identification and authentication. Identification asks: Who are you? This implies the person or organisation you are interacting with doesn’t know you yet and has a legitimate reason or even the obligation to identify the natural person it’s interacting with. Current means of identification include officially notified eID means as well as offerings from the private market such as postal service, video- or photo identification of your physical ID documents in combination with a photo or video of you. Currently, there are multiple eID implementations within Europe, however not every member state has notified an eID for cross border usage. Authentication asks: Is it you again? This implies that you had a previous interaction with the person or organisation you are interacting with so they already know you. Current means of authentication include the username (mostly an email) in combination with a password or a single sign-on (SSO) service also referred to as “social login” provided by big technology companies. Passwords are cumbersome to remember especially considering that users should use different passwords for different services. While “social logins” are more convenient and user-centric, they also come with critical drawbacks, since they lead to a high dependency on the “social login” provider and a lockin within their ecosystem. Interoperability is missing and oftentimes the business models of these providers are based on surveillance practices. In the early stages of the web, we mainly used postal ident for identification and only passwords for authentication. In the second and current iteration of the web, we use photo- or video identification for the verification of regulated identities or notified eID means provided by the member state. For authentication, we use a combination of passwords and “social logins”. In the third iteration of the internet “Web3”, we will use digital wallets for both identification and authentication. A key differentiator is the control over identificators. Until now users were only able to choose an identificator within an authoritative domain, such as email addresses, usernames on social media platforms or telephone numbers. Ultimately the legal entity governing the domain, in which the identificator is used, has full control over the usage of these identificators. That’s different with decentralised identificators (DIDs), which are created and controlled by users. The eIDAS regulation (electronic IDentification, Authentic and trust Services) instructs all relevant stakeholders regarding the use of electronic signatures, electronic transactions and their involved bodies as well as their embedding processes to provide a safe way for users to conduct business online. The first version of the European regulation came into effect in 2014. In June 2021 the European Commission proposed a revised version “eIDAS 2.0”, which is currently in draft. This revision was initiated due to the current limitations as described in more detail in the impact assessment: 1) Optional eID notification for member states. 2) Limited option to exercise data protection rights. 3) Strong limitations to public services in practice. 4) No level playing field for trust service providers from different member states. More information about the findings on the implementation and application of the revised eIDAS regulation was published by the European Parliamentary Research Service. The European Digital Identity (EUDI) Wallet is an application for citizens running on a mobile device or a cloud environment, which can be used to receive and store digital credentials and interact with third parties to get access to digital services. The wallet will be provided to citizens by all member states. Its usage is optional for citizens. The graphic above illustrates that there are multiple issuers of identity information. This information can be received, stored and presented by the EUDI Wallet. Entities requesting information from a citizen can be public institutions or representatives of those or commercial entities, which are required by law to identify their customers such as banks or airlines. The wallet will enable: 1) both identification and authentication 2) the verification of third parties 3) the storage and presentation of verified identity data and 4) the creation of qualified electronic signatures Currently, the intention for the EUID Wallet is to reach the level of assurance (LoA) “high”. The LoA represents the degree of an issuer’s confidence in a presented credential and its trustworthiness. Similar to how the European General Data Protection Regulation (GDPR) forced the internet to recognise the data protection rights of users, the eIDAS regulation will set the foundation for digital identity and identity wallets on a global scale. Very large platform providers will be mandated to accept the digital identity wallet. The digital markets act classifies a platform as such, once they reach 45 Million monthly active users in the European Union, which is equivalent to 10 per cent of the European citizens. This solves the initial problem of a two-sided market in which both issuers and consumers of identity data want the other party to be present before joining. It also expands the scope of the regulation from initially regulated identities only to also include contextual identities — at least the access to them via means of authentication. While some European Member states such as Sweden or Estonia already have an advanced framework for digital identities, which is used by the majority of citizens, this isn’t the case for all member states. Those who lag behind have the opportunity to leapfrog existing infrastructure. Furthermore, there is a massive opportunity for Europe as a whole to standardise user-centric processes for identification and authentication while preserving citizen control and privacy. This will facilitate access to digital services from the public and private market alike. The harmonisation of legislation and technology on a European level will enable public bodies and private market participants to better reach European consumers. The regulation has the chance to significantly improve processes via automatisation, verified data, flexibility and availability of a common infrastructure. It furthermore has the potential to reintroduce organisations with a direct encrypted communication interface to consumers without an intermediary. A shared infrastructure for all member states with easy access for private entities would also greatly facilitate information exchange between ecosystems, which are currently separated and fragmented. Infrastructure with a suitable legal framework would benefit all stakeholders by providing much-needed trust and security for digital interactions. The European Commission has set itself a tough timeline by planning to mandate member states to offer a EUDI Wallet at the beginning of 2024. The next big milestone will be the announcement of technical specifications as part of the eIDAS toolbox in October 2022. Hence, from the adoption of the legislation in early 2023 until the availability of the wallets there is only a one year period for member states to implement the wallet based on the defined standards. These standards are defined in the eIDAS Toolbox. You can find more information about the timeline published by the German research team accompanying the Showcase Digital Identity projects in Germany. The outline of the toolbox was published by the eIDAS expert group in February 2022. You can find it here. Who is working on the eIDAS 2.0 toolbox? The eIDAS regulation is revised by an expert group consisting of representatives from the 27 member states. The work of the eIDAS expert group is divided into four working groups (WG): The WG Provision and exchange of identity attributes is concerned with the set, format and issuance and validity of Personal identification data. The WG Functionality and security of the wallets also takes into consideration the APIs and protocols for the communication between the stakeholders as well as the creation and usage of qualified electronic signatures. The WG Reliance on the wallet/identity matching is concerned with the unique identification process, the authenticity of received credentials by the relying party and its authentication. The WG Governance is concerned with the accreditation of certification bodies, the trusted lists, the list of certified European Digital Identity Wallets, security breaches as well as business models and fees structures. What’s the status quo of the eIDAS toolbox? The current outline of the toolbox contains information about the objectives of the EUDI Wallet, the roles of the actors of the ecosystem, the wallet’s functional and non-functional requirements as well as potential building blocks. However, it currently doesn’t provide any further information regarding a technical architecture and reference framework, common standards and technical specifications or common guidelines and best practices. These components will be added later. There are multiple possible directions regarding the technological design of the EUDI Wallet. This primarily includes (de)centralized public key infrastructures, certificates such as X.509 certificates or verified credentials as well as communication protocols such as OpenID Connect or DIDComm. However, at this point, the final choice is still unclear. The toolbox technical architecture will result in a single connection interface for relying parties as stated in the outline: “To ensure that the EUDI Wallet can be used in a seamless way by trust service providers and relying parties alike, a common authentication protocol shall be specified, ensuring interoperability (…).” If you want to know more about how the toolbox process is defined, you can find a detailed description in the summary of the first meeting of the eIDAS expert group. There will be at least four pilot implementations of the European digital identity wallet, which are funded by the European Commission as part of the Digital Europe Programme. Each pilot implementation should contain the use cases driver licence, diploma, payment authentication and eHealth as well as use cases in other areas such as digital travel credentials and social security. Such scenarios may also demonstrate the functionalities of the wallet for example qualified electronic signatures. For one pilot implementation, at least three member states have to collaborate. While stakeholders from the private sector can also participate, the application must be submitted by the member states. The funding opportunity was announced in February 2022. With the application deadline of 17.05.2022, interested parties only have very limited time to form a consortia for a joint application. The objectives of the call are as followed: - Support the piloting of the European Digital Identity Wallet - Promote the development of use cases - Test the interoperability and scalability of use cases - Trial user journey and collect feedback for updates - Promote the opportunities of the EUDI Wallet - Help build the necessary expertise and infrastructure The announcement of the funding and tender opportunity can be found here. In the following, we would like to summarise feedback from diverse experts and highlight the most important aspects, which need further attention. However, there are also other aspects, which need to be improved, which aren’t listed here. Anti-coercion Coercion is the practice of persuading someone to do something by using force or threats. Since there is a big imbalance of power between big corporations or governments and users/citizens, safeguards against abuses of this system for tracking, profiling or targeted advertising is of the utmost importance. When the only way to get access to a service is to surrender Personal data to a third party, there isn’t much an individual can do against it. The regulation currently doesn’t address this issue adequately. Potential solutions could be to require information requests to have a non-repudiable digital signature from the verifier to prove inadequate requests as well as an anonymous complaint mechanism to report this bad behaviour as pointed out by Drummon Reed in the manning publication “Self-sovereign identity”. Privacy: There are very positive principles included in the current draft, such as the explicit prohibition for issuers of a European Digital Identity Wallet to collect more than the necessary minimum information about the user than required to provide the service. However, it also includes a unique and persistent identifier of a wallet/citizen. The European Data Protection Supervisor recommends alternative ways to replace the proposed unique and persistent identifier by stating: “This interference with the rights and liberties of the data subject is not necessarily trivial; in some Member States, unique identifiers have been considered unconstitutional in the past due to a violation of human dignity. Therefore, the EDPS recommends exploring alternative means to enhance the security of matching.” Transparency of the Toolbox process: Since the eIDAS expert group solely consists of representatives from the member states, security or privacy experts from the private sector have very limited options to participate in the legislative process. The current draft also includes 28 occasions of statutory instruments, which clarify further details at a later stage, making it impossible to conduct a holistic risk and privacy assessment according to an article by Epicenter. Evernym, an Avast company, also points out that remote wallet deletion, the limitation of just holding credentials from qualified trust service providers as well as high barriers to entry for the private market can significantly stifle the positive impact of the regulation. The revision of the eIDAS regulation brings major opportunities with it. The European Commission has clearly identified the need to act and provide a holistic solution for the digital identities of natural and legal entities within the European Union. The eIDAS framework has the potential to be a global vanguard in creating trusted relationships for all stakeholders while also preserving privacy, security and transparency for its citizens. While going in the right direction the technical details are still unclear. Without further information about the potential technical implementations and their consequences, a concluding assessment isn’t possible. There is a high risk that the planned pilot projects will develop in different technical directions, making future interoperability much more difficult. It’s also necessary to address the coercion and privacy concerns explained above. The limited options of participation for data protection and social experts also stifle public trust in the process. Given the global consequences of the GDPR, the eIDAS trust framework will likely have an even more severe impact on the daily lives of European citizens and beyond. Hence, it’s essential to get this right. Currently, it’s too early to draw conclusions. The publication of the final toolbox in October 2022 will include technical aspects and more detailed legal and business prerequisites. But one aspect is clear already: Wallets will be the future. If you have further questions regarding Identity Wallets don’t heSITAte to reach out to us via info@Lissi.id — Your Lissi Team. About Lissi: Lissi provides convenient applications for companies and organisations to receive, organise and share trusted data from end users while respecting privacy and data sovereignty. This includes the Lissi Wallet as well as our applications for organisations. You can find more information on our Website.",https://lissi-id.medium.com/eidas-and-the-european-digital-identity-wallet-context-status-quo-and-why-it-will-change-the-2a7527f863b3,,Post,,Explainer,Public,,,,EUDI Wallet,,,2022-03-17,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Medium,,,,,EIDAS,,EUDI Wallet: Illustration of the eIDAS roles and functions,In the graphic below we reorganised and regrouped the stakeholders to map the requirements for the eIDAS toolbox architecture onto a SSI framework (Self-Sovereign Identity framework).,"EUDI Wallet: Illustration of the eIDAS roles and functions In June 2021, the EU Commission presented a new draft on eIDAS regulation. The aim is to provide all citizens and businesses in the EU with digital wallets with which they can not only identify and authenticate themselves, but also store a variety of other documents (such as diplomas) and present them in a verifiable manner. In recent months, a group of experts has now presented a first outline for the architecture of an “eIDAS Toolbox” describing the reference architecture. The current version of the toolbox of the revised eIDAS regulation already defined new roles within the framework as well as their functions. In the graphic below we reorganised and regrouped the stakeholders to map the requirements for the eIDAS toolbox architecture onto a SSI framework (Self-Sovereign Identity framework). The graphic shows very clearly how well the requirements for the eIDAS toolbox can be implemented with SSI technology. This is also supported by the paper “Digital Identity: Leveraging the SSI Concept to Build trust” by the European Union agency for cybersecurity ENISA. We also added the arrow from the different issuers to the trust registries, since they need to provide information to these registries. Until the end of the year member states are now requested by the EU Commission to implement first pilot use cases on top of the reference architecture. We expect many member states to implement use cases on top of an architecture similar to above graphic. We used the graphic below as basis for our infographic, which was published as part of the current eIDAS Toolbox document on page 8. We would be delighted to hear your feedback. Do you think the reorganisation makes sense? Which roles or functions are missing? Your Lissi team",https://lissi-id.medium.com/eu-id-wallet-illustration-of-the-eidas-roles-and-functions-6cb7bb6bca39,,Post,,Explainer,Public,,,,EUDI Wallet,,,2022-03-04,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Medium,,,,,EIDAS,,Trust in the digital space,"Would we rather have a high level of security or self-sovereignty? Unfortunately, the two aspects are at different ends of the spectrum. If we only allow pre-verified and approved parties to retrieve identity data, as currently envisaged by the [eIDAS regulation](https://Lissi-id.Medium.com/eidas-and-the-european-digital-identity-wallet-context-status-quo-and-why-it-will-change-the-2a7527f863b3), this severely restricts usage","Trust in the digital space This article describes why and how the Lissi Wallet Beta, available for iOS and Android, uses certificates to authenticate organisations. This article is also available in German. The problem Imagine you go to an event and just before the entrance you see a QR code with the heading “Check-in here” along with the organiser’s logo. As you scan the QR code with your Wallet, you are asked for your payment information, among other things. But should you present this information? When we communicate with third parties over the internet, it is not always clear whether the other party is really who they say they are. This problem also exists with established communication channels such as websites and emails, among others. Phishing refers to the fraudulent tapping of data to gain access to bank accounts or similarly sensitive accounts or information. A permanent communication channel that allows users to identify the communication partner to enable a trustworthy exchange of information is essential to protect users from phishing. Context is important We often base our trust in an interaction on the context in which we are communicating. For example, we trust a link in an internal employee portal more than a link in a promotional email. The principle is the same when a contact wants to connect with users and the connection request is displayed in the wallet. Depending on the context in which the connection request is initiated, a different level of trust can be assigned. The context helps us to establish trust but is not sufficient on its own. Often the context is missing or attackers specifically try to exploit it. Authentication of organisations Wallet users must be able to check the authenticity of organisations they connect to. However, the organisation must first be identified and verified. Once the organisation has the required certificates it can be validated in the user’s wallet. Hence, before the wallet can verify the organisation, a trusted party must certify the organisation. Certification authorities are organisations that are entrusted with signing digital certificates. They verify the identity and legitimacy of the organisation and the person requesting a certificate. If the check is successful, a signed certificate is issued. This certificate can then be verified by the user’s application such as a browser or wallet to authenticate the organisation. Trust on different levels An encrypted communication channel between individuals and organisations allows sensitive information to be exchanged without third parties being able to read it. However, this is not sufficient, as the identity of the other party must be verified beforehand. To ensure that the contact is really a public authority, for example, we use certificates to verify their identity. Consequently, there are two levels of trust. On the lower level, there is a cryptographically secured communication channel. This is supplemented by certificates issued by different certificate authorities or trust domains. Certificates and trust domains The basis for trustworthiness is that the certification authority implements organisational and technical measures at an appropriate security level and establishes rules for all participants in the trust domain. The specific requirements for the certificates depend on the use case and the legal framework in which a transaction takes place. Thus, the certificates used can differ depending on the level of trust required for each use case. Regulated certificate authorities act as issuers of certificates that certify the legitimacy of the domain holder and the security properties of the certificate. The signatures of the certificate authorities essentially serve to confirm the legitimacy of the certificate holder’s identity and to create trust in online data transmissions. Generic requirements for certificate authorities acting as a certification authority with the security level “high” are described by the Federal Office for Information Security of Germany in the Technical Guideline TR-03145–1. Certificate verification in the Lissi Wallet We would now like to transfer the approach of certificate verification, which we have known so far from web browsers, to the world of SSI wallets and have integrated a corresponding verification concept into our Lissi Wallet. The Lissi Wallet checks the certificates sent by the contact or agent. If an extended validation certificate is sent, the Lissi Wallet checks that the name of the contact/agent matches the name in the certificate. Only if there is a valid extended validation certificate and the name of the contact/agent matches the name in the certificate, the contact is displayed as verified. Display of trusted contacts for users in Lissi Wallet When a new contact request is made, users are asked whether they want to connect to the contact. In addition to the display of whether a contact could be verified, a recommendation for action is also given to the user. Further information on the contact’s certificate can also be displayed. When users receive a connection request (fig. 1), a new proof (fig. 2) or a information request (fig. 3) in the Lissi Wallet, it is displayed whether the contact is verified. The function is available in the Lissi Wallet for Android and iOS. We welcome your feedback. The trade-off between self-sovereignty or maximum security Would we rather have a high level of security or self-sovereignty? Unfortunately, the two aspects are at different ends of the spectrum. If we only allow pre-verified and approved parties to retrieve identity data, as currently envisaged by the eIDAS regulation, this severely restricts usage. Allowing users to share their data on their own responsibility offers more flexibility and freedom, but also potential for attack. About Lissi: Lissi provides convenient applications for companies and organisations to receive, organise and share trusted data from end users while respecting privacy and data sovereignty. This includes the Lissi Wallet as well as our applications for organisations. You can find more information on our Website.",https://lissi-id.medium.com/trust-in-the-digital-space-7762471351cf,,Post,,Explainer,Public,,,,,,,2022-08-04,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Twitter,,,,,EIDAS,,@lissi_id The European Digital Identity #Wallet #EUDI will have a modular framework,"for the user interface, data storage, cryptographic protocols, sensitive cryptographic material and eID mean modules. ""[Requirements and Solution CNECT/LUX/2022/OP/0011](http://etendering.ted.europa.eu/cft/cft-documents.html?cftId=10237)""",,https://mobile.twitter.com/lissi_id/status/1536645378451333127,https://pbs.twimg.com/media/fvncyiqwaaa-gzx?format=jpg&name=4096x4096,Tweet,,Explainer,,,,,EUDI Wallet,,,2022-06-14,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Medium,,Digital Technologies Forum,,,,,Digital Technologies Forum now includes the Lissi demo,[german] Digital Technologies Forum is a networking platform and exhibition space for selected research projects and innovations in the field of digital technologies from Germany. The forum offers outstanding research projects a platform for more visibility and promotes exchange and knowledge transfer at national and international level.,"Lissi Demonstrator im Forum Digitale Technologien Über das Forum Digitale Technologien Das Forum Digitale Technologien ist Vernetzungsplattform und Ausstellungsfläche für ausgewählte Forschungsprojekte und Innovationen im Bereich digitaler Technologien aus Deutschland. Das Forum bietet herausragenden Forschungsprojekten eine Plattform für mehr Sichtbarkeit und fördert den Austausch und den Wissenstransfer auf nationaler und internationaler Ebene. Die Veranstaltungen und Demonstratoren des Forums fokussieren sich auf die technischen Schnittstellen und gesellschaftlichen Spannungsfelder aktueller Technologietrends: Internet of Things, Big Data, Künstliche Intelligenz und Sicherheit und Vertrauen im Digitalen Raum. Der Showroom des Forums bietet eine Ausstellungsfläche für Projekte aus den Technologieprogrammen des Bundesministeriums für Wirtschaft und Klimaschutz sowie weiterer Förderprogramme des Bundes. Mehr Informationen über das Forum digitale Technologien gibt es auf der Website. Lissi Demo im Showroom des Forums Die Lissi Demo ist Teil des Themenbereichs “Vertrauen im Digitalen Raum”. Sie veranschaulicht mehrere Anwendungsfälle aus Sicht der Nutzenden und gibt einen praxisnahen Einblick in die Interaktionen. Sie können die Demo hier selber ausprobieren: https://Lissi.id/demo Die Demonstration veranschaulicht, wie Anwendungsfälle und Nachweise aus dem hoheitlichen und privatwirtschaftlichen Bereich miteinander kombiniert werden können. Das Lissi Team der main incubator GmbH ist Konsortialleiter des IDunion Konsortiums. IDunion ist eines der vier Projekte im Schaufenster Digitale Identitäten, welches vom Bundesministerium für Wirtschaft und Klima gefördert wird. Die vier geförderten Schaufensterprojekte werden von der Begleitforschung unterstützt. Ziel von IDunion ist es, ein Ökosystem für vertrauensvolle digitale Identitäten zu schaffen, welches mit europäischen Werten betrieben wird und weltweit nutzbar ist. Dabei werden digitale Identitäten für natürlich Personen, juristische Personen, sowie Dinge (IoT) berücksichtigt. Über Lissi: Lissi bietet einfache Anwendungen für Organisationen, um vertrauenswürdige Interaktionen mit Nutzenden zu ermöglichen. Dazu gehört das Lissi Wallet sowie unsere Anwendungen für Organisationen.",https://lissi-id.medium.com/lissi-demonstration-im-forum-digitale-technologien-82d5f0c07a5d,,Post,,Meta,,,,,Lissi Connect,,,2022-04-27,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Medium,,GAIA-X,,,,,Lissi demonstrates authentication for Gaia-X Federation Services,"Gaia-X creates an open, federated digital ecosystem for data infrastructure for decentralized cloud and edge workloads and data sharing capabilities. As part of the core services, the so-called Gaia-X Federation Services (GXFS) are targeting the areas of Identity & Trust, Federated Catalogue, Sovereign Data Exchange and Compliance as open-source reference implementations.","Lissi demonstrates authentication for Gaia-X Federation Services You can use the Lissi Wallet for the Authentication (Login) into the Gaia-X Federated Services Platform here: https://Lissi-demo.gxfs.dev/ Collaboration between Gaia-X Federation Services (GXFS) and IDunion Gaia-X creates an open, federated digital ecosystem for data infrastructure for decentralized cloud and edge workloads and data sharing capabilities. As part of the core services, the so-called Gaia-X Federation Services (GXFS) are targeting the areas of Identity & Trust, Federated Catalogue, Sovereign Data Exchange and Compliance as open-source reference implementations. The project is funded by the governments of France and Germany to support the data-driven business models for the European economy. Part of every digital service solution is decentralized identity and access management, which enables participants of Gaia-X Federations to manage their participants in a self-sovereign manner. The identity layer determines how GDPR-compliant interactions between stakeholders can be offered, established and trusted based on SSI (Self-Sovereign Identity) principles without the need for a centralized controller. Such a decentralized implementation has been developed by IDunion. Despite the agnostic approach of Gaia-X, both projects collaborate to form a holistic solution and are aligned in regards to their principles of data sovereignty, openness and user control. Easy authentication via Lissi The demo is available via https://Lissi-demo.gxfs.dev/. The steps shown below provide a guidance how the process works. In this context authentication is the process of verifying the already known identity of a principal (user). Traditionally a centralized identity provider is used to identify and authenticate a user and you trust them by default. While there are single sign-on solutions by worldwide acting technology companies, these authentication mechanisms make users dependent on the provider while also introducing comprehensive surveillance risks. Therefore, Gaia-X goes another way with Personal wallets such as the Lissi Wallet to enable users to manage their identity by themselves as well as offering a passwordless authentication method, which doesn’t depend on a single centralized service. While this implementation is done via Lissi Connect to bridge newly evolving SSI technologies with existing standards like OpenID Connect, other vendors or open-source integrations can also be used simultaneously. Demonstration at the Hannover Fair The first draft of the integration was demonstrated at the Hannover Fair, an international industry trade exhibition. The final integration with any wallets is currently in progress and will be presented to the public at a later stage. Benefits for users: - Convenience: self-managed and controlled identity and login without password - User-centric: use the same application for interacting with multiple stakeholders - Order: using a credential for authentication and authorization - Transparency: of interactions with GDPR conformity Benefits for organizations: - Domain independent: shared trusted infrastructure with European values and regulatory conformance. - Streamlined UX: for authentication and authorization - Risk reduction: interacting with verified participants - no vendor lock-in: usage of open standards - Independence: decentralized user and access management About Gaia-X Federation Services: The Gaia-X Federation Services (GXFS) represent the minimum technical requirements needed to build and operate a cloud-based, self-determined data infrastructure ecosystem. On the basis of technical specifications, services are developed based on an open source code. These will be further developed into operational services by the Gaia-X community and continuously improved. Led by eco, the GXFS-DE project is also funded by the German Federal Ministry of Economic Affairs and Climate Action and is in close exchange with the Gaia-X Association for Data and Cloud (AISBL) and the French funded project GXFS-FR. You can find more information on our Website. About eco — Association of the Internet Industry: With over 1,000 member companies, eco is the largest Internet industry association in Europe. Since 1995 eco has been highly instrumental in shaping the Internet, fostering new technologies, forming framework conditions, and representing the interests of members in politics and international committees. eco’s key topics are the reliability and strengthening of digital infrastructure, IT security and trust, as well as ethically-oriented digitalisation. eco advocates for a free, technologically-neutral, and high-performance Internet. You can find more information on our Website. About Lissi: Lissi provides convenient applications for companies and organizations to receive, organize and share trusted data from end users while respecting privacy and data sovereignty. This includes the Lissi Wallet as well as our applications for organisations. You can find more information on our Website. About IDunion IDunion aims to create an open ecosystem for decentralized identity management, which can be used worldwide and is based on European values and regulations. The project is funded by the German Ministry of Economics and Climate Actions. You can find more information on our website.",https://lissi-id.medium.com/lissi-demonstrates-authentication-for-gaia-x-federation-services-819e9bbe70ad,,Post,,Meta,,,,,,,,2022-08-17,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Medium,,,,,,,Lissi Connect Demo,"[German] The login is only the start of the mutual customer relationship users do not want to monitor and maintain dozens of communication interfaces, but prefer a solution that brings these aspects together. This includes not only login services, but also newsletters, information letters from the bank, digital proof of any kind and other relevant Personal information. The media break and the fragmentation of the current systems poses a major challenge for users and organizations. However, once stored in the user's wallet, this information can be easily managed, sorted and presented as needed.","Lissi Connect Demo Lissi Connect erleichtert die Nutzerauthentifizierung und den Austausch von digitalen Nachweisen Es ist an der Zeit das Passwort in Rente zu schicken Die Option, sich ohne Passwort bei Plattformen einzuloggen, wird heutzutage von Nutzenden erwartet. Serviceanbieter suchen nach Integrationen, welche von Nutzenden angenommen werden und einfach zu integrieren sind. Dabei haben sich vor allem große Technologiekonzerne als Anbieter von Single-sign on Lösungen etabliert. Diese Login Optionen sind zwar einfach für Nutzer, bringen diese jedoch in große Abhängigkeit von zentralen Dienstleistern, welche häufig das Nutzerverhalten analysieren und an Werbetreibende verkaufen. Der Login ist nur der Start der gemeinsamen Kundenbeziehung Nutzende möchten nicht dutzende von Kommunikationsschnittstellen überwachen und pflegen, sondern bevorzugen eine Lösung, welche diese Aspekte zusammenführt. Dies beinhaltet nicht nur Login Dienste, sondern auch Newsletter, Informationsschreiben von der Bank, digitale Nachweise jeglicher Art und sonstige relevante persönliche Informationen. Der Medienbruch und die Zersplitterung der aktuellen Systeme stellt Nutzende sowie Organisationen vor eine große Herausforderung. Einmal in dem Wallet der Nutzenden abgelegt, können diese Informationen jedoch einfach verwaltet, sortiert und nach Bedarf präsentiert werden. Dies verbessert die Datenhaltung für Nutzende und bietet komplett neue Möglichkeiten der Kommunikation zwischen Organisationen und Endanwender. Die direkte Verbindung — mit Lissi Connect Lissi Connect ermöglicht die Authentifizierung von Nutzenden ohne Passwort. Zusätzlich zur Authentifizierung der Nutzer bietet Lissi Connect die Möglichkeit, digitale Nachweise auszustellen und abzufragen. Es handelt sich dabei um eine Platform-as-a-Service (PaaS) Lösung, welche einfach in bestehende Systeme integriert werden kann. Die Kontrolle über die Kundenschnittstelle liegt dabei stets bei den zwei Parteien, welche den Kommunikationskanal initial erstellt haben. Unsere Lissi Connect Demo Um ein Gefühl für das Nutzererlebnis zu erhalten haben wir euch eine Login Demo bereitgestellt in der ihr die passwortlose Registrierung und Anmeldung testen könnt. Wir haben die Anwendung bereits mit den ersten Partnern getestet und bieten die Anwendung kostenlos für Test-Zwecke an. Interessiert? Schreibt uns doch gerne eine Mail an info@Lissi.id Eurer Lissi Team",https://lissi-id.medium.com/lissi-connect-demo-d6db29db7755,,Post,,Product,,,,,Lissi Connect,,,2022-01-20,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Medium,,,,,,,The Lissi Wallet is now available in 12 languages!,"Lanugauges supported: Arabic • English • French • German • Italian • Korean • Polish • Portuguese • Russian • Romanian • Spanish • Turkish<br><br>> the language is only a small part of the whole user experience. The task is to design a universal cockpit, which people can navigate regardless of their social background. Similar to a dashboard in a car, which doesn’t look too different wherever you go. In order to achieve this, we have to standardize the icons, colours and user-flows to a certain degree. However, on the other hand, they need to be adjusted to the target audience.","The Lissi Wallet is now available in 12 languages! About the Lissi wallet The Lissi wallet is a simple but powerful tool to manage digital interactions. It can be used to establish connections to third parties and to exchange information. You can find a more detailed explanation here or on our website. It’s currently available as an open beta version for iOS and Android. The importance of multi language support Within the identity community, we spend considerable time to ensure interoperability between different solutions. We engage in conversations in a variety of standardization bodies to enable a seamless user experience on a global scale. Frankly speaking, we are not there just yet. But we are on a good way to enable access to the services regardless of where an entity is based or which social background an individual has. While regulation as well as technical and educational hurdles remain, it’s crucial to increase the accessibility of products to different cultures and languages. We have already received feedback from multiple stakeholders from the public and private sector saying that multi-language support is essential for the execution of various use cases. There are several nations, which have multiple official languages. If our end-user facing products are not available in the most common languages, it creates entry barriers so big that not even a piloting of these use cases would make sense. Hence, we took note and worked hard to ensure the Lissi wallet is available in the languages of communities, which currently explore the self-sovereign identity concept. The Lissi wallet now supports the following languages: - Arabic - English - French - German - Italian - Korean - Polish - Portuguese - Russian - Romanian - Spanish, and - Turkish. Challenges remain When designing a wallet, the language is only a small part of the whole user experience. The task is to design a universal cockpit, which people can navigate regardless of their social background. Similar to a dashboard in a car, which doesn’t look too different wherever you go. In order to achieve this, we have to standardize the icons, colours and user-flows to a certain degree. However, on the other hand, they need to be adjusted to the target audience. Let’s take the colour red as an example. In western cultures, red is associated with excitement, danger, urgency and love, whereas the same colour evokes danger and caution in the middle-east. In India, it’s associated with purity, while in China it symbolizes luck and happiness. Finding the right balance between standardization and necessary adjustments for the target audience will require knowledge about the cultural differences, feedback and time. When it comes to language it creates its own set of difficulties. Differences can be observed in the usage of genders, the left-to-right or right-to-left reading, the information density or the usage of tenses, just to name a few. Furthermore, there isn’t a common terminology used within the community, which makes a translation into different languages even more challenging. Hence, our translation won’t be perfect. While we worked with native speakers, the context is often difficult to explain without demonstrating the user-flow and an actual use-case. Languages also change depending on the use-case or the subject in question. Nevertheless, we are looking forward to making the Lissi wallet even more accessible by adding additional languages and improving our current translation with your feedback. What language would you like to have us support next? We are always looking for translators for additional languages, so reach out to us to get our winter 2021 edition of Lissi Merchandise! Cheers. The Lissi team.",https://lissi-id.medium.com/the-lissi-wallet-is-now-available-in-12-languages-f88e56b04e19,,Post,,Product,,,,,Lissi Wallet,,,2021-02-05,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Medium,,,,,,,The Lissi Wallet now supports additional cards and convenience features,"The Lissi Wallet now supports additional cards and convenience features. Import of additional cards The Lissi Wallet does supports verified credentials, custom cards and pk.pass files The Lissi Wallet now supports .pkpass files, as well as other custom cards, in addition to verifiable credentials (first screen). Any card in the wallet that has a bar code or QR code can now be easi","The Lissi Wallet now supports additional cards and convenience features. Import of additional cards The Lissi Wallet now supports .pkpass files, as well as other custom cards, in addition to verifiable credentials (first screen). Any card in the wallet that has a bar code or QR code can now be easily stored digitally in Lissi Wallet (second screen). These can be customer cards, membership cards or, for example, gift cards. If required, the barcode can then be shown to the merchant instead of carrying the card itself. Often tickets, such as a boarding pass for flights or health passes, which are required for entry abroad, are issued as a .pkpass file. These passes can be easily imported into the wallet and can be presented with the Lissi Wallet at the airport or on the train if needed (third screen). Currently, this feature is only available for Android. Automatically accept incoming connections and credentials and display information about interactions Connection requests and the acceptance of new credentials can now be automated. Users can activate the automatic acceptance of new connections within the settings or after the establishment of three connections. Regardless of whether the connection is accepted automatically or not, users are informed about the connection setup (first screen). The automatic acceptance of new credentials can be activated for individual contacts as desired (second screen). When users receive a new credential, a green banner informs them that the credential has been successfully stored in the wallet (third screen). In addition, users are informed that information has been successfully presented to a contact (fourth screen). Want to try it out yourself? Head over to www.Lissi.id/demo. Cheers, Your Lissi Team",https://lissi-id.medium.com/the-lissi-wallet-now-supports-additional-cards-and-convenience-features-465aeedf5f5c,,Post,,Product,,,,,Lissi Wallet,,,2021-11-09,,,,,,,,,,,,,
|
||
Lissi,Lissi,,Medium,,Indicio,,,,,The Lissi wallet supports the Indicio Network,"we are committed to not only provide individuals with the choice of their favourite wallet, but also organisations with the choice of their network. We are delighted to announce that the latest version of the Lissi wallet also supports the Indicio Network.","The Lissi wallet supports the Indicio Network Digital identity is now a fundamental requirement to function in a world that has shifted to remote-first. To empower individuals and to protect their self-sufficiency, the concept of self-sovereign identity (SSI) was developed. It grants the individual agency over their interactions and data by putting the data-subject back into the driver-seat. Self-sovereign identity and its use-cases There are plenty of potential use-cases. Our identity isn’t something we can easily explain or which can easily be summarized by single data sets. It highly depends on the context in which it operates and is different depending on the person with whom we interact. One category of use-cases is the proof of certification or qualification of an individual. This proof is required for applications for jobs, grants or the participation in special events. Depending on the requirements, an individual can collect all necessary certificates and present them directly to the relying party. Another big category of use-cases is to enable access to information, buildings or resources. When e.g. information is labeled as “internal only”, the authorized people should be able to access it. Organizations can issue credentials to the right target audience and only approve access for those individuals, which have a valid credential. Wallets and Networks Two elemental components of an SSI-ecosystem are the wallets for end-users and the networks for organisations on which issued credentials are anchored to. In this article, we will explain how these relate to each other, what their specific goals are and how they interact with each other by providing the example of the Lissi wallet and the Indicio Network. About the Lissi Wallet In order to store and possess the credential, an individual requires an application with a graphical user interface to receive, store and present these credentials to third parties. A wallet is a key management application, which hides all the complicated technical aspects from the user and provides the required guidance to securely interact with trusted contacts. The Lissi wallet offers an intuitive interface and provides the necessary information and flexibility to navigate through an increasingly complex digital environment. Furthermore, it automatically recognizes the network, which an organisation used to verify the authenticity of the credential. Hence, while organizations can choose a suitable network, the user is informed about the network, but doesn’t need to do anything to receive the credential or interact with the organisation. Lissi is your smart agent, which takes unnecessary workload from you, provides you with the information to make informed decisions while always offering a choice. While the Lissi team is also involved in the formation of the IDunion network with its main focus in Europe, we recognize that organisations around the world have different needs. The networks, which are necessary to verify the authenticity of issued credentials, need to be adjusted to different regulatory requirements and the specific demands of their target audience. The network is a distributed and publicly readable database, which contains the public identifier of a legal entity. Therefore, we are committed to not only provide individuals with the choice of their favourite wallet, but also organisations with the choice of their network. We are delighted to announce that the latest version of the Lissi wallet also supports the Indicio Network. About the Indico Network Indicio.tech created the Indicio Network to meet the needs of companies and organizations that want a reliable and robust network to build, test, demo, and launch their identity solutions — all the while supported by a team of engineers with extensive experience in decentralized identity, its architecture, and its use cases. Indicio believes that this “concierge” approach to running a network will accelerate the development, adoption, and use of verifiable digital credentials. This means: - Professional staffing: Indicio’s engineers are among the most experienced in the decentralized identity field and are ready to answer questions, support network operations, and help to get products ready for launch. - Stability for demonstrations: Indicio supports developers at every step of the way from building to testing and public demonstrations. - Cross-network test readiness: Indicio sees the future as one of interoperable credentials and networks. Its network is the perfect platform for testing interoperability by issuing and verifying on multiple networks. - Easy Node Operator onboarding: For those interested in joining a network as a node operator, Indicio has simplified the process, offers training for all levels, and a suite of network tools and resources. With the Covid pandemic driving urgent need for decentralized identity solutions, Indicio is committed to delivering superlative infrastructure and technical support, and to making decentralized identity as easy to use as possible, whether as a node operator, a developer, an issuer or a verifier. About Indicio Indicio.tech is a professional services firm specializing in decentralized identity architecture, engineering, and consultancy. Indicio provides expert guidance to a global community of clients on the use of verifiable credentials to build digital identity solutions. The decentralized networks and tools created by Indicio make verifiable credentials easy to adopt, simple to deploy, and reliable to use. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Indicio believes in privacy and security by design, interoperability, and supports the open source goals of the decentralized identity community. This article was co-written by the Lissi and Indicio team. Cheers",https://lissi-id.medium.com/the-lissi-wallet-supports-the-indicio-network-e2247f895d39,,Post,,Product,,,,,Lissi Connect,,,2021-01-20,,,,,,,,,,,,,
|
||
Lissi,Lissi,,google play,,Verity,,,,,Lissi by Main Incubator,"Lissi is your digital wallet. You can use it to store digital ID cards, proofs and other credentials, which are issued by companies and institutions. You can use these credentials to identify yourself to various online services, log in, shop online, gain access to buildings and much more.<br><br>The Lissi-wallet enables you to:<br>- Establish private and secure connections with other entities<br>- Receive, store and manage verified credentials<br>- Present digital proofs of your credentials<br>- Log-in without a password at third party providers<br>- Store pk.pass files (board pass, concert tickets etc.)","Lissi is your digital wallet. You can use it to store digital ID cards, proofs and other credentials, which are issued by companies and institutions. You can use these credentials to identify yourself to various online services, log in, shop online, gain access to buildings and much more. The Lissi-wallet enables you to: - Establish private and secure connections with other entities - Receive, store and manage verified credentials - Present digital proofs of your credentials - Log-in without a password at third party providers - Store pk.pass files (board pass, concert tickets etc.) The potential use-cases are endless. From proofing that you reached a certain age for age-restricted products to presenting your academic credentials to a potential employer. Identity is versatile - so is Lissi. Your information isn’t stored on a central database or any cloud service. Instead, the Lissi Wallet stores your encrypted identity data locally on your phone. Hence, you have full control over your data and only you decide with whom you want to share it. The Lissi Wallet is developed in Germany by the Neosfer GmbH, a 100 percent subsidiary of Commerzbank AG. Our team also leads the IDunion consortia. The Lissi Wallet currently supports the IDunion, Sovrin, BCovrin and Indicio network. For further information please visit our website www.Lissi.id Neosfer GmbH Eschersheimer Landstr. 6, 60322 Frankfurt am Main",https://play.google.com/store/apps/details?id=io.lissi.mobile.android,,Product,,Product,,,,,,,,2022-12-08,,,,,,,,,,,,,
|
||
MagicLabs,,MagicLabs,,Arthur Jen; Jaemin Jin; Sean Li,,"USA, California, San Francisco",USA,,,Magic Labs,"Magic is a developer SDK that you can integrate into your application to enable passwordless authentication using magic links - similar to Slack and Medium.<br><br>When users want to sign up or log in to your application:<br><br> User requests a magic link sent to their email address<br> User clicks on that magic link<br> User is securely logged into the application<br><br>If it's a web application, users are logged into the original tab, even if the user clicked on the magic link on a different browser or mobile device!",,https://magic.link/,,Company,,Company,Enterprise,ID,SSI,,,Passwordless,,2018,https://github.com/MagicLabs,https://twitter.com/magic_labs,https://www.youtube.com/channel/UCe9Itc4HfUnqXO4wJk9mo3Q/,https://medium.com/magiclabs,https://medium.com/feed/magiclabs,,https://www.crunchbase.com/organization/fortmatic-inc,,https://docs.magic.link/,,,,
|
||
MagicLabs,MagicLabs,,Medium,,,,,,,3 Types of Passwordless Authentication for Web 3.0,"Passwordless authentication is a fundamental shift in how people will access their tools and information online, and it will provide more security, prevent billions in losses, and create greater transparency.","3 Types of Passwordless Authentication for Web 3.0 This article was written by Mike Truppa, a content developer and blockchain expert at Webstacks, a website and marketing operations agency helping high-growth SaaS, FinTech, and Blockchain startups scale. Passwordless authentication is the future of online security, and promises a future where users don’t need to remember username and password combinations, spend time resetting passwords, and worry about the security of their Personal and financial information being stolen. Passwordless authentication is a fundamental shift in how people will access their tools and information online, and it will provide more security, prevent billions in losses, and create greater transparency. Let’s explore the different types of passwordless technology and compare a few companies offering passwordless authentication software. What is Passwordless Authentication? Passwordless authentication is a method for verifying an internet user’s identity without requiring a password. Types of passwordless authentication methods in use today including, magic links, one-time passwords (OTP), biometric authentication, and public-private key pairs using blockchain technology. Is two-factor authentication (2FA) passwordless authentication? Because the nature of two-factor authentication (2FA) is to add an additional layer of security to passwords, it can sometimes be mis-categorized as passwordless authentication. However, 2FA methods such as SMS-based authentication would still be considered a one-time password which is a form of passwordless authentication. 3 Types of Passwordless Authentication that Eliminate Single Points of Failure from Centralized PAP-based Authentication Today’s password authentication protocols (PAP) are designed with centralized intermediaries or organizations that maintain a database of username-password pairs to prove a user’s identity. The central point of failure of PAP-based authentication puts people at risk of hacks, data breaches, identity theft, fraud, and leaks, all of which can be mitigated with passwordless authentication. 1. Public-Key Cryptography and Blockchain Authentication Public key cryptography is a form of public and private key authentication, which has been broadly used in the current information world including WebAuthN, machine-to-machine communication, etc. Public-key cryptography has exploded in popularity in the last decade in large part because of public blockchains like Bitcoin, Ethereum, and Solana that use public-private cryptography to secure blockchain transactions of digital assets and Non-Fungible Tokens (NFTs). Because blockchain technology is built on top of public-key cryptography, they can be confused as one and the same. However, public-key cryptography doesn’t necesSITAte authentication with a blockchain. For example, although Magic enables Web 3.0 platforms to connect to public blockchains like Ethereum, throughout the entire authentication flow there is no interaction with the underlying blockchain; no consensus is involved or required to prove the user’s identity. How does blockchain authentication work to prove a person’s identity? Instead of using the traditional method of typing in a username and password, blockchain authentication uses public-key cryptography for self-sovereign identity management. When a user creates a wallet account on the blockchain, they receive a private key which only they know, and it is paired with a public key that connects them to the wallet address. To access Web 3.0 applications or complete blockchain transactions, the user signs transaction requests using their private key which authenticates their account access. How are blockchains secured using public-key authentication? Blockchains have a variety of security mechanisms to protect the integrity of the blockchain and secure user’s information. Bitcoin’s Proof-of-Work and Ethereum 2.0’s soon to be Proof-of-Stake consensus mechanisms ensure censorship resistant networks that are practically impossible to hack. To hack (i.e. modify transactions on a blockchain’s distributed ledger) a malevolent user would need to control 51% of Bitcoin’s hash power, or more than 33% of Ethereum’s stake. For example, the top four Bitcoin mining pools which power Bitcoin’s Proof-of-Work consensus, control ~60% of the mining power, and to manipulate the network, all four of these independent miners would need to collude. As long as someone does not have access to your private key, it is highly unlikely for someone to access your wallet or impersonate the identity tied to your public-private key pair. 2. Decentralized Authentication Decentralized authentication means no single centralized platform, organization, person, or entity is needed to verify your identity. While blockchain authentication has proven to be a strong use case for decentralized authentication, the two are not the same. You don’t need blockchains to use decentralized authentication methods. What is an ITF? Identity Trust Fabric (ITF) is a decentralized mechanism for establishing trust between credentialed users. ITFs act as middlemen by interacting directly with a centralized intermediary. For example, an ITF could handle all the identification and access requests needed from a centralized party. ITFs decrease the risks of sending your confidential information to an organization. What are the tradeoffs between decentralized authentication and blockchain authentication? The main argument for using decentralized authentication methods like ITFs instead of blockchain authentication is the speed and cost of using blockchains. However, with the emergence of lightning fast layer one blockchains like Solana, layer 2 solutions built to help Ethereum scale transaction throughput like Polygon, blockchains are quickly becoming a faster, cheaper alternative to traditional decentralized authentication protocols. ETH 2.0 brought Proof-of-Stake (PoS) and sharding to the scaling conversation. These aren’t bad options as they do increase the L1 transaction throughput, but to reach scalability where there are millions of transactions on the network on any given day, PoS and sharding simply aren’t enough. 3. Distributed Authentication Distributed authentication is a collection of hosts interconnected by a single network. While distributed authentication is the leading choice based on the adoption across the industry, it poses a high amount of security threats. Two Common Flaws in Distributed Authentication Two main flaws with distributed authentication are: - Unconstrained delegation - Unbalanced authority What is unconstrained delegation? Unconstrained delegation allows some entity to authenticate you as an individual and also authenticate on your behalf (i.e. impersonate, act as you) to another party. While unconstrained delegation has benefits such as allowing administrators to update database servers from a web server, it creates an area of exploitation where a hacker with access to admin credentials can unilaterally compromise the system. Unconstrained delegation can lead to data breaches, exposing millions of confidential usernames and passwords, causing fraud and potentially billions of damages every year. What is unbalanced authority? Unbalanced authority is when a specific centralized party or system has information that identifies specific principles within the system (e.g. users). Unbalanced authority occurs between enterprise businesses where an external business partner is trusted inside the system, allowing them to access company resources. When the access granted is over-provisioned it allows external companies access to too much sensitive information that can cause harm to the internal organization and their customers. What type of passwordless authentication does Magic use? Magic uses public-private key authentication. While the authentication flow doesn’t involve interacting with blockchain, Magic’s authentication allows users to interact with blockchains after they are authenticated by binding the authentication to 16+ different blockchain key generation schemes. Borrowing security principles from blockchain hardware wallets like Ledger, Magic secures accounts using a combination of hardware wallet security and AWS’s Delegated Key Management. Software developers can use Magic plug-and-play Software Developer Kit (SDK) to quickly add magic links secured with public-private key authentication to their application. A magic link is a special URL that represents a login URL, typically emailed to users at login. This link contains an embedded token that authorizes users without requiring a username or password. Magic also supports other login methods like SMS, Social Logins, WebAuthN and MFA. The Type of Passwordless Authentication You Choose Will Be Different for Each Application’s Security Requirements Passwordless authentication removes the need to remember passwords and for password managers, and improves upon the security benefits of password-based authentication. Scalable passwordless authentication tools like Magic help software developers reduce the complexity of securing their applications, while simultaneously hardening security using the best aspects of public-private key cryptography. With the mainstream adoption of blockchain technology transforming every business sector, having the option to bind authentication with 16+ blockchain key generation schemes helps today’s Web 2.0 companies prepare for the future of Web 3.0. Passwordless authentication isn’t a zero-sum game. Every business has different needs, and not every type of passwordless solution will fit within the regulatory and compliance needs of each business.",https://medium.com/magiclabs/types-of-passwordless-authentication-for-web-3-958062e9d265,,Post,,Explainer,,,,,,Passwordless,,2021-12-30,,,,,,,,,,,,,
|
||
MagicLabs,MagicLabs,,Medium,,,,,,,Developers: SMS Authentication is Challenging,"SMS (Short Message Service) messaging¹, despite a number of material challenges, has broad adoption, international regulations, and support across platforms.","Developers: SMS Authentication is Challenging Phones are ubiquitous; the largest segment of the world’s computing base. However, despite significant market adoption of a few operating systems, interoperable standards for messaging are rare, and often segmented. SMS (Short Message Service) messaging¹, despite a number of material challenges, has broad adoption, international regulations, and support across platforms. This post details the use of SMS as an authentication mechanism. What is a high quality SMS login system? - Easy for Users, Hard for Attackers - Works globally, across all cellular carriers, even in lossy service environments. - Enrollment, opting out, and authentication are beautiful, simple processes. - Confidence the user has access to their phone, and the phone number is valid. - When users change their phone number, they don’t stop using the service; they can migrate to a new phone number smoothly. - When an attacker pretends to be a user, they are prevented from taking over the account. - A user should not be easily duped into helping their attackers. SMS can be temporarily undeliverable SMS delivery is not guaranteed, and many implementations provide no mechanism through which a sender can determine whether an SMS message has been delivered. 💡 Allow users to request a new code as part of the product. Use a different code for each message. SMS can be permanently undeliverable Users can request to stop receiving SMS from a particular sender, often by replying with ‘STOP’. Users will no longer receive messages. In the United States, FCC affirms text messages are covered under the “Telephone Consumer Protection Act”, and users have a variety of rights, including to Opt-Out. 💡 Notify users when their phone number is undeliverable: either in-app, or via email Messages can come from unfamiliar sources SMS standards make spoofing phone numbers difficult. However, no easy way exists for consumers to authenticate numbers or associate them with businesses. Messages appear with only a number to identify them. Users are habituated to ignore sender ids, or react with suspicion when numbers are changed. 💡 Include information about the sender in your message “Your ACME.co Code: 123–123”, or use Domain-Bound Codes Users can be on fraudulent sites Some sites trick users into entering authenticator codes for other sites. A common ploy asks for a user’s phone number, and prompts the user to enter the code they receive. The attacker simply forwards the collected code to the target, and successfully poses as the end user. - User Logs in to Fraud Site. Provides User phone number - Fraud Site forwards request to Real Site - Real Site sends User a SMS challenge. However, User thinks it comes from the Fraud Site - User enters correct SMS onto Fraud Site - Fraud Site / Attacker uses correct SMS to log into legitimate site - Attacker now has legitimate session on real site 💡 Include information about the sender in your message Your ACME.co Code: 123–123 or use Domain-Bound Codes 💡 Monitor for automations and headless browsers attempting your site’s login flow Users can change their phone number Users, particularly those outside of the United States, change their phone numbers often, giving rise to the popularity of messaging applications. 💡 Facilitate self-service recovery of SMS logins through alternative channels Attackers request control over phone numbers SIM-swapping attacks are social engineered takeovers of a user’s telecom contract. Calling customer support and transferring phone numbers between phones is common practice for consumers, and is exploited by attackers to capture SMS messages. Users can, though rarely, defend themselves, and unfortunately many users remain susceptible to these risks. 💡 Many SMS vendors provide carrier information in their API responses. If the carrier changes for a given number, send a confirmation email. Domain bound codes, an emerging solution The emerging standard for SMS security is to use Domain-Bound Codes for authenticating and protecting SMS messages. Messages are formatted to describe their sender, and allow automated tools to read those messages to auto-fill or protect users. Major mobile operating systems support or plan to support domain-bound codes. 123-456 is your ACME.co code. @acme.co #123-456 Enhance SMS-delivered code security with domain-bound codes — Apple Developer Providing a good SMS user experience SMS login flows can be complex to build and manage, but a few considerations will make the experience as smooth as possible for your users. Allow users to copy-paste into your SMS input box - Diverse interfaces exist for mobile devices, and users may not type in codes using a keyboard. Allowing paste makes your service more accessible, and a smoother end user experience. Using numeric codes? Label your input box as `numeric` - Phone soft keyboards use information about the input box to render the most usable keyboard for the use case. Showing a numeric keypad helps make entering codes as easy as possible Supporting iOS users? Tag login boxes with textContentType - Operating systems such as iOS make it easy to fill in one time codes from SMS messages. Apple uses a text content tag of textContentType=.oneTimeCodeto allow users to auto-fill new SMS codes into the page. Building with Google Play? Consider auto-verification with the SMS Retriever API - Android’s Google Play Services offer a collection of advanced SMS tools for verification of SMS codes, including supporting background verification. - With the SMS Retriever API, It is possible to build almost silent user and device verification, however, fallback support for traditional SMS is required, and informing users about what is occurring is critical for building user trust and comfort. Building Web Applications? Use ` autocomplete=""one-time-code""` - Many browsers facilitate SMS message autofill through input code tags for autofill, such as autocomplete=""one-time-code"". This can provide smooth user experiences cross-platform, and allow your product to take advantage of built-in browser functionality. Using Magic for SMS authentication With challenges ranging from usability, deliverability, internationalization, fraud, bots, social engineering, and multi-device support, the simple user experience of SMS login comes with complexity for developers. Magic makes authentication easy for you and your users. Supporting a broad array of use-cases with a beautifully designed developer experience, getting started with SMS login is easier than ever. Learn more about SMS Login and Magic Join Magic’s Discord: https://discord.com/invite/magiclabs Follow Magic on Twitter: https://Twitter.com/magic_labs ¹Note: the terms ‘SMS’, ‘message’, and ‘text’ are used colloquially to refer to ‘Short Message Service messages’",https://medium.com/magiclabs/building-sms-authentication-c2cabccbd5f8,,Post,,Explainer,,,,,,SMS,,2021-10-27,,,,,,,,,,,,,
|
||
MagicLabs,MagicLabs,,Medium,,,,,,,"Building a low-code, opinionated approach to plug & play login","Magic Login Form represents a new onboarding experience for end-users, so we wanted to revamp our own onboarding experience for developers to match. Learning about auth can quickly derail any developer’s good day. Striking the balance between good UX and good security can just boggle the mind.","Building a low-code, opinionated approach to plug & play login It feels so long ago that Magic unveiled its first auth solution in April 2020. JAMstack was having a moment, and so were NFTs. The world had just begun to reckon with shutdowns and social distancing. A surge in remote work showed us that online identity was overdue for a refactor. Centralized infrastructures were being challenged everywhere. Back then, all it took was one line of code to implement Magic. We used to hear lots of positive feedback about our whole developer experience. Those docs, you know? So clean. A year since, the world of web development is again at a threshold. We’re inundated with feedback from users that want a multitude of sign-in options. They want to feel secure, they want to own their data. They desire convenience and seamlessness. Providing an auth experience that serves every user, no matter their technical acumen or accessibility needs, is a costly undertaking for app creators. That’s because building a Magic implementation never really was just one line-of-code. You still have to create buttons, composed into forms, connected to a server. Model user accounts, measure conversion rates, but wait… did I aria-label that button right? Hold on, we need a combobox? Now add social logins to the mix: what the heck is OAuth? Or WebAuthN? This login page is turning into infinite story points! As we added more and more choices for sign-in, we heard feedback that Magic was harder to use, especially for no-code builders. So, what happened? And why should auth — something that every app needs — be so difficult to build and maintain? That’s a question that’s been bugging me for some time now. I lead the engineering team for developer experience at Magic, so we aimed to set a new standard to help our customers build auth more quickly, more securely, more accessibly, and more user friendly-y. Occam’s auth The aha moment came from the simple realization that most modern auth flows follow a discrete pattern: authorization and callback. You prompt a user to authorize themselves, traditionally with an email + password. Or, a more modern (and more secure) approach would use social logins, or Magic’s own passwordless email/SMS flows. Once a user has submitted their proof-of-identity (“authorization”), the app has to then verify this information (“callback”). In the case of social logins, this requires checking a one-time code built around some fancy, math-y cryptographic stuff. Or, using Magic’s passwordless SDKs, you just call getRedirectResult for social logins and loginWithCredential for email/SMS. Building auth for the web essentially boils down to two big function calls. Noticing this, however, presents an opportunity to do what we engineers love to do best: abstract! But we weren’t going to make just any abstraction. We want a new paradigm that speaks to the power of web development today and uses web primitives in such a way that the solution can slot into just about any tech stack. We’re especially excited about no-code and low-code platforms like Webflow and Bubble, so we made it a priority to support those tools as natively as possible. Introducing (truly) plug & play auth Today, we’re introducing a new way to implement Magic auth for the web: Magic Login Form. We think it delivers on the promise of Magic as the easiest, most flexible, and most extensible auth solution available. That’s because we want your frontend implementation to be as simple as copy & paste. Everything you need to start securely authenticating your users with any of Magic’s sign-in methods is two <script> tags away: That’s all it takes to connect your app to Magic’s entire suite of auth features. You get a beautiful, accessible login screen with UI best-practices built-in — we’ll even remember which auth method users previously signed-in with. And better yet, your implementation is future-proof and automatically updates with Magic’s service. So, when Magic adds support for your favorite social login provider, you don’t need to deploy an update. Your users will see the latest changes automatically. All of this happens inside of an <iframe> hosted on your domain, so users aren't left questioning what service they're interacting with, reducing the risk of phishing. At Magic, we think developer experience is user experience. So we’re trying to remove as many barriers between you and your creativity as possible. With Login Form, you can stop worrying about auth and start focusing on what matters to your users and your business. Though it’s still not quite “one line of code” for everything auth, it’s a hell of a lot closer than we’ve seen anywhere else, and we’re excited about its potential to improve the auth experience for everyone on the web, long into the future. The first prototype At Magic, we promote a culture of creative experimentation, and we put this into practice during bi-weekly “demo days.” Everyone on the team has an opportunity to share something they’re working on — whether it’s related to a milestone project, or just a blossoming idea. Some of our best features sparked this way, usually based on pure intuition and user empathy. This takes a lot of introspection as a team. If we’re knowledgable of ourselves, it tends to manifest in great products for our users. So, demo day is also an opportunity for us to invest in each other. When Login Form made its first appearance at demo day — unplanned and off-the-cuff — it looked like this: The inspiration for that demo had simmered for a while. When we talked about this “big idea,” to make auth simple and clean and future-proof, it was sandwiched between phrases like “pie-in-the-sky” and “someday…” But, to produce that working proof-of-concept was a matter of hours (it helps, of course, that we already had a universe in which to bake our apple pie). Demo day was a hit, but that’s only where the real work began. The “real” work For the developer experience team, Login Form meant so much more than a pre-packaged UI. It represented a whole new, opinionated implementation approach. Building a login page is pretty easy to “get”, even if you’re not an engineer. We’ve seen a thousand login pages before. But we still had to explain this implementation approach in a way that our product designers and marketing extraordinaries could relate to — we needed to help them tell a story. So, we went back to the drawing board. It didn’t take long to find consensus on a UX pattern. Again, it’s a login page (a damn good login page). We started with a few design goals: - Login Form should be adaptive to a developer’s Magic Dashboard settings, creating a seamless development experience. If you add some Google creds to your Magic account, then Login Form should instantly reflect that. - Logging in should be quick, easy, and frictionless — users should never feel lost in sea of sign-in options. So, we want to remember what sign-in method a user has previously used for an app, then we can focus them on the right form automatically. - Good design is inclusive design. Our entire login experience should reflect UI best practices and accessibility standards, above and beyond simple compliance. - The design should be extensible. While we’re super proud of our initial release, we’re already thinking about ways to make Magic Login Form even better. Simplicity and flexibility help ensure we’ve got room to grow moving forward. A new onboarding experience for developers Magic Login Form represents a new onboarding experience for end-users, so we wanted to revamp our own onboarding experience for developers to match. Learning about auth can quickly derail any developer’s good day. Striking the balance between good UX and good security can just boggle the mind. Even building on top of a solution like Magic can quickly spiral into a thousand-thousand esoteric questions. UX tends to be the last box on the auth checklist. So, how do we show-off the “easy button”? We started by looking at our own sign-up experience on Magic Dashboard. After you’ve completed our passwordless email flow for the first time, you see a screen like this: When we created npx make-magic, we sought to speed-up development of new projects using Magic. When we added this screen to our sign-up flow, however, we saw a mixed response. Some, especially those from a JAMstack background, were happy to see familiar tooling options. Others were unsure about what npx make-magic was doing in their system, and why they were being asked to start there. One developer was confused upon seeing npx, thinking that Magic worked exclusively for the NodeJS ecosystem—an impression we wanted to correct. The easiest decision for us was to strike this page from our sign-up flow completely. We replaced this piece of the onboarding puzzle with a new Login Form settings page. From here, developers can access an interactive preview of their customized form. We also introduced a new featured card to Magic Dashboard’s home page, surfacing this new implementation approach with a beautiful, eye-catching design. Now what? Getting started with Magic Login Form is super easy. Log into your Magic Dashboard account and go to Login Form. Try a demo for your very own plug & play login page. You’ll also see a link to download a working implementation using your actual API keys! We’ve also written some documentation to help you build a plug & play login experience from the ground up. And, of course, we have added a template to our CLI scaffolding tool to generate a working implementation in under a minute. Simply run the following command in your preferred shell: npx make-magic --template plug-and-play So long, and thanks for all the fish! We hope you enjoyed this peek behind-the-curtain of the all-new Magic Login Form (fun fact: our internal code name was “Auth-in-a-box”). By the way, it’s totally free to try. If you’re interested in getting more involved, join the Magic Discord server, where you can provide feedback or connect with a vibrant community of developers. And one more thing: we’re hiring!",https://medium.com/magiclabs/building-a-low-code-opinionated-approach-to-plug-and-play-login-21bb30dca9a4,,Post,,Meta,,,IAM,,,,,2021-10-27,,,,,,,,,,,,,
|
||
MagicLabs,MagicLabs,,Medium,,,,,,,Decrypt trusts Magic to onboard record new user growth with the launch of reader tokens and rewards,I’ll cover how the Decrypt team streamlined development and onboarding to launch Decrypt Tokens and Drops — along with how crypto DNA and a leading product and team drove their trust in Magic.,"Decrypt trusts Magic to onboard record new user growth with the launch of reader tokens and rewards “CryptoPunks NFTs Are Worth Nearly $2 Billion Now” “Non-Fungible Tokens (NFT): Beginner’s Guide” These two headlines are just a glimpse into how Decrypt helps demystify the decentralized web. On the site, you can find everything from insights into the biggest events shaping the crypto industry to content that meets readers where they are in their journey of learning about blockchain and cryptocurrencies. As a fan of the publication, I was thrilled to talk with Luke Hamilton, Senior Software Engineer. In this post, I’ll cover how the Decrypt team streamlined development and onboarding to launch Decrypt Tokens and Drops — along with how crypto DNA and a leading product and team drove their trust in Magic. Incentivizing readers to earn as they learn Decrypt is the gateway into the decentralized world. What’s unique about the reader experience is its clear-eyed storytelling and accessibility. Decrypt content caters to various levels of existing crypto knowledge. Back in 2019, the team had ambitions to deepen the reader experience and connect further to their ethos of education through using decentralized technology. Fast forward two years of behind-the-scenes work by their product team, that vision is now brought to life with the launch of Decrypt Token (DCPT), a new reader token that lives inside the Decrypt mobile app. According to the team, “The token is our way of participating in the decentralized industry we cover, and experimenting with how cryptocurrency can spur reader engagement.” So, what’s in it for readers? The opportunity to earn as they learn. Every week, unique digital rewards — called Drops — become available and redeemable with Decrypt tokens. It’s an ingenious way to assign value to actions, thank readers for their loyalty, all while also encouraging hands-on interaction with crypto. Novel reader tokens and rewards inside a mobile app During the development process, Luke and the team tackled the question of how to authenticate users. They explored whether to build their own authentication or integrate with an existing auth solution. Internally, there was already substantial discussion about going passwordless for the tokens MVP (minimal viable product). For Decrypt, the biggest benefits to passwordless login were improvements to UX and that several forward-thinking brands had started to make passwordless login increasingly popular with users. At the time, work on traditional username and password flow had already begun. But after assessing where the app was and what was left to do in the critical path, Luke realized they had not yet solved issues around account recovery. After evaluating the large effort it’d require their lean engineering team to build out forgot password, reset password, and so on, Luke decided not to continue to invest in a traditional auth flow and instead, implement Magic. Natural excitement for Magic sparked at first discovery because of the companies’ shared philosophies on crypto. “Choosing Magic further bolstered the credibility of our ethos.” The product also stood out as the passwordless auth solution with future-proof crypto and identity tech under the hood. “The clear aha moment came when we realized Magic does exactly what we wanted, functionally and philosophically: frictionless login for users and decentralized identity aligned to our mission. There weren’t that many competing solutions or anything remotely comparable; it’s fully-featured and production-ready.” Luke estimates that continuing to build password-based auth would have added “at least another month” to their roadmap. In just a week, email magic links were up and running in Decrypt’s mobile app for their private beta. The process of implementation was streamlined primarily thanks to Magic’s simple documentation and proactive customer support. Luke counts the responsiveness to his questions and genuine partnership mindset as most impressive, highlighting the customer-centric team behind the product. Successful user adoption at scale Decrypt has seen remarkable success in the months immediately following their beta and public launches. Over 50,000 users have already downloaded their mobile app to engage with content and earn tokens, and Decrypt is expecting hundreds of thousands more users by the end of 2021. Luke is certain Magic played a key role in appealing to broader audiences who want to get their hands on the latest technology and trends in crypto. Sign up and login is easy and fast; users can set up their in-app wallet in literally a few seconds. “User growth is the best proof there is. Less friction for users leads to higher conversion.” He’s also the first to recommend Magic to colleagues and is now solely building auth with Magic for his passion projects as well. “I’m working on my own rewards app and had to switch from Auth0 to use Magic.” The coolest part: innovation around tokens is just getting started. To kick off the first season of Decrypt Drops, the team launched an exclusive NFT (non-fungible tokens) series. In the future, Drops will evolve to include a host of different rewards and functionality. Together with Magic, Decrypt is reaching record reader growth and engagement. Through seamless onboarding and innovative experiences powered by blockchain technology, the Decrypt and Magic teams are co-advocating for the future of the web.",https://medium.com/magiclabs/decrypt-trusts-magic-to-onboard-record-new-user-growth-with-the-launch-of-reader-tokens-and-rewards-14d791e582d5,,Post,,Meta,Web3,DWeb,DeFi,Onboarding,,,,2021-05-27,,,,,,,,,,,,,
|
||
MagicLabs,MagicLabs,,Medium,,,,,,,Magic Product Updates: December Edition,"Since our last product update, we’ve launched a multifaceted set of capabilities that enable you to do more with Magic.","Magic Product Updates: December Edition Happy holidays to the Magic community! Since our last product update, we’ve launched a multifaceted set of capabilities that enable you to do more with Magic. In this post, I’ll cover the latest highlights and improvements. Multi-factor Authentication Developers can now enable multi-factor authentication (MFA) for your users! This capability allows customers to add a layer of security to their end-user accounts. This means a secondary factor is validated along with the existing primary factor to log in to an account. Typically, the primary factor is an email, and a secondary factor is a phone number or mobile device authenticator. The idea is that both factors will need to be compromised to breach an account. The benefits of enabling MFA include: - MFA reduces the risk of a compromised account or stolen NFTs by requiring users to provide multiple credentials to access their accounts. - It protects users from theft. By requiring multiple authentication methods, MFA adds a layer of security from a stolen laptop or device. - MFA is one of the most straightforward and robust security methods a developer can enable. Magic makes enabling MFA simple with one click in the dashboard. - It helps your users meet regulatory compliance standards. You should enable MFA if your users must meet HIPAA, PCI, or CJIS compliance standards. Magic offers MFA through mobile authenticator apps like Authy or Google Authenticator. Email and SMS primary factors are currently supported. Magic will add WebAuthN, and social login primary factors support in the future. To get started, head to the dashboard and enable MFA or read more about the integration here. Custom Email Provider Have you wanted to customize the sender of your email magic link login? With Magic, you can now route emails through your Simple Mail Transfer Protocol (SMTP) server. Enabling the custom email provider gives you complete control over where your app’s login email is sent from, as well as the name of the sender. Magic will send email magic links through your SMTP server as soon as you configure the custom email provider. Disabling the custom email provider will restore sending emails from noreply@trymagic.com. Magic’s custom email provider is compatible with leading SMTP servers. Please visit our docs for more information on how to get started. Teams We believe when it comes to building great apps, collaboration is critical. That’s why we are introducing Teams. Every Magic developer is given a Personal team where you can invite up to two collaborators to help integrate Magic, update branding, or manage your users. As a collaborator, you will have access to any teams you have been invited to and your Personal workspace. Teams consist of two basic permission levels: a team owner and collaborators. The team owner has complete control over their account and is responsible for billing and managing team members. Collaborators have access to Dashboard functionality to collaborate on any project within the owner’s account. To add members to your project, head to your Magic Dashboard and look for My Team to get started. Magic Login Form: Privacy Policy and Terms of Service Our Magic Login Form enables a developer to integrate passwordless login with just 2 script tags. Seamlessly link to your applications Privacy Policy and Terms of Service and have them shown within the Magic Login Form. to new registering users and returning login users. For more information on embedding a URI, review our script options here. SMS Login for SDKs SMS Login support has been expanded to our Mobile, Flutter, iOS, and Android SDKs! This release enables developers to easily integrate SMS Login to their applications on any of the supported platforms. Thank you As 2021 comes to a close, I want to thank you for your support over this last year! At Magic, we are focused on helping our fast-growing developer community solve complex authentication, decentralized identity, and blockchain problems. So I want to invite you to join the Magic community on Discord, say hello, share product ideas and help others learn in this technology space.",https://medium.com/magiclabs/magic-product-updates-december-edition-7a24a3dcd4e4,,Post,,Meta,,,Developers,,,,,2021-12-28,,,,,,,,,,,,,
|
||
MagicLabs,MagicLabs,,Medium,,,,,,,Magic Raises $27M to Future-Proof Authentication,"Magic makes it plug and play for developers to add secure, passwordless login, like magic links and WebAuthN, to their applications. Users are no longer exposed to password-related risks from the very start.","Magic Raises $27M to Future-Proof Authentication Today, we’re thrilled to announce that Magic has raised $27 million in Series A funding, bringing our total funding to $31 million. This round is led by Northzone, with participation from Tiger Global, Placeholder, SV Angel, Digital Currency Group, CoinFund, and Cherubic — along with a roster of more than 80 stellar angel investors, including: - Alexis Ohanian — Co-founder of Reddit, Initialized Capital - Balaji Srinivasan — Ex-CTO at Coinbase, Co-founder of Earn.com - Ben Pruess — President at Tommy Hilfiger, Ex-VP at Adidas - Casey Neistat — YouTuber (12M subscribers) - Guillermo Rauch — CEO of Vercel & Next.js - Jacob Jaber — CEO of Philz Coffee - Jason Warner — CTO of GitHub - Kayvon Beykpour — Head of Consumer Product at Twitter, Periscope - Naval Ravikant — Co-founder of AngelList - Roham Gharegozlou — CEO of Dapper Labs - Ryan Hoover — Founder of Product Hunt, Weekend Fund - Sahil Lavingia — CEO of Gumroad - Scott Belsky — CPO of Adobe, Author of “The Messy Middle” - Soona Amhaz — General Partner at Volt Capital / TokenDaily - Varsha Rao — CEO at Nurx, Ex-Head of Global Ops at Airbnb This new capital will help us double down on empowering developers and future-proofing our technology, to ensure Magic is the most secure, seamless, and scalable way to onboard users sans passwords. Since launching on Product Hunt in April 2020, Magic has been in hyper-growth mode. This year, we went from a few people in a San Francisco loft to a 30+ all-remote team spread around the world. We’ve over 10X’d the number of developers building with Magic and our community continues to grow at a fast clip each month. Now, we’re securing millions of user identities for companies of all sizes and verticals. Trailblazing customers like UserVoice, Decrypt, Polymarket, Fairmint, and more integrate Magic as their core auth flow. We’ve helped our customer base expedite time-to-market, boost conversion rates, reach more audiences, level up security, and reduce cost. And we’re just getting started. Our vision is to build the passport of the internet in order to safeguard the trust between users and internet services. The legacy model User trust is one of the biggest challenges of the internet. Despite explosive growth in the number of people now connected to the internet — over 5.1 billion users, 67% of the planet — user trust is at an all-time low. Why? The current user trust model of the internet is fundamentally broken. A majority of the internet ecosystem has been trading user security, trust, and privacy in exchange for convenience and unsustainable profit growth. These dynamics at play resemble a teetering Jenga tower about to collapse. We are ensnared in a cybertrust paradox: relying on both a handful of mega-corporations and relative geopolitical stability for access to vital online services — sometimes forcefully so. These corporations may: - Go out of business and stop providing services - Get hacked and cause massive damage to businesses and users - Restrict critical access due to geopolitical motivations - Exploit user privacy and compete with businesses built on their own platform due to misaligned incentives - Ignore compatibility with modern tech stacks like Jamstack, blockchain, and other forms of decentralized infrastructure Big tech companies become centralized custodians, amassing troves of user identity data, creating single-points-of-failure with “too big to fail” level risks. With motivations to expand and maintain growth at all costs, they acquire more companies and absorb even more user identities. Close to 80% of all recorded acquisitions happened in the last 8 years alone. This problem compounds itself. One password leak makes other compromises easier, and the rate of lost or stolen passwords is only accelerating, as more companies are moving online due to the pandemic. Facebook’s most recent data breach compromised phone numbers and Personal data, making it easier for hackers to impersonate users and scam them into handing over login credentials. In this instance, over 500 million users’ data were leaked. To hedge against these risks, companies are under more pressure to keep data safe and act swiftly and transparently in a cyberattack. So they turn to their developers to implement authentication in-house. This often ends up being extremely expensive, involving building large teams to continuously address a multitude of security, compliance, infrastructure, reliability, and scale challenges. Despite these resources, 66% of breaches took months or even years to discover in the first place. Data breaches and lost/stolen passwords are a looming challenge of our times. Traditional forms of authentication haven’t changed much in decades and passwords are already obsolete. Now more than ever, we need digital identity infrastructure that’s secure and sustainable — that scales with modern internet ecosystems. The future At Magic, we believe the solution starts with developers. Instead of deferring responsibilities to end-users to improve their own security hygiene with things like password managers, Magic makes it plug and play for developers to add secure, passwordless login, like magic links and WebAuthN, to their applications. Users are no longer exposed to password-related risks from the very start. So, what makes Magic authentication unique? Instead of usernames and passwords, Magic uses public and private keys to authenticate users under the hood. A decentralized identifier is signed by the private key to generate a valid authentication token that can be used to verify user identity. Traditionally, usernames are publicly recognizable identifiers that help pinpoint a user, whereas passwords are secrets that were created by the user and are supposed to be something only they know. You can think of public and private keys as materially improved versions of usernames and passwords. The public key is the identifier and the private key is the secret. Instead of being created by users and prone to human error (e.g. weak/reused passwords), the key pair is generated via elliptic curve cryptography that has proven itself as the algorithm used to secure immense value sitting on mainstream blockchains like Bitcoin and Ethereum. Using blockchain key pairs for authentication gives Magic native compatibility with blockchain, supporting over a dozen blockchains. This enables blockchain developers to use Magic SDK to provide user-friendly onboarding experiences to mainstream users and tap into the potential of the rapidly expanding blockchain industry that is growing 56.1% year over year and projected to reach $69.04 billion by 2027. The key pairs are also privacy-preserving (no Personally identifiable information) and exportable. This allows user identity to be portable and owned by users themselves (self-sovereignty). The world is already moving towards this direction with novel solutions from companies like Workday and Microsoft. We’re first committed to enabling a passwordless future, by providing developers with the easiest way to integrate passwordless login methods into their applications, paving the way to eventually encourage worldwide adoption of decentralized identity. We’re hiring! To accelerate our momentum, we are growing our team! We are hiring across the board — engineering, product, research, and marketing. We are a diverse team with experience working at leading tech companies such as Stripe, Docker, Amazon, Auth0, Box, and Apple. You can check out our openings on our careers page. If you’re excited about our mission and don’t see a role that matches your skills, please write to careers@magic.link and share how you can help. To our customers, community, and investors: we’re incredibly grateful for your support. Absolutely thrilled to be on this journey together and can’t wait to share what’s in store for you all! Onward! 🔥",https://medium.com/magiclabs/magic-raises-27m-to-future-proof-authentication-79d8c63b2813,,Post,,Meta,,,,,,,WebAuthN,2021-08-09,,,,,,,,,,,,,
|
||
MagicLabs,MagicLabs,,dropbox,,,,,,,Magic: A Key-Based Authentication System For Self-Sovereign Identity,"Since Magic’s authentication protocol is based on key pairs provided by decentralized blockchain networks, it is platform-independent and thus able to provide authentication service without having to rely on centralized identity providers.","Build and deploy self-sovereign identity solutions, with the technology and go-to-market resources powering the largest implementations of digital credentials in production. Unlock the trust, security, and privacy benefits of verifiable credentials with our industry-leading platform. Start building real-world solutions today, with our software, training materials, product documentation, and world-class customer success team. Evernym is helping us develop our strategic response to the self-sovereign identity market opportunity. With access to Evernym’s insight, tools, and expertise, we’ll be able to rapidly experiment with this technology and its potential applications to the benefit of both individuals and business. Discovering Evernym and its tech in the last two weeks has totally changed my view on data privacy and the sharing of information. Evernym has put Truu at the forefront of decentralized healthcare identity. We have worked closely with Evernym to enable doctors to control their own portable digital identities at a higher level of trust than current standards. Evernym has given us very clear first steps to explore self sovereign identity. With experts on hand, we are able to learn quickly and develop real solutions that directly benefit our customers, partners and the wider business We just did a great live demo of verifiable credential exchange at the Blockchain in Pharma Supply Chain conference in Boston. It was very well received so thanks to this community for the support to make this possible! When you work with Evernym, you work with the world’s leading expert in verifiable credential technology. With deep skills in digital identity, cryptography, privacy, security, and governance, we are the original developers of Hyperledger Indy, the creator of the Sovrin Network and the Sovrin Foundation, a WEF Technology Pioneer, and a co-founder of cheqd, the Trust over IP Foundation, the Decentralized Identity Foundation, and the Good Health Pass Collaborative. We’re a mission-driven company, committed to continually raising the bar when it comes to privacy. All of our products are meticulously architected around privacy, incorporating cutting-edge cryptography and zero-knowledge proofs for data minimization. We never cut corners when it comes to building for the trusted future we want to see. Five safeness checks for ensuring that your digital identity systems are secure, private, flexible, and non-correlatable. It’s not enough to “talk the talk.” Here’s how our core principles have informed our entire direction. Not all digital credential solutions are created equal – here’s what makes Evernym’s solution safe, private, and open.",https://www.dropbox.com/s/3flqaszoigwis5b/magic%20whitepaper.pdf?dl=0,,Whitepaper,,Meta,,,,,,,,2020-07-09,,,,,,,,,,,,,
|
||
MagicLabs,MagicLabs,,Medium,,,,,,HackOn2.0,Magic at HackOn2.0,"At Magic, we love to be where developers hang out. As a Developer Advocate, it’s especially fun to connect with devs dreaming up big ideas and hacking them into reality. Back in April, the HackOn2.0 team reached out to me to talk about getting Magic involved in their next hackathon. We jumped at the opportunity and were so glad to support the HackOn2.0’s vibrant community.","Magic at HackOn2.0 At Magic, we love to be where developers hang out. As a Developer Advocate, it’s especially fun to connect with devs dreaming up big ideas and hacking them into reality. Back in April, the HackOn2.0 team reached out to me to talk about getting Magic involved in their next hackathon. We jumped at the opportunity and were so glad to support the HackOn2.0’s vibrant community. HackOn 2.0 is a week-long, global digital hackathon organized by Aditya Oberai, Rishabh Bansal, and team to bring developer’s ideas from inception to reality. This year’s edition focused on fostering innovation and also raised awareness on mental health and diversity. A lot of people dedicated time to participate! 8,298 people registered. In total, there were 370 teams of 2–4 members each. Participants built their hackathon projects based on these tracks: Magic was proud to be a platinum sponsor. In addition, I was Personally psyched to also be a mentor during the hackathon. The learning session set up for May 27th helped participants understand what Magic is and how to get started with it. For the mentor session, you can catch the full video recording here. “Best Hack built with Magic” was one of the categories for participants, in addition to the primary tracks. The participants were allowed to choose more than one sponsored track. We had the following prizes for participants: It was exciting to see that 15% of all projects were built with Magic! When it came to evaluating projects, selecting the winners was not an easy task. All of the projects were amazing and well-aligned to the primary track. In the end, we landed on the top 3 hacks that were focused on solving real problems and have utilized the best possible tech stack in this short time. A key criteria considered was how appropriately the project has implemented the Magic integration. Without further ado, here are the winners — along with the problem they are solving, the technology used, important links, and their team members. Winners 1. WeCare by Eternals Problem it solves WeCare predicts the health risks to your body based on your daily health. The ongoing pandemic made everyone learn the importance of being healthy and how daily activities and diet play a major role in being healthy. Team Eternals wanted to alert the users of WeCare about any risk associated with a deficiency or unhealthy lifestyle by showing graphs and predictions using machine learning. WeCare has a smart predict feature that helps uncover latent diseases and predict the risk of future ones. Patients will be able to review medical conditions and reach their treatment goals much faster, lowering the risk of serious health complications. Technology used ReactJS, Python, Django, Apollo, GraphQL, Heroku, Firebase, Magic Team Members Harmanjit Singh, Kunal Khullar, Gurleen Kaur, and Paritosh Arora Important Links Frontend Backend Video demo 2. ShopZop by Etherbots Problem it solves The Covid-19 Pandemic has hit the small and Medium-size businesses hard as they relied on their in-shop customers and had very little or no digital presence at all. And with the lockdown and movement restrictions, these businesses have suffered a lot. Team Etherbots came together to help connect the shop owners with customers and helping them. A simple solution was to harness the power of an application that almost every Smartphone user uses in India — WHATSAPP!. It will increase their online reach and solve the problem along with some revenue. Technology used ReactJS, NodeJS, GraphCMS, GraphQL, WhatsApp, Magic Team Members Ayush Jain and Bhuvanesh Shathyan Important Links A video demo of the product. 3. Debately by tier-4 Problem it solves Team tier-4 believes that debates help us have healthy discussions and solve problems within the society, helping us build a diverse and inclusive society. In the last few years, online and TV debates have been downgraded, and Tier-4 team members paired up to build an AI-based audio chat for the web using react, magic, webRTC, and symbal.ai to conduct and practice healthy debates. Debately enables individuals from all backgrounds, irrespective of race, religion, and region, to come together and debate on controversial or sensitive topics, share their opinions, or simply improve their language and communication skills while maintaining anonymity. Technology used ReactJS, WebRTC, FastAPI, MongoDB, SymblAI, Firebase, Magic Team Members Sanket Shevkar, Pranav Pathare, Aaryan Srivastava, and Gatij Taranekar Important Links HackOn2.0 not only let us to see how Magic fit into someone’s project and learn what features are performing well, but it also allowed us to connect with individuals, get to know them, and understand their perspectives related to solving real-world problems across topics like diversity and inclusion, mental health and awareness, Covid impact, and open innovation. We were inspired to see participants tackle a specific problem statement, how devs interacted with different technologies, and successfully built a live demo in a few days. All while working remotely, too! Overall, it was a very fun three days of meeting with community members. We gathered feedback and better understood the use case for Magic. It was also a great opportunity to collect more insights on improving product documentation for a better developer experience. We look forward to participating in upcoming hackathons and further engaging with the developer community. If you have any upcoming hackathons/meetups/conferences and want us to be part of your event, reach out to me on our Discord server with the details!",https://medium.com/magiclabs/magic-at-hackon2-0-9187d0e24d86,,Post,,Recap,,,,,,,,2021-08-02,,,,,,,,,,,,,
|
||
Mattereum,,Mattereum,,Rob Knight; Vinay Gupta,,"United Kingdom, England, London",,,,Mattereum,"Mattereum has the tools to make physical goods flow around the world as easily as information using Ethereum blockchain smart contracts.<br><br>We help people to sell, organise, and communicate about their property with confidence. Strong product knowledge ensures everyone derives maximum value from objects they buy, sell, and own.<br><br>Mattereum’s first customer is legendary actor William Shatner’s company Third Millenia. Using Mattereum Asset Passports to authenticate important collectibles and memorabilia from Mr Shatner’s long and distinguished career will ensure that these items continue to grow in value by retaining their provenance indefinitely.",,https://mattereum.com/,,Company,,Company,Enterprise; Web3,DWeb,IOT,,,"Ethereum,Smart Contracts",,2017-08-04,https://github.com/mattereum,https://twitter.com/mattereum/,https://www.youtube.com/channel/UCPJMRiyPFgquezFZT850_Ew,https://medium.com/humanizing-the-singularity,https://medium.com/feed/humanizing-the-singularity,,https://www.crunchbase.com/organization/mattereum,https://www.linkedin.com/company/mattereum/,,,,https://t.me/mattereum,
|
||
Mattereum,Mattereum,,Medium,,,,,,,Build Back Better!,"VR is not ready for artistic reasons, far more than for technical ones. So my prediction is this: the Metaverse is going to fail.","Build Back Better! Vinay Gupta, CEO of Mattereum and Ethereum launch co-ordinator in 2015 looks at how to survive the Crypto Winter and come out the other end with new purpose Around 2017 I was looking at starting a VC fund. I didn’t know exactly how to do it: I’d worked in a fund in the 1990s but there was a lot about the business I did not understand at the time. As part of the research, I put about 100 people through their first VR experience. I wanted to figure out if it was time to invest in VR projects. It was market research. I came to the conclusion that we were years away from VR being a workable technology. The hardware was clearly ready: Oculus Rift was a little clunky but it was the Apple IIe of VR: the thing that tells you The Revolution is Coming. That was fine. But the software was appalling. Back in the day they made movies by putting a camera on a tripod and doing a stage play in front of it. It took decades to make movies. A whole new art form (cinematography) had to be invented. Shocking and amazing things happened decade after decade for a century to get us to where cinema is now. Those changes were often driven by technology, but they were delivered by actors and directors portraying the human experience in new ways, creating ever-more-compelling works. VR is not ready for artistic reasons, far more than for technical ones. I could go on about this for some hours, but the lack of a “virtual cinematography” makes most VR experiences as compelling as watching CCTV footage of mall parking lots. It does not matter if I’m looking at the wheat fields of Gondor and scouting for Nazgul, if it’s not telling me a story I don’t want to know. Also violence is overwhelmingly overpowering in VR. It’s literally just too much and reducing the violent content of our video games to the point that it doesn’t traumatize people to play (Arizona Sunshine, I’m looking at you here) is also going to take time. So my prediction is this: the Metaverse is going to fail. The pandemic is more-or-less over, people are back out in the world again, and everybody loves this place! The stay-at-home culture which VR is fundamentally rooted in doesn’t provide the lived experiences which people want. There is something very much like VR which people do want, but until VR has solved the artistic problems of storytelling in the virtual Medium it’s just chat rooms with better graphics. I have an Meta Quest II, and it’s gathering dust: there’s just nothing to *do* in there. Not yet. On the other hand, we have inflation. To inflate. The state of things being inflated. The bubble of all bubbles. Folks don’t remember inflation. Realistically the US inflation rate is about 12% right now, if it’s calculated using the same measures used back in the day. If that stays up for four years, each dollar in circulation loses half of its value. The whole economy goes through what the Ethereum community has gone through in the last few weeks, over a couple of years. It is devastating. In theory wages adjust to keep up with inflation, and interest rates too. Practically speaking without strong labour unions to negotiate, workers get poorer year-on-year. Pair that up with the global supply chain crisis, including food, and you’ve got a recipe for global disaster. The middle class don’t just get to retreat into the virtual world. There’s nothing there that anybody wants. No, we have to stand our ground and fight for the real world: it’s where we live. The metaverse is not going to save us. If we re-inflate the tech-hype bubble around the metaverse, it’ll produce a brief flare of innovation, some very inflated prices for assets which are inherently extremely volatile, and another flame-out like the one that we’re in, just a couple of years down the road (at most.) I quite liked play to earn as a concept. It didn’t seem inherently bonkers to me that the age-old practice of gold farming could be modernized by crypto into something that allowed essentially independent operators to do it in a defi way. But it’s stuck in the Perez spin cycle, along with everything else, and people in developing world countries are losing their entire livelihood in this chaos. We have to change how this works. We have to take the hard path now: we have to start building the kind of wealth that Warren Buffet can understand, the rebuilding that outfits like Southwest Airlines specialized in. We have to learn how to make money.",https://medium.com/humanizing-the-singularity/build-back-better-8a1be80623b2,,Post,,Explainer,,Metaverse,,,,,,2022-07-07,,,,,,,,,,,,,
|
||
Mattereum,Mattereum,,Medium,,,,,,,The Best of Both Worlds: Solving Doctorow’s Oracle Problem,"So while the blockchain space is in some abstract sense perfectly private and perfectly reliable, things are weakest at the joints. The on-ramps and off-ramps are parts of the real world, they’re tied to physical reality and KYC/AML/CTF regulations.","The Best of Both Worlds: Solving Doctorow’s Oracle Problem Cory Doctorow has been thinking about Web3 lately. He just wrote a long piece about the Oracle Problem. In this article we’re going to get to the bottom of the Oracle Problem, and how Mattereum’s solution to the Oracle Problem — multi party risk splitting with full legal enforceability — lets customers buy and sell valuable physical items as NFTs with strong buyer protection. Solving the Oracle Problem in specific domains opens up huge possibilities for trade. Let’s go through Cory’s critique, and then present some answers to his questions. The Inevitability of Trusted Third Parties The search for a crypto use-case continues. doctorow.Medium.com In one section of “The inevitability of trusted third parties” Cory Doctorow quotes extensively from one of my comments on the Well’s State of the World 2022. I said: Here is an NFT on OpenSea which gives the owner the right to physically take delivery of a 1oz gold bar currently vaulted in Singapore, or its financial value.https://opensea.io/assets/0x495f947276749ce646f68ac8c248420045cb7b5e/47824387705 324153400210554042155132922682187088261737780213014306821163188225[...]Here is the full legal text of one of the Ricardian contracts: https://passport.Mattereum.com/ntfa.20210319.20.alpha.004.619263/06_carbon/asset s/out/certification-contract.htmlThis one is a contract between the NFT owner and my company, which guarantees that we have bought-and-retired carbon credits to cover the physical mining of the gold bullion that is being sold. It also covers the CO2 emissions of the NFT issuing process.Clause 20 has the arbitration machinery.We've worked fairly closely with the UK government on arbitration rules for blockchain asset disputes.https://www.pinsentmasons.com/out-law/news/new-dispute-rules-envisage-direct-to- blockchain-enforcement-arbitral-decisionsThe rules themselves are here: https://35z8e83m1ih83drye280o9d1-wpengine.netdna-ssl.com/wp-content/uploads/2021 /04/Lawtech_DDRR_Final.pdfWe get a name check on page 4.So what's being built out here is a very tightly bound legal framework for buying and selling physical goods, with suites of Ricardian contracts creating legally-enforceable claims about what the goods are **DRAWN ON THIRD PARTIES**. Those third parties do not benefit from the sale of the goods themselves, they make a living providing legal warranties on the goods - they're essentially third party inspectors with no economic interest in the situation other than by selling insurance on the fact that something (for example) contains no slave labour. Cory then continues: But the more I think about this smart contract, the fewer reasons I can find for it to be a smart contract instead of just, you know, a contract. As Vinay Gupta — the creator of this system and someone I have a lot of respect for — says, right there in the text, the entire system relies on third party arbiters to make sure that the parties involved don’t rip each other off, that no one uses slave labor, that the carbon is all offset through various schemes and so on. The point is, all of that could be a deception. The only reason to trust it is that you trust the auditors who have signed the scheme. I don’t mean to pooh-pooh the role of auditors here. Auditors are great. In fact, we’re in the middle of several kinds of serious economic crises because we allowed auditors to become an oligopoly of hopelessly conflicted megafirms, and they are cheating like crazy, and so much of the world is broken as a result. We know that the big auditing firms are riddled with fraud. We know that carbon offsets are among the most fraudulent instruments that companies make. I don’t get how blockchain fixes any of this. How blockchain fixes some of this So let’s talk about how blockchain improves the situation: not a blanket fix, but an economically-efficient risk reduction across the board. Alan Turing named the Oracle Problem in 1937, and people have been wrestling with it in the Ethereum space since the early days of the platform. The Oracle Problem If you’re reading this article, you probably already realize that smart contracts have incredible potential to improve… Medium.com The state of the art in Ethereum Oracles is pretty sophisticated and pretty successful. What Is the Blockchain Oracle Problem? Why Can’t Blockchains Solve It? The blockchain oracle problem is one of the most important barriers to overcome if smart contracts on networks like… blog.chain.link Part of the reason the Oracle Problem is hard in blockchain is the old computer security adage, “things are weakest at the joints”. The pathway by which information from the physical world comes into the blockchain smart contract is a potentially-weak joint between physical world systems and ultra-reliable blockchain systems. The blockchain systems are almost transcendentally reliable: this is why the strain concentrates on the Oracles, and on key management (we’ll come back to key management below). The computer does not break, but you can get at their inputs. Blockchain reliability is not a magical process The blockchain systems are theoretically reliable because of massive redundancy. It’s a bit like flight computers, where there are five independent systems. Five Where Only One is Needed: How Airbus Avoids Single Points of Failure — Right Attitudes In my case study of the Boeing 737 MAX aircraft’s anti-stall mechanism, I examined how relying on data from only one… www.rightattitudes.com All of the flight computers have to agree for it not to be A Problem. In the blockchain space there are thousands of computers forming the consensus on the current state of the system. This is very inefficient, but it is very effective. Given inputs will produce the same outputs very, very reliably. This redundancy method produces insulation from the statistical problems of the real world. This is a very important point. A non-redundant system can get hit by a cosmic ray or a stray particle produced by normal radioactive decay. It can go down from a power outage and go offline. Non-redundant systems are flakey. There is no cure for this. Redundant systems are inefficient. There is no cure for this either. This is the law. The massively redundant blockchain systems are decoupled from the randomness of life to form a statistical approximation of a perfect system. You can program the Ethereum metacomputer — a virtual machine made of thousands of regular computers in a redundant array — as if it was perfect. You can just totally ignore how the physical world works when programming this virtual machine: does not get hacked, does not have physical hardware problems. Any individual node in the metacomputer which develops issues drops out of the consensus. The tens of thousands of remaining machines go on. As a programmer, the massive redundancy means you completely stop worrying about hardware problems or failovers. There is no “error handling.” Nothing ever crashes. It just works. There is a downside to this redundancy. The archaic Bitcoin proof of work mechanism would cost about $750,000,000 to carbon offset every year, if we even thought CO2 offsetting worked. There’s good redundancy and bad redundancy. It’s unfeasibly expensive to do this the wrong way. But the Avalanche system runs on less than 500 tons of CO2 a year, it’s hundreds of times faster than Bitcoin or Ethereum, and it’s been available for a year. So massive redundancy is one half of what produces the effective decoupling of the blockchain from physical reality: the system is so redundant that there can be chaos in the physical world and the blockchain soldiers right on. It is not efficient (yet — we’ll cover proof of stake systems below) but it is elegant. Now let’s look at the other half of what separates the blockchain from physical reality. The other half of the decoupling of the blockchain from physical reality is True Names. You can download the Metamask wallet, create a key pair, and proceed to perform transactions on the blockchain without ever giving anybody any additional information about you at all: you’re an unaccountable ghost. Your legal identity is not tied to your actions. This kind of power was a Cypherpunk dream for decades. Vernor Vinge’s True Names explores the boundary between the cryptographic metaverse and the physical world in minute detail. It is a massively important formative work. In that world, nobody can bind your metaverse avatar to your physical body: the system protects your privacy. But if the link is made, and your True Name is cracked, you can be shut down — or even killed — in the physical world. In the blockchain environment, nobody has to know your True Name. This turns out to be a lot more of a mixed blessing than it was in SF novels. Of course, in practice, the True Name protection on the blockchain is weak for three reasons: - If the actions you take benefit you in the real world (“I mail ordered this book”) then you are bound to your name and address for delivery. If you cash out money you want to spend, you’re tied by banking. If you rent an apartment, you’re tied to the address. - Funding a cryptocurrency wallet through crypto exchanges requires KYC/AML/CTF checks, just like regular banking or even more strict. This enables a “follow the money” approach to de-anonymizing people who later receive those funds, even if those people believe themselves to be anonymous. The crypto cash is tied to your real identity from the get go. - You’re still going to have to pay taxes. So while the blockchain space is in some abstract sense perfectly private and perfectly reliable, things are weakest at the joints. The on-ramps and off-ramps are parts of the real world, they’re tied to physical reality and KYC/AML/CTF regulations. Furthermore, when a user wants blockchain software to act on their behalf in ways which touch the real world, the blockchain’s security model breaks: the system acting as an extension of your will is no longer just the massively redundant blockchain. Now it’s a bunch of messy, regulated, human or Web2 systems. You and your software agents are no longer entirely anonymous because the real world requires named actors with KYC. Any system connected to the real world is going to have some of the problems of the real world connected to it too. That’s just life. Of course, there is the key management problem, and the software reliability problem. If you lose your key, or more likely somebody hacks the device on which you store your key, your blockchain identity and all its assets now belong to someone else. Hardware wallets help a lot, but no system is perfect. And if the software you write and upload to the chain — the smart contracts — have a bug? God help you. It’s a machine, and it’s utterly unforgiving. It’s more like programming aeroplane avionics systems or satellites than web pages. Here be dragons! The best of both worlds: Oracles that work So we’ve looked at the problems of connecting blockchain systems to the real world: reliability and anonymity break down, and the core benefits of the blockchain are diluted. But there are pragmatic systems that find advantages in connecting the blockchain and the real world. I’ll briefly cover two: Chainlink (a big Web3 project of some note), and Mattereum (my own company, as described in the Well State of the World quote above.) Chainlink Consider the Chainlink oracle system. What Is an Oracle in Blockchain? “ Explained | Chainlink Blockchain oracles are entities that connect blockchains to external systems, thereby enabling smart contracts to… chain.link A dazzling array of partner organizations with access to high quality data about things like weather conditions, stock prices and plane departures use a cryptographic and financial architecture to publish their data into the blockchain. They’re standing on their reputations: if they input bad data, they’re going to pay. There are complex “staking” arrangements coming this year which further automate this access to data. Staking Is Coming To The Chainlink Network In 2022 In a recent presentation about the future of the Chainlink network, Chainlink Co-founder Sergey Nazarov detailed… chainlinktoday.com So that’s the True Names side of the bridge: in the Chainlink system, people publishing data are identified by their True Name. Only worthy partners are allowed. The security side is more complex: Trusted Computing infrastructure signs transactions, multiple data feeds are integrated into an oracle feed, no data is single-pathed. It’s not quite the fully redundant system of the blockchain, but it’s a lot of cryptographic hardware and a lot of data source diversity. Not quite the same statistical guarantees as the blockchain itself makes, but a lot better than you would get with a conventional API-based price feed or other oracle. It’s a bridge: not as squishy and unreliable as real-world APIs are, with lots of single points of failure that hackers could attack. But not as utterly, relentlessly infallible as the Ethereum metacomputer. Better than what we have now: Chainlink manages the Oracle Problem well enough to produce a highly functional system, without some kind of Science Fiction fix. It’s just a lot of hard work and good engineering by teams of smart people over years, building the most fraud-resistant and functional thing they can. Actual Machines, not Fucking Magic, as they say. So that’s one end of the pragmatic solutions to the Oracle Problem in the blockchain space: highly automated systems for doing things like price feeds using trusted computing hardware to guarantee data integrity. Mattereum Now let’s look at my own company, Mattereum. Mattereum operates all the way at the other end of the automation landscape: we rely on domain specialists such as curators, art historians and collectors to evaluate physical items. These physical items are being sold as NFTs! The model is: - An issuer puts the physical item in a vault with a set of third party evaluations provided by those experts - The issuer then publishes an NFT which gives the owner of the NFT the right to the item in the vault, akin to a coat check, but secure - The NFT owner can cash in the NFT for the item in the vault. It’s a system for doing real world trade in valuable things with a huge reduction in the risk of fraud: better expert opinions than one would usually get, and intermediate owners (e.g. dealers, auctioneers) are never touching the physical item. It stays securely in the vault. We’ve worked on quite a few assets already, such as a selection of props and memorabilia owned by legendary actor William Shatner, a 350,000 year-old hand axe, and a signed first edition by Gonzo icon Hunter S. Thompson. These physical asset evaluations are performed by curators with relevant domain expertise and presented as legal warranty contracts — pen and ink type legalese — bound into the blockchain using a legal/technical construct called a Ricardian Contract. Not a lot of people seem to have heard of Ricardian Contracts these days: they were widely understood 5 years ago, but the number of people in the blockchain space has grown by something like 10000x since then and the Old Wisdom has become diluted. Invented in the 1990s by Ian Grigg for managing a physical gold reserve with a cryptocurrency system, the Ricardian Contract is the key to understanding the new blockchain ecosystems. What are Ricardian Contracts? A Comprehensive Guide The new EOS Ricardian contracts and the introduction of a new type of digital agreement raised many eyebrows as well as… 101blockchains.com Ricardian contracts are paper contracts, which use international arbitration law to appoint technically sophisticated arbitrators to manage cases involving a hybrid legal/technical system. Those arbitration laws carry hundreds of billions a year of settlements, so they’re well used and well understood and stable: the system has run since 1958. Using those systems to make legalese contracts signed and paid on the blockchain arbitrable in the real world by real courts is the Ricardian Contract breakthrough. It’s another bridge: an extremely robust legal framework for getting enforceable legal rulings in complex areas like maritime law or civil engineering, carefully paired up to blockchain smart contracts. What does the blockchain bring to the table in this instance? Well, the payments are taken on chain, using digital signatures, which are universally accepted as legal signatures these days. Actual statute legislates that in almost all countries. Those statues in each country are one part of the bridge, built out 20 years ago for general purpose use of digital signatures. But what this means is we have an instant and immediate settlement: the warranties on the physical object, and the ownership transfer, happen in a single atomic moment with the technical, legal, and financial transaction perfectly coordinated because they are the same transaction. Remember I said “things are weakest at the joints” — well, in this case, there are no joints. The legal, technical and financial transaction are a single transaction, there’s no way to wind up with a thing half-insured or stuck in an indeterminate state waiting for more paperwork. It’s a “performative speech act.” You say “make it so” and it is so! That will stand up in court in 160+ countries because of the arbitration framework described above. It’s not an end-run around “the system” — it is an extension of “the system.” It’s a bridge! The legal system and the blockchain system can, with due care, work as one system. In the beginning, Cory asked “why isn’t this just a contract?” The answer is “it is a contract, but it’s a contract embedded in a Ricardian context — one half in the world of automated smart contracts, and the other half in the world of courts.” If you do it right, when you put legal contracts on to the blockchain, you get the best of both worlds, and that’s what we did. Some examples of our work. A gold mood ring, one of the first issued, recently sold as an NFT for $65,000. https://passport.Mattereum.com/goldmoodstonering.1975/ https://www.PRNewswire.com/news-releases/the-mood-ring---a-cultural-icon-re-emerges-as-smart-wearable-nfts-301414759.html An original Andy Warhol print of Alexander the Great https://opensea.io/assets/0x1E65Ad4fB60844925Ed95Dec8460d8DE807B2BC4/0 https://passport.Mattereum.com/Alexander_Screenprint_Warhol_01/ About Mattereum London-based Mattereum was established in 2017 by a trans-disciplinary team with a track record in designing and launching nation state-level infrastructure and headed by former Ethereum release coordinator Vinay Gupta. With blockchain-enabled authentication and tokenization of physical assets with built-in legal guarantees, Mattereum removes the fear, uncertainty, and doubt that has plagued digital commerce for decades. Follow us as we bring the Mattereum Protocol to an expanding variety of markets ranging from memorabilia, gold, wine, to prized classical instruments, and more. More at: http://www.Mattereum.com Twitter: https://Twitter.com/Mattereum",https://medium.com/humanizing-the-singularity/the-best-of-both-worlds-solving-doctorows-oracle-problem-3287cda2e48b,,Post,,Explainer,,,,,,,,2022-01-31,,,,,,,,,,,,,
|
||
Mattereum,Mattereum,,Medium,Ian Simmons,,,,,,The Blockchain Sheep of Reason: Why the blockchain works for the young and old people hate it,There is practically no boomer control of bitcoin or Web3,"THE BLOCKCHAIN SLEEP OF REASON Why the blockchain works for the young and old people hate it By Vinay Gupta, CEO, Mattereum The younger generation suck at explaining the blockchain. No wonder there’s so much fear, uncertainty and doubt in the space right now. Let me explain what’s going on, but this time we’re going to leave the really important parts of the picture in the frame. You’ve got to start in 2008 with the global financial collapse. Since then, interest rates have hovered around zero as government pumped cash into the global economy to keep it running. For a long time, no inflation. The extreme medicine was working. Then covid, and 5% inflation. During that period with no inflation, and tons of money printing, there was very little economic growth. If there had been, inflation would have started then — the economy picks right up, credit risk goes down because lending is less risky, and prices start to rise on the cash. So what happens to people who are turning 20 or graduating college in 2009? Nothing. Nobody will lend them money to start businesses, VC becomes extremely conservative and risk-averse, and Silicon Valley starts to consolidate. Life is slowly getting harder, and young are v. poor. Actually take a look at this thing: There’s just a titanic concentration of wealth in the hands of older people, and the young are living on scraps. This is absolutely brutal: feels like civilization is going backwards. The young are terrified. So in this situation what the young need is economic opportunity and THEY ARE NOT GOING TO GET IT IN ANY RESOURCE POOL THAT THE BOOMERS CONTROL.There’s just no room to expand: whatever you want to do, there’s a tax: boomers control media distribution, VC finance, real estate. Most of the easy niches in the dotcom world are already taken: people shop, they read the news, they listen to music, they date — and big dotcoms are already in all of those niches. Nobody is going to overthrow reddit and FB anytime soon; they’re solidly locked into place now. This is a simple problem of access to capital: land, patents, intellectual property, distribution networks for physical products (Target, Walmart). The young don’t have enough cash to BUY INTO THE SYSTEM. They’re so poor they can’t even get exploited. Now do you see Bitcoin’s’s fix? Bitcoin allows its adherents an escape from the boomer-ridden conventional economic landscape. What is bitcoin’s core feature? IT’S TOO HARD FOR BOOMERS TO USE. So you can build an economy for the young, a parallel state to the Boomer Dollar dominated US political landscape. This is one of those Voice, Loyalty and Exit things. The young complain: nobody hears them. They try and tough it out in existing cultural blocs: still can’t make rent. But to exit the boomer-controlled old economy into the btc/Web3 space? Wow: a huge self-financed bubble grows.There is practically no boomer control of bitcoin or Web3. I’m 50, and I’m one of the oldest people in the space — Ian Grigg, David Chaum, Joe Lubin, Don Tapscott, a handful of others — but the old guys are rare and (apart from Joe) not that powerful. Young blockchain frontier. Now, young people, they are not so smart. I was young myself. They make mistakes. Big ones. Now we have Oh So Many young people running around with a huge pile of capital that they basically created by believing each-other’s stories. It’s a creative pact. A NEW SOCIAL CONTRACT. So this New Social Contract (NSC), let’s examine things like Bored Apes. In this contract, kids who grew up with an ipad in their hand acquiring virtual property in games and trading it with friends, go from renting virtual property from Farmville to creating and owning it. Why is an Ape valuable? The younger generation decided it was valuable. Why isn’t this golf course valuable? The people that thought golf was at the centre of the universe are all too old to play much. This is a generational turnover of wealth, an escape from a locked-up world. Because there’s practically no law in (for example) defi or the NFT game. The ICOs ran afoul of old school financial regulation. But the NFTs are bang in the middle of open ground: you draw a thing, the government says you own it outright if it isn’t Mickey Mouse. You sell it. So now you have tons of cultural production — young people making art for young people — but the distribution channel is OpenSea, and that’s also owned by… young people. And their fees are light. The younger generation are Exiting the Boomer Machine to live on their own terms. 25 years ago, John Perry Barlow (I met him once!) wrote The Declaration of Independence of Cyberspace: Does any of this sound familiar to you? Now let’s discuss the New Social Contract’s critics.Who’s getting so het up here? Why all the hatred for Web3 these days? Let me tell you where the action is: it’s the people who are waiting around to inherit their parent’s houses who are telling you that this NFT thing is no good and YOU SHOULD GET BACK INSIDE OF THE MACHINE WHICH WILL ONE DAY PAY MY PENSION. Because this “defection” hurts them. The more the young define their own system with their own rules, the more the old are missing out on the deal. But they’re mostly not watching this: Warren Buffet is an exception, and he called BTC “rat poison” remarkably early. But he’s the rat. But mostly the old missed it. Then the bridge generation, the one doing the complaining, clogging the airwaves with NFT hate and trying to coerce artists and brands into staying away. What do they have to lose? The answer is: they’re late to the party, and were hoping to WIN AT THE OLD GAME in the future. The well positioned chattering classes, the media and academics, want to inherit the system that the young are refusing to join and actively working to undermine. They don’t have real power yet but they know if the system stays up, in 20 years they’ll run the next TED/Davos. And these slightly younger defenders of the Status Quo are fighting like hell to defend the Status Quo only because they expect to inherit it. If the cultural production system burns to the ground before they inherit, they get nothing. All the value’s skipping a generation. So when you see somebody wailing on NFTs, ask the hard question: “how old are they, and what is their stake in the old system?” This is not to say that NFTs and crypto are perfect: young people made them without much reference to the Old Social Contract and they worked very fast. I was under 30 in the 1990s when I got involved in cryptocurrency. By the time Ethereum rolled round I was one of the oldest members of the team: a stabilizing influence, perhaps. But I was on that side because I had nothing to lose: I had no property, no stake. Open Source guy! Now I’m plotting a global environmental and economic reform which, even if it works, I will not likely see the end of unless I live for a very long time. I’m working for future generations now. (Medium.com/humanizing-the…) I should probably also mention http://myhopeforthe.world/ which is a rough summary of how I spent my life before lucking into Ethereum and becoming the Launch Coordinator for the 2015 launch.",https://medium.com/humanizing-the-singularity/the-blockchain-sleep-of-reason-b28f3bd3f83b,,Post,,Explainer,Web3,,,,,,,2022-06-09,,,,,,,,,,,,,
|
||
Mattereum,Mattereum,,Medium,,,,,,,Winter Has Come,"crypto wins by solving problems that nobody else can solve, profitably. It has to win at three levels to survive:<br>- Ordinary people have to use it<br>- It has to generate actual value, not just move value around<br>- Governments have to tolerate it or use it themselves — either one will do<br>","Winter Has Come Vinay Gupta, CEO of Mattereum and Ethereum launch co-ordinator in 2015 looks at the latest Crypto Winter and how we got here. Once again, crypto winter is upon us. Depending on how you count this is my fifth or eighth. Back in the 1990s there was ITAR and the crypto wars, where the US Federal Government basically killed the industry in America, back when crypto was a thing you did to emails not a shorthand for privately issued anonymous digital cash. The early days of bitcoin, when it hit thirty-something dollars then crashed to four and stayed there. The 2016 DAO crisis. I’ve seen it all. I want to talk about how we wound up where we are today, and how we get off this nightmare ride of boom-and-bust cycles which are psychologically trashing our industry and making it so hard to build enduring value. We can stop the bleeding, but we have to be smart. The psychology of the boom-bust innovation cycle. Here’s the piece I wrote in 2017 about the 2018 crypto winter. Some of you may remember it. It lays out the same basic case I’m going to make here: crypto wins by solving problems that nobody else can solve, profitably. It has to win at three levels to survive: - Ordinary people have to use it - It has to generate actual value, not just move value around - Governments have to tolerate it or use it themselves — either one will do If we can’t hit all three of those criteria WANGMI. We’re building a technology that needs a billion regular users to survive. The constants are huge: big engineering teams, big marketing budgets, complex legal and regulatory work. It all costs money and the only thing that can sustain the industry is real economic growth. It is still Warren Buffet’s world — literally, he owns it. His way of analyzing the world is to look at the fundamental valuation and profitability of assets. He’s so good at it, his company has roughly the same market cap as Ethereum had a couple of years ago. But Buffet’s company doesn’t 10x up and 4x down every couple of years: it’s plodded along fairly reliably for decades, incrementally growing value. Now, you’ve gotta ask yourself, “why can’t Warren Buffet see crypto?” And it’s not because he’s old or doesn’t get tech. No, it’s because he’s smart about financial fundamentals. That’s all that boring stuff like: - How much food will come out of the ground? - What’s the global market for machine tools going to look like next year? - How fast are people moving to the cities? Thinking about this stuff really carefully is how you figure out when a real-world asset is under-valued. Buying things which are under-valued is how Warren Buffet makes money. But Warren Buffet won’t buy bitcoin at any price. So we have to think about that. What do we know that Warren Buffet doesn’t? Now let me introduce another big thinker about finance: Carlotta Perez. Perez is good at valuing bubbles. Specifically, she’s good at thinking about how bubbles create value in the long run even though they all pass through horrible periods like this. Source: https://avc.com/2015/02/the-carlota-perez-framework/ Source: https://www.lesswrong.com/posts/oaqKjHbgsoqEXBMZ2/s-curves-for-trend-forecasting What Perez says, roughly, is that bubbles build infrastructure. There’s a massive surge of irrational enthusiasm during which water flows uphill, capital flows into the most improbable things, and the future seems so close you can touch it. Warren Buffet hates this stuff because it involves guessing about the future. During that initial lift, during the innovation burn period, the G-force is like sitting in a plane at take-off. It hits you right in the base of your spine. Here we go! Then a little later nearly everyone goes bankrupt. Then a bunch of grown-ups who know how to run business move in, buy up all the assets cheap, and turn all of this technological potential into profitable businesses. It’s a fundamental change in the nature of the people that run the industry: it goes from innovators to operators. Risk tolerance goes way down because the technology already works but it needs people who can operate it profitably. My company, Mattereum is in the middle of that process of transformation right now as our business team takes more and more responsibility for running the business, and the original innovators are now more integrated into product development and sales. Usually companies don’t make that transition under their own power — somebody has to come in and fire the CEO — but in our case we knew enough about how businesses really run to make those changes under our own power (without anyone firing me). I’ll introduce a few members of our new senior leadership team in another Medium post shortly. So we’ve been through basically six rounds of the Perez cycle in crypto in the past 15 years. These things come on each-other thick and fast. An enormous amount of underlying technological, financial and even social innovation has happened. - Bitcoin - Altcoins - Ethereum - ICOs - Defi - NFTs And at the end of that we’re left with a rough copy of about a third of the global financial system. You could just about put together a mortgage loan and buy a house as an NFT if you really put your mind to it. It would take work and connections, but you could just about buy real estate as an NFT today, with financing. Mattereum is of course going to make that a lot easier real soon, but let’s deal with things as they are rather than as they’re going to be. So in every one of these rounds of the Perez cycle we can see the future as clearly as all the folks who went bust installing “dark fiber” optic cables in the ground. About 20 years ago companies ploughed billions into the ground to lay fiber optic cables that nobody needed or would pay for. It was THE FUTURE. Then everybody went bankrupt, grown ups bought the assets for a song, and slowly started to operate the fibre profitably. Same thing happened with airlines: again and again and again Southwest Airlines acquired bankrupt airlines for a song, with planes and routes, and operated them profitably. In crypto we run this cycle super fast and with the same people in the room on each cycle. But the field keeps growing and changing, and all these ups and downs result in a most “interesting” phenomena: the All Time High. During the All Time High, at crypto events, every single person in the room has made money. Everybody. You bought in yesterday, you’ve already made $5. You bought in six months ago and you look like a genius. The money just seems to take care of itself. During a dip the “newbies” have lost money; the least socially connected 10% or 20% of the people in the room bought, then it dipped, and the old hands are buying at the current price. The newbies are a bit twitchy but the old hands are all looking greedy rather than fearful, and life goes on. During a crash, and all-importantly after a crash, 70% of the people in the room have lost money. Some of them have lost 70% of what they invested: they took real money out of bricks-and-mortar assets and bought hype tickets, and then got rekt, to use current parlance. And in these periods the old hands aren’t buying anything; they’re sitting on their winnings and waiting for the rally somewhere on down the line. The NFT space is now in this condition. At NYC.NFT in June 2022 the mood was mordant. People have lost a lot of money, and they wonder if they’re ever going to make it back. Remember 2018, Ethereum people? The ICO craze had blown over because the massive promises made by those projects had failed to materialize, and people were gloomy as all hell. A long time passed until Defi Summer in 2020, and then the 2021 NFT take-off a year ago. Most of the new folks in the space haven’t seen crypto winter before. 70% or so of the people in the room have lost money. The mood is down, down, down. Welcome to the Perez Cycle, kids. I’m sorry to be seeing you down here in the gutter one more time, but you’ll find a lot of old friends here, survivors of previous rounds of the cycle. We stick together.",https://medium.com/humanizing-the-singularity/winter-has-come-e56fcb667cfa,,Post,,Explainer,,,,,,,,2022-07-06,,,,,,,,,,,,,
|
||
Mattereum,Mattereum,,Medium,,,,,,,Bringing Truth to Market with Trust Communities & Product Information Markets,"With product information markets enabled by Trust Communities, we can incentivize truth in markets by allowing experts to make money on their expertise and face consequences for erroneous claims. Simple concept, powerful implications.","Bringing Truth to Market with Trust Communities & Product Information Markets The incentives in the global marketplace are utterly broken, giving rise to drastic inefficiencies, environmental harm and human rights abuses around the world. How do we create a market economy that doesn’t consume the world? Intro: Fix the Incentives, Fix the World As it currently stands, centralized institutions are incentivized not to secure truth in their products or services but rather shield themselves from potential liabilities. While many companies offer warranties to their products, these are limited almost entirely to the primary markets. As soon as an object begins trading in the secondary markets, liability dissipates. The only other sources of truth available beyond the manufacturer are the specialist firms which rate and certify objects of a particular domain: fine art, collectible cards, instruments, etc. However, these institutions are limited in their capacity by their lack of a shared record of an object’s history. Best case there’s an entry in a single database. Worst case: a single paper certificate. This disconnected certification system and lack of initiative and coordination in securing product information has incredibly adverse effects on society: counterfeiting abound, environmental harm, and human rights abuses. So how do we break out of the silos of separate “truths” and crowdsource expertise to build a better marketplace that doesn’t absolutely consume the world? Our answer: Trust Communities What are Trust Communities? Built on the Mattereum Asset Passport Before diving into this concept it is important to discuss the core component of the Mattereum Protocol: the Mattereum Asset Passport (MAP). In short, a MAP is a bundle of legal warranties tied to an object. While these warranties can vary with the object in question, the initial warranty is often some form of identification. Other warranties in a MAP may include authentication methods, carbon offsets, anti-slavery certification, tokenization (or connections to any smart contract or software system), and many others. These warranties are essentially “legal lego” of various contract terms that will range greatly between different asset classes and will accrue around assets over time. The warranties within a MAP are not informal handshakes on the internet but rather legally-enforceable terms of a contract that the buyer opts into, evidenced by cryptographic signatures which bind counterparties to the underlying legal agreement. All claims are backed by financial stake, giving all warrantors accountability and skin-in-the-game for their assessments. This framework also provides access to dispute resolution protocols in the event of systemic or commercial fallout via an arbitration clause in the contract. Later this year we will integrate UNN.finance risk management pools so that there is a DeFi-native way of supporting product sales using Mattereum. We are also building bridges with the regular insurance industry. This approach means a single asset can be protected by both DeFi-native and fiat-native risk management protocols. Once a MAP has been generated, we must then have a system in place to incentivize and secure this object warranty system over time without centralized institutions. Building Product Information Markets with Blockchain Commerce on the internet is both a technological and social phenomenon, so any proposed system must address the fundamental social and technical challenges that go into curating and securing product information. Socially, in the absence of centralized trust we must incentivize experts to apply their domain-specific knowledge to supply warrantied information around goods and services. Technically, we need a shared, open database for recording the provenance of an object and a programmable transaction system which powers the aforementioned incentive mechanism. This system must also overcome the transaction costs of bringing in experts (who are not dealers) to rapidly and efficiently deploy their expertise. Mattereum Trust Communities With blockchain and smart contracts, we can power commons-based peer production of product information with built-in crypto-economic incentives and clear lines of accountability. With product information markets enabled by Trust Communities, we can incentivize truth in markets by allowing experts to make money on their expertise and face consequences for erroneous claims. Simple concept, powerful implications. Let’s look at some examples. Eliminating Counterfeit Goods According to a joint report by OECD and EUIPO based on 2016 data, counterfeit and pirated goods accounted for over $500 billion in international trade in that year alone: up to 3.3% of world trade! The range of markets affected include common consumer goods such as clothing and electronics, B2B products like chemicals and construction materials, luxury fashion, fine art, and even pharmaceuticals, food, and medical equipment. This scale of fraud doesn’t just burn the top 1% with a forged Monet or Picasso discovered in their collections. This can have a real impact on public health, safety, and infrastructure. The product information market method could adapt to ALL of these industries at the consumer and business level. The main difference is where the Asset Passport is generated in the product lifecycle. With consumers, we can generate MAPs to secure the storied value of unique or limited objects such as art, instruments, cultural artifacts, or precious metals, leveraging a network of third-party, independent certifiers with relevant expertise to stake their assessments via warranties, challenge any existing claims, and even earn from the continued sale of these items. At the business level, we could embed MAPs within corporate supply chains even before the manufacturing stage to have a truly transparent and information-rich view of the entire product lifecycle, from design and specifications, to production, to purchase and resale. Cryptographically verifiable by any and all parties. Think of something like KickStarter: you put money in at the design stage and get an NFT. You can resell that NFT as the product approaches physical delivery. The new owner of the NFT will get the final product when it ships: ideal for rarities and exciting new drops. Most industries feature third-party expertise outside of the key corporations such as certification firms and engineering tear-downs. With Mattereum Trust Communities, we can build product information markets that aggregate and incentivize the use of this underutilized knowledge, restoring value, increasing liquidity, and bringing peace of mind to trade at all scales. Powering the “Lean Green” Circular Economy Before we can ever achieve an ultra-efficient and sustainable circular economy of goods, we must first solve for the deep lack of trust in online transactions. Just as liability dissipates when goods reach the secondary market, so does their value. E-commerce platforms like eBay and Amazon provide incredible reach with their two-sided marketplaces, but the frequent fraud, dodgy dispute resolution procedures, and high fees severely hinder the valuation of secondhand goods. The “doubt discount” runs far and wide in these markets, where the vast majority of all material goods reside. As an example, let’s look at a useful item in virtually any setting: the Honda EU20i portable electric generator. By reputation, these generators run forever: it’s the Rolls-Royce of small generators. On eBay, they’re roughly 60% discounted from new, even if they’ve only been run a few hundred hours. Enter our Trust Community experts: Examine the generator. Write a warranty on its expected future lifespan. If it breaks down before that, the buyer gets paid. The simplest form: “I’ll buy this generator for $1000 if it breaks down in the next two years.” This gives the expert the chance to repair it and resell it. Simple market mechanisms. The result is the second hand generators are worth a lot more if they last. Garbage units can’t be warrantied, so they don’t attract higher resale prices. This produces a flight to quality in the market, reducing the overall number of failed generators sitting in the scrap heaps. This has two benefits: those goods which are produced in a re-identifiable form and can be warrantied will sell for higher prices initially, because of more stable, higher secondhand value, and these goods will provide more hours of service for lower landfill cost than low quality goods. The “secondhand circular economy” then kicks in. There’s less unused cheap and nasty generators gathering dust in garages, because people buy a good generator secondhand when they need it, and sell it again for nearly what they paid for it — a rational price given how long these things last. Now imagine this system for any product of lasting utility. Using product information markets, the secondhand marketplace suddenly evolves from the place where value goes to die to the ideal means of acquiring goods of tested, durable worth. As a society, we could do more with less, increasing profitability in peer-to-peer economies and greatly reducing waste. Revealing the True Costs of Trade Trade globalization is rife with negative externalities. For the purposes of this article, let’s look at two examples of suffering imbued within the global machinery of trade, and how Trust Communities and Product Information Markets can help combat this systemic injustice. Climate With unchecked carbon emissions being an accelerant of climate change — which will have cascading effects throughout our agriculture, infrastructure, and general quality of life now and in the future — we can view this by-product of trade as contributing to suffering in the world. Despite the threat, there’s plenty of greenwashing and CSR/ESG theatre from the corporate sphere, as well as gaslighting of the populace. Global coordination to avert the worst-case scenarios in the future is a combined policy, social, and technological challenge: a veritable “wicked problem.” Policy and social awareness are beyond the scope of this article, so let’s look at supportive solutions on the tech side. Even though we have carbon calculators for individuals and businesses alike as well as marketplaces for purchasing carbon offsets, there still lacks a fine-grain accounting of C02 and energy costs across all of industry. Carbon Calculator carbon footprint calculator for individuals and households Show you care for the environment and communities across the… www.carbonfootprint.com For assets secured by the Mattereum Protocol, carbon offsets can be purchased through third-party marketplaces and be warrantied within the Asset Passport. For products which emit carbon throughout their lifecycle — such as gas-powered vehicles — multiple offsets can be purchased and warrantied over time. In a future where net-zero products gain a premium over non-net-zero goods and services due to demand and possibly mandate, Trust Communities can help facilitate Product Information Markets for end-to-end carbon accounting of trade and push us further to a greener future. Decentralizing this process is necessary to maintain accountability for all parties involved — in particular corporations and governments which have the largest carbon footprints and therefore the most responsibility. Slavery The most pernicious and hidden suffering of trade globalization lies upon the millions of slave laborers which exist in the world today. While exact numbers are of course impossible to count, the ranges estimate the total number of slaves in the world today to be 20–40 million, with 10 million estimated to be children and the vast majority being women and girls. The clothes on our backs. The minerals in our smartphones. The likelihood of anyone reading or writing this not owning a product touched by a slave is incredibly slim. The truth is that throughout the global supply chain, there are shadowy areas far removed from the corporate offices or warehouses where material and labor is sourced by stripping the basic rights and dignity of other human beings. Slavery Footprint How many slaves work for you? There are 27 million slaves in the world today. Many of them contribute to the supply… slaveryfootprint.org While fair trade products are prevalent, they are limited in number and scope. In reality, slavery-free trade and decent labor conditions should be the standard mode of industry rather than a unique offering. Similar to climate, the path forward for meaningful systemic change will be a combination of progressive policy, social awareness, and technological innovation. On the tech front, Mattereum is actively researching and developing methods of warrantying anti-slavery certification. In practice, an independent, expert individual or firm within a Trust Community could investigate the supply chain of a particular product for any labor abuses, either supplying the warranty or checking an active claim if there is any suspicion on its authenticity. The exact implementation of this is a work in progress, but we are joined in the effort by human rights activist and anti-slavery expert Helen Burrows to design the process in the context of the Mattereum Protocol. This work is a culmination of decades spent thinking on how to manage scarce resources on the planet such that people’s basic necessities of living are secured, even in the worst of situations. For a longer read, Mattereum CEO Vinay Gupta distilled his insights on the technological and social transformation of material culture in his book, The Future of Stuff, where he explores how material culture has made monsters of us all and how we can build a better path forward where humanity can live in equilibrium with the planet at the expense of no one. The Future of Stuff The Future of Stuff - Kindle edition by Gupta, Vinay. Download it once and read it on your Kindle device, PC, phones or… www.amazon.com Better Markets Make a Better World The potential of Product Information Markets as described here is a reformation, rethinking, and possibly a path to redemption for the markets consuming the world today. If taken to the nth degree, Product Information Markets could create a multi-dimensional pricing mechanism beyond supply and demand: energy costs, material costs, C02 emissions, labor hours, worker conditions, etc. Imagine if we were able to index and filter markets along these lines. We could virtually eliminate fraud, piracy, environmental harm, and root out human rights abuses all across the world. With this information at hand, people could engage in material culture in a way that does not conflict with their core principles. They can even configure tools to prevent them from contributing to negative externalities like accidentally buying products with unchecked carbon emissions or slaves in the supply chain: automated morality systems that overcome the mental transaction costs of simply not doing harm to society. . This vision of a market economy which incentivizes truth and prices out societal harm does not require central command economies, heavy bureaucracy, or some undiscovered paradigm-shifting technology. It simply demands a will to change, and the technology to make sure that people’s decisions are correctly implemented. About Mattereum London-based Mattereum was established in 2017 by a trans-disciplinary team with a track record in designing and launching nation state-level infrastructure and headed by former Ethereum release coordinator Vinay Gupta. With blockchain-enabled authentication and tokenization of physical assets with built-in legal guarantees, Mattereum removes the fear, uncertainty, and doubt that has plagued digital commerce for decades. Follow us as we bring the Mattereum Protocol to an expanding variety of markets ranging from memorabilia, gold, wine, to prized classical instruments, and more. More at: http://www.Mattereum.com Twitter: https://Twitter.com/Mattereum Telegram: https://t.me/Mattereum",https://medium.com/humanizing-the-singularity/bringing-truth-to-market-with-trust-communities-product-information-markets-d09fb4a6e780,,Post,,Meta,,,IOT,,,,,2021-09-29,,,,,,,,,,,,,
|
||
Mattereum,Mattereum,,Medium,,,,,,,Decentralizing and Securing Collectible Card Grading Services with the Mattereum Protocol,"Avoid costly industry deadlocks and gatekeeping with a peer-produced, decentralized alternative to centralized collectible grading and authentication services with the Mattereum Protocol","Decentralizing and Securing Collectible Card Grading Services with the Mattereum Protocol Avoid costly industry deadlocks and gatekeeping with a peer-produced, decentralized alternative to centralized collectible grading and authentication services with the Mattereum Protocol On March 30, 2020, the collectible sports card world was shaken by the news that the gold standard card grading and authentication service, PSA, had decided to close its doors to new submissions following a vast uptick in demand in recent months that had created an insurmountable backlog. Competing firms may enjoy a new wave of users from this new demand, but as they operate on the same business model and processes — accepting incredibly steep submission and grading fees and using trained in-house personnel — this new window of profitability and growth may inevitably lead to the same path as PSA: incapacitation and immense loss of value. How can the collectible card world achieve and sustain a level of scalability, resilience, and quality in the expert certification process that can adapt to the ebbs and flows of a growing, global industry? Enter the Mattereum Protocol, a peer-produced, decentralized alternative to centralized grading and authentication firms that elegantly brings assets from zero history to storied value with built-in legal warranties and dispute resolution. Before diving into how Mattereum addresses these issues, let’s first unpack the incumbent business model. Monopolies | High Fees | Gated Markets Grading companies wield immense monopolistic power. A small number are held as industry standards. A favorable grading from them is much sought after. An unfavorable grading from them is fairly damning. A high rating by a firm like PSA can be the difference between a card valued at $500 versus $50,000 or more. Most of the card sales that make headlines with multi-million dollar auctions feature a PSA grade. In fact, a profitable cottage industry of specialist companies have arisen to offer predictions of what the PSA rating for a card may be. Occupying this indispensable position, grading companies can charge what they like for their services, comfortable in the knowledge that collectors and dealers will pay it. There is a fundamental imbalance in power here. Disputes over a grading, while not impossible, are very difficult as grading companies are insulated by their Ts&Cs which state that grading conforms to THEIR standards and that graders are authorized to exercise their judgement within these bounds. This is a double edged sword. While it ensures judgement by established criteria, it also grants their objective opinions on certain matters a great deal of weight. In the legal world, we call these sorts of arrangements contracts of adhesion, situations in which a single, powerful firm takes charge of the contract drafting process and is able to bend the agreements to their favor. This a common instrument (among others) wielded by companies with monopolistic and gatekeeping tendencies. Yet, despite the value and reputation these firms have in the collectibles industry, the actual mechanics of their business are difficult to scale which can be a stated justification for egregiously high fees. And so it goes. We propose a different way of doing things. The Mattereum Protocol in Practice Peer production of goods, services, and infrastructure has become a powerful phenomenon in the 21st century. With the monumental success of peer-produced operating systems (Linux), encyclopedias (Wikipedia), and financial infrastructures (Bitcoin, Ethereum, and other blockchains), it is not a stretch to imagine other industry verticals that could benefit from a similar paradigm shift. The ecosystem that has built around collectible card grading and authentication is certainly ripe for disruption in a way that encourages economic decentralization and genuine market efficiency. To accomplish this, the Mattereum Protocol leverages an integrated legal and software system consisting of three main components: Mattereum Asset Passports, Trust Communities, and Real-World Asset Tokens. (For a more detailed guide of the protocol, we highly recommend referring to the Mattereum Product Walkthrough. For purposes of this article, we will mostly highlight the certification process.) The Mattereum Asset Passport In short, an Asset Passport is a bundle of legal warranties tied to an object. While these warranties can vary in accordance to the object in question, the initial warranty is often some form of identification (with data points provided by the owner). Other warranties that may accompany an asset like a sports card may include a reference to the particular tagging system used for in-person authentication of the item (such as scannable NFC chips affixed to the object) or the token smart contract if the card has been tokenized and “uploaded” to an NFT marketplace. In practice, the Asset Passport aggregates warranties and object information using established best practices. The Asset Passport provides the basis for securing an object’s provenance and value over time through protocols rather than firms. The following are some of the benefits this structure provides for the sports card industry. - Skin in the game All expert certification of a card is backed by a legal warranty as well as financial stake, adding weight to an expert’s assessment. Even if they are not employed by one of the big grading companies (and wouldn’t this be a great arrangement for former staff of these organizations?), the fact that these experts are willing to put skin in the game and put their own money behind their assessments will not be overlooked by serious collectors. 2. Trust Communities By virtue of Mattereum’s method of bringing together multiple third-party sources of information on objects, it is possible for a card’s Asset Passport to contain several independent assessments of its authenticity, provenance, and condition. This would enable buyers to get a broader perspective not limited to centralized firms. Experts can also hold each other to account to safeguard against erroneous claims. Through this distributed framework of crowdsourcing expertise, trust becomes a feature powered by community, not simply a risk to be mitigated. 3. Dispute Resolution Buyers of the card have much clearer and impartial paths of recourse through the Mattereum Protocol if they have doubts about any information in a card’s Asset Passport. Our dispute resolution procedures ensure that suitably knowledgeable arbitrators will decide each case, and that these arbitrators will be neutral rather than defending the interests and reputation of a company. Fortunately, this mechanism is globally enforceable across 150+ countries by the 1958 New York Convention on Arbitration. 4. Composability A Mattereum Asset Passport is more than just a grading. It’s an authentication mechanism, a record of provenance, a tokenization engine, and many other things ON TOP OF also being a condition and grading report. This comprehensive record of the card is a living document that aggregates all legal and software components in a single interface immortalized on blockchain and distributed file storage networks. New forms of ownership, commerce, and engagement around collectible cultural artifacts can be designed to create more utility and possibility around an otherwise benign object. From zero history to storied value Instead of buckling under the weight of growth and unscalable business models, the collectible cards market can adopt a peer-produced, decentralized alternative that provides the same guarantees at cheaper prices without centralized operations. With blockchain-enabled authentication and tokenization of physical assets with built-in legal guarantees, Mattereum removes the fear, uncertainty, and doubt that has plagued digital commerce for decades. Follow us as we bring the Mattereum Protocol to an expanding variety of markets ranging from memorabilia, gold, wine, to prized classical instruments, and more. More at: http://www.Mattereum.com Twitter: https://Twitter.com/Mattereum Telegram: https://t.me/Mattereum",https://medium.com/humanizing-the-singularity/decentralizing-and-securing-collectible-card-grading-services-with-the-mattereum-protocol-ead040351c2,,Post,,Meta,,,IOT,,,,,2021-05-19,,,,,,,,,,,,,
|
||
Mattereum,Mattereum,,Medium,,,,,,,"FOS Ep. 5: Ian Grigg on Crypto, Identity, Community, and Building Positive-Sum Systems","Ian Grigg is one of the most influential builders in the crypto space, having built digital asset systems since the nineties. We discuss his invention of the Ricardian contract framework, what makes cryptonetworks successful, identity as communal phenomenon, and the importance of building positive-sum systems.",,https://medium.com/humanizing-the-singularity/fos-ep-5-ian-grigg-on-crypto-identity-community-and-building-positive-sum-systems-17ef316703b9,,Episode,,Meta,,,,,,,,2021-08-25,,,,,,,,,,,,,
|
||
Mattereum,Mattereum,,Medium,,,,,,,Countering Marketplace Deception with Mattereum’s Trust-as-a-Service Platform,"Marketplace deception is everywhere, at great cost and risk to consumers and businesses. Regulation alone won’t fix it. Can Mattereum Asset Passports and Product Information Markets help secure trust in B2B and B2C trade?","Countering Marketplace Deception with Mattereum’s Trust-as-a-Service Platform Marketplace deception is everywhere, at great cost and risk to consumers and businesses. Regulation alone won’t fix it. Can Mattereum Asset Passports and Product Information Markets help secure trust in B2B and B2C trade? On October 13, 2021, the Federal Trade Commission issued a Notice of Penalty Offenses to over 700 companies, putting pressure on American businesses to disengage from deceptive practices such as fake reviews and false endorsements or else face civil penalties. FTC Puts Hundreds of Businesses on Notice about Fake Reviews and Other Misleading Endorsements The Federal Trade Commission is blanketing industry with a clear message that, if they use endorsements to deceive… www.ftc.gov The list of companies on the notice include some of the largest companies in the world across a range of industries, such as Alphabet, Inc. (Google), Amazon, Apple, Microsoft, Shell Oil, Starbucks, McDonalds, and many others. A quick skim through the list gives the impression that almost any household name company actively deceives consumers as part of their ongoing business strategy, at least according to the FTC. This form of marketplace deception is not limited to B2C relationships. On October 14, 2021, Reuters reported that aerospace giant Boeing had notified the Federal Aviation Administration (FAA) that it had discovered defective parts for its 787 Dreamliner fleet which were sourced by a supplier and manufactured by another company. Boeing finds new defect in continuing struggle to produce Dreamliner 787 WASHINGTON, Oct 14 (Reuters) - Boeing Co (BA.N) said on Thursday that some titanium 787 Dreamliner parts were… www.reuters.com These forms of marketplace deception are seemingly omnipresent in trade at all scales. While regulation may be able to get many businesses to more authentically engage with consumers and other businesses, some of these entities are of such a size that they can simply absorb civil penalties en masse and proceed with business as usual. To combat this endemic deception of consumers, we need a combined effort of effective regulation and technological solutions to secure trust in digital commerce. More specifically, we need to establish standards for consumer protection, and implement the protocols capable of meeting them. The Mattereum Protocol is well-suited for tackling the challenge of holding companies to account for their stated claims, specifically by offering buyers warrantied claims around their purchased goods powered by an incentivized network of third-party expert certifiers. Let’s explore how Mattereum as a trust-as-a-service platform can help create more authentic relationships between businesses and consumers and between businesses themselves. How do we build Trust-as-a-Service? Ultimately, Mattereum is building a system to secure truth in trade at all scales: documenting and offsetting negative externalities, creating a circular economy of reuse, recycling, and upcycling of goods, and designing incentives which align profitability with sustainability. Let’s unpack the Mattereum approach and explore how it would work in B2C and B2B contexts. Asset Passports: Living Product Documentation The Mattereum Asset Passport (MAP) is the core mechanism of the Mattereum Protocol. In short, a MAP is a bundle of legal warranties tied to an object. While these warranties can vary with the object in question, the initial warranty is often some form of identification. Other warranties in a MAP may include authentication methods, carbon offsets, anti-slavery certification, tokenization (or connections to any smart contract or software system), and many others. These warranties are essentially “legal lego” of various contract terms that will range greatly between different asset classes and will accrue around assets over time. All claims are cryptographically signed, secured, and backed by financial stake, giving all warrantors accountability and skin-in-the-game for their assessments. This framework also provides access to dispute resolution protocols in the event of systemic or commercial fallout via an arbitration clause in the contract. Asset Passports are not a static structure but a dynamic, living documentation that evolves throughout the product lifecycle. Once this initial documentation is established, we need a suitable incentive mechanism in place to supply and secure warrantied product information without relying wholly on centralized institutions. Product Information Markets: Breaking Out of the Silos of Separate “Truths” In both B2C and B2B contexts, trust is heavily centralized. We source our product knowledge from companies either directly or through ratings programs they design and companies source their materials and product information with trade partners or distant multiple-degree connections in their supply chains. Instead of relying on companies to secure trust and accountability when they are incentivized to stack the deck in their favor and shield themselves from liabilities, we propose a more decentralized, networked solution that can bring in third-party expertise to supply and secure warranties in a structure we call production information markets (PIMs). In short, a PIM is method of incentivizing truth in markets by allowing industry experts to make money from their niche knowledge and face consequences for erroneous claims. For the crypto-savvy, the PIM model makes use of a cryptoeconomic system — or protocol-based mechanism design — to incentivize a particular set of behaviors within the network, in this case the supplying and securing of production information over time. Together, the living product documentation and bundle of warranties of the MAPs and the incentive structure enabled by PIMs can help create more authentic B2C and B2B relationships which don’t rely on deceptive business practices at the expense and detriment to many. Mattereum Trust-as-a-Service: B2C The Mattereum model is a win-win for businesses and their customers. Any of the 700+ companies listed in the FTC Notice of Penalty Offenses — ranging from tech giants to telecomms to food services — would benefit from embracing a more decentralized approach to securing information and accountability around the sale of goods and services. Fake reviews and shady endorsements simply don’t work well within the Mattereum Protocol. By design, any and all faulty information has consequences. Companies can take initiative by integrating the Mattereum Protocol into their launch process or wait for their customers to do so down the line. The former option is certainly ideal. An Asset Passport can be generated at any point throughout a product’s lifecycle. Of course, having a MAP at the beginning of the cycle at the point of manufacturing or even design stage would allow for much more information-rich documentation over the course of time, but MAPs can be created even years after initial product release. Instead of putting the burden on companies to create and implement their own trust frameworks, they can instead plug their operations into an existing protocol. This makes adoption easier than a patchwork, disconnected solution. There is a potentially huge long-term effect in this approach. If a credibly-neutral, autonomous, decentralized third-party system for warrantying product information takes off, it will put pressure on businesses to improve the quality and authenticity of their offering. Failure to adapt to the new paradigm will result in a flight of customers to more provably trustworthy competitors. This is key: product information markets turn the trustworthiness of an enterprise into a competitive advantage in the marketplace while also maintaining regulatory compliance. All in the same system. Mattereum Trust-as-a-Service: B2B As above, so below. While the FTC notice highlights a severe misalignment in the average business-consumer relationship with a list of companies that looks like a library catalog, this trust problem also extends to the deals which happen much farther upstream to the corporate supply chain. Between mineral and materials sourcing, manufacturing, and distribution, the sheer scale of supply chains makes it difficult to document product information before it reaches digital or physical storefronts. The only other sources of truth available beyond the manufacturer are the specialist firms which rate and certify objects of a particular domain: fine art, collectible cards, instruments, vehicles, etc. However, these institutions are limited in their capacity by their lack of a shared record of an object’s history. Best case there’s an entry in a single database. Worst case: a single paper certificate. This disconnected certification system and lack of initiative and coordination in securing product information creates opportunities for even the world’s largest companies — such as Boeing — to be supplied defective or counterfeit parts, components, or ingredients at real risk to public health and safety. Had Boeing integrated MAPs within their supply chain and production process, they could have paired their incredibly detailed product specifications with warranties supplied by third-party engineering firms and other entities. Clear lines of accountability throughout a vast web of B2B deals. While we delineate B2C and B2B for explanatory purposes, ultimately the benefits of provable authenticity cascade throughout the entire system. If a business sources materials from verifiable and transparent sources, the company will be less likely to perpetuate faulty parts or information downstream to its own customers. The goal of Mattereum’s trust-as-a-service approach is simple in theory but profound in its potential: to power a market economy that doesn’t prey on individuals and institutions, while aligning profitability with sustainability. The cost and optics of civil penalties will get us nowhere. Let’s try something different. About Mattereum London-based Mattereum was established in 2017 by a trans-disciplinary team with a track record in designing and launching nation state-level infrastructure and headed by former Ethereum release coordinator Vinay Gupta. Mattereum is building an innovative trust-as-a-service platform for securing trust and liquidity in the sale of physical assets, creating durable secondary markets, and removing negative externalities of trade. Follow us as we bring the Mattereum Protocol to an expanding variety of markets and industries. More at: http://www.Mattereum.com Twitter: https://Twitter.com/Mattereum Telegram: https://t.me/Mattereum",https://medium.com/humanizing-the-singularity/countering-marketplace-deception-with-mattereums-trust-as-a-service-platform-2615dc2c47be,,Post,,Product,,,IOT,,,,,2021-10-20,,,,,,,,,,,,,
|
||
Mattereum,Mattereum,,Medium,,,,,,,Introduction to Smart Property,How can we streamline and improve the techno-social protocols around commerce so we can better maintain equilibrium with our planet and ourselves?,"Introduction to Smart Property How can we streamline and improve the techno-social protocols around commerce so we can better maintain equilibrium with our planet and ourselves? This article is a companion piece to the first episode of The Future of Stuff podcast. Listen to Vinay unpack the idea of smart property in the debut episode 🎙 Humanitarian engineer Buckminster Fuller once described an approach to building economic systems which he referred to as ephemeralization, defining it as our ability to “do more and more with less and less until eventually we can do everything with nothing.” Now, such a thing seems impossible with the numerous technological and social constraints that come to mind, but the idea of “doing more with less” as a design principle — if followed responsibly — could eventually lead us to a future where atoms and bits dance in perfect synchrony, benefiting everyone at the expense of no one. With that north star, we can start prototyping the systems of tomorrow with the tools of today. With that in mind, let’s explore a potential techno-social Medium for building this future: smart property. What is Smart Property? In the online age, people are used to summoning entities around the world to meet their everyday needs. A few taps on the scrying mirror of a smartphone can marshal human and autonomous agents to provide all manner of services. Transportation. Goods. Housing. Entertainment. Education. Nearly anything. This ability to program value flows and social interactions around goods and services presents an incredibly powerful design space which we can refer to as smart property. In practice, smart property is property that can be bought, sold, collateralized, and accessed via software APIs and search engines. While we already have this in a sense with housing (AirBnB), transportation (Uber), distribution (Amazon), and other areas, these systems are often controlled by centralized corporations operating under the illusory banner of “platform,” resulting in ongoing concerns around privacy, security, and labor practices. At Mattereum, we believe smart property is an inevitable evolution of commerce, but how this system is implemented and the motivations behind it are paramount. Smart property can be how we achieve equilibrium with the planet and with each other, or it could be co-opted by the incumbent powers-that-be with results that can be read in dystopian science fiction. Smart Property Can Fix Society’s Inventory Problem The world is suffering from a severe misallocation of resources, especially on a long-term time horizon. At the core, this is a system design problem. Between planned obsolescence and trade globalization in the pursuit of corporate profit, the things we produce are not actually built to last and rely on an unimaginably complex, global infrastructure powered by distant (or not so distant) horrific labor practices and environmental costs. The IPCW industrial cycle, designed by Mattereum CEO Vinay Gupta and a team at the Rocky Mountain Institute, provides an almost mandala-like mapping of industry. As you can see, the four main areas are Investment (of different capital types), Production, Consumption, and Waste. One way to explain this odd paradox of gluttony and scarcity in the world today is to map where in this cycle we have innovated over the last half century, and where we have not. Investment and Production witnessed a transformative leap in operational efficiency and scale in the post-WWII era, yet similar progress has not been made in optimizing Consumption and managing Waste. Tightening these feedback loops will require better policy, technology, and social awareness. We don’t know what we don’t know about our stuff and figuring it out is critical to achieving sustainability and improving quality of life for all. Smart property as a design framework can help us generate, access, and act upon the necessary product information to build secondhand markets of durable quality goods and facilitate truly efficient p2p commerce. Imagine a digital wallet interface akin to an MMORPG’s character inventory in which the assets are actual objects which can be owned in common, shared, leased, bought, or sold with a few taps on a screen. This is possible. Smart Property Streamlines Complexity in Commerce It is important to emphasize that smart property is not about embedding sensor devices in everything under the sun (Internet of Things, smart cities) and more about designing the social protocols around goods and services enabled by a tight integration between law and software to streamline, structure, and secure the numerous interactions and transactions around a good or service. Oddly enough, with smart property it is the space and social interaction around an object which is where the magic happens. The ‘ground’ is the key design space rather than the ‘figure’ of an object. On the first episode of The Future of Stuff podcast by Mattereum, Vinay illustrated the potential of smart property in multiple examples, including one that would seem random at first glance, but is profound in its implications. The Logistical Nightmare of Wedding Planning If the idea of smart property is to automate and streamline transactions and interactions between various parties around a good or service, reducing complexity and costs to those involved, then wedding planning makes for a surprisingly fitting use case. Venue. Catering. Travel. Entertainment. Fashion. So many moving pieces in constant motion. What if there were a system which allowed one to aggregate these pieces via software protocols, creating order in the chaos and vastly reducing mental and transaction costs? As Vinay explains, someone in charge of planning the event could use a smart property system to “build a multi-path matrix of possible solutions,” securing various options leading up to the day of the event as a safeguard for any failure or setback in the chain. “It’s this ability to take processes where there are tons and tons of dependencies, turn the dependencies into option contracts, group the option contracts together until you have a solution, and then once you have a solution, execute all of the options. Bringing that kind of capability out of heavy engineering and finance into ordinary everyday people’s lives will be completely revolutionary.” Of course, the application of smart property in event planning is not limited to weddings. Think of all the events we attend throughout our lives which involve networks of contracted parties which do the work of creating these dynamic social experiences. Concerts. Reward ceremonies. Conferences. Imagine booking and planning these events as easy as using AirBnB. Smart Property is a Critical Use Case of Blockchain It is imperative that a smart property system be decentralized and operated on a peer-to-peer basis. There’s a real risk of smart property systems as envisioned by Mattereum being co-opted by the incumbent Big Tech companies and being absorbed into monopolies and oligopolies, resulting in economic centralization with severe consequences for society (price distortions, resource misallocation, etc). We do not want a world in which everybody owns nothing and Big Tech companies lease the majority of material goods to the populace, under contractual agreements which favor them at the expense of the user at every turn. No recourse. No agency. If antitrust legislation and governmental efforts to prevent monopolies or keep them in check fail (and such legislation tends to wax and wane across administrations), then people would be at the mercy of market forces (such mercy does not exist). Building smart property systems with p2p protocols is critical to ensuring that the already terrible distribution of wealth and power is not made even worse for future generations. Blockchains provide cryptographically secure and verifiable transaction networks for the seamless and borderless exchange of value. A global, programmable Medium of value transfer open to all is a design space we are still trying to wrap our heads around. And it is currently the most compelling technological solution for smart property. Traditional client-server databases are silos which lack the interoperability necessary for coordinating and streamlining complex business transactions across multiple contexts. Having a shared financial infrastructure instead of a patchwork of protocols that struggle to maintain consensus can vastly reduce complexity and cost of trade. However, neither blockchain nor any other computer system for that matter can power smart property alone. There needs to be a way of “programming” the material world and the social agreements between people. Bringing the programmable value of blockchain into the real world requires a tight integration between legal systems that enforce property rights and the software systems which facilitate commerce. Enter Mattereum. Prototyping Smart Property with the Mattereum Protocol Mattereum is building the legal-transactional framework that secures trust in digital commerce. Bridging the programmable value of blockchain-based digital assets with the programmable agreements of contract law with the affordances of both presents a compelling design framework for smart property. The Mattereum Protocol leverages an integrated legal and software system consisting of three main components: Mattereum Asset Passports, Real-World Asset NFTs, and Trust Communities. Here’s how the protocol works, in brief: (For a more detailed guide of the protocol, we highly recommend referring to the Mattereum Product Walkthrough.) - Asset Passports are a bundle of legal warranties tied to a particular object. Essential warranties include identification, confirmation of the presence and capabilities of verification technologies such as NFC tags, vault status, carbon offsets, anti-slavery certification, and tokenization. The heart of the Mattereum Protocol, Asset Passports are powerful, composable tools for securing property rights in virtual and physical space. - Real-World Asset NFTs are unique digital tokens that denote the right to take physical custody of a particular object. While sharing features with the NFTs now gaining popularity in the cultural mainstream in the creative industries, these tokens are different in that they are backed by an underlying physical asset, complete with warranties, insurance, and legal enforceability to create trust in trade. They are paired with objects via a warranty in the Asset Passport. - Trust Communities are the networks of expert certifiers that accrue around each object (artwork, memorabilia, precious metals) throughout its lifetime. All expert certification of an object is backed by a legal warranty as well as financial stake, adding weight to an expert’s assessment. Built upon the composable, legal-tech framework of the Asset Passport, Trust Communities restore value, increase liquidity, and bring peace of mind to digital commerce. So far, the Mattereum Protocol has tokenized a wide range of objects, ranging from gold-backed NFTs, to pop culture memorabilia, to historical/cultural artifacts. Currently, these assets and arrangements are only a simple display of smart property. An asset is secured in a vault and can be physically delivered to the NFT bearer upon burning the token, with all the necessary legal warranties attached to the property. A transferable, cryptographic voucher of sorts. Mattereum is researching and developing tools and methodologies to build upon this groundwork, streamlining the passporting and tokenization process for a wider range of asset classes and exploring new opportunities in this exciting design space. In the future, there will be smart legal templates which create a streamlined user experience around generating the Asset Passport and minting digital assets, automated custodians that reduce the risk and complexity around custody of physical goods (and even provide corporate legal status of certain objects to empower collective ownership, asset governance, amongst other possibilities), advanced search capabilities with robust metadata schemas of all goods and services on the protocol, and APIs that allow others to build and expand on these foundations. The idea of smart property is to use technology to create a better relationship between ourselves and the planet. The digitization of commerce is a trend with no stopping point, so we might as well be good at it. Eventually, the technological and social systems which underlie trade will become so coordinated that we will move into an era of property rights so fluid and dynamic that it will be more akin to “streaming property” than owning property. How we build this system is crucial to preventing injustice and inequity in the world, and Mattereum is aiming to build the protocols to enable this future such that everyone can benefit at the expense of no one. In the spirit of Buckminster Fuller, let’s build a future of “more with less” where everything simply works in stunning techno-social grace. About Mattereum London-based Mattereum was established in 2017 by a trans-disciplinary team with a track record in designing and launching nation state-level infrastructure and headed by former Ethereum release coordinator Vinay Gupta. With blockchain-enabled authentication and tokenization of physical assets with built-in legal guarantees, Mattereum removes the fear, uncertainty, and doubt that has plagued digital commerce for decades. Follow us as we bring the Mattereum Protocol to an expanding variety of markets ranging from memorabilia, gold, wine, to prized classical instruments, and more. More at: http://www.Mattereum.com Twitter: https://Twitter.com/Mattereum Telegram: https://t.me/Mattereum",https://medium.com/humanizing-the-singularity/introduction-to-smart-property-ecb446268f23,,Post,,Product,,,IOT,,,,,2021-09-28,,,,,,,,,,,,,
|
||
Mattr,,Mattr,,Jack Holt; Kyle Leach; Murray McKerlie,DHS; Sovrin Foundation,"USA, Texas, Austin",USA,,,Mattr,"Restoring trust in digital interactions<br><br>From business people to developers, from policy makers to individuals, Mattr is building tools and working alongside communities who want to transform the internet into a web of trust and restore trust and confidence in digital interactions.<br><br>We think the exciting new world of verifiable data and decentralised identity can be applied to solve many real-world problems we experience today, creating whole new opportunities for value creation. We make our products simple, accessible, and easy to use to help stimulate innovation by the people that understand their context best.<br><br>If you have a digital trust or verifiable data problem that you would like help with, let’s have a discussion on how we can help.",,http://Mattr.global,,Company,,Company,Enterprise,ID,SSI,,,BBS Signatures,"DID,Verifiable Credentials,Schema.Org,JSON-LD,Linked Data",2019,https://github.com/Mattrglobal/,https://twitter.com/MattrGlobal,https://www.youtube.com/channel/UCQ19LrZwBsotEb8M2kxWRtw,https://medium.com/Mattr-global; https://Mattr.global/resources/,https://Mattr.global/feed/resources/,,https://www.crunchbase.com/organization/Mattr,https://www.linkedin.com/company/Mattrglobal/,https://learn.Mattr.global/,,,,
|
||
Mattr,Mattr,,Medium,Nader Helmy,,,,,,IIW32: BBS+ and Beyond,"One common theme this year was the continued development and adoption of BBS+ signatures, a type of multi-message cryptographic digital signature that enables selective disclosure of verifiable credentials. This development is possible due to the fact that BBS+ signatures is a ledger-independent approach to selective disclosure, effectively no custom logic or bespoke infrastructure is needed for these digital signatures to be created, used and understood.","IIW32: BBS+ and beyond The Internet Identity Workshop continues to be a central nucleus for thoughtful discussion and development of all things related to digital identity. The most recent workshop, which was held in mid-April, was no exception. Despite the lack of in-person interaction due to the ongoing global pandemic, this IIW was as lively as ever, bringing together a diverse set of stakeholders from across the globe to share experiences, swap perspectives, and engage in healthy debates. One common theme this year was the continued development and adoption of BBS+ signatures, a type of multi-message cryptographic digital signature that enables selective disclosure of verifiable credentials. We first introduced this technology at IIW30 in April 2020, and have been inspired and delighted by the community’s embrace and contribution to this effort across the board. In the year since, progress has been made in a variety of areas, from standards-level support to independent implementations and advanced feature support. We thought we’d take a moment to round up some of the significant developments surrounding BBS+ signatures and highlight a few of the top items to pay attention to going forward. Over the past few months, the linked data proofs reference implementation of BBS+ published a new release that introduces a variety of improvements in efficiency and security, including formal alignment to the W3C CCG Security Vocab v3 definitions. In addition, support for JSON-LD BBS+ signatures was added to the VC HTTP API, making it possible to test this functionality in an interoperable way with other vendors participating in an open environment. An important element in enabling BBS+ signatures is using what’s known as a pairing-friendly curve; for our purposes we use BLS12–381. We have seen some promising signs of adoption for this key pair, with multiple Decentralized Identifier (DID) methods — both did:indy from Hyperledger and did:ion from DIF — indicating they intend to add or already have support for these keys, allowing BBS+ signatures to be issued across a variety of decentralized networks and ecosystems. This development is possible due to the fact that BBS+ signatures is a ledger-independent approach to selective disclosure, effectively no custom logic or bespoke infrastructure is needed for these digital signatures to be created, used and understood. In addition, the Hyperledger Aries project has been hard at work developing interoperable and ledger-agnostic capabilities in open source. The method used to track interop targets within the cohort and ultimately measure conformance against Aries standards is what’s known as an Aries Interop Profile (AIP). A major upcoming update to AIP will add support for additional DID methods, key types and credential formats, as well as introducing Aries support for JSON-LD BBS+ signatures as part of AIP 2.0. This will allow Aries-driven credential issuance and presentation protocols to work natively with BBS+ credentials, making that functionality broadly available for those in the Aries community and beyond. There have also been a number of exciting developments when it comes to independent implementations of BBS+ signatures. Animo Solutions has recently implemented JSON-LD BBS+ signatures support into the popular open-source codebase Hyperledger Aries Cloud Agent Python (ACA-Py). In another independent effort, Trinsic has contributed an implementation of JSON-LD BBS+ credentials which they have demonstrated to be working in tandem with DIDComm v2, a secure messaging protocol based on DIDs. Implementations such as these help to demonstrate that open standards are transparent, can be understood and verified independently, and can be implemented with separate languages and tech stacks. They also set the groundwork for demonstrating real testing-driven interoperability via mechanisms such as the VC HTTP API and AIP 2.0. We are continuously looking to improve the documentation of these specs and standards so that their implications and nuances can be more broadly understood by builders and developers looking to engage with the technology. On the cryptographic side of things, progress is also being made in hardening the existing BBS+ specification as well as expanding BBS+ to support more advanced privacy-preserving features. A significant development in this area is the work of cryptographer Michael Lodder who has been actively conducting research on an enhanced credential revocation mechanism using cryptographic accumulators with BBS+. This approach presents a promising alternative to existing solutions that allow authoritative issuers to update the status of issued credentials without compromising the privacy of the credential holder or subject who may be presenting the credential. We see this as another application of BBS+ signatures in the context of verifiable credentials that carries a lot of potential in pushing this technology to an even more robust state. There was also initial discussion and tacit agreement to create a new cryptography-focused working group at Decentralized Identity Foundation. As the new WG drafts its charter, the first work item of this group will be the BBS+ Signatures spec which defines the cryptographic scheme known as BBS+ agnostic of its application in areas such as linked data signatures or verifiable credentials. In the future, this WG will likely evolve to include other crypto-related work items from the community. This is just the tip of the iceberg when it comes to the momentum and development building around this technology in the community. We couldn’t be more excited about the future of BBS+ signatures, especially as we gear up to tackle the next set of hard problems in this area including privacy-preserving subject authentication and revocation using cryptographic accumulators. If you’re interested we encourage you to get involved, either by contributing to the Linked Data Proofs specification, checking out our reference implementations, or participating in the new WG at DIF, to name but a few of the many ways to engage with this work. We look forward to doing this retrospective at many IIWs to come, documenting the ever-growing community that continues to champion this technology in dynamic and interesting ways.",https://medium.com/Mattr-global/iiw32-bbs-and-beyond-1a41634c15b0,,Post,,Ecosystem,,,Recap,,,BBS+,,2021-05-05,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Digital Wallets,The reframing of the user as a first-class citizen and their empowerment as ‘holder’ represents a shift towards a new paradigm. Such a paradigm offers users greater sovereignty of their own information and empowerment to manage their digital identity. Users are able to exercise their new role in this ecosystem by utilizing a new class of software known as digital wallets.,"Learn Concepts: Digital Wallets In order to coordinate the authentication needs of apps and services on the web, many of today’s users will leverage services such as password managers. These tools help users keep track of how they’ve identified themselves in different contexts and simplify the login process for different services. In many ways, the need to overlay such services in order to preserve non-negotiable security properties reflects the broken state of identity on the internet today. Users of these apps (i.e. the data subjects) are often an afterthought when a trust relationship is established between data authorities and apps or services consuming and relying on user data. Asymmetry in the nature of the relationships between participants largely prevents users from asserting their data rights as subjects of the data. Users are left to deal with the problems inherent in such a model, foisting upon them the responsibility of implementing appropriate solutions to patch over the shortcomings of identity management under this legacy model. The emerging web of trust based upon self-certifying identifiers and user-centric cryptography is shifting this fundamental relationship by refashioning the role of the user. This role (known in the VC data model as a “holder”) is made central to the ecosystem and, importantly, on equal footing with the issuers of identity-related information and the relying parties who require that data to support their applications and services. The reframing of the user as a first-class citizen and their empowerment as ‘holder’ represents a shift towards a new paradigm. Such a paradigm offers users greater sovereignty of their own information and empowerment to manage their digital identity. Users are able to exercise their new role in this ecosystem by utilizing a new class of software known as digital wallets. Digital wallets are applications that allow an end user to manage their digital credentials and associated cryptographic keys. They allow users to prove identity-related information about themselves and, where it’s supported, choose to selectively disclose particular attributes of their credentials in a privacy-preserving manner. Wallets and Agents When working with technology standards that are inherently decentralized, it’s important to establish a common context and consensus in our choice of terminology and language. Convergence on key terms that are being used to describe concepts within the emerging decentralized identity and self-sovereign identity technologies allows participants to reach a shared understanding. Consequently, participating vendors are able to understand how they fit into the puzzle and interoperability between vendor implementations is made possible. Through dedicated research and careful coordination with the broader technical community, the Glossary Project at DIF offers a useful definition for both wallets and agents. Wallets Provide storage of keys, credentials, and secrets, often facilitated or controlled by an agent. Agents An agent is a software representative of a subject (most often a person) that controls access to a wallet and other storage, can live in different locations on a network (cloud vs. local), and can facilitate or perform messaging or interactions with other subjects. The two concepts are closely related, and are often used interchangeably. In short, the Glossary Project found that an agent is most commonly a piece of software that lets you work with and connect to wallets. Wallets can be simple, while agents tend to be more complex. Agents often need access to a wallet in order to retrieve credentials, keys, and/or messages that are stored there. At Mattr, we tend to use the terms ‘digital wallet’ or simply ‘wallet’ to holistically describe the software that is utilized by end-users from within their mobile devices, web browsers, or other such user-controlled devices or environments. A digital wallet can be thought of as a kind of agent, though we try to make the distinction between the software that sits on a user’s device and the data managed and logic facilitated by a cloud-based platform in support of the wallet’s capabilities. We like the term ‘wallet’ because it is analogous to real-world concepts that by and large parallel the primary function of a wallet; to store and retrieve identity-related information. User-centric Design As end users have often found themselves the casualty of the information systems used by the modern web, there has been little opportunity to allow users to directly manage their data and negotiate what data they wish to withhold or disclose to certain parties. Under the new web of trust paradigm, the rights of the data subject are codified in standards, processes, and protocols guaranteeing the user the power to exercise agency. The interjection of the wallet to support end-users as data subjects on equal footing with issuers of identity information and relying parties provides an indispensable conduit and control point for this information that enables new opportunities for user-centric design. The innovation in this area is only just beginning and there is no limit to the kinds of new experiences application developers can design and deliver to users. Some examples include: - Allowing users to synchronize their data across multiple applications - Allowing users to self-attest to a piece of data or attest to data self-asserted by peers - Allowing a user to explicitly give consent around how their data may be used - Allowing users to revoke their consent for access to the continued use of and/or persistence of a particular piece of data - Allowing users to opt-in to be discoverable to other verified users, provided they can mutually verify particular claims and attributes about themselves - Allowing users to opt-in to be discoverable to certain service providers and relying parties, provided they can mutually verify particular claims and attributes about themselves These are just a handful of the potential ways that developers can innovate to implement user-centric experiences. Mattr offers the tools necessary to create new kinds of wallet and authentication experiences for users and we’re excited to see what developers come up with when given the opportunity to create applications and services inspired by these new standards and technologies.",https://medium.com/Mattr-global/learn-concepts-digital-wallets-c88318055e42,,Post,,Explainer,,,,Wallets,,,,2020-12-23,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Selective Disclosure,"An important principle that we want to achieve when designing any system that involves handling Personally Identifiable Information (PII) is to minimize the data disclosed in a given interaction. When users share information, they should be able to choose what and how much they share on a case-by-case basis, while the relying parties receiving the information must be able to maintain assurances about the presented information’s origin and integrity.","Learn Concepts: Selective Disclosure An important principle that we want to achieve when designing any system that involves handling Personally Identifiable Information (PII) is to minimize the data disclosed in a given interaction. When users share information, they should be able to choose what and how much they share on a case-by-case basis, while the relying parties receiving the information must be able to maintain assurances about the presented information’s origin and integrity. This process is often referred to as selective disclosure of data. As technologists, by having solutions that easily achieve selective disclosure, we can drive a culture based on the minimum information exchange required to enhance user privacy. Privacy and Correlation Selective disclosure of information is particularly relevant when evaluating approaches to using verifiable credentials (VCs). Because authorities are able to issue credentials to a subject’s digital wallet, the subject is able to manage which data they disclose to relying parties as well as how that disclosure is performed. This presents an opportunity for those designing digital wallets to consider the user experience of data disclosure, particularly as it relates to the underlying technology and cryptography being used for data sharing. The problem of user privacy as it relates to digital identity is a deep and complicated one, however the basic approach has been to allow users to share only the information which is strictly necessary in a particular context. The VC Data Model spec provides some guidance on how to do so, but stops short of offering a solution to the issue of managing user privacy and preventing correlation of their activities across different interactions: Organizations providing software to holders should strive to identify fields in verifiable credentials containing information that could be used to correlate individuals and warn holders when this information is shared. A number of different solutions have been deployed to address the underlying concerns around selective disclosure. Each solution makes a different set of assumptions and offers different tradeoffs when it comes to usability and convenience. Approaches to Selective Disclosure When it comes to solutions for selective disclosure of verifiable credentials, there are many different ways to tackle this problem, but three of the most common are: - Just in time issuance — contact the issuer at request time either directly or indirectly for a tailored assertion - Trusted witness — use a trusted witness between the provider and the relying party to mediate the information disclosure - Cryptographic solutions — use a cryptographic technique to disclose a subset of information from a larger assertion Just in time issuance Just in time issuance, a model made popular by OpenID Connect, assumes the issuer is highly available, which imposes an infrastructure burden on the issuer that is proportional to the number of subjects they have information for and where those subjects use their information. Furthermore, in most instances of this model, the issuer learns where a subject is using their identity information, which can be a serious privacy problem. Trusted witness Trusted witness shifts this problem to be more of a presentation concern, where a witness de-anonymizes the subject presenting the information and presents an assertion with only the information required by the relying party. Again, this model requires a highly available party other than the holder and relying party present when a subject wants to present information, one that must be highly trusted and one that bears witness to a lot of PII on the subject, leading to privacy concerns. Cryptographic solutions Cryptographic solutions offer an alternative to these approaches by solving the selective disclosure problem directly at the core data model layer of the VC, providing a simpler and more flexible method of preserving user privacy. There are a variety of ways that cryptography can be used to achieve selective disclosure or data minimization, but perhaps the most popular approach is using a branch of cryptography often known as Zero-Knowledge Proofs, or ZKPs. The emergent feature of this technology is that a prover can prove knowledge of some data without exposing any additional data. Zero-knowledge proofs can be achieved in a flexible manner with verifiable credentials using multi-message digital signatures such as BBS+. Traditional Digital Signatures Traditional digital signatures look a bit like this. You have a message (virtually any kind of data for which you want to establish integrity) and a keypair (private and public key) which you use to produce a digital signature on the data. By having the message, public key, and the signature; verifiers are able to evaluate whether the signature is valid or not, thereby establishing the integrity of the message and the authenticity of the entity that signed the message. In the context of verifiable credentials, the entity doing the signing is the issuer of the credential, while the entity doing the verification is the verifier. The keypair in question belongs to the issuer of the credential, which allows verifiers to establish the authority on that credential in a verifiable manner. Sign Verify Multi-message Digital Signatures Multi-message digital signature schemes (like BBS+), on the other hand, are able to sign an array of messages, rather than a single message over which the entire digital signature is applied. The same mechanism is used wherein a private key produces a digital signature over the messages you wish to sign, but now you have the flexibility of being able to break a message up into its fundamental attributes. In the context of verifiable credentials, each message corresponds to a claim in the credential. This presents an opportunity for selective disclosure due to the ability to derive and verify a proof of the digital signature over a subset of messages or credential attributes. Sign Verify In addition to the simple ability to sign and verify a set of messages, multi-message digital signatures have the added capability of being able to derive a proof of the digital signature. In the context of verifiable credentials, the entity deriving the proof is the credential subject or holder. This process allows you to select which messages you wish to disclose in the proof and which messages you want to keep hidden. The derived proof indicates to the verifier that you know all of the messages that have been signed, but that you are only electing to disclose a subset of these messages. Derive Proof Verify Proof The verifier, or the entity with which you’re sharing the data, is only able to see the messages or credential claims which you have selectively disclosed to them. They are still able to verify the integrity of the messages being signed, as well as establish the authenticity of the issuer that originally signed the messages. This provides a number of privacy guarantees to the data subject because relying parties are only evaluating the proof of the signature rather than the signature itself.",https://medium.com/Mattr-global/learn-concepts-selective-disclosure-4b9bf4e5c887,,Post,,Explainer,,,,,,,,2020-12-23,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Semantic Web,The semantic web is a set of technologies whose goal is to make all data on the web machine-readable. Its usage allows for a shared understanding around data that enables a variety of real-world applications and use cases.,"Learn Concepts: Semantic Web With so much data being created and shared on the internet, one of the oldest challenges in building digital infrastructure has been how to consistently establish meaning and context to this data. The semantic web is a set of technologies whose goal is to make all data on the web machine-readable. Its usage allows for a shared understanding around data that enables a variety of real-world applications and use cases. The challenges to address with the semantic web include: - vastness — the internet contains billions of pages, and existing technology has not yet been able to eliminate all semantically duplicated terms - vagueness — imprecise concepts like ‘young’ or ‘tall’ make it challenging to combine different knowledge bases with overlapping but subtly different concepts - uncertainty — precise concepts with uncertain values can be hard to reason about, this mirrors the ambiguity and probabilistic nature of everyday life - inconsistency — logical contradictions create situations where reasoning breaks down - deceit — intentionally misleading information spread by bad actors, can be mitigated with cryptography to establish information integrity Linked Data Linked data is the theory behind much of the semantic web effort. It describes a general mechanism for publishing structured data on the internet using vocabularies like schema.org that can be connected together and interpreted by machines. Using linked data, statements encoded in triples (subject → predicate → object) can be spread across different websites in a standard way. These statements form the substrate of knowledge that spans across the entire internet. The reality is that the bulk of useful information on the internet today is unstructured data, or data that is not organized in a way which makes it useful to anyone beyond the creators of that data. This is fine for the cases where data remains in a single context throughout its lifecycle, but it becomes problematic when trying to share data across contexts while retaining its semantic meaning. The vision for linked data is for the internet to become a kind of global database where all data can be represented and understood in a similar way. One of the biggest challenges to realizing the vision of the internet as a global database is enabling a common set of underlying semantics that can be consumed by all this data. A proliferation of data becomes much less useful if the data is redundant, unorganized, or otherwise messy and complicated. Ultimately, we need to double down on the usage of common data vocabularies and common data schemas. Common data schemas combined with the security features of verifiable data will make fraud more difficult, making it easier to transmit and consume data so that trust-based decisions can be made. Moreover, the proliferation of common data vocabularies will help make data portability a reality, allowing data to be moved across contexts while retaining the semantics of its original context. Semantic Web Technologies The work around developing semantic web technology has been happening for a very long time. The vision for the semantic web has been remarkably consistent throughout its evolution, although the specifics around how to accomplish this and at what layer has developed over the years. W3C’s semantic web stack offers an overview of these foundational technologies and the function of each component in the stack. The ultimate goal of the semantic web of data is to enable computers to do more useful work and to develop systems that can support trusted interactions over the network. The shared architecture as defined by the W3C supports the ability for the internet to become a global database based on linked data. Semantic Web technologies enable people to create data stores on the web, build vocabularies, and write rules for handling data. Linked data are empowered by technologies such as RDF, SPARQL, OWL, and SKOS. - RDF provides the foundation for publishing and linking your data. It’s a standard data model for representing information resources on the internet and describing the relationships between data and other pieces of information in a graph format. - OWL is a language which is used to build data vocabularies, or “ontologies”, that represent rich knowledge or logic. - SKOS is a standard way to represent knowledge organization systems such as classification systems in RDF. - SPARQL is the query language for the Semantic Web; it is able to retrieve and manipulate data stored in an RDF graph. Query languages go hand-in-hand with databases. If the Semantic Web is viewed as a global database, then it is easy to understand why one would need a query language for that data. By enriching data with additional context and meaning, more people (and machines) can understand and use that data to greater effect. JSON-LD JSON-LD is a serialization format that extends JSON to support linked data, enabling the sharing and discovery of data in web-based environments. Its purpose is to be isomorphic to RDF, which has broad usability across the web and supports additional technologies for querying and language classification. RDF has been used to manage industry ontologies for the last couple decades, so creating a representation in JSON is incredibly useful in certain applications such as those found in the context of Verifiable Credentials (VCs). The Linked Data Proofs representation of Verifiable Credentials makes use of a simple security protocol which is native to JSON-LD. The primary benefit of the JSON-LD format used by LD-Proofs is that it builds on a common set of semantics that allow for broader ecosystem interoperability of issued credentials. It provides a standard vocabulary that makes data in a credential more portable as well as easy to consume and understand across different contexts. In order to create a crawl-able web of verifiable data, it’s important that we prioritize strong reuse of data schemas as a key driver of interoperability efforts. Without it, we risk building a system where many different data schemas are used to represent the same exact information, creating the kinds of data silos that we see on the majority of the internet today. JSON-LD makes semantics a first-class principle and is therefore a solid basis for constructing VC implementations. JSON-LD is also widely adopted on the web today, with W3C reporting it is used by 30% of the web and Google making it the de facto technology for search engine optimization. When it comes to Verifiable Credentials, it’s advantageous to extend and integrate the work around VCs with the existing burgeoning ecosystem of linked data.",https://medium.com/Mattr-global/learn-concepts-semantic-web-250784d6a49f,,Post,,Explainer,,,,,,Semantic web,,2020-12-23,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,The State of Identity on the Web,"This cycle perpetuates the dominance of a few major IdPs and likewise forces users to keep choosing from the same set of options or risk losing access to all of their online accounts. In addition, many of these IdPs have leveraged their role as central intermediaries to increase surveillance and user behavior tracking, not just across their proprietary services, but across a user’s entire web experience. OIDC Credential Provider allows you to extend OIDC to allow IdPs to issue reusable VCs about the end-user instead of simple identity tokens with limited functionality. It allows end-users to request credentials from an OpenID Provider and manage their own credentials in a digital wallet under their control. This article discusses how the success of Open ID Connect shaped the state of identity on the web, how new web standards enable a new model, and describes a bridge between those worlds: OIDC Credential provider.","The State of Identity on the Web The evolution of identity on the web is happening at a rapid pace, with many different projects and efforts converging around similar ideas with their own interpretations and constraints. It can be difficult to parse through all of these developments while the dust hasn’t completely settled, but looking at these issues holistically, we can see a much bigger pattern emerging. In fact, many of the modern innovations related to identity on the web are actually quite connected and build upon each other in a myriad of complementary ways. The rise of OpenID Connect The core of modern identity is undoubtedly OpenID Connect (OIDC), the de-facto standard for user authentication and identity protocol on the internet. It’s a protocol that enables developers building apps and services to verify the identity of their users and obtain basic profile information about them in order to create an authenticated user experience. Because OIDC is an identity layer built on top of the OAuth 2.0 framework, it can also be used as an authorization solution. Its development was significant for many reasons, in part because it came with the realization that identity on the web is fundamental to many different kinds of interactions, and these interactions need simple and powerful security features that are ubiquitous and accessible. Secure digital identity is a problem that doesn’t make sense to solve over and over again in different ways with each new application, but instead needs a standard and efficient mechanism that’s easy to use and works for the majority of people. OpenID Connect introduced a convenient and accessible protocol for identity that required less setup and complexity for developers building different kinds of applications and programs. In many ways, protocols like OIDC and OAuth 2.0 piggy-backed on the revolution that was underfoot in the mid 2000’s as developers fled en-mass from web based systems heavily reliant on technologies like XML (and consequently identity systems built upon these technologies like SAML), for simpler systems based on JSON. OpenID built on the success of OAuth and offered a solution that improved upon existing identity and web security technologies which were vulnerable to attacks like screen scraping. This shift towards a solution built upon modern web technologies with an emphasis on being easy-to-use created ripe conditions for adoption of these web standards. OIDC’s success has categorically sped up both the web and native application development cycle when it comes to requiring the integration of identity, and as a result, users have now grown accustomed to having sign-in options aplenty with all their favorite products and services. It’s not intuitively clear to your average user why they need so many different logins and it’s up to the user to manage which identities they use with which services, but the system works and provides a relatively reliable way to integrate identity on the web. Success and its unintended consequences While OIDC succeeded in simplicity and adoption, what has emerged over time are a number of limitations and challenges that have come as a result of taking these systems to a global scale. When it comes to the market for consumer identity, there are generally three main actors present: - Identity Providers - Relying Parties - End-Users The forces in the market that cause their intersection to exist are complex, but can be loosely broken down into the interaction between each pair of actors. In order for an End-User to be able to “login” to a website today, the “sweet spot” must exist where each of these sets of requirements are met. The negotiation between these three parties usually plays out on the relying party’s login page. It’s this precious real-estate that drives these very distinct market forces. Anti-competitive market forces In typical deployments of OIDC, in order for a user to be able to “login” to a relying party or service they’re trying to access online, the relying party must be in direct contact with the Identity Provider (IdP). This is what’s come to be known as the IdP tracking problem. It’s the IdP that’s responsible for performing end-user authentication and issuing end-user identities to relying parties, not the end-users themselves. Over time, these natural forces in OIDC have created an environment that tends to favour the emergence and continued growth of a small number of very large IdPs. These IdPs wield a great deal of power, as they have become a critical dependency and intermediary for many kinds of digital interactions that require identity. This environment prevents competition and diversity amongst IdPs in exchange for a convenience-driven technology framework where user data is controlled and managed in a few central locations. The market conditions have made it incredibly difficult for new IdPs to break into the market. For example, when Apple unveiled their “Sign in with Apple” service, they used their position as a proprietary service provider to mandate their inclusion as a third party sign in option for any app or service that was supporting federated login on Apple devices. This effectively guaranteed adoption of their OpenID-based solution, allowing them to easily capture a portion of the precious real-estate that is the login screen of thousands of modern web apps today. This method of capturing the market is indicative of a larger challenge wherein the environment of OIDC has made it difficult for newer and smaller players in the IdP ecosystem to participate with existing vendors on an equal playing field. Identity as a secondary concern has primary consequences Another key issue in the current landscape is that for nearly all modern IdPs, being an identity provider is often a secondary function to their primary line of business. Though they have come to wear many different hats, many of the key IdPs’ primary business function is offering some service to end-users (e.g. Facebook, Twitter, Google, etc.). Their role as IdPs is something that emerged over time, and with it has surfaced a whole new set of responsibilities whose impact we are only just beginning to grapple with. Due to this unequal relationship, developers and businesses who want to integrate identity in their applications are forced to choose those IdPs which contain user data for their target demographics, instead of offering options for IdP selection based on real metrics around responsible and privacy-preserving identity practices for end-users. This cycle perpetuates the dominance of a few major IdPs and likewise forces users to keep choosing from the same set of options or risk losing access to all of their online accounts. In addition, many of these IdPs have leveraged their role as central intermediaries to increase surveillance and user behavior tracking, not just across their proprietary services, but across a user’s entire web experience. The net result of this architecture on modern systems is that IdPs have become a locus for centralized data storage and processing. The privacy implications associated with the reliance on a central intermediary who can delete, control, or expose user data at any time have proven to be no small matter. New regulations such as GDPR and CCPA have brought user privacy to the forefront and have spurred lots of public discourse and pressure for companies to manage their data processing policies against more robust standards. The regulatory and business environment that is forming around GDPR and CCPA is pushing the market to consider better solutions that may involve decentralizing the mode of operation or separating the responsibilities of an IdP. Identity Provider lock-in Lastly, in today’s landscape there is an inseparable coupling between an End-User and the IdP they use. This effectively means that, in order to transfer from say “Sign In With Google” to “Sign In With Twitter,” a user often has to start over and build their identity from scratch. This is due to the fact that users are effectively borrowing or renting their identities from their IdPs, and hence have little to no control in exercising that identity how they see fit. This model creates a pattern that unnecessarily ties a user to the application and data ecosystem of their IdP and means they must keep an active account with the provider to keep using their identity. If a user can’t access to their account with an IdP, say by losing access to their Twitter profile, they can no longer login to any of the services where they’re using Twitter as their IdP. One of the problems with the term Identity Provider itself is that it sets up the assumption that the end-user is being provided with an identity, rather than the identity being theirs or under their control. If end-users have no real choice in selecting their IdP, then they are ultimately subject to the whims of a few very large and powerful companies. This model is not only antithetical to anti-trust policies and legislation, it also prevents data portability between platforms. It’s made it abundantly clear that the paradigm shift on end-user privacy practices needs to start by giving users a baseline level of choice when it comes to their identity. A nod to an alternative model Fundamentally, when it comes to identity on the web, users should have choice; choice about which services they employ to facilitate the usage of their digital identities along with being empowered to change these service providers if they so choose. The irony of OpenID Connect is that the original authors did actually consider these problems, and evidence of this can be found in the original OIDC specification: in chapter 7, entitled “Self Issued OpenID Provider” (SIOP). Earning its name primarily from the powerful idea that users could somehow be their own identity provider, SIOP was an ambitious attempt at solving a number of different problems at once. It raises some major questions about the future of the protocol, but it stops short of offering an end-to-end solution to these complex problems. As it stands in the core specification, the SIOP chapter of OIDC was really trying to solve 3 significant, but distinct problems, which are: - Enabling portable/transferable identities between providers - Dealing with different deployment models for OpenID providers - Solving the Nascar Problem SIOP has recently been of strong interest to those in the decentralized or self-sovereign identity community because it’s been identified as a potential pathway to transitioning existing deployed digital infrastructure towards a more decentralized and user-centric model. As discussion is ongoing at the OpenID Foundation to evolve and reboot the work around SIOP, there are a number of interesting questions raised by this chapter that are worth exploring to their full extent. For starters, SIOP questions some of the fundamental assumptions around the behaviour and deployment of an IdP. OpenID and OAuth typically use a redirect mechanism to relay a request from a relying party to an IdP. OAuth supports redirecting back to a native app for the end-user, but it assumes that the provider itself always takes the form of an HTTP server, and furthermore it assumes the request is primarily handled server-side. SIOP challenged this assumption by questioning whether the identity provider has to be entirely server-side, or if the provider could instead take the form of a Single-Page Application (SPA), Progressive Web Application (PWA), or even a native application. In creating a precedent for improving upon the IdP model, SIOP was asking fundamental questions such as: who gets to pick the provider? What role does the end-user play in this selection process? Does the provider always need to be an authorization server or is there a more decentralized model available that is resilient from certain modes of compromise? Although some of these questions remain unanswered or are early in development, the precedent set by SIOP has spurred a number of related developments in and around web identity. Work is ongoing at the OpenID Foundation to flesh out the implications of SIOP in the emerging landscape. Tech giants capitalize on the conversation Although OIDC is primarily a web-based identity protocol, it was purposefully designed to be independent of any particular browser feature or API. This separation of concerns has proved incredibly useful in enabling adoption of OIDC outside of web-only environments, but it has greatly limited the ability for browser vendors to facilitate and mediate web-based login events. A number of large technology and browser vendors have picked up on this discrepancy, and are starting to take ownership of the role they play in web-based user interactions. Notably, a number of new initiatives have been introduced in the last few years to address this gap in user privacy on the web. An example of this can be found in the W3C Technical Architecture Group (TAG), a group tasked with documenting and building consensus around the architecture of the World Wide Web. Ahead of the 2019 W3C TPAC in Japan, Apple proposed an initiative called IsLoggedIn, effectively a way for websites to tell the browser whether the user was logged in or not in a trustworthy way. What they realized is that the behavior of modern web architecture results in users being “logged in by default” to websites they visit, even if they only visit a website once. Essentially as soon as the browser loads a webpage, that page can store data about the user indefinitely on the device, with no clear mechanism for indicating when a user has logged out or wishes to stop sharing their data. They introduced an API that would allow browsers to set the status of user log-ins to limit long term storage of user data. It was a vision that required broad consensus among today’s major web browsers to be successful. Ultimately, the browsers have taken their own approach in trying to mitigate the issue. In 2019, Google created their Privacy Sandbox initiative to advance user privacy on the web using open and transparent standards. As one of the largest web browsers on the planet, Google Chrome seized the opportunity provided by an increased public focus on user privacy to work on limiting cross-site user tracking and pervasive incentives that encourage surveillance. Fuelled by the Privacy Sandbox initiative, they created a project called WebID to explore how the browser can mediate between different parties in a digital identity transaction. WebID is an early attempt to get in the middle of the interaction that happens between a relying party and an IdP, allowing the browser to facilitate the transaction in a way that provides stronger privacy guarantees for the end-user. As an overarching effort, it’s in many ways a response to the environment created by CCPA and GDPR where technology vendors like Google are attempting to enforce privacy expectations for end-users while surfing the web. Its goal is to keep protocols like OIDC largely intact while using the browser as a mediator to provide a stronger set of guarantees when it comes to user identities. This may ultimately give end-users more privacy on the web, but it doesn’t exactly solve the problem of users being locked into their IdPs. With the persistent problem of data portability and limited user choices, simply allowing the browser to mediate the interaction is an important piece of the puzzle but does not go far enough on its own. Going beyond the current state of OpenID Connect Though it is a critical component of modern web identity, OIDC is not by any means the only solution or protocol to attempt to solve these kinds of problems. A set of emerging standards from the W3C Credentials Community Group aim to look at identity on the web in a very different way, and, in fact, are designed to consider use cases outside of just consumer identity. One such standard is Decentralized Identifiers (DIDs) which defines a new type of identifier and accompanying data model featuring several novel properties not present in most mainstream identifier schemes in use today. Using DIDs in tandem with technologies like Verifiable Credentials (VCs) creates an infrastructure for a more distributed and decentralized layer for identity on the web, enabling a greater level of user control. VCs were created as the newest in a long line of cryptographically secured data representation formats. Their goal was to provide a standard that improves on its predecessors by accommodating formal data semantics through technologies like JSON-LD and addressing the role of data subjects in managing and controlling data about themselves. These standards have emerged in large part to address the limitations of federated identity systems such as the one provided by OIDC. In the case of DIDs, the focus has been on creating a more resilient kind of user-controllable identifier. These kinds of identifiers don’t have to be borrowed or rented from an IdP as is the case today, but can instead be directly controlled by the entities they represent via cryptography in a consistent and standard way. When combining these two technologies, VCs and DIDs, we enable verifiable information that has a cryptographic binding to the end-user and can be transferred cross-context while retaining its security and semantics. As is the case with many emerging technologies, in order to be successful in an existing and complicated market, these new standards should have a cohesive relationship to the present. To that end, there has been a significant push to bridge these emerging technologies with the existing world of OIDC in a way that doesn’t break existing implementations and encourages interoperability. One prominent example of this is a new extension to OIDC known as OpenID Connect Credential Provider. Current OIDC flows result in the user receiving an identity token which is coupled to the IdP that created it, and can be used to prove the user’s identity within a specific domain. OIDC Credential Provider allows you to extend OIDC to allow IdPs to issue reusable VCs about the end-user instead of simple identity tokens with limited functionality. It allows end-users to request credentials from an OpenID Provider and manage their own credentials in a digital wallet under their control. By allowing data authorities to be the provider of reusable digital credentials instead of simple identity assertions, this extension effectively turns traditional Identity Providers into Credential Providers. The credentials provided under this system are cryptographically bound to a public key controlled by the end-user. In addition to public key binding, the credential can instead be bound to a DID, adding a layer of indirection between a user’s identity and the keys they use to control it. In binding to a DID, the subject of the credential is able to maintain ownership of the credential on a longer life cycle due to their ability to manage and rotate keys while maintaining a consistent identifier. This eases the burden on data authorities to re-issue credentials when the subject’s keys change and allows relying parties to verify that the credential is always being validated against the current public key of the end-user. The innovations upon OIDC mark a shift from a model where relying parties request claims from an IdP, to one where they can request claims from specific issuers or according to certain trust frameworks and evaluation metrics appropriate to their use case. This kind of policy-level data management creates a much more predictable and secure way for businesses and people to get the data they need. OIDC Credential Provider, a new spec at the OpenID Foundation, is challenging the notion that the identity that a user receives has to be an identity entirely bound to its domain. It offers traditional IdPs a way to issue credentials that are portable and can cross domains because the identity/identifier is no longer coupled to the provider as is the case with an identity token. This work serves to further bridge the gap between existing digital identity infrastructure and emerging technologies which are more decentralized and user-centric. It sets the foundation for a deeper shift in how data is managed online, whether it comes in the form of identity claims, authorizations, or other kinds of verifiable data. Broadening the landscape of digital identity OIDC, which is primarily used for identity, is built upon OAuth 2.0, whose primary use is authorization and access. If OIDC is about who the End-User is, then OAuth 2.0 is about what you’re allowed to do on behalf of and at the consent of the End-User. OAuth 2.0 was built prior to OIDC, in many ways because authorization allowed people to accomplish quite a bit without the capabilities of a formalized and standardized identity protocol. Eventually, it became obvious that identity is an integral and relatively well-defined cornerstone of web access that needed a simple solution. OIDC emerged as it increasingly became a requirement to know who the end-user (or resource owner) is and for the client to be able to request access to basic claims about them. Together, OIDC and OAuth2.0 create a protocol that combines authentication and authorization. While this allows them to work natively with one another, it’s not always helpful from an infrastructure standpoint to collapse these different functions together. Efforts like WebID are currently trending towards the reseparation of these concepts that have become married in the current world of OpenID, by developing browser APIs that are specifically geared towards identity. However, without a solution to authorization, it could be argued that many of the goals of the project will remain unsatisfied whenever the relying party requires both authentication and authorization in a given interaction. As it turns out, these problems are all closely related to each other and require a broad and coordinated approach. As we step into an increasingly digital era where the expectation continues to evolve around what’s possible to do online, the role of identity becomes increasingly complex. Take, for example, sectors such as the financial industry dealing with increased requirements around electronic Know-Your-Customer (KYC) policies. In parallel with the innovation around web identity and the adoption of emerging technologies such as VCs, there has been a growing realization that the evolution of digital identity enables many opportunities that extend far beyond the domain of identity. This is where the power of verifiable data on the web really begins, and with it an expanded scope and structure for how to build digital infrastructure that can support a whole new class of applications. A new proposed browser API called Credential Handler API (CHAPI) offers a promising solution to browser-mediated interactions that complements the identity-centric technologies of OIDC and WebID. It currently takes the form of a polyfill to allow these capabilities to be used in the browser today. Similar to how SIOP proposes for the user to be able pick their provider for identity-related credentials, CHAPI allows you to pick your provider, but not just for identity — for any kind of credential. In that sense, OIDC and CHAPI are solving slightly different problems: - OIDC is primarily about requesting authentication of an End-User and receiving some limited identity claims about them, and in certain circumstances also accessing protected resources on their behalf. - CHAPI is about requesting credentials that may describe the End-user or may not. Additionally, credentials might not even be related to their identity directly and instead used for other related functions like granting authorization, access, etc. While OIDC offers a simple protocol based upon URL redirects, CHAPI pushes for a world of deeper integration with the browser that affords several useability benefits. Unlike traditional implementations of OIDC, CHAPI does not start with the assumption that an identity is fixed to the provider. Instead, the end-user gets to register their preferred providers in the browser and then select from this list when an interaction with their provider is required. Since CHAPI allows for exchanging credentials that may or may not be related to the end-user, it allows for a much broader set of interactions than what’s provided by today’s identity protocols. In theory, these can work together rather than as alternative options. You could, for instance, treat CHAPI browser APIs as a client to contact the end-user’s OpenID Provider and then use CHAPI to exchange and present additional credentials that may be under the end-user’s control. CHAPI is very oriented towards the “credential” abstraction, which is essentially a fixed set of claims protected in a cryptographic envelope and often intended to be long lived. A useful insight from the development of OIDC is that it may be helpful to separate, at least logically, the presentation of identity-related information from the presentation of other kinds of information. To extend this idea, authenticating or presenting a credential is different from authenticating that you’re the subject of a credential. You may choose to do these things in succession, but they are not inherently related. The reason this is important has to do with privacy, data hygiene, and best security practices. In order to allow users to both exercise their identity on the web and manage all of their credentials in one place, we should be creating systems that default to requesting specific information about an end-user as needed, not necessarily requesting credentials when what’s needed is an authentic identity and vice versa. Adopting this kind of policy would allow configurations where the identifier for the credential subject would not be assumed to be the identifier used to identify the subject with the relying party. Using this capability in combination with approaches to selective disclosure like VCs with JSON-LD BBS+ signatures will ensure not only a coherent system that can separate identity and access, but also one that respects user privacy and provides a bridge between existing identity management infrastructure and emerging technologies. An emergent user experience Using these technologies in tandem also helps to bridge the divide between native and web applications when it comes to managing identity across different modalities. Although the two often get conflated, a digital wallet for holding user credentials is not necessarily an application. It’s a service to help users manage their credentials, both identity-related and otherwise, and should be accessible wherever an end-user needs to access it. In truth, native apps and web apps are each good at different things and come with their own unique set of trade-offs and implementation challenges. Looking at this emerging paradigm where identity is managed in a coherent way across different types of digital infrastructure, “web wallets” and “native wallets” are not necessarily mutually exclusive — emerging technologies can leverage redirects to allow the use of both. The revolution around digital identity offers a new paradigm that places users in a position of greater control around their digital interactions, giving them the tools to exercise agency over their identity and their data online. Modern legislation focused on privacy, portability, security and accessible user experience is also creating an impetus for the consolidation of legacy practices. The opportunity is to leverage this directional shift to create a network effect across the digital ecosystem, making it easier for relying parties to build secure web experiences and unlocking entirely new value opportunities for claims providers and data authorities. Users shouldn’t have to manage the complexity left behind by today’s outdated identity systems, and they shouldn’t be collateral damage when it comes to designing convenient apps and services. Without careful coordination, much of the newer innovation could lead to even more fragmentation in the digital landscape. However, as we can see here, many of these technology efforts and standards are solving similar or complementary problems. Ultimately, a successful reinvention of identity on the web should make privacy and security easy; easy for end-users to understand, easy for relying parties to support, and easy for providers to implement. That means building bridges across technologies to support not only today’s internet users, but enabling access to an entirely new set of stakeholders across the globe who will finally have a seat at the table, such as those without access to the internet or readily available web infrastructure. As these technologies develop, we should continue to push for consolidation and simplicity to strike the elusive balance between security and convenience across the ecosystem for everyday users. Where to from here? Solving the challenges necessary to realize the future state of identity on the web will take a collective effort of vendor collaboration, standards contributions, practical implementations and education. In order to create adoption of this technology at scale, we should consider the following as concrete next steps we can all take to bring this vision to life: Continue to drive development of bridging technologies that integrate well with existing identity solutions and provide a path to decentralized and portable identity - E.g. formalization of OIDC Credential Provider to extend existing IdPs Empower users to exercise autonomy and sovereignty in selecting their service provider, as well as the ability to change providers and manage services over time - E.g. selection mechanisms introduced by SIOP and WebID Adopt a holistic approach to building solutions that recognizes the role of browser-mediated interactions in preserving user privacy - E.g. newer browser developments such as CHAPI and WebID Build solutions that make as few assumptions as necessary in order to support different types of deployment environments that show up in real-world use cases - E.g. evolution of SIOP as well as supporting web and native wallets Ensure that the development of decentralized digital identity supports the variety and diversity of data that may be managed by users in the future, whether that data be identity-related or otherwise Taking these steps will help to ensure that the identity technologies we build to support the digital infrastructure of tomorrow will avoid perpetuating the inequalities and accessibility barriers we face today. By doing our part to collaborate and contribute to solutions that work for everybody, building bridges rather than building siloes, we can create a paradigm shift that has longevity and resilience far into the future. We hope you join us.",https://medium.com/Mattr-global/the-state-of-identity-on-the-web-cffc392bc7ea,,Post,,Explainer,,,,,,,OIDC,2021-03-14,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Verifiable Data,"refers to the authenticity and integrity of the actual data elements being shared. *Also covers Verifiable Relationships, Verifiable Processes, Verifiable Credentials, along with Semantics and Schemas.*","Learn Concepts: Verifiable Data The ability to prove the integrity and authenticity of shared data is a key component to establishing trust online. Given that we produce so much data and are constantly sharing and moving that data around, it is a complex task to identify a solution that will work for the vast majority of internet users across a variety of different contexts. The fundamental problem to address is how to establish authority on a piece of data, and how to enable mechanisms to trust those authorities in a broad set of contexts. Solving this problem on a basic level allows entities to have greater trust in the data they’re sharing, and for relying parties to understand the integrity and authenticity of the data being shared. We use the overarching term verifiable data to refer to this problem domain. Verifiable data can be further expanded into three key pillars: - Verifiable data - Verifiable relationships - Verifiable processes Verifiable data This refers to the authenticity and integrity of the actual data elements being shared. Verifiable relationships This refers to the ability to audit and understand the connections between various entities as well as how each of these entities are represented in data. Verifiable processes This describe the ability to verify any digital process such as onboarding a user or managing a bank account (particularly with respect to how data enables the process to be managed and maintained). These closely-related, interdependent concepts rely on verifiable data technology becoming a reality. Verifiable Credentials The basic data model of W3C Verifiable Credentials may be familiar to developers and architects that are used to working with attribute-based credentials and data technologies. The issuer, or the authority on some information about a subject (e.g. a person), issues a credential containing this information in the form of claims to a holder. The holder is responsible for storing and managing that credential, and in most instances uses a piece of software that acts on their behalf, such as a digital wallet. When a verifier (sometimes referred to as a relying party) needs to validate some information, they can request from the holder some data to meet their verification requirements. The holder unilaterally determines if they wish to act upon the request and is free to present the claims contained in their verifiable credentials using any number of techniques to preserve their privacy. Verifiable Credentials form the foundation for verifiable data in the emerging web of trust. They can be thought of as a container for many different types of information as well as different types of credentials. Because it is an open standard at the W3C, verifiable credentials are able to widely implemented by many different software providers, institutions, governments, and businesses. Due to the wide applicability of these standards, similar content integrity protections and guarantees are provided regardless of the implementation. Semantics and Schemas The authenticity and integrity-providing mechanisms presented by Verifiable Credentials provide additional benefits beyond the evaluation of verifiable data. They also provide a number of extensibility mechanisms that allow data to be linked to other kinds of data in order to be more easily understood in the context of relationships and processes. One concrete example of this is the application of data schemas or data vocabularies. Schemas are a set of types and properties that are used to describe data. In the context of data sharing, schemas are an incredibly useful and necessary tool in order to represent data accurately from the point of creation to sharing and verification. In essence, data schemas in the Verifiable Credential ecosystem are only useful if they are strongly reused by many different parties. If each implementer of Verifiable Credentials chooses to describe and represent data in a slightly different way, it creates incoherence and inconsistency in data and threatens to diminish the potential of ubiquitous adoption of open standards and schemas. Verifiable Credentials make use of JSON-LD to extend the data model to support dynamic data vocabularies and schemas. This allows us to not only use existing JSON-LD schemas, but to utilize the mechanism defined by JSON-LD to create and share new schemas as well. To a large extent this is what JSON-LD was designed for; the adoption and reuse of common data vocabularies. This type of Verifiable Credential is best characterized as a kind of Linked Data Proof. It allows issuers to make statements that can be shared without loss of trust because their authorship can be verified by a third party. Linked Data Proofs define the capability for verifying the authenticity and integrity of Linked Data documents with mathematical proofs and asymmetric cryptography. It provides a simple security protocol which is native to JSON-LD. Due to the nature of linked data, they are built to compactly represent proof chains and allow a Verifiable Credential to be easily protected on a more granular basis; on a per-attribute basis rather than a per-credential basis. This mechanism becomes particularly useful when evaluating a chain of trusted credentials belonging to organizations and individuals. A proof chain is used when the same data needs to be signed by multiple entities and the order in which the proofs were generated matters. For example, such as in the case of a notary counter-signing a proof that had been created on a document. Where order needs to be preserved, a proof chain is represented by including an ordered list of proofs with a “proof chain” key in a Verifiable Credential. This kind of embedded proof can be used to establish the integrity of verifiable data chains. Overall, the ability for data to be shared across contexts whilst retaining its integrity and semantics is a critical building block of the emerging web of trust.",https://medium.com/Mattr-global/learn-concepts-verifiable-data-4515a62c8e40,,Post,,Explainer,,,,,,"Verifiable Relationships,Verifiable Processes,Semantics,Schemas",Verifiable Credentials,2020-12-23,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Web of Trust 101,"The emerging “Web of Trust” is an idea that has been around since the dawn of the internet. To explain what motivated its creation, let’s take a look at how trust on the internet functions today.","Learn Concepts: Web of Trust 101 The original vision for the World Wide Web was an open platform on which everyone could freely communicate and access information. It was built on the decentralized architecture of the internet, used open standards, and functioned as an accessible platform that would inherit and amplify the fundamentally decentralized nature of the network that underpinned it. However, the reality today has fallen far short of its founding vision. The modern internet is largely centralized and siloed. The vast majority of web traffic belongs to a few powerful corporations that control the distribution of data through platforms designed to selectively serve up information based on in-depth analysis of their users’ data. The lack of an identity system native to the internet over time has created an imbalance of power that erodes users’ digital rights. Several decades after the web was introduced, most of us are now accustomed to widespread spam, fraud, abuse, and misinformation. We don’t have any real agency over how our data is used, and the corporations controlling our data have shown their inability to properly shoulder the responsibility that comes with it. We’re locked into this system, with no reasonable ability to opt out. As a result, the modern internet has made it incredibly difficult to establish trust with others online, creating many barriers to participation that often leave everyday users out of the value chain. Information and data, and the value they create, are no longer freely accessible by the users creating it — most of whom are utterly unaware of the limited agency they have in accessing it. To fix this fundamental problem of digital trust, we need to begin by building a system that allows users to control their identities and to move their Personal data freely from one online platform to another without fear of vendor lock-in. Evolution of Digital Trust The emerging “Web of Trust” is an idea that has been around since the dawn of the internet. To explain what motivated its creation, let’s take a look at how trust on the internet functions today. Though we may not always be aware, we rely on a basic form of security practically every day we use the internet. HTTPS, the secure browsing protocol for the World Wide Web, uses a common infrastructure based on digital signatures to allow users to authenticate and access websites, and protect the privacy and integrity of the data exchanged while in transit. It is used to establish trust on all types of websites, to secure accounts, and to keep user communications, identity, and web browsing private. This is all based on the usage of cryptographic keys, instead of passwords, to perform security and encryption. Public key cryptography is a cryptographic technique that enables entities to securely communicate on an insecure public network (the internet), and reliably verify the identity of users via digital signatures. It is required for activities where simple passwords are an inadequate authentication method and more rigorous proof is required to confirm the identity of the parties involved in the communication and to validate the information being transferred. The type of Public Key Infrastructure (PKI) currently used by the internet primarily relies on a hierarchical system of certificate authorities (CAs), which are effectively third-parties that have been designated to manage identifiers and public keys. Virtually all internet software now relies on these authorities. Certificate authorities are responsible for verifying the authenticity and integrity of public keys that belong to a given user, all the way up to a ‘self-signed’ root certificate. Root certifications are typically distributed with applications such as browsers and email clients. Applications commonly include over one hundred root certificates from dozens of PKIs, thereby bestowing trust throughout the hierarchy of certificates which lead back to them. The concept is that if you can trust the chain of keys, you can effectively establish secure communication with another entity with a reasonable level of assurance that you’re talking to the right person. However, the reliance on certificate authorities creates a centralized dependency for practically all transactions on the internet that require trust. This primarily has to do with the fact that current PKI systems tightly control who gets to manage and control the cryptographic keys associated with certificates. This constraint means that modern cryptography is largely unusable for the average user, forcing us to borrow or ‘rent’ identifiers such as our email addresses, usernames, and website domains through systems like DNS, X.509, and social networks. And because we need these identities to communicate and transact online, we’re effectively beholden to these systems which are outside of our control. In addition, the usability challenges associated with current PKI systems mean that much of Web traffic today is unsigned and unencrypted, such as on major social networks. In other words, cryptographic trust is the backbone of all internet communications, but that trust rarely trickles down to the user level. A fully realized web of trust instead relies on self-signed certificates and third party attestations, forming the basis for what’s known as a Decentralized Public Key Infrastructure (DPKI). DPKI returns control of online identities to the entities they belong to, bringing the power of cryptography to everyday users (we call this user-centric cryptography) by delegating the responsibility of public key management to secure decentralized datastores, so anyone and anything can start building trust on the web. A Trust Layer for the Internet The foundational technology for a new DPKI is a system of distributed identifiers for people, organizations, and things. Decentralized identifiers are self-certifying identifiers that allow for distributed discovery of public keys. DIDs can be stored on a variety of different data registries, such as blockchains and public databases, and users can always be sure that they’re talking to the right person or entity because an identifier’s lookup value is linked to the most current public keys for that identifier. This creates a kind of even playing field where the standards and requirements for key management are uniform across different users in an ecosystem, from everyday users to large corporations and everything in between. This will, in the first place, give users far greater control over the manner in which their Personal data is being used by businesses, allowing them to tweak their own experience with services to arrive at that specific trade-off between convenience and data protection that best suits their individual requirements. But more importantly, it will allow users to continue to federate data storage across multiple services while still delivering the benefits that come from cross-platform data exchange. In other words, it gives them the ability to manage all their data in the same way while being able to deal with data differently depending on the context they are in. This also allows them to move their Personal data freely from one online platform to another without losing access to the services they need, and without fear of vendor lock-in. Eventually, this will allow for portability not only of data but of the trust and reputation associated with the subjects of that data. For instance, a user might be able to transfer their reputation score from one ride-sharing service to another, or perhaps use the trust they’ve established in one context in another context entirely. This emerging decentralized web of trust is being forged by a global community of developers, architects, engineers, organizations, hackers, lawyers, activists, and more working to push forward and develop web standards for things like credential exchange, secure messaging, secure storage, and trust frameworks to support this new paradigm. The work is happening in places like the World Wide Web Foundation, W3C Credentials Community Group, Decentralized Identity Foundation, Trust Over IP Foundation, Linux Foundation’s Hyperledger project, and Internet Engineering Task Force, to name a few.",https://medium.com/Mattr-global/learn-concepts-web-of-trust-101-77120941ea6c,,Post,,Explainer,,,,,,web of trust,,2020-12-27,,,,,,,,,,,,,
|
||
Mattr,Personal,,,Damien Bowden,,,,,,CREATE AN OIDC CREDENTIAL ISSUER WITH Mattr AND ASP.NET CORE,This article shows how to create and issue verifiable credentials using Mattr and an ASP.NET Core. The ASP.NET Core application allows an admin user to create an OIDC credential issuer using the Mattr service. The credentials are displayed in an ASP.NET Core Razor Page web UI as a QR code for the users of the application.,,https://damienbod.com/2021/05/03/create-an-oidc-credential-issuer-with-Mattr-and-asp-net-core/,,Post,,HowTo,,,,,aspnet,,OIDC,2021-05-03,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Issuing credentials directly to the Mattr mobile wallet,"If you’re already using a secure mechanism to authenticate your users, then setting up OIDC capability isn’t necessary. As we’ve explored, sending credentials using secure DID messaging directly or via a QR code or deep-link is safe, convenient and allows users to obtain their credentials directly.","Issuing credentials directly to the Mattr mobile wallet Summary: We explore how to issue credentials using secure messaging. At Mattr, we’ve pioneered a way to request and receive credentials using OpenID Connect (OIDC) capability. However, if you already have a robust mechanism in place to authenticate users, then setting up additional OIDC capability is unnecessary. Sending credentials using secure Decentralized Identifier (DID) messaging or directly with a QR code is a safe, convenient alternative. In this article, we’ll explore this alternative method in more detail. The Mattr mobile wallet supports two main channels for issuing a credential: - OpenID Credential Provider - Secure DID messaging Note: We’re building DID messaging on the JOSE stack to facilitate signing and encryption. OpenID Credential Provider If you haven’t yet authenticated a user, using OpenID Credential Provider offers a secure way to authenticate a user at the point of credential creation. It involves setting up and configuring an OpenID Provider to work alongside the Mattr VII OIDC Bridge Extension — simple if you’re already using OIDC infrastructure, but more complex to set up from scratch. Secure DID messaging If you’ve already authenticated a user through another method, issuing a credential through a secure DID message is a reliable alternative to OIDC. This approach works well if you’re authenticating users through a website login or even in person (like a classroom or training centre). Let’s see how this might work in practice. 1. Authentication Before issuing a credential, you need to authenticate the user. The most common way to do this is having a user login to a session on your website. 2. Linking Now that you’ve authenticated the user, you need to link their DID to the session of the user. This DID will be generated by the wallet they are using to hold the credential. You can obtain it in a few different ways: - If the user already has a credential you’ve issued, and you trust they are still in control of the subject DID in the credential, you can create a new credential based off the DID inside the credential. - If the user needs to link their DID from their mobile wallet, you can use a DID Auth flow to make sure you’re obtaining a validated DID that the user can prove they own. - If you needed to verify credential data from the user as part of the transaction anyway, you’ll need to use the Holder DID from the Verifiable Presentation as the determining DID. For very simple use cases like demo and testing, if a user has the Mattr mobile wallet they can use a Public DID — they can simply copy the DID and pass it to you out-of-band. 3. Constructing the credential and message Now that the DID is ‘known’ and we’ve authenticated the user, a Verifiable Credential is created using the Mattr VII platform. This credential is then packaged into a secure DID message format to be delivered to the recipient. Because the subject DID is known, the DID message can be encrypted to ensure the data is safe in transit. Use the messaging endpoints to easily perform this step. 4. Delivery The Mattr mobile wallet can read DID messages in either a secure DID message, QR code or deep-link. Sending a secure DID message is an easy way to push messages to mobile wallet holders. Once the message has been encrypted, it can be sent to the subject DID and the Mattr VII platform will route the encrypted message to the holder. QR codes and deep-links typically make the messages too large to be reliably read by most smartphones. To solve this, we embed a URL to an endpoint hosting the DID message. Then, the Mattr mobile wallet simply follows a redirect to obtain the message. 5. Storing the credential Once the Mattr mobile wallet receives the message, the user can view the credential to make sure it’s correct, then store the credential in the wallet. At this stage, the wallet will also perform some checks to verify the credential, including: - Validation of the credential proof - Resolving the issuer DID document - Checking that the issuer is publishing a valid DID-to-Domain linkage credential. The checks are clearly visible to the user, and there’s assurance that when it comes time for a user to present their credential, it’ll be accepted by trusted verifiers. Wrap up If you’re already using a secure mechanism to authenticate your users, then setting up OIDC capability isn’t necessary. As we’ve explored, sending credentials using secure DID messaging directly or via a QR code or deep-link is safe, convenient and allows users to obtain their credentials directly. Try this out for yourself now on the Mattr VII platform and in the Mattr mobile wallet. Mattr VII trial Sign up today for a free trial of our Mattr VII platform to experience the powerful capabilities that can be deployed in the context of your organization.",https://medium.com/Mattr-global/issuing-credentials-directly-to-the-Mattr-mobile-wallet-8e8cab931e2e,,Post,,HowTo,,,,,,,,2021-08-13,,,,,,,,,,,,,
|
||
Mattr,Personal,,,Damien Bowden,,,,,,Present and and Verify Verifiable Credentials in ASP.NET Core Using Decentralized Identities and Mattr,"This article shows how use verifiable credentials stored on a digital wallet to verify a digital identity and use in an application. For this to work, a trust needs to exist between the verifiable credential issuer and the application which requires the verifiable credentials to verify. A blockchain decentralized database is used and Mattr is used as a access layer to this ledger and blockchain. The applications are implemented in ASP.NET Core.",,https://damienbod.com/2021/05/10/present-and-verify-verifiable-credentials-in-asp-net-core-using-decentralized-identities-and-Mattr/,,Post,,HowTo,,,,Wallets,,Aspnet,Verifiable Credentials,2021-05-10,,,,,,,,,,,,,
|
||
Mattr,MyCreds,,,,ARUCC,,,,,"ARUCC is pleased to announce a partnership between Digitary, its service partner, and Mattr, a friend of MyCreds™","These two international organizations are combining their talents to deliver SSI (self-sovereign identity) and Verifiable Credentials for the ARUCC MyCreds™ virtual wallet. This groundbreaking work means the Canadian MyCreds™ credential wallet along with other international members of the Digitary global network will be able to reach an even higher bar of service delivery for mobile learners, creating a triangle of trust that includes them and the Canadian colleges and universities.","ARUCC is pleased to announce a partnership between Digitary, its service partner, and Mattr, a friend of MyCreds™. These two international organizations are combining their talents to deliver SSI (self-sovereign identity) and Verifiable Credentials for the ARUCC MyCreds™ virtual wallet. This groundbreaking work means the Canadian MyCreds™ credential wallet along with other international members of the Digitary global network will be able to reach an even higher bar of service delivery for mobile learners, creating a triangle of trust that includes them and the Canadian colleges and universities.",https://mycreds.ca/2021/04/14/bridging-today-and-tomorrow-ensuring-self-sovereignty-for-learners-through-aruccs-mycreds/,,Post,,Meta,,,,,,,,2021-04-14,,,,,,,,,,,,,
|
||
Mattr,DHS,,,,,,,,,DHS Awards $200K for Issuing and Validating Essential Work and Task Licenses,"Mattr is currently building an extensive set of foundational capabilities in a software-as-a-service (SaaS) platform for verifiable credential issuance, verification, and storage. An essential worker or a person performing an essential task would receive various credentials and attestations from many issuers containing relevant assertions about their essential work or task status. Their solution also offers the option to validate the information further by using either public or private registries of authoritative verifiable information.","FOR IMMEDIATE RELEASE S&T Public Affairs, 202-254-2385 E-mail: STMedia@hq.DHS.gov WASHINGTON – The Department of Homeland Security (DHS) Science and Technology Directorate (S&T) has awarded $200,000 to Mattr LIMITED, a woman-owned start-up based in Auckland, New Zealand, to develop a capability to digitally issue and validate essential work and task licenses for United States Citizenship and Immigration Services (USCIS). The Phase 1 award was made under S&T’s Silicon Valley Innovation Program (SVIP) re-release of its Preventing Forgery & Counterfeiting of Certificates and Licenses solicitation, which sought standards-based blockchain and distributed ledger technology (DLT) solutions to fulfill additional needs across DHS missions. The COVID-19 global pandemic has amplified the role of essential workers in ensuring the continuity of operations in emergency response, supply chain fulfillment, essential business, and other previously overlooked areas of interest―and the need for them to self-identify in the performance of their duties. In addition, the need for individuals to interact in-person with DHS to conduct official tasks, duties, and appointments while ensuring public health and safety still exists during this global pandemic. Mattr is currently building an extensive set of foundational capabilities in a software-as-a-service (SaaS) platform for verifiable credential issuance, verification, and storage. An essential worker or a person performing an essential task would receive various credentials and attestations from many issuers containing relevant assertions about their essential work or task status. Their solution also offers the option to validate the information further by using either public or private registries of authoritative verifiable information. “The ability for workers and individuals conducting essentials tasks to assert their respective eligibilities in a manner that safeguards their individual privacy and civil liberties while ensuring public health is a critical need,” said Anil John, SVIP technical director. “Mattr’s platform brings the modular building blocks to address this need by its support for World Wide Web Consortium (W3C) verifiable credentials and decentralized identifier standards. They will adapt and enhance their platform by supporting privacy respecting, ledger independent selective disclosure of information, and integration with existing federated identity protocols to provide a complete solution.” Given the reality that certain areas of the economy will need to continue to operate in parallel for an extended period of time while effective counter-measures are being developed, the ability for workers and individuals conducting essentials tasks to assert their respective eligibility in a manner that safeguards their individual privacy and civil liberties while ensuring public health is a critical need. About SVIP SVIP is one of S&T’s programs and tools to fund innovation and work with private sector partners to advance homeland security solutions. Companies participating in SVIP are eligible for up to $800,000 of non-dilutive funding over four phases to develop and adapt commercial technologies for homeland security use cases. For more information on current and future SVIP solicitations, visit https://www.DHS.gov/science-and-technology/svip or contact DHS-silicon-valley@hq.DHS.gov. For more information about S&T’s innovation programs and tools, visit https://www.DHS.gov/science-and-technology/business-opportunities. ###",https://www.dhs.gov/science-and-technology/news/2020/10/09/news-release-dhs-awards-200k-issuing-and-validating-essential-work-and-task-licenses,,Post,,Meta,,,,,,,,2020-10-09,,,,,,,,,,,,,
|
||
Mattr,Mattr,,,,IATA,,,,,Mattr has a series of Videos about their work with IATA,"This is a significant undertaking for both IATA and the other parties involved. As part of Mattr’s role in supporting this initiative, we developed a series of educational videos in partnership with IATA to explain the value and mechanics of a decentralised identity ecosystem.","Developing the future of digital identity in aviation Mattr & IATA collaboration At Mattr, we’re exploring how different sectors can take advantage of verifiable data and digital trust. It’s this mission that led us to our recent partnership with the International Air Transport Association (IATA) on their Digital Identity for Distribution initiative. Mattr is working with IATA to demonstrate how digital identity can be used to enable the secure identification and authentication of organisations involved in the travel distribution chain to improve security and reduce the level of fraud risk for both airlines and travel intermediaries. How IATA supports the aviation industry IATA is the trade association for the airline industry, representing 290 airlines around the world. One of IATA’s roles is to support airlines travel agents, industry bodies and other parties to operate more effectively by managing a set of commonly recognised codes and identities which help to create an efficient distribution ecosystem for all players. Transforming identity management in the airline industry Current identification standards in airline distribution are based on technology concepts developed in the 1960 and 1970s. They have served the industry exceptionally well and continue to be used extensively today. However, the evolution of technology over the past 50 years has meant a transition from closed, private networks to open, public infrastructure and a need to review the legacy identification standards. Using solutions such as Mattr VII, which is built on open standards including W3C Decentralized Identifiers and Verifiable Credentials, can help enable industry bodies like IATA to solve identity challenges in a complex ecosystem. At the same this also ensures that members of such ecosystems are always in control of their own identity and can reliably trust the information shared between organisations. As aviation starts to place customers at the focus of a shop/order/pay ecosystem regardless of distribution channel, IATA has identified some key issues: Airlines can't fully identify all parties in the distribution value chain Current industry coding systems can't scale to cover all parties Current codes do not provide end-to-end security and offer loopholes for fraudulent use and impersonation of identities. To address these issues, the Digital Transformation Advisory Council (DTAC) endorsed a digital strategy with B2B identity management as the highest priority. The DTAC is comprised of senior digital transformation representatives from airlines and advises IATA on industry digital transformation. This is a significant undertaking for both IATA and the other parties involved. As part of Mattr’s role in supporting this initiative, we developed a series of educational videos in partnership with IATA to explain the value and mechanics of a decentralised identity ecosystem. See our video series below. How this benefits the aviation sector This digital identity approach will give all parties in the aviation sector the ability to quickly verify who they’re doing business with, reduce fraud and provide end-to-end security in the transaction process. Sellers, such as Travel Agencies, will be in full control of their own identities and information and will only have to disclose (to the airline or supplier) the relevant information required to request a tailor-made offer and complete a transaction. Find out more about this initiative To learn more about Digital Identity for Distribution, visit the IATA Innovation Hub. Watch our video series The videos below are a series of short explainers and demonstrations of what a digital identity solution could look like in the airline industry: Get in touch with us Our technologies give you powerful ways to build trust and prove things about people. If you’ve got a use case and want to see what it might look like, we’d love to talk to you about it.",https://Mattr.global/solutions/iata/,,Post,,Meta,,,,,,,,2022-01-01,,,,,,,,,,,,,
|
||
Mattr,Mattr,,,Nader Helmy,,,,,,Why we’re launching Mattr VII,"Inspired by the seven states of matter, our platform gives builders and developers all the tools they need at their fingertips to create a whole new universe of decentralized products and applications. We provide all the raw technical building blocks to allow you to create exactly what you have in mind. Mattr VII is composable and configurable to fit your needs, whether you’re a well-established business with legacy systems or a start-up looking to build the next best thing in digital privacy. Best of all, Mattr VII is use-case-agnostic, meaning we’ve baked minimal dependencies into our products so you can use them the way that makes the most sense for you.","Why we’re launching Mattr VII Nader Helmy • Mar 26, 2021 • 6 min read It’s no secret we need a better web. The original vision of an open and decentralised network that’s universally accessible continues to be a north star for those working to design the future of digital infrastructure for everyday people. Despite the progress that has been made in democratising access to massive amounts of information, the dire state of cybersecurity and privacy on the internet today present significant barriers to access for too many of our most vulnerable populations. We started Mattr because we believe that standards, transparency, and openness are not only better for users; they make for stronger systems and more resilient networks. We recognise that a decentralised web of digital trust, based on transparency, consent, and verifiable data, can help us address critical challenges on a global scale. It represents a significant opportunity to give people real agency and control over their digital lives. Our story At its inception, we chose “Mattr” as a moniker because we strongly believed that the movement towards more decentralised systems will fundamentally change the nature of data and privacy on the internet. Matter, in its varying states, forms the building blocks of the universe, symbolically representing the capacity for change and transformation that allows us all to grow and adapt. In another sense, people matter, and the impact of decisions we make as builders of technology extend beyond ourselves. It’s a responsibility we take seriously, as Tim Berners Lee puts it, “to preserve new frontiers for the common good.” We proudly bear the name Mattr and the potential it represents as we’ve built out our little universe of products. In September 2020, we introduced our decentralised identity platform. Our goal was to deliver standards-based digital trust to developers in a scalable manner. We designed our platform with a modular security architecture to enable our tools to work across many different contexts. By investing deeply in open standards and open source communities as well as developing insights through collaboration and research, we realised that developers want to use something that’s convenient without compromising on flexibility, choice, or security. That’s why we launched our platform with standards-based cryptography and configurable building blocks to suit a broad array of use cases and user experiences in a way that can evolve as technology matures. At the same time, we’ve continued to work in open source and open standards communities with greater commitment than ever to make sure we’re helping to build a digital ecosystem that can support global scale. We launched Mattr Learn and Mattr Resources as hubs for those interested in these new technologies, developing educational content to explore concepts around decentralised identity, offering guided developer tutorials and videos, and providing documentation and API references. We also unveiled a new website, introduced a novel approach to selective disclosure of verifiable credentials, built and defined a new secure messaging standard, developed a prototype for paper-based credentials to cater for low-tech environments, and made a bridge to extend OpenID Connect with verifiable credentials. We’ve consistently released tools and added features to make our products more secure, extensible, and easy to use. In parallel, we also joined the U.S. Department of Homeland Security’s SVIP program in October to help advance the goals of decentralised identity and demonstrate provable interoperability with other vendors in a transparent and globally-visible manner. Zooming out a bit, our journey at Mattr is part of a much larger picture of passionate people working in collaborative networks across the world to make this happen. The bigger picture It has been an incredible year for decentralised and self-sovereign identity as a whole. In light of the global-scale disruption of COVID-19, the demand for more secure digital systems became even more critical to our everyday lives. Start-ups, corporations, governments, and standards organisations alike have been heavily investing in building technology and infrastructure to support an increasingly digital world. We’re seeing this innovation happen across the globe, from the work being done by the DHS Silicon Valley Innovation Program to the Pan-Canadian Trust Framework and New Zealand Digital Identity Trust Framework. Many global leaders are stepping up to support and invest in more privacy-preserving digital security, and for good reason. Recent legislation like GDPR and CCPA have made the role of big tech companies and user data rights increasingly important, providing a clear mandate for a wave of change that promises to strengthen the internet for the public good. This provides an incredible catalyst for all the work happening in areas such as cryptography, decentralised computing and digital governance. Just in the last year, we’ve seen the following advancements: - Secure Data Storage WG created at DIF and W3C to realise an interoperable technology for encrypted and confidential data storage - Decentralised Identifiers v1.0 specification reached “Candidate Recommendation” stage at the W3C, establishing stability in anticipation of standardisation later this year - Sidetree protocol v1.0 released at DIF, providing a layer-2 blockchain solution for scalable Decentralised Identifiers built on top of ledgers such as Bitcoin and Ethereum - DIDComm Messaging v2.0 specification launched at DIF, a new protocol for secure messaging based on Decentralised Identifiers and built on JOSE encryption standards - Self-Issued OpenID (SIOP) became an official working group item at the OpenID Foundation, advancing the conversation around the role of identity providers on the web - Google’s WebID project started developing features to allow the browser to mediate interactions between end-users and identity providers in a privacy-preserving way For more information on how all of these technologies are interconnected, read our latest paper, The State of Identity on the Web. In addition, as part of our involvement with the DHS SVIP program, in March of this year we participated in the DHS SVIP Interoperability Plugfest. This event saw 8 different companies, representing both human-centric identity credentials as well as asset-centric supply chain traceability credentials, come together to showcase standards-compliance and genuine cross-vendor and cross-platform interoperability via testing and live demonstrations. The full presentation, including demos and videos from the public showcase day, can be found here. These are just a handful of the significant accomplishments achieved over the last year. It’s been incredibly inspiring to see so many people working towards a common set of goals for the betterment of the web. As we’ve built our products and developed alongside the broader market, we’ve learned quite a bit about how to solve some of the core business and technical challenges associated with this new digital infrastructure. We’ve also gained a lot of insight from working directly with governments and companies across the globe to demonstrate interoperability and build bridges across different technology ecosystems. Launching Mattr VII We’ve been hard at work making our decentralised identity platform better than ever, and we’re proud to announce that as of today, we’re ready to support solutions that solve real-world problems for your users, in production — and it’s open to everybody. That’s why we’re rebranding our platform to Mattr VII. Inspired by the seven states of matter, our platform gives builders and developers all the tools they need at their fingertips to create a whole new universe of decentralised products and applications. We provide all the raw technical building blocks to allow you to create exactly what you have in mind. Mattr VII is composable and configurable to fit your needs, whether you’re a well-established business with legacy systems or a start-up looking to build the next best thing in digital privacy. Best of all, Mattr VII is use-case-agnostic, meaning we’ve baked minimal dependencies into our products so you can use them the way that makes the most sense for you. Starting today, we’re opening our platform for general availability. Specifically, that means if you’re ready to build a solution to solve a real-world problem for your users, we’re ready to support you. Simply contact us to get the ball rolling and to have your production environment set up. Of course, if you’re not quite ready for that, you can still test drive the platform by signing up for a free trial of Mattr VII to get started right away. It’s an exciting time in the Mattr universe, and we’re just getting started. We’re continuing to build-out features to operationalise and support production use cases. To this end, in the near future we will be enhancing the sign-up and onboarding experience as well as providing tools to monitor your usage of the platform. Please reach out to give us feedback on how we can improve our products to support the solutions you’re building. We’re excited to be in this new phase of our journey with Mattr VII. It will no doubt be another big year for decentralised identity, bringing us closer to the ultimate goal of bringing cryptography and digital trust to every person on the web. -- This blog was originally posted on Medium.",https://Mattr.global/resources/articles/why-launch-Mattr-vii/,,Post,,Meta,,,,,,Mattr VII,,2021-03-26,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Adding DID ION to Mattr VII,"Different types of DIDs can be registered and anchored using unique rules specific to the set of infrastructure where they’re stored. Since DIDs provide provenance for keys which are controlled by DID owners, the rules and systems that govern each kind of DID method have a significant impact on the trust and maintenance model for these identifiers.","Adding DID ION to Mattr VII Since the beginning of our journey here at Mattr, decentralization and digital identity have been central to our approach to building products. As part of this, we’ve supported Decentralized Identifiers (or DIDs) since the earliest launch of our platform. We’ve also considered how we might give you more options to expand the utility of these identities over time. An important milestone The W3C working group responsible for Decentralized Identifiers recently published the DID v1.0 specification under “Proposed Recommendation” status. This is a significant milestone as DIDs approach global standardization with the pending approval of the W3C Advisory Committee. DIDs are maturing, but so is the environment and context in which they were originally designed. With a complex ecosystem consisting of dozens of different methodologies and new ones emerging on a regular basis, it’s important to balance the potential of this decentralized approach with a realistic approach for defining the real utility and value of each DID method. For example, the DID Method Rubric provides a good frame of reference for comparing different approaches. Different types of DIDs can be registered and anchored using unique rules specific to the set of infrastructure where they’re stored. Since DIDs provide provenance for keys which are controlled by DID owners, the rules and systems that govern each kind of DID method have a significant impact on the trust and maintenance model for these identifiers. This is the key thing to remember when choosing a DID method that makes sense for your needs. Our supported DID methods In Mattr VII, by supporting a variety of DID methods — deterministic or key-based DIDs, domain-based DIDs, and ledger-based DIDs — we are able to provide tools which can be customized to fit the needs of individual people and organizations. - Key-based DIDs — Largely static, easy to create, and locally controlled. This makes them a natural choice for applications where there’s a need to manage connections and interactions with users directly. - DIDs anchored to web domains — These have a different trust model, where control over the domain can bootstrap a connection to a DID. This makes a lot of sense for organizations with existing domain names that already transact and do business online, and can extend their brand and reputation to the domain of DIDs. - Ledger-based DIDs — These offer a distributed system of public key infrastructure which is not centrally managed or controlled by a single party. While ledgers differ in their governance and consensus models, they ultimately provide a backbone for anchoring digital addresses in a way which allows them to be discovered and used by other parties. This can be a useful feature where a persistent identifier is needed, such as in online communication and collaboration. There is no single DID method or type of DID (which at the moment) should be universally applied to every situation. However, by using the strengths of each approach we can allow for a diverse ecosystem of digital identifiers enabling connections between complex networks of people, organizations and machines. To date, we’ve provided support for three main DID methods in our platform: DID Key, DID Web, and DID Sovrin. These align with three of the central types of infrastructure outlined above. Introducing DID ION We’re proud to announce that as of today we’ve added support for DID ION, a DID method which is anchored to IPFS and Bitcoin. We’ve supported the development of the Sidetree protocol that underpins DID ION for some time as it has matured in collaboration with working group members at the Decentralized Identity Foundation. With contributions from organizations such as Microsoft, Transmute, and SecureKey, Sidetree and DID ION have emerged as a scalable and enterprise-ready solution for anchoring DIDs. The core idea behind the Sidetree protocol is to create decentralized identifiers that can run on any distributed ledger system. DID ION is an implementation of that protocol which backs onto the Bitcoin blockchain, one of the largest and most used public ledger networks in the world. Sidetree possesses some unique advantages not readily present in other DID methods, such as low cost, high throughput, and built-in portability of the identifier. This provides a number of benefits to people and organizations, especially in supporting a large volume of different kinds of connections with the ability to manage and rotate keys as needed. We have added end-to-end capabilities for creating and resolving DIDs on the ION network across our platform and wallet products. Although DID ION is just one implementation of the Sidetree protocol, we see promise in other DID methods using Sidetree and will consider adding support for these over time as and when it makes sense. We’ll also continue to develop Sidetree in collaboration with the global standards community to ensure that this protocol and the ION Network have sustainable futures for a long time to come. At the same time, the community around DID Sovrin is developing a new kind of interoperability by designing a DID method that can work for vast networks of Indy ledgers, rather than focusing on the Sovrin-specific method that’s been used to date. As DID Sovrin gets phased out of adoption, we’re simultaneously deprecating standard support for DID Sovrin within Mattr VII. We’ll be phasing this out shortly with upcoming announcements for customers building on our existing platform. If you’ve got any use cases that utilize DID Sovrin or want to discuss extensibility options, please reach out to us on any of our social channels or at info@Mattr.global and we’ll be happy to work with you. Looking ahead We believe this a big step forward in providing a better set of choices when it comes to digital identity for our customers. From the start, we have designed our platform with flexibility and extensibility in mind, and will continue to support different DID methods as the market evolves. We look forward to seeing how these new tools can be used to solve problems in the real world and will keep working to identify better ways to encourage responsible use of digital identity on the web.",https://medium.com/Mattr-global/adding-did-ion-to-Mattr-vii-d56bdb7a2fde,,Post,,Product,,,,,,,DID:ION,2021-09-17,,,,,,,,,,,,,
|
||
Mattr,Mattr,,,,,,,,,Adding support for revocation of Verifiable Credentials leveraging the Revocation List 2020 draft from the CCG.,"Integrating revocation into our platform brings us one step closer to building a fully realized verifiable data ecosystem, an environment where verifiers can have more confidence and trust in the decisions they’re making and people can participate in the sharing and exchange of information without eroding their basic privacy. We look forward to continuing to collaborate with the community and gathering feedback from industry to enhance and extend different ways to accomplish revocation while respecting users’ digital rights.","Adding support for revocation of Verifiable Credentials The Mattr team is excited to announce a critical new addition to our product capabilities. We’re continuing to build out an extensive suite of features to support the exchange of Verifiable Credentials (VCs), leveraging the best efforts of the open-source community along with a number of distinct product innovations. These innovations include our recent work related to using BBS+ signatures for privacy-preserving selective disclosure and our earlier work on the OIDC Credential Provider spec. We’ve also been busy sharing new tools for checking the syntax of a JSON-LD credential during development. In this product release, we are focused on one of the fundamental capabilities in any credential-based system: the ability to provably revoke a credential when it’s no longer valid. Using verifiable data in combination with open standards not only improves the quality of the data exchanged in an ecosystem, it also enables the authority (issuer) on any piece of information to maintain that data throughout its lifecycle. In practice, this means that credential issuers can manage the status of a credential directly, using the same general mechanism as the one used for issuing VCs. Credentials are typically stored by the user (subject) in some kind of digital wallet where they are able to manage their credentials and control when and how to share their data. When accessing services, the user may consent to present their VCs to a relying party (verifier). The relying party needs to be able to be able to verify the credential is genuine and tamper free. They also need to be able to easily validate whether a presented credential has been revoked or not. For example, say you’re issued a digital driver’s license, then you go and get several speeding tickets. The department that issued your license determines you have breached the terms of your license and consequently, suspends your license. In doing so, the credential status is changed, and next time a relying party checks the status they will be able to see that you are no longer entitled to drive. If a car rental office is checking your driving status in order to loan you a vehicle, they’d like to be able to verify if the digital license is still valid and legitimate. User journey viewing a credential & presentation request with a revoked credential. In general, we want to accomplish all of these goals while minimizing the burden on data issuers and verifiers, and preserving the autonomy of credential holders in deciding how to store and disclose their data. We also want to remain flexible around where the revocation list is stored and managed, so we opted to implement an approach that’s extensible to different types of infrastructure supported by the issuer. The resulting solution contrasts with others that have tended to be tightly coupled with a particular kind of infrastructure such as a distributed ledger. We believe revocation should be built in a simple, transparent, and standardizable manner, which is why we built our approach on the W3C CCG’s Credential Revocation List. Practically, the information regarding whether a credential has been revoked is represented in the credential status property of a VC, as defined in the W3C spec. When a credential is first created, the issuer can include in the credential status field a reference to a publicly available revocation list. This list can be updated by the issuer at any time, so resolving this reference will always get you the latest revocation status. The retrieved revocation list is in the form of a JSON-LD based verifiable credential, making it straightforward to validate. Holders and verifiers can simply check this revocation list to determine if a specific VC has been revoked or not. Since the revocation list contains the revocation status of many credentials, credential subjects get “herd privacy” with regards to the issuer, meaning the issuer doesn’t know where an issued credential is being used. This protects the holder against one form of surveillance and potential leakage of their Personal data. Some use cases will require solutions that offer an even greater level of privacy for credential subjects, such as cryptographic accumulators. We are actively working on a number of improvements in this area which will offer enhanced privacy for credential holders when it’s required and simplify the process of relying parties validating a credential’s status. We’ve implemented this functionality using a simple optional flag on our VC issuance API, allowing issuers to toggle between credentials that support revocation and those that don’t. On the holder side, our Mobile Wallet App automatically checks the revocation status when opened, alerting the user of any changes and giving them a warning if it’s revoked. For verifiers, validating credential status is a standard part of our VC verification API, so validation will automatically fail if the credential is revoked. Integrating revocation into our platform brings us one step closer to building a fully realized verifiable data ecosystem, an environment where verifiers can have more confidence and trust in the decisions they’re making and people can participate in the sharing and exchange of information without eroding their basic privacy. We look forward to continuing to collaborate with the community and gathering feedback from industry to enhance and extend different ways to accomplish revocation while respecting users’ digital rights. Sign up on our website to get access to our platform and check out Mattr Learn to start issuing revocable credentials.",https://Mattr.global/resources/articles/adding-support-for-revocation-of-verifiable-credentials/,,Post,,Product,,,Revocation,,,BBS+,"OIDC,Verifiable Credentials",2020-10-21,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Adding support for Secure DID Messaging,"We are excited to announce a new addition to our Mattr VII platform capabilities. As we continue to build out an extensive suite of features to support the exchange of data such as Verifiable Credentials, we have now added secure Decentralized Identifier messaging capabilities to enable entirely new ways to communicate using our platform.","Adding support for Secure DID Messaging We are excited to announce a new addition to our Mattr VII platform capabilities. As we continue to build out an extensive suite of features to support the exchange of data such as Verifiable Credentials, we have now added secure Decentralized Identifier (DID) messaging capabilities to enable entirely new ways to communicate using our platform. The common and well-understood ways to interact with verifiable credentials have typically been mechanisms such as scanning QR codes or sharing deep links. In this release, we have focused on adding an option to these approaches that provides an even greater level of transparency and user control. With this new capability, you will be able to facilitate more intuitive user flows that make issuing, verifying, and communicating around verifiable credentials a seamless and efficient process for users. While utilizing existing frameworks like push notifications, secure DID messaging maintains a high level of privacy-preserving security. It does this by leveraging a decentralized ecosystem that ensures control of the data in a message remains solely with the participants exchanging the information, and no one else. Utilizing the JSON Web Message (JWM) specification, this new capability allows for encrypted messages to be sent and received in a way that hides their content from all but the cryptographically authorized recipients of the message. That way, the sender of the message can be confident they are only disclosing their details and message with the intended parties. There are two key pieces of information about a recipient that are fundamental to facilitating secure DID-based messaging on the platform. The first of these is the same as any messaging framework, in that an address or endpoint is needed to understand where to send the message. Unlike traditional messaging capabilities that require you to utilize centralized, service provider created, and controlled identifiers such as email and phone numbers, our DID-based messaging solution allows you to facilitate interactions between parties simply by using a Decentralized Identifier and its associated DID Document. The second piece of information you need is the recipient’s public key, which the platform obtains from the resolved DID document. This public key is then used to encrypt a message to the recipient. These capabilities ensure that: - a message is delivered to the correct recipient, and - only the intended recipient can view the content and other metadata of the message. A unique DID is also created by the wallet specifically for each unique interaction with a particular party or organization — further preserving a wallet holder’s privacy and anonymity between the various interactions they may have with issuers and relying parties alike. Once the recipient’s DID is known, a message is formatted as a JSON Web Message (JWM). In this release, we have focused on adding support for 3 main types of messages: - Offering a credential — rather than the user having to scan a QR code, a message can be sent directly to them that will initiate the credential offer flow within their wallet. - Notification about a change in the revocation status of a credential — a mechanism to ensure wallet holders are proactively informed about any status changes for credentials they hold in their wallet, even if they’re not online when the status update occurs. - Starting a credential verification flow (presentation request) — allows a holder to present a credential to a verifier directly, particularly useful in situations where there isn’t a co-location of parties present in the interaction. To take advantage of this capability as a wallet user, all you need to do is opt into receiving messages by enabling notifications in your wallet. At this point, a private cloud-based inbox will be created to store messages that are sent to your device. Messages remain encrypted in the inbox and are only persisted in storage while they are in an unread state. Once the user views the message and its payload is extracted into their wallet, the encrypted message is removed from the inbox and only the payload itself is retained in the user’s wallet. To give an example, let’s say you call up your bank and they are required to sight a form of government-issued identification in order to revalidate their anti-money laundering (AML) records. Given the bank already has an established relationship with you from your prior communications, they will likely have your DID stored in their system. Using this information, they push a message containing a presentation request to you. This request is then processed by the wallet on your phone and triggers a push notification to make you aware of the update. Upon receiving the push notification containing the message, the wallet app will decrypt and extract the payload. In this instance, it contains a presentation request that asks you to select the credential to share back to the bank as proof of your identity. Once you have shared the credential presentation to match their request, the bank verifies that it satisfies their requirements giving them assurance that they are talking to the correct person and that they have met their audit requirements from an AML perspective. This allows them to continue the call with you. Bringing secure messaging into our platform helps facilitate a more intuitive user experience with mobile wallets and helps simplify some of the otherwise complex issuance and verification flows that might otherwise involve scanning multiple QR codes or accessing different embedded links. We look forward to continuing our collaboration with the community on the underlying messaging and security standards as well as gathering feedback from our customers and the broader ecosystem around ways to enhance and extend the different types of messages, along with ways to digest them into different kinds of interactions and experiences. Sign up on our website today to get access to Mattr VII and check out Mattr Learn for a detailed walkthrough to start utilizing messaging as part of a verifiable credential flow.",https://medium.com/Mattr-global/adding-support-for-secure-did-messaging-befb75a72feb,,Post,,Product,,,,,,,DID,2021-05-06,,,,,,,,,,,,,
|
||
Mattr,Mattr,,,,,,,,,DID Extensibility on the Mattr Platform,"DID Web helps to bridge the gap between the way that trust is established on the internet today, namely using domains, and new and emerging ecosystems using DIDs. When using DID Web, rather than anchoring a DID to a decentralized ledger such as a blockchain, the DID is instead associated with a specific domain name, and subsequently anchored to the web host registered with that domain via DNS resolution. Effectively, this allows a DID using this scheme to be resolved as simply as one resolves a web URL, every time they click on a link. For example, we’ve set up a DID Web using our own domain, which can be resolved at did:web:Mattr.global.","DID Extensibility on the Mattr Platform At Mattr we’ve been busy building the next generation of solutions for verifiable data and digital trust. Earlier this month we introduced our platform and added experimental support for a new, privacy-preserving method for selective data disclosure. Today, we’ve reached another milestone that gives our users even more choice and transparency by the addition of a new way to use Decentralized Identifiers (DIDs). Modularity and extensibility are key design principles underpinning our core platform architecture. The Mattr Platform is designed to support a wide range of distinct pluggable components, providing our customers with confidence that their technology choices will continue to evolve with new innovations as they emerge. When it comes to DIDs, there are currently over 50+ DID Methods registered with the W3C. Each DID Method defines a CRUD model that describes how a particular DID scheme works with a specific verifiable data registry such as a distributed ledger or blockchain. The initial group of DID methods was quite small, and has expanded significantly over time as more solutions emerge in this space. While all of these new DID methods theoretically adhere to the DID core specification, each method makes a different set of choices that affect the underlying trust model at play. For instance, DID methods have distinct rules about who gets to add new transactions, what input data is required, where DIDs are anchored, who can view or monitor the DIDs, and more. In short, there are many factors that affect the choice around which DID method to use, and it’s not a trivial decision. We believe that DIDs, when deployed responsibly, can be extremely effective at preserving user privacy, enhancing transparency and consent, enabling data portability, and enforcing user control. To learn more about our approach, read our blog, “Intro to DIDs for people”. In addition to our current support for DID Key (static key-based identifier) and DID Sovrin (ledger-based identifier), we are now proud to add DID Web (domain-based identifier) to our list of supported DID methods. DID Web helps to bridge the gap between the way that trust is established on the internet today, namely using domains, and new and emerging ecosystems using DIDs. When using DID Web, rather than anchoring a DID to a decentralized ledger such as a blockchain, the DID is instead associated with a specific domain name, and subsequently anchored to the web host registered with that domain via DNS resolution. Effectively, this allows a DID using this scheme to be resolved as simply as one resolves a web URL, every time they click on a link. For example, we’ve set up a DID Web using our own domain, which can be resolved at did:web:Mattr.global. Users in the emerging world of DIDs can use this mechanism to bootstrap trust by using the reputation associated with public domains. While this solution may not work in every circumstance and lacks some of the resilience and censorship guarantees afforded by DID methods with less centralized dependencies, DID Web provides a practical and useful pathway to adoption, particularly for entities whose data and identity are already public. When used in parallel with more natively decentralized mechanisms, we can help to ensure that the web remains free and open while providing a path for legacy systems to remain interoperable with the emerging distributed web of trust. By adding support for new DID methods such as DID Web, we are creating optionality and choice for our users. Our products will always be ledger-agnostic. We will also continue to offer support for DIDs which are not anchored to any ledger. We aim to bridge the gap between approaches that are built on top of ledgers and those using domains, key registries, and other methods to establish trust. We are also actively investigating how to add support for more scalable solutions that use ledgers on an ad-hoc basis, such as DID methods based on the layer two Sidetree Protocol. This open-source protocol provides an abstract layer that sits on top of DLT infrastructure. The Platform Drivers part of our architecture provides DID Method support in the form of pluggable integrations that prevent vendor lock-in and enable user choice. To find out more about how the Mattr Platform supports a broad spectrum of DID methods, check out our documentation on Mattr Learn and sign up to get started.",https://Mattr.global/resources/articles/did-extensibility-on-the-Mattr-platform/,,Post,,Product,,,,,,DNS,DID:WEB,2020-10-07,,,,,,,,,,,,,
|
||
Mattr,Mattr,,,,,,,,,Introducing the Mattr Platform,"Here at Mattr, we have been hard at work building a suite of products to serve the next generation of digital trust. We’ve designed our products based on a few key principles: extensible data formats, secure authentication protocols, a rigorous semantic data model, industry-standard cryptography, and the use of drivers and extensions to allow modular and configurable use of the platform over time. By combining our core capabilities with extensions and drivers, our platform offers developers convenience without compromising flexibility or choice.","Introducing the Mattr Platform Here at Mattr, we have been hard at work building a suite of products to serve the next generation of digital trust. We’ve designed our products based on a few key principles: extensible data formats, secure authentication protocols, a rigorous semantic data model, industry-standard cryptography, and the use of drivers and extensions to allow modular and configurable use of the platform over time. By combining our core capabilities with extensions and drivers, our platform offers developers convenience without compromising flexibility or choice. The Mattr Platform delivers digital trust in a scalable manner. Our belief is that a modular security architecture is one which can work across many different contexts. When it comes to trust, context is everything, and we know our users each have their own unique requirements and expectations when it comes to their digital interactions. We provide flexible and configurable building blocks for trust on the web in order to create a digital ecosystem that can support global scale. The platform consists of 3 main components: - Platform Core - Platform Extensions - Platform Drivers Our platform provides the capabilities needed for digital trust through a set of modular and flexible building blocks known as our Platform Core. This includes the ability to establish and use DIDs, sign and encrypt messages, manage the verifiable credentials lifecycle, and share privacy-preserving verifiable presentations. Platform Core is designed as a set of simple APIs that are available for all of our users, with operational tools and documentation. We’ve designed the platform to have cryptographic agility and flexibility built in at a fundamental level. Platform Drivers are pre-configured integrations that allow our capabilities to be pluggable and extensible over time, preventing vendor lock-in and enabling user choice. They identify key areas where flexibility, choice, and optionality are desirable and surface them to the user to promote more resilient security architectures for the future. They are typically surfaced to the user as pluggable parameters in our Platform Core. Extensibility is a key component of our platform architecture. Platform Extensions are higher level capabilities that plug in to our platform, providing convenient and easy-to-access application logic, such as service orchestration and workflow. They are built on top of our Platform Core, allowing users to smoothly onboard and extend our platform as well as enabling Mattr’s digital trust infrastructure to integrate with digital services and protocols that exist outside of our products. They are modular components in terms of logic and configuration, operating independently of Platform Core as an extensible set of APIs. Finally, we offer a growing number of Developer Tools to simplify the user experience by providing additional interfaces and ways to interact with our platform. These tools are free and mostly optional to use, though they do simplify setting up the infrastructure needed to get started experimenting with the platform. Some tools, like some of the features exposed by Mattr’s Mobile Wallet, may be required to use certain features of the platform. Our Developer Tools are designed to work natively with Platform Core as well as our Platform Extensions. Over the past 6 months, we have been working in close collaboration with a number of preview customers to create a great developer experience and identify features that are important for a wide variety of use cases. We’ve been working with partners from industry and government to make sure we’ve built a solution for the problems that matter to you. Checkout Mattr Learn to find out more about our platform, view our API documentation, and follow our tutorials to start using the platform today.",https://Mattr.global/resources/articles/introducing-the-Mattr-platform/,,Post,,Product,,,,,Mattr Platform,,,2020-09-17,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,Rendering credentials in a human-friendly way,"For example, this update formats address fields to make them more readable; formats names and proper nouns where possible; makes URLs, telephone numbers and email addresses clickable; highlights images and icons for better trust and brand signaling; and creates basic rules for language localization that adjust to a user’s device settings.","Rendering credentials in a human-friendly way At Mattr we’re always dreaming up ways to make decentralized identity and verifiable credentials easy and intuitive to use for as many people as possible. From the start, it’s been a core part of our mission to make sure that end users understand the implications of decentralized identity and the control it affords them over their data and their privacy. This model offers users greater sovereignty over their own information by empowering individuals as both the holder and subject of information that pertains to them. Users are able to exercise their new role in this ecosystem by utilizing a new class of software known as digital wallets. We first released our Mobile Wallet for smartphones in June 2020, with a simple user interface to allow people to interact with and receive credentials from issuers as well as present credentials to relying parties. In the interim, we have developed a number of improvements and features to the Mobile Wallet to support advanced capabilities such as: - Authenticating to Identity Providers over OpenID Connect to receive credentials via OIDC Bridge - Deriving privacy-preserving selective disclosure presentations from credentials using BBS+ signatures - Establishing a secure DID messaging inbox for users to receive encrypted messages and push notifications These changes have not only made the wallet more functional; they’ve also evolved to better protect users’ best interests — giving them privacy-by-design and surfacing the information and context that they need to confidently make decisions underpinned by the security of decentralized identity. This journey has led us to realize the importance of creating a wallet experience that places users front and center. As these systems create more opportunity for user-driven consent and identity management, they’ve simultaneously created a demand for a wallet that can not only perform the technical operations required, but do so in a user-friendly way that surfaces the information that truly matters to people. Our latest feature release to the Mattr Mobile Wallet is a step in this direction. With Human-Friendly Credentials, we have added the capability to render different kinds of credentials uniquely in the wallet interface according to the information they contain. Until now, the end user experience for verifiable credentials has been largely consistent across different categories of credentials and issuers. In other words, a credential containing medical data from your doctor looks exactly the same as an education credential from your university or a concert ticket from a music venue: they all appear to the user as a long list of claims. In this release we change that. Thanks to the semantic information encoded in verifiable credentials, the wallet is now able to understand and interpret certain kinds of credentials to render them to the user in a way that makes the data easier to understand. JSON-LD verifiable credentials have the ability to support common data vocabularies and schemas which are published on the web. For example, if a credential contains a claim describing the name of an individual, the claim can be defined via reference to an existing data vocabulary found here: https://schema.org/Person Human-Friendly Credentials allow the wallet to start intelligently displaying known credential types and data types. This shows up in a variety of different ways in a user’s dataset. For example, this update formats address fields to make them more readable; formats names and proper nouns where possible; makes URLs, telephone numbers and email addresses clickable; highlights images and icons for better trust and brand signaling; and creates basic rules for language localization that adjust to a user’s device settings. This logic allows the wallet to create a kind of information hierarchy that starts to draw out the important aspects of data included in a credential, so users can make trust-based decisions using this information. These rules are applied to any kind of credential the wallet encounters. Whilst all of this is incredibly helpful for users, we have gone a step further: displaying entire credentials in a completely unique UI according to their type. The ‘type’ property of a credential expresses what kind of information is in the credential — is it a degree, a medical record, a utility bill? The usage of common credential types across different implementations and ecosystems is necessary for semantic interoperability on the broader scale. The wallet should be able to recognize these common credential types and display them to the user in a friendly way. In this update, we have added custom rendering for both Personal Identity Credentials as well as Education Courses. These are common types we see occurring naturally across the decentralized identity landscape, and now the Mattr Mobile Wallet is able to recognize and display them properly. An important note to consider is that Human-Friendly Credentials only work for known data and credential types. As the ecosystem matures, we expect to add more data types and credential types in the future to support an even broader set of human-related processes and actions. In a general sense, we will also continue to iterate on our digital wallet offerings to provide a greater degree of flexibility and control for end users. We believe it’s a vital component to a healthy digital trust ecosystem. To check out these changes yourself, download the latest version of our Mobile Wallet to get started. For more information on Human-Friendly Credentials, check out our tutorials and video content on Mattr Learn. For everything else related to Mattr, visit https://Mattr.global or follow @MattrGlobal on Twitter.",https://medium.com/Mattr-global/rendering-credentials-in-a-human-friendly-way-e47f4a32fd4b,,Post,,Product,,,,,,,,2021-06-01,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,Nader Helmy,,,,,,Using privacy-preserving ZKP credentials on the Mattr Platform,"By leveraging pairing-friendly elliptic-curve cryptography in the context of Linked Data Proofs, our approach provides an unprecedented way to perform zero-knowledge proofs using the semantics of JSON-LD. This allows credential issuers to tap into vast data vocabularies that exist on the web today, such as schema.org and Google Knowledge Graph, making user data more context-rich without sacrificing security and privacy of the user in the process. Not only is this approach more interoperable with existing implementations of the VC data model and semantic web technologies, it also doesn’t rely on any external dependencies to operate (like a distributed ledger), meaning it’s far more efficient than other approaches based on CL-signatures and zk-SNARKs. We’ve open-sourced our LD-Proofs suite for VCs including performance benchmarks so you can check it out yourself.","Using privacy-preserving ZKP credentials on the Mattr Platform Mattr is proud to announce we’ve added support for privacy-preserving verifiable credentials on our platform using BBS+ signatures. Using a technique to implement selective disclosure, we’ve added the ability to generate credentials that support zero knowledge proofs without revealing any unnecessary information about the end-user, or placing any added burden on issuers, in the process. Since we first introduced and open-sourced JSON-LD BBS+ Signatures at IIW30 in April of this year, we’ve received lots of engagement, feedback and contributions from the broader technical community to further develop the implementations and specifications we presented. You can read more about our approach to privacy-preserving verifiable credentials on our introductory blog post. One of the benefits of using the BBS+ cryptographic scheme to sign credentials is the ability to derive a zero knowledge proof from the signature, where the party generating the proof can choose to partially disclose statements from the original message. When enabled, this feature allows issuers to create a credential that effectively enforces minimal data disclosure using the Mattr Platform and a compliant digital wallet. To support this functionality, we generate the keys required to support these signatures and create a Decentralized Identifier (DID) with the keys referenced in the DID Document. BBS+ signatures require what’s called a pairing-friendly curve, we use BLS12–381. This DID can be referenced in credentials to establish the issuer of the data, a common practice to allow a verifier or relying party to trace the root of trust in a credential. To issue a ZKP-enabled credential, simply use our API endpoint to create a new DID Key with type set to BLS 12–381. Then, create a Verifiable Credential (VC) using your new DID Key as the issuer DID. Our platform will automatically detect this capability is available in your DID and create a ZKP-enabled BBS+ credential for you. You can use the platform this way to create a privacy-enabled credential, or you can create a regular credential by providing a DID with a different key type — you have the option. On the user side, you can hold ZKP-enabled credentials in your wallet alongside all of your other credentials. We’ve designed this process in a way that minimizes friction to the user. In future updates, our Mobile Wallet App will be able to detect if BBS+ signatures are being used in a credential. When you get a request to verify some information contained in one of these privacy-enabled credentials, it will derive a new presentation that selectively discloses the required info using a zero-knowledge proof. The platform will then allow verification of the proof using the same interface as any other type of presentation. Our integrated approach treats zero-knowledge proofs as an extension of VCs, rather than an entirely new framework with a separate set of dependencies. We have built BBS+ Signatures and privacy-enabled credentials into our platform for anybody to experiment with, in what we think is a significant milestone for standards-based credential solutions on the market today. As a technology, BBS+ digital signatures can be used to sign more than just verifiable credentials. Combining these technologies is quite effective, though they can also be treated as modular or separate components. We’ve open-sourced software for creating and verifying BBS+ signatures in browser environments as well as node.js, and we’ve also published a library for generating BLS 12–381 keypairs for signing and verifying BBS+ Signatures. By leveraging pairing-friendly elliptic-curve cryptography in the context of Linked Data Proofs, our approach provides an unprecedented way to perform zero-knowledge proofs using the semantics of JSON-LD. This allows credential issuers to tap into vast data vocabularies that exist on the web today, such as schema.org and Google Knowledge Graph, making user data more context-rich without sacrificing security and privacy of the user in the process. Not only is this approach more interoperable with existing implementations of the VC data model and semantic web technologies, it also doesn’t rely on any external dependencies to operate (like a distributed ledger), meaning it’s far more efficient than other approaches based on CL-signatures and zk-SNARKs. We’ve open-sourced our LD-Proofs suite for VCs including performance benchmarks so you can check it out yourself. We’re excited to finally make these powerful privacy features easily accessible for everyone, and we can’t wait to see what you build with it. To get started, sign up now on our website and follow our tutorials on Mattr Learn to start creating ZKP-enabled verifiable credentials on the Mattr Platform. Additional Links Open-source: - Node JS BBS+ Signatures — BBS+ signatures implementation for node.js environments - WASM JS BBS+ Signatures — BBS+ signatures implementation for browser & node.js environments - BLS 12–381 Key Pair JS — crypto keys for signing/verifying BBS+ signatures - BBS+ JSON-LD Signatures JS — uses BBS+ signatures & BLS 12–381 keypair in a Linked Data Proofs suite (for use in VC implementations) Specifications: - BBS+ JSON-LD Signatures Spec — specifies linked data suite for BBS+ signatures - BBS+ Signatures Spec — definition of BBS+ signatures scheme",https://medium.com/Mattr-global/using-privacy-preserving-zkp-credentials-on-the-Mattr-platform-4c9e351a2fc3,,Post,,Product,,,,,,ZKP,"JSON-LD,LinkedData",2020-09-17,,,,,,,,,,,,,
|
||
Mattr,Personal,,,Damien Bowden,,,,,,Implement Compound Proof BBS+ Verifiable Credentials Using ASP.NET Core and Mattr,The ZKP BBS+ verifiable credentials are issued and stored on a digital wallet using a Self-Issued Identity Provider (SIOP) and OpenID Connect. A compound proof presentation template is created to verify the user data in a single verify. Code: [https://GitHub.com/swiss-ssi-group/MattrAspNetCoreCompoundProofBBS](https://GitHub.com/swiss-ssi-group/MattrAspNetCoreCompoundProofBBS),"This article shows how Zero Knowledge Proofs BBS+ verifiable credentials can be used to verify credential subject data from two separate verifiable credentials implemented in ASP.NET Core and Mattr. The ZKP BBS+ verifiable credentials are issued and stored on a digital wallet using a Self-Issued Identity Provider (SIOP) and OpenID Connect. A compound proof presentation template is created to verify the user data in a single verify. Code: https://GitHub.com/swiss-ssi-group/MattrAspNetCoreCompoundProofBBS Blogs in the series - Getting started with Self Sovereign Identity SSI - Create an OIDC credential Issuer with Mattr and ASP.NET Core - Present and Verify Verifiable Credentials in ASP.NET Core using Decentralized Identities and Mattr - Verify vaccination data using Zero Knowledge Proofs with ASP.NET Core and Mattr - Challenges to Self Sovereign Identity - Implement Compound Proof BBS+ verifiable credentials using ASP.NET Core and Mattr What are ZKP BBS+ verifiable credentials BBS+ verifiable credentials are built using JSON-LD and makes it possible to support selective disclosure of subject claims from a verifiable credential, compound proofs of different VCs, zero knowledge proofs where the subject claims do not need to be exposed to verify something, private holder binding and prevent tracking. The specification and implementation are still a work in progress. Setup The solution is setup to issue and verify the BBS+ verifiable credentials. The credential issuers are implemented in ASP.NET Core as well as the verifiable credential verifier. One credential issuer implements a BBS+ JSON-LD E-ID verifiable credential using SIOP together with Auth0 as the identity provider and the Mattr API which implements the access to the ledger and implements the logic for creating and verifying the verifiable credential and implementing the SSI specifications. The second credential issuer implements a county of residence BBS+ verifiable credential issuer like the first one. The ASP.NET Core verifier project uses a BBS+ verify presentation to verify that a user has the correct E-ID credentials and the county residence verifiable credentials in one request. This is presented as a compound proof using credential subject data from both verifiable credentials. The credentials are presented from the Mattr wallet to the ASP.NET Core verifier application. The BBS+ compound proof is made up from the two verifiable credentials stored on the wallet. The holder of the wallet owns the credentials and can be trusted to a fairly high level because SIOP was used to add the credentials to the Mattr wallet which requires a user authentication on the wallet using OpenID Connect. If the host system has strong authentication, the user of the wallet is probably the same person for which the credentials where intended for and issued too. We only can prove that the verifiable credentials are valid, we cannot prove that the person sending the credentials is also the subject of the credentials or has the authorization to act on behalf of the credential subject. With SIOP, we know that the credentials were issued in a way which allows for strong authentication. Implementing the Credential Issuers The credentials are created using a credential issuer and can be added to the users wallet using SIOP. An ASP.NET Core application is used to implement the Mattr API client for creating and issuing the credentials. Auth0 is used for the OIDC server and the profiles used in the verifiable credentials are added here. The Auth0 server is part of the credential issuer service business. The application has two separate flows for administrators and users, or holders of the credentials and credential issuer administrators. An administrator can signin to the credential issuer ASP.NET Core application using OIDC and can create new OIDC credential issuers using BBS+. Once created, the callback URL for the credential issuer needs to be added to the Auth0 client application as a redirect URL. A user can login to the ASP.NET Core application and request the verifiable credentials only for themselves. This is not authenticated on the ASP.NET Core application, but on the wallet application using the SIOP flow. The application presents a QR Code which starts the flow. Once authenticated, the credentials are added to the digital wallet. Both the E-ID and the county of residence credentials are added and stored on the wallet. Auth0 Auth pipeline rules The credential subject claims added to the verifiable credential uses the profile data from the Auth0 identity provider. This data can be added using an Auth0 auth pipeline rule. Once defined, if the user has the profile data, the verifiable credentials can be created from the data. function (user, context, callback) { const namespace = 'https://damianbod-sandbox.vii.Mattr.global/'; context.idToken[namespace + 'name'] = user.user_metadata.name; context.idToken[namespace + 'first_name'] = user.user_metadata.first_name; context.idToken[namespace + 'date_of_birth'] = user.user_metadata.date_of_birth; context.idToken[namespace + 'family_name'] = user.user_metadata.family_name; context.idToken[namespace + 'given_name'] = user.user_metadata.given_name; context.idToken[namespace + 'birth_place'] = user.user_metadata.birth_place; context.idToken[namespace + 'gender'] = user.user_metadata.gender; context.idToken[namespace + 'height'] = user.user_metadata.height; context.idToken[namespace + 'nationality'] = user.user_metadata.nationality; context.idToken[namespace + 'address_country'] = user.user_metadata.address_country; context.idToken[namespace + 'address_locality'] = user.user_metadata.address_locality; context.idToken[namespace + 'address_region'] = user.user_metadata.address_region; context.idToken[namespace + 'street_address'] = user.user_metadata.street_address; context.idToken[namespace + 'postal_code'] = user.user_metadata.postal_code; callback(null, user, context); } Once issued, the verifiable credential is saved to the digital wallet like this: { ""type"": [ ""VerifiableCredential"", ""VerifiableCredentialExtension"" ], ""issuer"": { ""id"": ""did:key:zUC7GiWMGY2pynrFG7TcstDiZeNKfpMPY8YT5z4xgd58wE927UxaJfaqFuXb9giCS1diTwLi8G18hRgZ928b4qd8nkPRdZCEaBGChGSjUzfFDm6Tyio1GN2npT9o7K5uu8mDs2g"", ""name"": ""damianbod-sandbox.vii.Mattr.global"" }, ""name"": ""EID"", ""issuanceDate"": ""2021-12-04T11:47:41.319Z"", ""credentialSubject"": { ""id"": ""did:key:z6MkmGHPWdKjLqiTydLHvRRdHPNDdUDKDudjiF87RNFjM2fb"", ""family_name"": ""Bob"", ""given_name"": ""Lammy"", ""date_of_birth"": ""1953-07-21"", ""birth_place"": ""Seattle"", ""height"": ""176cm"", ""nationality"": ""USA"", ""gender"": ""Male"" }, ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/security/bbs/v1"", { ""@vocab"": ""https://w3id.org/security/undefinedTerm#"" }, ""https://Mattr.global/contexts/VC-extensions/v1"", ""https://schema.org"", ""https://w3id.org/VC-revocation-list-2020/v1"" ], ""credentialStatus"": { ""id"": ""https://damianbod-sandbox.vii.Mattr.global/core/v1/revocation-lists/dd507c44-044c-433b-98ab-6fa9934d6b01#0"", ""type"": ""RevocationList2020Status"", ""revocationListIndex"": ""0"", ""revocationListCredential"": ""https://damianbod-sandbox.vii.Mattr.global/core/v1/revocation-lists/dd507c44-044c-433b-98ab-6fa9934d6b01"" }, ""proof"": { ""type"": ""BbsBlsSignature2020"", ""created"": ""2021-12-04T11:47:42Z"", ""proofPurpose"": ""assertionMethod"", ""proofValue"": ""qquknHC7zaklJd0/IbceP0qC9sGYfkwszlujrNQn+RFg1/lUbjCe85Qnwed7QBQkIGnYRHydZiD+8wJG8/R5i8YPJhWuneWNE151GbPTaMhGNZtM763yi2A11xYLmB86x0d1JLdHaO30NleacpTs9g=="", ""verificationMethod"": ""did:key:zUC7GiWMGY2pynrFG7TcstDiZeNKfpMPY8YT5z4xgd58wE927UxaJfaqFuXb9giCS1diTwLi8G18hRgZ928b4qd8nkPRdZCEaBGChGSjUzfFDm6Tyio1GN2npT9o7K5uu8mDs2g#zUC7GiWMGY2pynrFG7TcstDiZeNKfpMPY8YT5z4xgd58wE927UxaJfaqFuXb9giCS1diTwLi8G18hRgZ928b4qd8nkPRdZCEaBGChGSjUzfFDm6Tyio1GN2npT9o7K5uu8mDs2g"" } } For more information on adding BBS+ verifiable credentials using Mattr, see the documentation, or a previous blog in this series. Verifying the compound proof BBS+ verifiable credential The verifier application needs to use both E-ID and county of residence verifiable credentials. This is done using a presentation template which is specific to the Mattr platform. Once created, a verify request is created using this template and presented to the user in the UI as a QR code. The holder of the wallet can scan this code and the verification begins. The wallet will use the verification request and try to find the credentials on the wallet which matches what was requested. If the wallet has the data from the correct issuers, the holder of the wallet consents, the data is sent to the verifier application using a new presentation verifiable credential using the credential subject data from both of the existing verifiable credentials stored on the wallet. The webhook or an API on the verifier application handles this and validates the request. If all is good, the data is persisted and the UI is updated using SignalR messaging. Creating a verifier presentation template Before verifier presentations can be sent a the digital wallet, a template needs to be created in the Mattr platform. The CreatePresentationTemplate Razor page is used to create a new template. The template requires the two DIDs used for issuing the credentials from the credential issuer applications. public class CreatePresentationTemplateModel : PageModel { private readonly MattrPresentationTemplateService _MattrVerifyService; public bool CreatingPresentationTemplate { get; set; } = true; public string TemplateId { get; set; } [BindProperty] public PresentationTemplate PresentationTemplate { get; set; } public CreatePresentationTemplateModel(MattrPresentationTemplateService MattrVerifyService) { _MattrVerifyService = MattrVerifyService; } public void OnGet() { PresentationTemplate = new PresentationTemplate(); } public async Task<IActionResult> OnPostAsync() { if (!ModelState.IsValid) { return Page(); } TemplateId = await _MattrVerifyService.CreatePresentationTemplateId( PresentationTemplate.DidEid, PresentationTemplate.DidCountyResidence); CreatingPresentationTemplate = false; return Page(); } } public class PresentationTemplate { [Required] public string DidEid { get; set; } [Required] public string DidCountyResidence { get; set; } } The MattrPresentationTemplateService class implements the logic required to create a new presentation template. The service gets a new access token for your Mattr tenant and creates a new template using the credential subjects required and the correct contexts. BBS+ and frames require specific contexts. The CredentialQuery2 has two separate Frame items, one for each verifiable credential created and stored on the digital wallet. public class MattrPresentationTemplateService { private readonly IHttpClientFactory _clientFactory; private readonly MattrTokenApiService _MattrTokenApiService; private readonly VerifyEidCountyResidenceDbService _verifyEidAndCountyResidenceDbService; private readonly MattrConfiguration _MattrConfiguration; public MattrPresentationTemplateService(IHttpClientFactory clientFactory, IOptions<MattrConfiguration> MattrConfiguration, MattrTokenApiService MattrTokenApiService, VerifyEidCountyResidenceDbService VerifyEidAndCountyResidenceDbService) { _clientFactory = clientFactory; _MattrTokenApiService = MattrTokenApiService; _verifyEidAndCountyResidenceDbService = VerifyEidAndCountyResidenceDbService; _MattrConfiguration = MattrConfiguration.Value; } public async Task<string> CreatePresentationTemplateId(string didEid, string didCountyResidence) { // create a new one var v1PresentationTemplateResponse = await CreateMattrPresentationTemplate(didEid, didCountyResidence); // save to db var template = new EidCountyResidenceDataPresentationTemplate { DidEid = didEid, DidCountyResidence = didCountyResidence, TemplateId = v1PresentationTemplateResponse.Id, MattrPresentationTemplateReponse = JsonConvert.SerializeObject(v1PresentationTemplateResponse) }; await _verifyEidAndCountyResidenceDbService.CreateEidAndCountyResidenceDataTemplate(template); return v1PresentationTemplateResponse.Id; } private async Task<V1_PresentationTemplateResponse> CreateMattrPresentationTemplate(string didId, string didCountyResidence) { HttpClient client = _clientFactory.CreateClient(); var accessToken = await _MattrTokenApiService.GetApiToken(client, ""MattrAccessToken""); client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(""Bearer"", accessToken); client.DefaultRequestHeaders.TryAddWithoutValidation(""Content-Type"", ""application/json""); var v1PresentationTemplateResponse = await CreateMattrPresentationTemplate(client, didId, didCountyResidence); return v1PresentationTemplateResponse; } private async Task<V1_PresentationTemplateResponse> CreateMattrPresentationTemplate( HttpClient client, string didEid, string didCountyResidence) { // create presentation, post to presentations templates api // https://learn.Mattr.global/tutorials/verify/presentation-request-template // https://learn.Mattr.global/tutorials/verify/presentation-request-template#create-a-privacy-preserving-presentation-request-template-for-zkp-enabled-credentials var createPresentationsTemplatesUrl = $""https://{_MattrConfiguration.TenantSubdomain}/v1/presentations/templates""; var eidAdditionalPropertiesCredentialSubject = new Dictionary<string, object>(); eidAdditionalPropertiesCredentialSubject.Add(""credentialSubject"", new EidDataCredentialSubject { Explicit = true }); var countyResidenceAdditionalPropertiesCredentialSubject = new Dictionary<string, object>(); countyResidenceAdditionalPropertiesCredentialSubject.Add(""credentialSubject"", new CountyResidenceDataCredentialSubject { Explicit = true }); var additionalPropertiesCredentialQuery = new Dictionary<string, object>(); additionalPropertiesCredentialQuery.Add(""required"", true); var additionalPropertiesQuery = new Dictionary<string, object>(); additionalPropertiesQuery.Add(""type"", ""QueryByFrame""); additionalPropertiesQuery.Add(""credentialQuery"", new List<CredentialQuery2> { new CredentialQuery2 { Reason = ""Please provide your E-ID"", TrustedIssuer = new List<TrustedIssuer>{ new TrustedIssuer { Required = true, Issuer = didEid // DID used to create the oidc } }, Frame = new Frame { Context = new List<object>{ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/security/bbs/v1"", ""https://Mattr.global/contexts/VC-extensions/v1"", ""https://schema.org"", ""https://w3id.org/VC-revocation-list-2020/v1"" }, Type = ""VerifiableCredential"", AdditionalProperties = eidAdditionalPropertiesCredentialSubject }, AdditionalProperties = additionalPropertiesCredentialQuery }, new CredentialQuery2 { Reason = ""Please provide your Residence data"", TrustedIssuer = new List<TrustedIssuer>{ new TrustedIssuer { Required = true, Issuer = didCountyResidence // DID used to create the oidc } }, Frame = new Frame { Context = new List<object>{ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/security/bbs/v1"", ""https://Mattr.global/contexts/VC-extensions/v1"", ""https://schema.org"", ""https://w3id.org/VC-revocation-list-2020/v1"" }, Type = ""VerifiableCredential"", AdditionalProperties = countyResidenceAdditionalPropertiesCredentialSubject }, AdditionalProperties = additionalPropertiesCredentialQuery } }); var payload = new MattrOpenApiClient.V1_CreatePresentationTemplate { Domain = _MattrConfiguration.TenantSubdomain, Name = ""zkp-eid-county-residence-compound"", Query = new List<Query> { new Query { AdditionalProperties = additionalPropertiesQuery } } }; var payloadJson = JsonConvert.SerializeObject(payload); var uri = new Uri(createPresentationsTemplatesUrl); using (var content = new StringContentWithoutCharset(payloadJson, ""application/json"")) { var presentationTemplateResponse = await client.PostAsync(uri, content); if (presentationTemplateResponse.StatusCode == System.Net.HttpStatusCode.Created) { var v1PresentationTemplateResponse = JsonConvert .DeserializeObject<MattrOpenApiClient.V1_PresentationTemplateResponse>( await presentationTemplateResponse.Content.ReadAsStringAsync()); return v1PresentationTemplateResponse; } var error = await presentationTemplateResponse.Content.ReadAsStringAsync(); } throw new Exception(""whoops something went wrong""); } } public class EidDataCredentialSubject { [Newtonsoft.Json.JsonProperty(""@explicit"", Required = Newtonsoft.Json.Required.Always)] public bool Explicit { get; set; } [Newtonsoft.Json.JsonProperty(""family_name"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object FamilyName { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""given_name"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object GivenName { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""date_of_birth"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object DateOfBirth { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""birth_place"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object BirthPlace { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""height"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object Height { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""nationality"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object Nationality { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""gender"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object Gender { get; set; } = new object(); } public class CountyResidenceDataCredentialSubject { [Newtonsoft.Json.JsonProperty(""@explicit"", Required = Newtonsoft.Json.Required.Always)] public bool Explicit { get; set; } [Newtonsoft.Json.JsonProperty(""family_name"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object FamilyName { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""given_name"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object GivenName { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""date_of_birth"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object DateOfBirth { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""address_country"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object AddressCountry { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""address_locality"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object AddressLocality { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""address_region"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object AddressRegion { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""street_address"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object StreetAddress { get; set; } = new object(); [Newtonsoft.Json.JsonProperty(""postal_code"", Required = Newtonsoft.Json.Required.Always)] [System.ComponentModel.DataAnnotations.Required] public object PostalCode { get; set; } = new object(); } When the presentation template is created, the following JSON payload in returned. This is what is used to create verifier presentation requests. The context must contain the value of the context value of the credentials on the wallet. You can also verify that the trusted issuer matches and that the two Frame objects are created correctly with the required values. { ""id"": ""f188df35-e76f-4794-8e64-eedbe0af2b19"", ""domain"": ""damianbod-sandbox.vii.Mattr.global"", ""name"": ""zkp-eid-county-residence-compound"", ""query"": [ { ""type"": ""QueryByFrame"", ""credentialQuery"": [ { ""reason"": ""Please provide your E-ID"", ""frame"": { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/security/bbs/v1"", ""https://Mattr.global/contexts/VC-extensions/v1"", ""https://schema.org"", ""https://w3id.org/VC-revocation-list-2020/v1"" ], ""type"": ""VerifiableCredential"", ""credentialSubject"": { ""@explicit"": true, ""family_name"": {}, ""given_name"": {}, ""date_of_birth"": {}, ""birth_place"": {}, ""height"": {}, ""nationality"": {}, ""gender"": {} } }, ""trustedIssuer"": [ { ""required"": true, ""issuer"": ""did:key:zUC7GiWMGY2pynrFG7TcstDiZeNKfpMPY8YT5z4xgd58wE927UxaJfaqFuXb9giCS1diTwLi8G18hRgZ928b4qd8nkPRdZCEaBGChGSjUzfFDm6Tyio1GN2npT9o7K5uu8mDs2g"" } ], ""required"": true }, { ""reason"": ""Please provide your Residence data"", ""frame"": { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/security/bbs/v1"", ""https://Mattr.global/contexts/VC-extensions/v1"", ""https://schema.org"", ""https://w3id.org/VC-revocation-list-2020/v1"" ], ""type"": ""VerifiableCredential"", ""credentialSubject"": { ""@explicit"": true, ""family_name"": {}, ""given_name"": {}, ""date_of_birth"": {}, ""address_country"": {}, ""address_locality"": {}, ""address_region"": {}, ""street_address"": {}, ""postal_code"": {} } }, ""trustedIssuer"": [ { ""required"": true, ""issuer"": ""did:key:zUC7G95fmyuYXNP2oqhhWkysmMPafU4dUWtqzXSsijsLCVauFDhAB7Dqbk2LCeo488j9iWGLXCL59ocYzhTmS3U7WNdukoJ2A8Z8AVCzeS5TySDJcYCjzuaPm7voPGPqtYa6eLV"" } ], ""required"": true } ] } ] } The presentation template is ready and can be used now. This is just a specific definition used by the Mattr platform. This is not saved to the ledger. Creating a verifier request and present QR Code Now that we have a presentation template, we initialize a verifier presentation request and present this as a QR Code for the holder of the digital wallet to scan. The CreateVerifyCallback method creates the verification and returns a signed token which is added to the QR Code to scan and the challengeId is encoded in base64 as we use this in the URL to request or handle the webhook callback. public class CreateVerifierDisplayQrCodeModel : PageModel { private readonly MattrCredentialVerifyCallbackService _MattrCredentialVerifyCallbackService; public bool CreatingVerifier { get; set; } = true; public string QrCodeUrl { get; set; } [BindProperty] public string ChallengeId { get; set; } [BindProperty] public string Base64ChallengeId { get; set; } [BindProperty] public CreateVerifierDisplayQrCodeCallbackUrl CallbackUrlDto { get; set; } public CreateVerifierDisplayQrCodeModel(MattrCredentialVerifyCallbackService MattrCredentialVerifyCallbackService) { _MattrCredentialVerifyCallbackService = MattrCredentialVerifyCallbackService; } public void OnGet() { CallbackUrlDto = new CreateVerifierDisplayQrCodeCallbackUrl(); CallbackUrlDto.CallbackUrl = $""https://{HttpContext.Request.Host.Value}""; } public async Task<IActionResult> OnPostAsync() { if (!ModelState.IsValid) { return Page(); } var result = await _MattrCredentialVerifyCallbackService .CreateVerifyCallback(CallbackUrlDto.CallbackUrl); CreatingVerifier = false; var walletUrl = result.WalletUrl.Trim(); ChallengeId = result.ChallengeId; var valueBytes = Encoding.UTF8.GetBytes(ChallengeId); Base64ChallengeId = Convert.ToBase64String(valueBytes); VerificationRedirectController.WalletUrls.Add(Base64ChallengeId, walletUrl); // https://learn.Mattr.global/tutorials/verify/using-callback/callback-e-to-e#redirect-urls //var qrCodeUrl = $""didcomm://{walletUrl}""; QrCodeUrl = $""didcomm://https://{HttpContext.Request.Host.Value}/VerificationRedirect/{Base64ChallengeId}""; return Page(); } } public class CreateVerifierDisplayQrCodeCallbackUrl { [Required] public string CallbackUrl { get; set; } } The CreateVerifyCallback method uses the host as the base URL for the callback definition which is included in the verification. An access token is requested for the Mattr API, this is used for all the requests. The last issued template is used in the verification. A new DID is created or the existing DID for this verifier is used to attach the verify presentation on the ledger. The InvokePresentationRequest is used to initialize the verification presentation. This request uses the templateId, the callback URL and the DID. Part of the body payload of the response of the request is signed and this is returned to the Razor page to be displayed as part of the QR code. This signed token is longer and so a didcomm redirect is used in the QR Code and not the value directly in the Razor page.. /// <summary> /// https://learn.Mattr.global/tutorials/verify/using-callback/callback-e-to-e /// </summary> /// <param name=""callbackBaseUrl""></param> /// <returns></returns> public async Task<(string WalletUrl, string ChallengeId)> CreateVerifyCallback(string callbackBaseUrl) { callbackBaseUrl = callbackBaseUrl.Trim(); if (!callbackBaseUrl.EndsWith('/')) { callbackBaseUrl = $""{callbackBaseUrl}/""; } var callbackUrlFull = $""{callbackBaseUrl}{Mattr_CALLBACK_VERIFY_PATH}""; var challenge = GetEncodedRandomString(); HttpClient client = _clientFactory.CreateClient(); var accessToken = await _MattrTokenApiService.GetApiToken(client, ""MattrAccessToken""); client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(""Bearer"", accessToken); client.DefaultRequestHeaders.TryAddWithoutValidation(""Content-Type"", ""application/json""); var template = await _VerifyEidAndCountyResidenceDbService.GetLastPresentationTemplate(); var didToVerify = await _MattrCreateDidService.GetDidOrCreate(""did_for_verify""); // Request DID from ledger V1_GetDidResponse did = await RequestDID(didToVerify.Did, client); // Invoke the Presentation Request var invokePresentationResponse = await InvokePresentationRequest( client, didToVerify.Did, template.TemplateId, challenge, callbackUrlFull); // Sign and Encode the Presentation Request body var signAndEncodePresentationRequestBodyResponse = await SignAndEncodePresentationRequestBody( client, did, invokePresentationResponse); // fix strange DTO var jws = signAndEncodePresentationRequestBodyResponse.Replace(""\"""", """"); // save to db var vaccinationDataPresentationVerify = new EidCountyResidenceDataPresentationVerify { DidEid = template.DidEid, DidCountyResidence = template.DidCountyResidence, TemplateId = template.TemplateId, CallbackUrl = callbackUrlFull, Challenge = challenge, InvokePresentationResponse = JsonConvert.SerializeObject(invokePresentationResponse), Did = JsonConvert.SerializeObject(did), SignAndEncodePresentationRequestBody = jws }; await _VerifyEidAndCountyResidenceDbService.CreateEidAndCountyResidenceDataPresentationVerify(vaccinationDataPresentationVerify); var walletUrl = $""https://{_MattrConfiguration.TenantSubdomain}/?request={jws}""; return (walletUrl, challenge); } The QR Code is displayed in the UI. Once the QR Code is created and scanned, the SignalR client starts listening for messages returned for the challengeId. Validating the verification callback After the holder of the digital wallet has given consent, the wallet sends the verifiable credential data back to the verifier application in a HTTP request. This is sent to a webhook or an API in the verifier application. This needs to be verified correctly. In this demo, only the challengeId is used to match the request, the payload is not validated which it should be. The callback handler stores the data to the database and sends a SignalR message to inform the waiting client that the verify has been completed successfully. private readonly VerifyEidCountyResidenceDbService _verifyEidAndCountyResidenceDbService; private readonly IHubContext<MattrVerifiedSuccessHub> _hubContext; public VerificationController(VerifyEidCountyResidenceDbService verifyEidAndCountyResidenceDbService, IHubContext<MattrVerifiedSuccessHub> hubContext) { _hubContext = hubContext; _verifyEidAndCountyResidenceDbService = verifyEidAndCountyResidenceDbService; } /// <summary> /// { /// ""presentationType"": ""QueryByFrame"", /// ""challengeId"": ""nGu/E6eQ8AraHzWyB/kluudUhraB8GybC3PNHyZI"", /// ""claims"": { /// ""id"": ""did:key:z6MkmGHPWdKjLqiTydLHvRRdHPNDdUDKDudjiF87RNFjM2fb"", /// ""http://schema.org/birth_place"": ""Seattle"", /// ""http://schema.org/date_of_birth"": ""1953-07-21"", /// ""http://schema.org/family_name"": ""Bob"", /// ""http://schema.org/gender"": ""Male"", /// ""http://schema.org/given_name"": ""Lammy"", /// ""http://schema.org/height"": ""176cm"", /// ""http://schema.org/nationality"": ""USA"", /// ""http://schema.org/address_country"": ""Schweiz"", /// ""http://schema.org/address_locality"": ""Thun"", /// ""http://schema.org/address_region"": ""Bern"", /// ""http://schema.org/postal_code"": ""3000"", /// ""http://schema.org/street_address"": ""Thunerstrasse 14"" /// }, /// ""verified"": true, /// ""holder"": ""did:key:z6MkmGHPWdKjLqiTydLHvRRdHPNDdUDKDudjiF87RNFjM2fb"" /// } /// </summary> /// <param name=""body""></param> /// <returns></returns> [HttpPost] [Route(""[action]"")] public async Task<IActionResult> VerificationDataCallback() { string content = await new System.IO.StreamReader(Request.Body).ReadToEndAsync(); var body = JsonSerializer.Deserialize<VerifiedEidCountyResidenceData>(content); var valueBytes = Encoding.UTF8.GetBytes(body.ChallengeId); var base64ChallengeId = Convert.ToBase64String(valueBytes); string connectionId; var found = MattrVerifiedSuccessHub.Challenges .TryGetValue(base64ChallengeId, out connectionId); //test Signalr //await _hubContext.Clients.Client(connectionId).SendAsync(""MattrCallbackSuccess"", $""{base64ChallengeId}""); //return Ok(); var exists = await _verifyEidAndCountyResidenceDbService.ChallengeExists(body.ChallengeId); if (exists) { await _verifyEidAndCountyResidenceDbService.PersistVerification(body); if (found) { //$""/VerifiedUser?base64ChallengeId={base64ChallengeId}"" await _hubContext.Clients .Client(connectionId) .SendAsync(""MattrCallbackSuccess"", $""{base64ChallengeId}""); } return Ok(); } return BadRequest(""unknown verify request""); } The VerifiedUser ASP.NET Core Razor page displays the data after a successful verification. This uses the challengeId to get the data from the database and display this in the UI for the next steps. The demo UI displays the data after a successful verification. The next steps of the verifier process can be implemented using these values. This would typically included creating an account and setting up an authentication which is not subject to phishing for high security or at least which has a second factor. Notes The Mattr BBS+ verifiable credentials look really good and supports selective disclosure and compound proofs. The implementation is still a WIP and Mattr are investing in this at present and will hopefully complete and improve all the BBS+ features. Until BBS+ is implemented by the majority of SSI platform providers and the specs are completed, I don’t not see how SSI can be adopted unless of course all converge on some other standard. This would help improve some of the interop problems between the vendors. Links https://learn.Mattr.global/tutorials/verify/using-callback/callback-e-to-e https://Mattr.global/get-started/ https://learn.Mattr.global/ Generating a ZKP-enabled BBS+ credential using the Mattr Platform https://learn.Mattr.global/tutorials/dids/did-key https://gunnarpeipman.com/httpclient-remove-charset/ Where to begin with OIDC and SIOP https://Anonyome.com/2020/06/decentralized-identity-key-concepts-explained/ Verifiable-Credentials-Flavors-Explained https://learn.Mattr.global/api-reference/ https://w3c-CCG.GitHub.io/ld-proofs/ Verifiable Credentials Data Model v1.1 (w3.org)",https://damienbod.com/2021/12/13/implement-compound-proof-bbs-verifiable-credentials-using-asp-net-core-and-Mattr/,,Post,,Resources,,,,,,BBS+,"Verifiable Credentials,SOIP,OIDC",2021-12-13,,,,,,,,,,,,,
|
||
Mattr,Mattr,,,Emily Fry; Tobias Looker,,,,,,New to JSON-LD? Introducing JSON-LD Lint,"The rise in popularity of javascript (due to its natural language monopoly in web-browsers) led to a mass exile from XML and shift over to JSON as the prefered data representation format. In the process, certain valuable features of XML were lost, in particular those that provide a standardised semantic syntax. JSON-LD defines this missing layer of syntax, which improves semantic reasoning around data. This is critical for maintaining data quality and trust in data, which is particularly important as we increase our reliance on digital infrastructure, IOT and AI.","New to JSON-LD? Introducing JSON-LD Lint JSON-LD, based on the ubiquitous JSON technology, is rapidly gaining adoption on the web. JSON-LD is an innovation relevant to both business minds and developers alike. For those unfamiliar with this technology, this short video is a great introduction. At Mattr we use JSON-LD in a variety of ways. For example, the platform generates credentials using this technology so that they can be inherently understood and referenced. Despite it’s growing adoption, the success of standards based technologies like JSON-LD tends to depend on how quickly and easily developers can understand it. Developers rely on tools such as compilers, IDE’s (integrated development environments) like visual studio code and linters to provide them with guidance and feedback as they code. These tools are essential for facilitating developer productivity and education. When it comes to JSON-LD, many have observed that there are limited educational tools and resources available. The lack of training wheels in the space creates a barrier to entry, or results in developers breaking things along the way. Having been on this journey ourselves, we want to make it easier for developers to pick up JSON-LD. That’s why we have developed a linter, which we are open-sourcing today. Specifically, we are open-sourcing a mono-repo of packages (“JSON-LD Lint”) designed to lint/process JSON-LD documents. These packages are: - JSON-LD Lint Core — A typescript/javascript library containing the core linting engine for JSON-LD documents - JSON-LD Lint CLI — A command line tool for linting/processing JSON-LD documents. - JSON-LD Lint VSCode Extension — A VS Code extension aimed at providing an improved development experience within VS Code when dealing with JSON-LD documents (coming soon to the VSCode marketplace). We hope that these packages will help more developers to understand and adopt this technology. As always, we appreciate your feedback and welcome your involvement in progressing this project further! Head along to our GitHub to get involved. You can also gain access to Mattr’s sandbox platform to issue your own JSON-LD credentials today. FAQ What is JSON-LD and why is it on the rise? The rise in popularity of javascript (due to its natural language monopoly in web-browsers) led to a mass exile from XML and shift over to JSON as the prefered data representation format. In the process, certain valuable features of XML were lost, in particular those that provide a standardised semantic syntax. JSON-LD defines this missing layer of syntax, which improves semantic reasoning around data. This is critical for maintaining data quality and trust in data, which is particularly important as we increase our reliance on digital infrastructure, IOT and AI. What is a Linter? Developers are renowned for building tools that make their job easier — whether it be through automating previously manual processes or designing tools that help to catch their mistakes. The number of tools available has grown in tandem with the open source movement. In general a linter is a tool that analyzes some input (often source code) and flag errors, bugs, stylistic errors, and suspicious constructs. It provides developers with feedback around detected issues with their code/input and often includes information on how it could be fixed.",https://Mattr.global/resources/articles/new-to-json-ld-introducing-json-ld-lint/,,Post,,Resources,,,,,,,JSON-LD,2020-10-09,,,,,,,,,,,,,
|
||
Mattr,CCG,,,Daniel Hardman,,,,,,"credential definitions, credential manifests, BBS+, etc","When Tobias first described Mattr's approach to BBS+ signatures, one of my takeaways was that this changed the Indy mechanism of cred defs in two wonderful ways:<br>1. It eliminated the need for lots of keys (only one key, Y, needs to be declared as a credential signing key, instead of a set of keys, Y[0]..Y[n])<br>2. It made it possible to store a cred def somewhere other than a ledger<br>I was very happy about this.<br>However, I have since heard several smart people summarize the breakthrough as: ""We don't need credential definitions at all. You just use the assertionMethod key in your DID doc to sign credentials, and that's all you need."" I believe this is oversimplifying in a way that loses something important, so I wanted to open a conversation","(For those who have never heard of/ understood the thing that Hyperledger Indy calls a ""credential definition"", let me first define the term. A credential definition is a public statement by an issuer, announcing to the world, ""I plan to issue credentials that match schema X. I will sign them with key(s) Y[0]..Y[n], and I will revoke them with the following mechanism: Z."" Because cred defs are not discussed in the VC spec, they have been viewed as a symptom of unnecessary divergence from standards - although they don't violate the VC spec in any way, either. Indy stores cred defs on a ledger, but this is not an essential property, just a convenience.) When Tobias first described Mattr's approach to BBS+ signatures, one of my takeaways was that this changed the Indy mechanism of cred defs in two wonderful ways: 1. It eliminated the need for lots of keys (only one key, Y, needs to be declared as a credential signing key, instead of a set of keys, Y[0]..Y[n]) 2. It made it possible to store a cred def somewhere other than a ledger I was very happy about this. However, I have since heard several smart people summarize the breakthrough as: ""We don't need credential definitions at all. You just use the assertionMethod key in your DID doc to sign credentials, and that's all you need."" I believe this is oversimplifying in a way that loses something important, so I wanted to open a conversation about it. In doing so, I am NOT arguing that cred defs should be required for all VCs, and I am also NOT arguing that credential defs should live on a ledger (I love that Mattr's removed that requirement). I am instead suggesting that they are highly desirable for *some* VCs no matter what the signature format of the VCs is, and that they should become a welcomed part of the ecosystem for all of us (without any introduction of other Indy-isms). VCs CAN absolutely be issued ad-hoc. That is, any controller of a DID can build a credential on the spur of the moment, inventing (or referencing) whatever schema they like, and using any key from the appropriate verification method in their DID doc to sign. And VCs issued in this ad-hoc way can be verified by simply looking for the schema a verifier hopes to see. This totally works. But there are several useful properties that we give up when we operate in this ad-hoc fashion, that we would retain if we used credential definitions: 1. Discoverability (not of individual VCs, but of the VC-publication activities and intentions of institutions) 2. A stable target for reputation 3. A formal versioning strategy As an approximation, credential definitions can provide, for VCs, the same sort of publication formality that a Debian repo provides for Linux artifacts, or that an app store provides on a mobile platform. Is it possible to publish artifacts without such mechanisms? Absolutely. But by publicizing and regularizing the behavior of software ""issuers"", they have a powerful effect on the integrity and predictability/trust of the ecosystem as a whole. (I admit in advance that this analog is imperfect. App stores are centralized. I'm not arguing for centralization as a defining characteristic of VC issuance.) Re. discoverability: without a cred def, there is no piece of data that describes the publication activities and intentions of an institution - there are only individual pieces of evidence (VC instances) that suggest those intentions. I may see a credential for a PhD, signed by Harvard and issued to Alice. But I don't know whether Harvard plans to use that schema with its next PhD credential. Harvard is not on the record anywhere as having any intention to stick with that schema. *With* a cred def, discoverability for such matters is at least conceivable. (DIF credential manifests are imagined to be published on a company's web site, possibly under .well-known. This accomplishes a similar purpose. I believe it does so in a way that conflates some other issues, but perhaps we could merge cred defs into/with cred manifests at some point...) Re. reputation: Tying the reputation/gravitas of a credential just to its issuer is incorrect. Harvard's credentials about academic achievements of its students are likely to have a stellar reputation; Harvard's credentials that let a member of the campus custodial staff into the laundry room of a dorm may be highly suspect. This is NOT just because the problem domain is different; instead, the types of vetting and assurance that precede issuance may differ radically, *even if the same key signs both credentials*. You could say, ""Well, right; we'll tie the reputation of the VC to the issuer plus the schema."" But that's not quite right, either. In the US, there's been a move by the federal government to push some states to improve the procedures they use to vet holders before they issue driver's licenses. States that comply get to announce that their driver's licenses now carry the ""Real ID"" endorsement, and are considered secure enough to be used to board a flight in domestic travel. So, credential reputation is affected by the Real ID change, but the schemas and the signers of the credentials don't change. I suggest that the correct association for reputation should be issuer+intention/process+schema - which happens to be the scope of credential definitions. This is approximately like the reputation we see in the app store, where Google may have a great general reputation, but not all apps by Google have the same number of stars - and not all successive versions of the same app by Google have the same reputation, either. Just like one-off builds of a software artifact, ad-hoc VCs (e.g., Alice wants to testify to the world Bob is a skilled birdwatcher, because she's observed him on Audubon Society outings) may not need reputation. But I think most VCs that are long-lived and human-centric and intended for repeated use are worthy of a more stable and nuanced target for reputation than just the issuer or the schema. Re. versioning: Suppose Alice, Bob, and Carol all have PhD VCs from Harvard, issued a day apart, in that order. Alice's cred uses schema A, Bob's uses schema B, and Carol's uses schema A. What can a verifier conclude about the schema Harvard's using for PhDs? There's not an orderly progression of schema versions (it goes A --> B --> A), and there's no public record that explains the variation. Did a sysadmin deploy a patch after Alice's PhD was issued, then back it out after discovering a problem? Who knows. I think this will confuse and frustrate verifiers. Imagine if this kind of variation occurred during a rollout of COVID vaccination creds that was trying to unfreeze global travel... Indy credential definitions are immutable, versioned, and use semver semantics. Without any Indy baggage, we could say the same thing about cred defs in the larger ecosystem. This would force issuers to behave in a rational way, and to communicate the semantic shift associated with evolutions to their issuing behavior. Of course, issuers could operate ad-hoc, without a cred def - but if they used one, we'd have much greater predictability in the use cases where it matters. So, that's the short form of my reasoning on why cred defs still have value, for ANY credential format, even if we simplify them and move them off a ledger. How we represent cred defs (e.g., in [an evolution? of] DIF's cred manifest format, or in some new format, or whatever) isn't what I care about here. I think they need to be immutable/tamper-proof. That's all. And using them all the time feels like overkill. But I think they could provide real value to the ecosystem if we explored them instead of thinking of them as an obnoxious dependency. What do you think? Discuss on a community call? ________________________________ The information in this email and any attachments is confidential and intended solely for the use of the individual(s) to whom it is addressed or otherwise directed. Please note that any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the Company. Finally, the recipient should check this email and any attachments for the presence of viruses. The Company accepts no liability for any damage caused by any virus transmitted by this email.Received on Monday, 1 February 2021 08:31:47 UTC This archive was generated by hypermail 2.4.0 : Thursday, 24 March 2022 20:25:09 UTC",https://lists.w3.org/archives/public/public-credentials/2021feb/0010.html,,Archive,,Standards,,,,,,BBS+,,2021-01-30,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,Nader Helmy,,,,,,A solution for privacy-preserving Verifiable Credentials,"Here at Mattr, we are piloting an approach to ZKPs based on BBS+ signatures. Beyond the privacy and security benefits of ZKPs in general, this approach has a number of additional benefits compared to the ZKP implementations that exist today.","A solution for privacy-preserving Verifiable Credentials The recent ratification of Verifiable Credentials (VCs) as a new standard at the W3C defines a powerful new data model for interoperability of identity technologies. As a standard it also represents a disruptive shift in the future design options of digital systems, towards ones that feature more portable and user-centric digital identity, often referred to as ‘self-sovereign’ or ‘decentralized identity’. The basic data model of verifiable credentials may be familiar to developers and architects that are used to working with attribute-based credentials and data technologies. The issuer, or the authority on some information about a subject (e.g a person), issues a credential containing this information in the form of claims to a holder. The holder is responsible for storing and managing that credential, and in most instances is a piece of software that acts on behalf of the subject such as a digital wallet. When a verifier, sometimes referred to as a relying party, needs to validate some information, they can request from the holder some data to meet their verification requirements. Depending on the capabilities of the underlying technology, the holder is free to present the claims contained in their verifiable credentials using any number of techniques to preserve their privacy. The concept of issuing authorities and verifiers or relying parties has been around on the web for quite a long time. It is a model adopted by certificate authorities which are used to securely browse websites as well as protocols like OpenID Connect that are used to manage identity claims about a subject. The real innovation of the verifiable credentials standard is that it pushes for the introduction of a layer between relying parties or verifiers and issuing authorities — what’s known in the VC data model as a ‘holder’. The introduction of this layer signals a shift towards a new paradigm, giving users greater control over their own information and also making it more convenient for a user to manage their digital identity. One of the important principles that we want to achieve when designing any system that involves handling PII is to minimize the data disclosed in a given interaction. When users share information, they should be able to choose what and how much they share on a case-by-case basis (often referred to as selective disclosure), while the relying parties receiving the information must be able to maintain assurances about the presented information’s origin and integrity. As technologists, by having solutions that easily achieve selective disclosure, we can drive a culture based on the minimum information exchange required to enhance user privacy. When it comes to solutions, there are many different ways to tackle this problem, but three of the most common are: - Just in time issuance — Contact the issuer at request time either directly or indirectly for a tailored assertion containing only the information required by the relying party. - Trusted witness — Use a trusted witness between the prover and the relying party to mediate the information disclosure. - Cryptographic solutions — Use a cryptographic technique to disclose a subset of information from a larger assertion. While each solution is perfectly valid in different scenarios, these approaches have some important trade-offs. Just in time issuance, a model made popular by OpenID Connect, assumes the issuer is highly available, which imposes an infrastructure burden on the issuer that is proportional to the number of subjects they have information for and where those subjects use their information. Furthermore, in most instances of this model, the issuer learns where a subject is using their identity information, which can be a serious privacy problem. Trusted witness shifts this problem to be more of a presentation concern, where a witness de-anonymizes the subject presenting the information and presents an assertion with only the information required by the relying party. Again, this model requires a highly available party other than the holder and relying party present when a subject wants to present information, one that must be highly trusted and one that bears witness to a lot of PII on the subject, leading to privacy concerns. Cryptographic solutions offer an alternative to these approaches by solving the selective disclosure problem directly at the core data model layer of the VC, providing a simpler and more flexible method of preserving user privacy. There are a variety of ways that cryptography can be used to achieve selective disclosure or data minimization, but perhaps the most popular approach is using a branch of cryptography known as Zero-Knowledge Proofs, or ZKPs. The emergent feature of this technology is that a prover can prove knowledge of some data without exposing any additional data. Naturally, there has been a lot of interest in combining verifiable credentials with zero-knowledge proofs. Some promising implementations of ZKPs have emerged in the open-source digital identity community based on the usage of CL-signatures and zk-SNARKs. While these approaches have provided a lot of thought-leadership in the VC ecosystem, they have traditionally come with their own set of notable drawbacks. These include new cryptography, new infrastructure dependencies, as well as an increase in the computational effort required to implement them. Here at Mattr, we are piloting an approach to ZKPs based on BBS+ signatures. Beyond the privacy and security benefits of ZKPs in general, this approach has a number of additional benefits compared to the ZKP implementations that exist today. - Highly performant and compact - Minimal pre-setup or external dependencies (such as credential definitions) - Interoperable with existing schema technologies - Compliant with emerging standards To expand on these ideas, we believe that ZKPs when applied in the context of verifiable credentials, can and should be implemented in one of the two existing assertion formats that are defined for use within the standard today, namely JSON-LD and JWTs. It has often been said in the community that there are three ‘types’ of verifiable credentials: JWTs, Linked Data Proofs, and ZKP-based credentials. This is a false equivalence. While the first two are assertion formats, which are essentially alternative ways to represent the same information, ZKPs are not, instead they are a capability afforded by a particular branch of cryptography. In truth, ZKP-capable verifiable credentials are best characterized as an emergent feature of the kind of digital signatures they use, rather than being regarded as their own distinct ‘type’ of credential. More information on this topic can be found here. Beyond the need to avoid unnecessary complexity associated with creating another ‘type’ of credential, there are a number of benefits ZKP-based credentials can gain from utilizing the JSON-LD data model the way it was originally designed. There has been some effort in the community to get CL-signatures-based VCs to work with JSON-LD, however this approach still relies on a separate resource called a ‘credential definition’ which is stored on a decentralized ledger. Very few ledgers support this kind of object, which limits the utility of this solution. In our approach, the entire context of the credential, including its associated schemas, is included in the VC via JSON-LD. In essence, data schemas in the VC ecosystem are only useful if they are strongly reused by many different parties in the ecosystem. By utilizing JSON-LD according to current specification definitions, our approach enables us not only to use existing JSON-LD schemas, but to utilize the mechanism defined by JSON-LD to create and share new schemas as well. To a large extent this is what JSON-LD was designed for: the adoption and reuse of common data vocabularies. There is a strong need for privacy which has motivated much of the development around ZKPs. On the other side of the spectrum, there is the strong need for semantic interoperability which can make VCs more broadly useful. Our ZKP implementation exists at the intersection of these often competing goals. We want to create an approach to achieving selective disclosure which is highly-performant, minimal in size, cryptographically secure, and most importantly, compatible with existing standards. As shown above, this solution relies on the underlying cryptography of BBS+ signatures. While the cryptography is readily available for anyone to use today (even for use outside of verifiable credentials), what we have done is create an approach which combines Linked Data Proofs with BBS+ signatures. This solution is defined in a new specification as well as a new cryptographic signature suite. The new signature suite will be added to the Linked Data Cryptosuite Registry to maintain interoperability with existing Linked Data Signature schemes. We can also envision a similar specification and cryptographic suite that would combine JWT/JWS with BBS+ signatures. For the sake of completeness and further interoperability, we would encourage this approach to be developed in parallel with the linked data approach we are using. There is nothing about the BBS+ digital signature scheme that mandates the use of a particular VC format, or, indeed, the use of a VC at all. We have found great synergy by combining these technologies together, however there is nothing to prevent them from being used as modular or interchangeable components. Our implementation is ready to be utilized as-is with existing JSON-LD libraries and processors. We invite developers and architects to collaborate with us in further developing this emerging work around ZKPs. You can get involved in a number of different ways: by reading the specification and providing feedback in the form of issues and comments; developing open source libraries that utilize and consume the JSON-LD based crypto suite; and contributing examples of ZKP-based VCs to the W3C VC Examples repository, to name a few. We have published multiple open-source libraries ready to be used with VC-JS, an open-source JavaScript library for working with verifiable credentials. We are excited to bring the power of ZKPs to the rest of the credential ecosystem, and we hope that this work enables the broad usage of privacy-preserving technologies. Specifications: - JSON-LD BBS+ specification: https://w3c-CCG.GitHub.io/ldp-bbs2020/ - BBS+ signature scheme: https://Mattrglobal.GitHub.io/bbs-signatures-spec/ - BBS+ cryptography: https://eprint.iacr.org/2016/663.pdf Open-source libraries:",https://medium.com/Mattr-global/a-solution-for-privacy-preserving-verifiable-credentials-f1650aa16093,,Post,,Standards,,,,,,"BBS+,ZKP",,2020-07-17,,,,,,,,,,,,,
|
||
Mattr,Mattr,,,,,,,,IIW30,BBS+ signatures,"Mattr is proud to announce we’ve added support for privacy-preserving verifiable credentials on our platform using BBS+ signatures. Using a technique to implement selective disclosure, we’ve added the ability to generate credentials that support zero knowledge proofs without revealing any unnecessary information about the end-user, or placing any added burden on issuers, in the process. Since we first introduced and open-sourced JSON-LD BBS+ Signatures at IIW30 in April of this year, we’ve received lots of engagement, feedback and contributions from the broader technical community to further develop the implementations and specifications we presented. You can read more about our approach to privacy-preserving verifiable credentials on our introductory blog post.<br><br>","Using privacy-preserving ZKP credentials on the Mattr Platform Mattr is proud to announce we’ve added support for privacy-preserving verifiable credentials on our platform using BBS+ signatures. Using a technique to implement selective disclosure, we’ve added the ability to generate credentials that support zero knowledge proofs without revealing any unnecessary information about the end-user, or placing any added burden on issuers, in the process. Since we first introduced and open-sourced JSON-LD BBS+ Signatures at IIW30 in April of this year, we’ve received lots of engagement, feedback and contributions from the broader technical community to further develop the implementations and specifications we presented. You can read more about our approach to privacy-preserving verifiable credentials on our introductory blog post. One of the benefits of using the BBS+ cryptographic scheme to sign credentials is the ability to derive a zero knowledge proof from the signature, where the party generating the proof can choose to partially disclose statements from the original message. When enabled, this feature allows issuers to create a credential that effectively enforces minimal data disclosure using the Mattr Platform and a compliant digital wallet. Issuers can create ZKP-enabled credentials that allow the user to selectively disclose data To support this functionality, we generate the keys required to support these signatures and create a Decentralized Identifier (DID) with the keys referenced in the DID Document. BBS+ signatures require what’s called a pairing-friendly curve, we use BLS12–381. This DID can be referenced in credentials to establish the issuer of the data, a common practice to allow a verifier or relying party to trace the root of trust in a credential. To issue a ZKP-enabled credential, simply use our API endpoint to create a new DID Key with type set to BLS 12–381. Then, create a Verifiable Credential (VC) using your new DID Key as the issuer DID. Our platform will automatically detect this capability is available in your DID and create a ZKP-enabled BBS+ credential for you. You can use the platform this way to create a privacy-enabled credential, or you can create a regular credential by providing a DID with a different key type — you have the option. On the user side, you can hold ZKP-enabled credentials in your wallet alongside all of your other credentials. We’ve designed this process in a way that minimizes friction to the user. In future updates, our Mobile Wallet App will be able to detect if BBS+ signatures are being used in a credential. When you get a request to verify some information contained in one of these privacy-enabled credentials, it will derive a new presentation that selectively discloses the required info using a zero-knowledge proof. The platform will then allow verification of the proof using the same interface as any other type of presentation. Our integrated approach treats zero-knowledge proofs as an extension of VCs, rather than an entirely new framework with a separate set of dependencies. We have built BBS+ Signatures and privacy-enabled credentials into our platform for anybody to experiment with, in what we think is a significant milestone for standards-based credential solutions on the market today. As a technology, BBS+ digital signatures can be used to sign more than just verifiable credentials. Combining these technologies is quite effective, though they can also be treated as modular or separate components. We’ve open-sourced software for creating and verifying BBS+ signatures in browser environments as well as node.js, and we’ve also published a library for generating BLS 12–381 keypairs for signing and verifying BBS+ Signatures. By leveraging pairing-friendly elliptic-curve cryptography in the context of Linked Data Proofs, our approach provides an unprecedented way to perform zero-knowledge proofs using the semantics of JSON-LD. This allows credential issuers to tap into vast data vocabularies that exist on the web today, such as schema.org and Google Knowledge Graph, making user data more context-rich without sacrificing security and privacy of the user in the process. Not only is this approach more interoperable with existing implementations of the VC data model and semantic web technologies, it also doesn’t rely on any external dependencies to operate (like a distributed ledger), meaning it’s far more efficient than other approaches based on CL-signatures and zk-SNARKs. We’ve open-sourced our LD-Proofs suite for VCs including performance benchmarks so you can check it out yourself. We’re excited to finally make these powerful privacy features easily accessible for everyone, and we can’t wait to see what you build with it. To get started, sign up now on our website and follow our tutorials on Mattr Learn to start creating ZKP-enabled verifiable credentials on the Mattr Platform. Additional Links Open-source: - Node JS BBS+ Signatures — BBS+ signatures implementation for node.js environments - WASM JS BBS+ Signatures — BBS+ signatures implementation for browser & node.js environments - BLS 12–381 Key Pair JS — crypto keys for signing/verifying BBS+ signatures - BBS+ JSON-LD Signatures JS — uses BBS+ signatures & BLS 12–381 keypair in a Linked Data Proofs suite (for use in VC implementations) Specifications: - BBS+ JSON-LD Signatures Spec — specifies linked data suite for BBS+ signatures - BBS+ Signatures Spec — definition of BBS+ signatures scheme",https://Mattr.global/resources/articles/using-privacy-preserving-zkp-credentials-on-the-Mattr-platform/,,Post,,Standards,,,,,,BBS+,,2021-04-20,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,Nader Helmy,,,,,,JWT vs Linked Data Proofs: comparing Verifiable Credentials,"Linked Data Proofs offer more flexibility and are thus more scalable for global decentralized networks. Plus, because they natively work with JSON-LD, they encourage adoption of an open-world data model and re-usage of schemas that makes JSON-LD so powerful. JWTs, in contrast, offer a simple and straightforward way to express data with a limited semantic vocabulary. Using JWTs with JSON-LD provides a potential compromise between the two approaches, but loses much of the flexibility provided by Linked Data Security.","JWT vs Linked Data Proofs: comparing Verifiable Credentials Verifiable Credentials, a standard at the W3C as of late last year, is a verifiable data model which can be represented in multiple different assertion formats. Essentially, these formats, or ‘types’ of verifiable credentials, are just alternative ways to represent the same information. The data model described by VCs does not dictate a particular rendering or assertion format, however there are clear trade-offs between the different implementation choices that offer a number of useful insights. Both of the existing VC formats build off of JSON, the primary serialization format used on the web today. The first of these is JSON Web Token, or JWT, and the other is Linked Data Proofs. The ‘Linked Data’ in Linked Data Proofs refers to JSON Linked Data, or JSON-LD. While Linked Data Proofs are designed specifically to work with JSON-LD, the JWT-based assertion format can be used with either JSON-LD or plain JSON. Comparison JWTs have the benefit of already being widely used in today’s identity technologies, most notably in the framework used by OAuth 2.0 and OpenID Connect. Because of this, there are a number of existing software libraries and tools that developers can use immediately to begin building out their implementations. In addition, due to the fact that JWT-based credentials rely on a shared assertion format with existing identity technologies, it may be an easier mental model for newcomers to adopt when starting to experiment with VCs. JWTs are, however, limited in other ways. While they are efficient at representing information, they do very little to allow humans and software to understand the context of the data they represent. As the VC standard continues to mature and adoption increases, what’s starting to emerge is a web of verifiable data. The data in this ‘web’ originates from many different sources and many different contexts, so it’s important that we have some common standards to maintain the hygiene and quality of the data we are using. The widespread usage of a format such as JWT leads to deep data quality issues when you want to build an ecosystem on consistent, high-quality linked data. Because JWT poorly represents the context of data, its utility in the context of rich data supply chains is quite limited. Fortunately, we have an alternative model that overcomes this significant limitation. Linked Data Proofs offer a number of improvements on top of JSON. The primary benefit of the JSON-LD format used by LD-Proofs is that it builds on a common set of semantics that allow for broader ecosystem interoperability. It provides a standard vocabulary that makes data more portable as well as easy to consume and understand across different contexts. In order to create a crawlable web of verifiable data, it’s important that we prioritize strong reuse of data schemas as a key driver of interoperability efforts. Without it, we risk building a system where many different data schemas are used to represent the same exact information, creating the kinds of data silos that we see on the internet today. JSON-LD makes semantics a first-class principle and is therefore a solid basis for constructing VC implementations. JSON-LD is widely adopted on the web today, with W3C reporting it is used by 30% of the web and Google making it the de facto technology for search engine optimization. When it comes to verifiable credentials, it would be advantageous to extend and integrate the work around VC’s with the existing burgeoning ecosystem of linked data. The security model behind these two approaches also teases out some important differences. JWT-based VCs use the existing JOSE framework for encryption and security. ‘JOSE’ stands for Javascript Object Signing and Encryption, and it was largely created to provide security guarantees around JSON-based identity technologies. Alternatively, VCs based on Linked Data Proofs use Linked Data Signatures for security. Linked Data Signatures provide a simple security protocol which is native to JSON-LD. They are built to compactly represent proof chains and allow a VC to be easily protected on a more granular basis; per-attribute, instead of per-credential. These features support a much more robust security model which has broader implications downstream from VCs, especially in terms of size and efficiency. The limitations of the JWT approach come from the fact that JWT was originally designed to consume plain JSON rather than JSON-LD. In this permutation, JWTs offer a very limited semantic vocabulary that negatively affects data portability. Using JWTs with JSON is not suited for many complex use cases that need a more expressive data format. While it’s true that JWTs can be used with JSON-LD to achieve some of the open-world data modeling features that make JSON-LD so useful, this approach suffers because it does not support the security features offered by Linked Data Proofs. In order to add protection to JWTs, it’s necessary to perform additional pre- and post-processing on the data. In contrast, protecting a JSON-LD based VC is as simple as passing a valid VC to a Linked Data Signatures implementation and generating a digital signature. To see a nuanced breakdown of these approaches, reference this chart. What about zero-knowledge proofs? Among the technologies that have often been used in conjunction with verifiable credentials are zero-knowledge proofs, or ZKPs. ZKPs are needed for non-correlatable selective disclosure of credential attributes. They allow a user to dynamically generate any number of proofs that minimally disclose information in order to satisfy credential presentation requests. Unlike more simple VC approaches, when using a credential that supports ZKPs, the original credential almost never leaves the Holder’s wallet. The piece of information that’s disclosed to external parties is the credential presentation which is dynamically generated as-needed, rather than the credential itself. In discussions around the different VC assertion formats, it has often been stated that ZKP-based credentials represent a third ‘type’ of credential. In reality, ZKP-based credentials are not a new type of credential. Instead, ZKPs are an emergent property of certain digital signature schemes that can be represented in multiple different ‘types’ of VCs. Conflating these issues presents an interoperability barrier between ZKP-based credentials and standard JWT-based credentials or Linked Data Proof credentials. The adoption of ZKPs in the decentralized identity community has largely suffered because of misconceptions about what a ZKP is, and how it fits into the overarching VC data types. An assertion format is a conceptually different level of abstraction than a particular kind of digital signature scheme. Although this distinction is obvious to low-level programmers, it may not be immediately obvious to many of the stakeholders in a VC ecosystem. This is partially because, until now, there have been no existing open-source implementations that attempt to bridge the gap. Our solution, which uses JSON-LD based credentials with ZKPs, creates an open-source reference implementation that demonstrates how these technologies can work together. See here for our approach to solving this problem. Interoperability So, what will it take to make the different VC assertion formats compatible with one another? To answer this question, we revisit the foundation of both of these formats: JSON. In order to get interoperability and compatibility between these types of VCs, we need to make sure that their data models share a common underlying approach that can be easily translated from one type to another. In practice, this means that what is needed for VC interoperability is JSON processing, along with pre-configured ‘@context’ definitions. Having dynamically resolvable contexts and doing JSON-LD processing is not strictly necessary for an interoperable VC ecosystem. However, these features may add additional semantic data capabilities that are useful for many kinds of implementations. As long as the community agrees to these minimum requirements, people can implement VCs however they want. A recent draft specification around VC JSON Schemas demonstrates how to use JSON Schema to represent credential schemas in parallel to the method used by JSON-LD. This approach can allow JSON Schema and JSON-LD to work symbiotically in the VC ecosystem. VC JSON Schemas is motivated by an attempt to create convergence around the semantics of verifiable credentials. It does not address the fact that JOSE signatures and Linked Data Signatures work differently on a cryptographic level. However, it does demonstrate that a credential represented in JSON can be mapped to a functionally similar credential in JSON-LD. The Verdict Linked Data Proofs offer more flexibility and are thus more scalable for global decentralized networks. Due to their native compatibility with JSON-LD, they encourage adoption of an open-world data model and reusable schema definitions that make JSON-LD so powerful. JWTs, in contrast, offer a simple and straightforward way to express data with a limited semantic vocabulary. Using JWTs with JSON-LD provides a potential compromise between the two approaches, but loses much of the flexibility provided by Linked Data Security. By providing a bridge for different VC assertion formats at least on a semantic level, we can eschew much of the cryptographic incompatibility that exists on the layers below. While these implementation differences will remain in a diverse and healthy ecosystem, they should not prove to be a hindrance to the implementation of standards-based verifiable credentials.",https://medium.com/Mattr-global/jwt-vs-linked-data-proofs-comparing-VC-assertion-formats-a2a4e6671d57,,Post,,Standards,,,,,,,"JWT,LinkedData",2020-05-09,,,,,,,,,,,,,
|
||
Mattr,Mattr,,Medium,,,,,,,OpenID Connect Credential Provider,"Introducing OpenID Connect Credential Provider, an extension to OpenID Connect which enables the end-user to request credentials from an OpenID Provider and manage their own credentials in a digital wallet. This specification defines how an OpenID Provider can be extended beyond being the provider of simple identity assertions into being the provider of credentials, effectively turning these Identity Providers into Credential Providers.","Introducing OIDC Credential Provider OpenID Connect (OIDC) is a hugely popular user authentication and identity protocol on the web today. It enables relying parties to verify the identity of their users and obtain basic profile information about them in order to create an authenticated user experience. In typical deployments of OpenID Connect today, in order for a user to be able to exercise the identity they have with a relying party, the relying party must be in direct contact with what’s known as the OpenID Provider (OP). OpenID Providers are responsible for performing end-user authentication and issuing end-user identities to relying parties. This effectively means that an OpenID Provider is the Identity Provider (IdP) of the user. It’s the reason we often see buttons that say “Login with Google” or “Login with Facebook” during the login journey in an application or service. The website or application you want to use must first authenticate who you are with a provider like Google or Facebook which controls and manages that identity on your behalf. In this context we can think of the IdP as the “man in the middle.” This relationship prevents users from having a portable digital identity which they can use across different contexts and denies users any practical control over their identity. It also makes it incredibly easy for IdPs like Google or Facebook to track what users are doing, because the “man in the middle” can gather metadata about user behavior with little agency over how this identity data is shared and used. In order to allow users to have practical control over their identity, we need a new approach. Introducing OpenID Connect Credential Provider, an extension to OpenID Connect which enables the end-user to request credentials from an OpenID Provider and manage their own credentials in a digital wallet. This specification defines how an OpenID Provider can be extended beyond being the provider of simple identity assertions into being the provider of credentials, effectively turning these Identity Providers into Credential Providers. To maximize the reuse of existing infrastructure that’s deployed today, OIDC Credential Provider extends the core OpenID Connect protocol, maintaining the original design and intent of OIDC while enhancing it without breaking any of its assumptions or requirements. Instead of using OIDC to provide simple identity assertions directly to the relying party, we can leverage OIDC to offer a Verifiable Credential (VC) which is cryptographically bound to a digital wallet of the end-users choice. The digital wallet plays the role of the OpenID Client application which is responsible for interacting with the OpenID Provider and manages the cryptographic key material (both public and private keys) used to prove ownership of the credential. The credentials issued to the wallet are re-provable and reusable for the purposes of authentication. This helps to decouple the issuance of identity-related information by providers and the presentation of that information by a user, introducing the user-controlled “wallet” layer between issuers and relying parties. Essentially, a wallet makes a request to an OpenID provider in order to obtain a credential, and then receives the credential back into their wallet so they can later use it to prove their identity to relying parties. The interaction consists of three main steps: - The Client sends a signed credential request to the OpenID Provider with their public key - The OpenID Provider authenticates and authorizes the End-User to access the credential - The OpenID Provider responds to the Client with the issued VC In this new flow, the credential request extends the typical OpenID Connect request in that it expresses the intent to ask for something beyond the identity token of a typical OIDC flow. Practically, what this means is that the client uses a newly defined scope to indicate the intent of the request. The Client also extends the standard OIDC Request object to add cryptographic key material and proof of possession of that key material so that the credential can be bound to the wallet requesting it. Though the credential can be bound to a public key by default, it can also support different binding mechanisms, e.g. the credential can optionally be bound to a Decentralized Identifer (DID). In binding to a DID, the subject of the credential is able to maintain ownership of the credential on a longer life cycle due to their ability to manage and rotate keys while maintaining a consistent identifier. This eases the burden on data authorities to re-issue credentials when keys change and allows relying parties to verify that the credential is always being validated against the current public key of the end-user. The request can also indicate the format of the requested credential and even ask for specific claims present within the credential. This is designed to allow multiple credential formats to be used within the OIDC flow. On the provider side, OpenID Connect Providers are able to advertise which capabilities they support within the OIDC ecosystem using OpenID Connect Provider Metadata. This approach extends the metadata to support additional fields that express support for binding to DIDs, for issuing VCs, and advertising which DID methods, credential formats, credentials, and claims they are offering. This information can be utilized by the end-user’s digital wallet to help the user understand whether or not they wish to proceed with a credential request. In order to create a way for the wallet or client to connect to the OpenID Provider, the spec also defines a URL which functions as a Credential Offer that the client can invoke in order to retrieve and understand the types of credential being offered by the provider. The client registers the ‘openid’ URI scheme in order to be able to understand and render the offer to the user so they can make an informed decision. The sum of these changes means that OpenID Connect can allow users to have a portable digital identity credential that’s actually under their control, creating an opportunity for greater agency in digital interactions as well as preventing identity providers from being able to easily track user behavior. The OpenID Connect Credential Provider specification is in the process of being contributed to the OpenID Foundation (OIDF) as a work item at the A/B Working Group, where it will continue to be developed by the community behind OpenID Connect. Mattr is pleased to announce that our OIDC Bridge Platform Extension now uses OIDC Credential Provider under the hood to facilitate issuing credentials with OpenID Connect. OIDC Bridge hides the complexity associated with setting up infrastructure for credential issuance and simply requires configuration of a standard OpenID Provider. We also simplify the process of verifying credentials issued over OIDC Credential Provider by allowing the wallet to respond to requests, present credentials, and prove ownership and integrity of their credentials via OIDC. This new set of capabilities allows OpenID Providers greater flexibility around which claims end up in a credential, and allows for the support of many different credential types with a straight-forward authentication journey for end-users. Our Mobile Wallet supports the ability to invoke credential offers using OIDC Credential Provider as well as creating credential requests and receiving credentials from an OpenID Provider. To find out more, check out our tutorials on Mattr Learn, read the spec, or watch a recording of our presentation on this spec from the recent Internet Identity Workshop.",https://medium.com/Mattr-global/introducing-oidc-credential-provider-7845391a9881,,Post,,Standards,,,,,,,OIDC,2020-12-15,,,,,,,,,,,,,
|
||
Mattr,Mattr,,GitHub,T. Looker ; J. Thompson ; A. Lemmon ; K. Cameron<br>,,,,,,OIDC Credential Provider,is “an extension to OpenID Connect which enables the end-user to request credentials from an OpenID Provider and manage their own credentials in a digital wallet.”,"|OpenID Connect Credential Provider||April 2021| |Looker, et al.||Informational||[Page]| OpenID Connect 1.0 is a simple identity layer on top of the OAuth 2.0 protocol. It enables relying parties to verify the identity of the End-User based on the authentication performed by an Authorization Server, as well as to obtain basic profile information about the End-User.¶ OpenID Providers today within OpenID Connect assume many roles, one of these is providing End-User claims to the relying party at the consent of the End-User such as their name or date of birth, providers performing this function are often referred to as being claims providers. However, the need for End-Users to be able to provide a variety of claims to a relying party from different providers is only increasing as many business processes that span multiple logical domains such as KYC and education move towards digital channels.¶ However, assuming a direct integration between the relying party and the claims providers leads to a difficult experience for End-Users to manage. Instead End-Users need a way to consolidate the different identities and claims they have available with various claims providers into one place where they can manage their release from. In doing this, a layer of in-direction is created between the relying party and the claims provider through the introduction of a new party that we refer to in this specification as being the ""holder"".¶ In OpenID Connect today the existing ways to communicate End-User claims to relying parties are the id_token and the userinfo endpoint, however these mechanisms alone are unsuitable for the style of indirect presentation of claims to relying parties via a holder, as the relying party must be able to authenticate the authority of the holder to be presenting the claims on behalf of the End-User. Instead in order to support this style of flow, this specification defines a new vehicle for communicating End-User claims called a ""credential"". In addition to this definition this specification defines how an existing OpenID Provider can be extended to issue ""credentials"" to holders.¶ OpenID Connect 1.0 is a simple identity layer on top of the OAuth 2.0 protocol. It enables relying parties to verify the identity of the End-User based on the authentication performed by an Authorization Server, as well as to obtain basic profile information about the End-User.¶ OpenID Providers today within OpenID Connect assume many roles, one of these is providing End-User claims to the relying party at the End-User's consent such as their name or date of birth. OpenID providers performing this function are often referred to as being claims providers. However, the need for End-Users to be able to provide a variety of claims from different providers is only increasing as many business processes that span multiple logical domains such as KYC and education move towards digital channels.¶ However, assuming a direct integration between the relying party and the claims providers leads to a difficult experience for End-Users to manage. Instead End-Users need a way to consolidate the different identities and claims they have available with various claims providers into one place where they can manage their release from. In doing this, a layer of indirection is created between the relying party and the claims provider through the introduction of a new party that we refer to in this specification as being the ""holder"".¶ In OpenID Connect today the existing ways to communicate End-User claims to relying parties are the id_token and the userinfo endpoint. However, these mechanisms alone are unsuitable for the style of indirect presentation of claims to relying parties via a holder as the relying party must be able to authenticate the authority of the holder to be presenting the claims on behalf of the End-User. Instead, in order to support this style of flow, this specification defines a new vehicle for communicating End-User claims called a ""credential"". In addition to this definition this specification defines how an existing OpenID Provider can be extended to issue ""credentials"" to holders.¶ To reiterate, this specification defines a protocol where a Credential Issuer (OP) may provide a Credential (set of claims) to a Credential Holder (acting as an RP) of which that Credential Holder (acting as an OP) controls and may present onward to Verifiers (RPs).¶ Note that the protocol for a Credential Holder to present a Credential to a Credential Verifier is outside the scope of this specification.¶ The key words ""MUST"", ""MUST NOT"", ""REQUIRED"", ""SHALL"", ""SHALL NOT"", ""SHOULD"", ""SHOULD NOT"", ""RECOMMENDED"", ""NOT RECOMMENDED"", ""MAY"", and ""OPTIONAL"" in this document are to be interpreted as described in RFC 2119 @!RFC2119.¶ In the .txt version of this document, values are quoted to indicate that they are to be taken literally. When using these values in protocol messages, the quotes MUST NOT be used as part of the value. In the HTML version of this document, values to be taken literally are indicated by the use of this fixed-width font.¶ All uses of JSON Web Signature (JWS) JWS and JSON Web Encryption (JWE) JWE data structures in this specification utilize the JWS Compact Serialization or the JWE Compact Serialization; the JWS JSON Serialization and the JWE JSON Serialization are not used.¶ This specification uses the terms defined in OpenID Connect Core 1.0; in addition, the following terms are also defined:¶ A set of claims about the End-User (subject) which is cryptographically bound to the credential holder in an authenticatable manner based on public/private key cryptography.¶ An OpenID Connect Authentication Request that results in the End-User being authenticated by the Authorization Server and the Client (Credential Holder) receiving a credential about the authenticated End-User.¶ A role an entity performs by holding credentials and presenting them to RPs on behalf of the consenting End-User (subject of the credentials). A Credential Holder serves the role of an OP (when presenting credentials to RPs) and an RP (when receiving credentials from Credential Issuers).¶ A role an entity performs by asserting claims about one or more subjects, creating a credential from these claims (cryptographically binding them to the holder), and transmitting the credential to the Credential Holder. A Credential Issuer is an OP that has been extended in order to also issue credentials.¶ An entity about which claims are made. Example subjects include human beings, animals, and things. In many cases the Credential Holder of a credential is the subject, but in certain cases it is not. For example, a parent (the Credential Holder) might hold the credentials of a child (the subject), or a pet owner (the Credential Holder) might hold the credentials of their pet (the subject). Most commonly the subject will be the End-User.¶ A role an entity performs by receiving one or more credentials for processing. A verifier is an RP that has been extended to receive and process credentials.¶ This specification extends the OpenID Connect protocol for the purposes of credential issuance. Whereby credential issuance refers to a protocol where a Credential Holder (acting as an RP) makes a request to a Credential Issuer (acting as an OP) to have a credential issued to it so that it may at a later stage then present this credential at the consent of the end user to Credential Verifiers (RPs). The steps in the credential issuance protocol are as follows:¶ The Credential Holder (acting as an RP) sends a Credential Request to the Credential Issuer (acting as an OP).¶ The Credential Issuer authenticates the End-User and obtains authorization.¶ The Credential Issuer responds with a Credential.¶ These steps are illustrated in the following diagram:¶ +----------+ +----------+ | | | | | |---(1)OpenID Credential Request--->| | | | | | | | +--------+ | | | | | | | | |Credential| | End- |<--(2) AuthN & AuthZ-->|Credential| | Holder | | User | | Issuer | | (RP) | | | | (OP) | | | +--------+ | | | | | | | |<--(3)OpenID Credential Response---| | | | | | +----------+ +----------+¶ Note - Outside of the scope for this specification is how the Credential Holder then exercises presentation of this credential with a Credential Verifier, however the diagram looks like the following.¶ The Credential Verifier (acting as a relying party) sends an OpenID Request to the Credential Holder (acting an OP).¶ The Credential Holder authenticates the End-User and obtains authorization.¶ The Credential Holder responds with a Credential Presentation.¶ +----------+ +----------+ | | | | | |---(1)OpenID Connect Credential Presentation Request------>| | | | | | | | +--------+ | | | | | | | | |Credential| | End- |<--(2) AuthN & AuthZ-->|Credential| | Verifier | | User | | Holder | | (RP) | | | | (OP) | | | +--------+ | | | | | | | |<--(3)OpenID Connect Credential Presentation Response------| | | | | | +----------+ +----------+¶ A Credential Request is an OpenID Connect authentication request made by a Credential Holder that requests the End-User to be authenticated by the Credential Issuer and consent be granted for a credential containing the requested claims about the End-User be issued to it.¶ The following section outlines how an OpenID Connect Authentication Request is extended in order to become a valid Credential Request.¶ The simplest OpenID Connect Credential Request is an ordinary OpenID Connect request that makes use of one additional scope, openid_credential.¶ A non-normative example of the Credential Request.¶ HTTP/1.1 302 Found Location: https://server.example.com/authorize? response_type=code &scope=openid%20openid_credential &client_id=s6BhdRkqt3 &state=af0ifjsldkj &redirect_uri=https%3A%2F%2Fclient.example.org%2Fcb &credential_format=w3cvc-jsonld¶ When a request of this nature is made, the access_token issued to the Credential Holder authorizes it to access the credential endpoint to obtain a credential from the Credential Issuer.¶ A Credential Request uses the OpenID and OAuth2.0 request parameters as outlined in section 3.1.2.1 of OpenID Connect core, except for the following additional constraints.¶ REQUIRED. A Credential Request MUST contain the openid_credential scope value in the second position directly after the openid scope.¶ REQUIRED. Determines the format of the credential returned at the end of the flow, values supported by the OpenID Provider are advertised in their openid-configuration metadata, under the credential_formats_supported attribute.¶ OPTIONAL. Used when making a Signed Credential Request, defines the key material the Credential Holder is requesting the credential to be bound to and the key responsible for signing the request object. The value is a JSON Object that is a valid JWK.¶ OPTIONAL. Defines the relationship between the key material the Credential Holder is requesting the credential to be bound to and a decentralized identifier. Processing of this value requires the CI to support the resolution of decentralized identifiers which is advertised in their openid-configuration metadata, under the dids_supported attribute. The value of this field MUST be a valid decentralized identifier.¶ Public private key pairs are used by a requesting Credential Holder to establish a means of binding to the resulting credential. A Credential Holder making a Credential Request to a Credential Issuer must prove control over this binding mechanism during the request, this is accomplished through the extended usage of a signed request defined in OpenID Connect Core.¶ It is RECOMMENDED that a Credential Request flow use the authorization code flow as defined in OpenID Connect core.¶ Successful and Error Authentication Response are in the same manor as OpenID Connect Core 1.0 with the code parameter always being returned with the Authorization Code Flow.¶ On Request to the Token Endpoint the grant_type value MUST be authorization_code inline with the Authorization Code Flow and the code value included as a parameter.¶ The following is a non-normative example of a response from the token endpoint, whereby the access_token authorizes the Credential Holder to request a credential from the credential endpoint.¶ { ""access_token"": ""eyJhbGciOiJSUzI1NiIsInR5cCI6Ikp..sHQ"", ""token_type"": ""bearer"", ""expires_in"": 86400, ""id_token"": ""eyJodHRwOi8vbWF0dHIvdGVuYW50L..3Mz"" }¶ The Credential Endpoint is an OAuth 2.0 Protected Resource that when called, returns Claims about the authenticated End-User in the form of a credential. To obtain a credential on behalf of the End-User, the Credential Holder makes a request to the Credential Endpoint using an Access Token obtained through OpenID Connect Authentication whereby the the openid_credential scope was granted.¶ Communication with the Credential Endpoint MUST utilize TLS. See Section 16.17 for more information on using TLS.¶ The Credential Endpoint MUST support the use of HTTP POST methods defined in RFC 2616 [RFC2616].¶ It is recommended that the Credential Endpoint SHOULD enforce presentation of the OAuth2.0 Access Token to be sender constrained DPOP. However the Credential Endpoint MAY also accept Access Tokens as OAuth 2.0 Bearer Token Usage [RFC6750].¶ The Credential Endpoint SHOULD support the use of Cross Origin Resource Sharing (CORS) [CORS] and or other methods as appropriate to enable Java Script Clients to access the endpoint.¶ The Credential Holder may provide a signed request object containing the sub to be used as the subject for the resulting credential. When a sub claim is present within the request object an associated sub_jwk claim MUST also be present of which the request object MUST be signed with, therefore proving control over the sub.¶ The Credential Holder may also specify the credential_format they wish the returned credential to be formatted as. If the Credential Issuer receiving the request does not support the requested credential format they it MUST return an error response, as per [TODO]. If the credential_format is not specified in the request the Credential Issuer SHOULD respond with their preferred or default format. (Note if we are going to have a default we need to specify it or is it at the discretion of the Credential Issuer to determine this?)¶ When a signed request is not provided the Credential Issuer will use the sub associated with the initial Credential request, where possible. If a sub value is not available the Credential Issuer MUST return an error response, as per [TODO].¶ OPTIONAL. A valid OIDC signed JWT request object. The request object is used to provide a sub the Credential Holder wishes to be used as the subject of the resulting credential as well as provide proof of control of that sub.¶ A non-normative example of a Signed Credential request.¶ POST /credential HTTP/1.1 Host: https://issuer.example.com Authorization: Bearer <access-token> Content-Type: application/json { ""request"": <signed-jwt-request-obj> }¶ Where the decoded payload of the request parameter is as follows:¶ { ""aud"": ""https://issuer.example.com"", ""iss"": ""https://wallet.example.com"", ""sub"": ""urn:uuid:dc000c79-6aa3-45f2-9527-43747d5962a5"", ""sub_jwk"" : { ""crv"":""secp256k1"", ""kid"":""YkDpvGNsch2lFBf6p8u3"", ""kty"":""EC"", ""x"":""7KEKZa5xJPh7WVqHJyUpb2MgEe3nA8Rk7eUlXsmBl-M"", ""y"":""3zIgl_ml4RhapyEm5J7lvU-4f5jiBvZr4KgxUjEhl9o"" }, ""credential_format"": ""w3cvc-jwt"", ""nonce"": ""43747d5962a5"", ""iat"": 1591069056, ""exp"": 1591069556 }¶ format : REQUIRED. The proof format the credential was returned in. For example w3cvc-jsonld or w3cvc-jwt. credential : REQUIRED. A cryptographically verifiable proof in the defined proof format. Most commonly a Linked Data Proof or a JWS.¶ { ""format"": ""w3cvc-jsonld"", ""credential"": <credential> }¶ Formats of the credential can vary, examples include JSON-LD or JWT based Credentials, the Credential Issuer SHOULD make their supported credential formats available at their openid-configuration metadata endpoint.¶ The following is a non-normative example of a Credential issued as a W3C Verifiable Credential 1.0 compliant format in JSON-LD.¶ { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://www.w3.org/2018/credentials/examples/v1"" ], ""id"": ""http://example.gov/credentials/3732"", ""type"": [""VerifiableCredential"", ""UniversityDegreeCredential""], ""issuer"": ""did:key:z6MkjRagNiMu91DduvCvgEsqLZDVzrJzFrwahc4tXLt9DoHd"", ""issuanceDate"": ""2020-03-10T04:24:12.164Z"", ""credentialSubject"": { ""id"": ""urn:uuid:dc000c79-6aa3-45f2-9527-43747d5962a5"", ""jwk"": { ""crv"":""secp256k1"", ""kid"":""YkDpvGNsch2lFBf6p8u3"", ""kty"":""EC"", ""x"":""7KEKZa5xJPh7WVqHJyUpb2MgEe3nA8Rk7eUlXsmBl-M"", ""y"":""3zIgl_ml4RhapyEm5J7lvU-4f5jiBvZr4KgxUjEhl9o"" }, ""givenName"": ""John"", ""familyName"": ""Doe"", ""degree"": { ""type"": ""BachelorDegree"", ""name"": ""Bachelor of Science and Arts"" } }, ""proof"": { ""type"": ""Ed25519Signature2018"", ""created"": ""2020-04-10T21:35:35Z"", ""verificationMethod"": ""did:key:z6MkjRagNiMu91DduvCvgEsqLZDVzrJzFrwahc4tXLt9DoHd#z6MkjRagNiMu91DduvCvgEsqLZDVzrJzFrwahc4tXLt9DoHd"", ""proofPurpose"": ""assertionMethod"", ""jws"": ""eyJhbGciOiJFZERTQSIsImI2NCI6ZmFsc2UsImNyaXQiOlsiYjY0Il19..l9d0YHjcFAH2H4dB9xlWFZQLUpixVCWJk0eOt4CXQe1NXKWZwmhmn9OQp6YxX0a2LffegtYESTCJEoGVXLqWAA"" } }¶ The following is a non-normative example of a Credential issued as a JWT¶ ewogICJhbGciOiAiRVMyNTYiLAogICJ0eXAiOiAiSldUIgp9.ewogICJpc3MiOiAiaXNzdWVyIjogImh0dHBzOi8vaXNzdWVyLmVkdSIsCiAgInN1YiI6ICJkaWQ6ZXhhbXBsZToxMjM0NTYiLAogICJpYXQiOiAxNTkxMDY5MDU2LAogICJleHAiOiAxNTkxMDY5NTU2LAogICJodHRwczovL3d3dy53My5vcmcvMjAxOC9jcmVkZW50aWFscy9leGFtcGxlcy92MS9kZWdyZWUiOiB7CiAgICAgImh0dHBzOi8vd3d3LnczLm9yZy8yMDE4L2NyZWRlbnRpYWxzL2V4YW1wbGVzL3YxL3R5cGUiOiAiQmFjaGVsb3JEZWdyZWUiLAogICAgICJodHRwczovL3d3dy53My5vcmcvMjAxOC9jcmVkZW50aWFscy9leGFtcGxlcy92MS9uYW1lIjogIkJhY2hlbG9yIG9mIFNjaWVuY2UgYW5kIEFydHMiCiAgfQp9.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c¶ And the decoded Claim Set of the JWT¶ { ""iss"": ""https://issuer.example.com"", ""sub"": ""urn:uuid:dc000c79-6aa3-45f2-9527-43747d5962a5"", ""sub_jwk"" : { ""crv"":""secp256k1"", ""kid"":""YkDpvGNsch2lFBf6p8u3"", ""kty"":""EC"", ""x"":""7KEKZa5xJPh7WVqHJyUpb2MgEe3nA8Rk7eUlXsmBl-M"", ""y"":""3zIgl_ml4RhapyEm5J7lvU-4f5jiBvZr4KgxUjEhl9o"" }, ""iat"": 1591069056, ""exp"": 1591069556, ""https://www.w3.org/2018/credentials/examples/v1/degree"": { ""https://www.w3.org/2018/credentials/examples/v1/type"": ""BachelorDegree"", ""https://www.w3.org/2018/credentials/examples/v1/name"": ""Bachelor of Science and Arts"" } }¶ TODO improve this section¶ Decentralized identifiers are a resolvable identifier to a set of statements about the did subject including a set of cryptographic material (e.g public keys). Using this cryptographic material, a decentralized identifier can be used as an authenticatable identifier in a credential, rather than using a public key directly.¶ A Credential Holder submitting a signed Credential Request can request, that the resulting credential be bound to the Credential Holder through the usage of decentralized identifiers by using the did field.¶ A Credential Holder prior to submitting a credential request SHOULD validate that the CI supports the resolution of decentralized identifiers by retrieving their openid-configuration metadata to check if an attribute of dids_supported has a value of true.¶ The Credential Holder SHOULD also validate that the CI supports the did method to be used in the request by retrieving their openid-configuration metadata to check if an attribute of did_methods_supported contains the required did method.¶ A CI processing a credential request featuring a decentralized identifier MUST follow the following additional steps to validate the request.¶ Validate the value in the did field is a valid decentralized identifier.¶ Resolve the did value to a did document.¶ Validate that the key in the sub_jwk field of the request is referenced in the authentication section of the DID Document.¶ If any of the steps fail then the CI MUST respond to the request with the Error Response parameter, section 3.1.2.6. with Error code: invalid_did.¶ The following is a non-normative example of requesting the issuance of a credential that uses a decentralized identifier.¶ { ""response_type"": ""code"", ""client_id"": ""IAicV0pt9co5nn9D1tUKDCoPQq8BFlGH"", ""sub_jwk"" : { ""crv"":""secp256k1"", ""kid"":""YkDpvGNsch2lFBf6p8u3"", ""kty"":""EC"", ""x"":""7KEKZa5xJPh7WVqHJyUpb2MgEe3nA8Rk7eUlXsmBl-M"", ""y"":""3zIgl_ml4RhapyEm5J7lvU-4f5jiBvZr4KgxUjEhl9o"" }, ""did"": ""did:example:1234"", ""redirect_uri"": ""https://Client.example.com/callback"", ""credential_format"": ""w3cvc-jsonld"" }¶ The following is a non-normative example of a credential endpoint response for the request shown above.¶ { ""format"": ""w3cvc-jsonld"", ""credential"": { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://www.w3.org/2018/credentials/examples/v1"" ], ""id"": ""http://example.gov/credentials/3732"", ""type"": [""VerifiableCredential"", ""UniversityDegreeCredential""], ""issuer"": ""https://issuer.edu"", ""issuanceDate"": ""2020-03-10T04:24:12.164Z"", ""credentialSubject"": { ""id"": ""did:example:1234"", ""degree"": { ""type"": ""BachelorDegree"", ""name"": ""Bachelor of Science and Arts"" } }, ""proof"": { ""type"": ""Ed25519Signature2018"", ""created"": ""2020-04-10T21:35:35Z"", ""verificationMethod"": ""https://issuer.edu/keys/1"", ""proofPurpose"": ""assertionMethod"", ""jws"": ""eyJhbGciOiJFZERTQSIsImI2NCI6ZmFsc2UsImNyaXQiOlsiYjY0Il19..l9d0YHjcFAH2H4dB9xlWFZQLUpixVCWJk0eOt4CXQe1NXKWZwmhmn9OQp6YxX0a2LffegtYESTCJEoGVXLqWAA"" } } }¶ An OpenID provider can use the following meta-data elements to advertise its support for credential issuance in its openid-configuration defined by OpenID-Discovery.¶ credential_supported Boolean value indicating that the OpenID provider supports the credential issuance flow.¶ credential_endpoint A JSON string value indicating the location of the OpenID providers credential endpoint.¶ credential_formats_supported A JSON array of strings identifying the resulting format of the credential issued at the end of the flow.¶ credential_claims_supported A JSON array of strings identifying the claim names supported within an issued credential.¶ credential_name A human readable string to identify the name of the credential offered by the provider.¶ dids_supported Boolean value indicating that the OpenID provider supports the resolution of decentralized identifiers.¶ did_methods_supported A JSON array of strings representing Decentralized Identifier Methods that the OpenID provider supports resolution of.¶ The following is a non-normative example of the relevant entries in the openid-configuration meta data for an OpenID Provider supporting the credential issuance flow¶ { ""dids_supported"": true, ""did_methods_supported"": [ ""did:ion:"", ""did:elem:"", ""did:sov:"" ], ""credential_supported"": true, ""credential_endpoint"": ""https://server.example.com/credential"", ""credential_formats_supported"": [ ""w3cvc-jsonld"", ""jwt"" ], ""credential_claims_supported"": [ ""given_name"", ""last_name"", ""https://www.w3.org/2018/credentials/examples/v1/degree"" ], ""credential_name"": ""University Credential"" }¶ In certain instances it is advantageous to be able to construct a URL which points at an OpenID Connect provider, of which is invocable by a supporting OpenID Connect client.¶ The URL SHOULD use the scheme openid to allow supporting clients to register intent to handle the URL.¶ The URL SHOULD feature the term discovery in the host portion of the URL identifying the intent of the URL is to communicate discovery related information.¶ The URL SHOULD feature a query parameter with key issuer who's value corresponds to a valid issuer identifier as defined in OpenID Connect Discovery. This identifier MUST be a url of the scheme https:// of which when concatenated with the string /.well-known/openid-configuration and dereferenced by an HTTP GET request results in the retrieval of the providers OpenID Connect Metadata.¶ The following is a non-normative example of an invocable URL pointing to the OpenID Provider who has the issuer identifier of https://issuer.example.com¶ openid://discovery?issuer=https://issuer.example.com¶",https://Mattrglobal.github.io/oidc-client-bound-assertions-spec/,,Spec,,Standards,,,,,,,OIDC,2021-04-20,,,,,,,,,,,,,
|
||
Mattr,CCG,,GitHub,Dave Longley ; Manu Sporny,Mattr,,,,,Revocation List 2020,"This specification describes a privacy-preserving, space-efficient, and high-performance mechanism for publishing the revocation status of Verifiable Credentials.","This specification describes a privacy-preserving, space-efficient, and high-performance mechanism for publishing the revocation status of Verifiable Credentials. This document is experimental and is undergoing heavy development. It is inadvisable to implement the specification in its current form. An experimental implementation is available. It is often useful for an issuer of verifiable credentials [[VC-DATA-MODEL]] to link to a location where a verifier can check to see if a credential has been revoked. There are a variety of privacy and performance considerations that are made when designing, publishing, and processing revocation lists. One such privacy consideration happens when there is a one-to-one mapping between a verifiable credential and a URL where the revocation status is published. This type of mapping enables the website that publishes the URL to correlate the holder, time, and verifier when the status is checked. This could enable the issuer to discover the type of interaction the holder is having with the verifier, such as providing an age verification credential when entering a bar. Being tracked by the issuer of a driver's license when entering an establishment violates a privacy expectation that many people have today. Similarly, there are performance considerations that are explored when designing revocation lists. One such consideration is where the list is published and the burden it places from a bandwidth and processing perspective, both on the server and the client fetching the information. In order to meet privacy expectations, it is useful to bundle the status of large sets of credentials into a single list to help with herd privacy. However, doing so can place an impossible burden on both the server and client if the status information is as much as a few hundred bytes in size per credential across a population of hundreds of millions of holders. The rest of this document proposes a highly compressible, bitstring-based revocation list mechanism with strong privacy-preserving characteristics, that is compatible with the architecture of the Web, is highly space-efficient, and lends itself well to content distribution networks. As an example of using this specification to achieve a number of beneficial privacy and performance goals, it is possible to create a revocation list that can be constructed for 100,000 verifiable credentials that is roughly 12,500 bytes in size in the worst case. In a case where a few hundred credentials have been revoked, the size of the list is less than a few hundred bytes while providing privacy in a herd of 100,000 individuals. This section outlines the core concept utilized by the revocation list mechanism described in this document. At the most basic level, revocation information for all verifiable credentials issued by an issuer are expressed as simple binary values. The issuer keeps a bitstring list of all verifiable credentials it has issued. Each verifiable credential is associated with a position in the list. If the binary value of the position in the list is 1 (one), the verifiable credential is revoked, if it is 0 (zero) it is not revoked. One of the benefits of using a bitstring is that it is a highly compressible data format since, in the average case, large numbers of credentials will remain unrevoked. This will ensure long sections of bits that are the same value and thus highly compressible using run-length compression techniques such as ZLIB [[RFC1950]]. The default bitstring size is 16KB (131,072 entries), and when only a handful of verifiable credentials are revoked, the compressed bitstring size is reduced down to a few hundred bytes. Another benefit of using a bitstring is that it enables large numbers of verifiable credential revocation statuses to be placed in the same list. This specification utilizes a minimum bitstring length of 131,072 (16KB). This population size ensures an adequate amount of herd privacy in the average case. If better herd privacy is required, the bitstring can be made to be larger. The following sections outlines the data model for this document. When an issuer desires to enable revocation for a verifiable credential, they MAY add a status property that uses the data model described in this specification. |Property||Description| |id|| The constraints on the | |type|| The | |revocationListIndex|| The | |revocationListCredential|| The | { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/VC-revocation-list-2020/v1"" ], ""id"": ""https://example.com/credentials/23894672394"", ""type"": [""VerifiableCredential""], ""issuer"": ""did:example:12345"", ""issued"": ""2020-04-05T14:27:42Z"", ""credentialStatus"": { ""id"": ""https://dmv.example.gov/credentials/status/3#94567"", ""type"": ""RevocationList2020Status"", ""revocationListIndex"": ""94567"", ""revocationListCredential"": ""https://example.com/credentials/status/3"" }, ""credentialSubject"": { ""id"": ""did:example:6789"", ""type"": ""Person"" }, ""proof"": { ... } } When a revocation list is published, the result is a verifiable credential that encapsulates the revocation list. The following section describes the format of the verifiable credential that encapsulates the revocation list: |Property||Description| |id|| The verifiable credential that contains the revocation list MUST express an | |type|| The verifiable credential that contains the revocation list MUST express a | |credentialSubject.type|| The | |credentialSubject.encodedList|| The | { ""@context"": [ ""https://www.w3.org/2018/credentials/v1"", ""https://w3id.org/VC-revocation-list-2020/v1"" ], ""id"": ""https://example.com/credentials/status/3"", ""type"": [""VerifiableCredential"", ""RevocationList2020Credential""], ""issuer"": ""did:example:12345"", ""issued"": ""2020-04-05T14:27:40Z"", ""credentialSubject"": { ""id"": ""https://example.com/status/3#list"", ""type"": ""RevocationList2020"", ""encodedList"": ""H4sIAAAAAAAAA-3BMQEAAADCoPVPbQsvoAAAAAAAAAAAAAAAAP4GcwM92tQwAAA"" }, ""proof"": { ... } } The following section outlines the algorithms that are used to generate and validate revocation lists as described by this document. The following process, or one generating the exact output, MUST be followed when producing a RevocationList2020Credential: encodedListproperty set. encodedListto compressed bitstring. The following process, or one generating the exact output, MUST be followed when validating a verifiable credential that is contained in a RevocationList2020Credential: statusentry that is a RevocationList2020Status. encodedListproperty of the RevocationList2020Credential. revocationListIndexproperty of the RevocationList2020Status. trueif revoked is 1, false otherwise. The following process, or one generating the exact output, MUST be followed when generating a revocation list bitstring. The algorithm takes a issuedCredentials list as input and returns a compressed bitstring as output. revocationListIndexvalue in a revoked credential in issuedCredentials, set the bit to 1 (one), otherwise set the bit to 0 (zero). The following process, or one generating the exact output, MUST be followed when expanding a compressed revocation list bitstring. The algorithm takes a compressed bitstring as input and returns a uncompressed bitstring as output. This section details the general privacy considerations and specific privacy implications of deploying this specification into production environments. This document specifies a minimum revocation bitstring length of 131,072, or 16KB uncompressed. This is enough to give holders an adequate amount of herd privacy if the number of verifiable credentials issued is large enough. However, if the number of issued verifiable credentials is a small population, the ability to correlate an individual increases because the number of allocated slots in the bitstring is small. Correlating this information with, for example, where the geographic request came from can also help to correlate individuals that have received a credential from the same geographic region. It is possible for verifiers to increase the privacy of the holder whose verifiable credential is being checked by caching revocation lists that have been fetched from remote servers. By caching the content locally, less correlatable information can be inferred from verifier-based access patterns on the revocation list. The use of content distribution networks by issuers can increase the privacy of holders by reducing or eliminating requests for the revocation lists from the issuer. Often, a request for a revocation list will be served by an edge device and thus be faster and reduce the load on the server as well as cloaking verifiers and holders from issuers. There are a number of security considerations that implementers should be aware of when processing data described by this specification. Ignoring or not understanding the implications of this section can result in security vulnerabilities. While this section attempts to highlight a broad set of security considerations, it is not a complete list. Implementers are urged to seek the advice of security and cryptography professionals when implementing mission critical systems using the technology outlined in this specification. Write security considerations. There are a number of accessibility considerations implementers should be aware of when processing data described in this specification. As with any web standards or protocols implementation, ignoring accessibility issues makes this information unusable to a large subset of the population. It is important to follow accessibility guidelines and standards, such as [[WCAG21]], to ensure all people, regardless of ability, can make use of this data. This is especially important when establishing systems utilizing cryptography, which have historically created problems for assistive technologies. This section details the general accessibility considerations to take into account when utilizing this data model. Write accessibility considerations. There are a number of internationalization considerations implementers should be aware of when publishing data described in this specification. As with any web standards or protocols implementation, ignoring internationalization makes it difficult for data to be produced and consumed across a disparate set of languages and societies, which would limit the applicability of the specification and significantly diminish its value as a standard. This section outlines general internationalization considerations to take into account when utilizing this data model. Write i18n considerations.",https://w3c-ccg.github.io/VC-status-rl-2020/,,Spec,,Standards,,,Revocation,,,VC,,2020-04-05,,,,,,,,,,,,,
|
||
Mattr,Mattr,,,,,,,,,Verifiable Credential based Authentication via OpenID Connect,"At MATTR, we’ve been working hard on an exciting opportunity with the Government of British Columbia (BC Gov) in Canada. In June 2019, the BC Gov Verifiable Organisations Network team put out a “Code With Us” development bounty to integrate KeyCloak, their chosen enterprise Identity and Access Management (IAM) solution, with a new W3C standard called Verifiable Credentials. This work led to a solution that enables the use of Verifiable Credentials (VC) as a means of authentication that is interoperable with OpenID Connect (OIDC). We call this work VC-AuthN-OIDC. The output is an adapter that bridges these standards and enables a whole new set of capabilities through a simple extension of most modern IAM solutions.",,https://mattr.global/verifiable-credential-based-authentication-via-openid-connect/,https://mattr.global/wp-content/uploads/2019/10/0_Kcm1VBTjAxZP9Dkk-1024x465.png,post,,Standards,,,,,,,,2019-12-10,,,,,,,,,,,,,
|
||
Meeco,,Meeco,,Katryna Dow,MyData; Respect Network; DIF,"Australia, Melbourne, Victoria",Europe,GDPR,,Meeco,"Meeco gives people and organisations the tools to access, control and create mutual value from Personal data<br><br>Privately, securely and with explicit consent","Put your customers in control of their Personal data, identity and digital assets Unlock the power of permissioned Personal data and digital assets with enterprise infrastructure that has privacy, security and convenience built in. Reduce cost and meet data compliance requirements on a range of uses cases, from decentralised identity to digital asset management. Deploy new business models built on digital trust and evolve existing applications from Web2 to Web3 with our platform for Personal identity and data ecosystems. Trust is a key enabler of connected digital communities. It is central to delivering sustainable outcomes across financial services, mobility, health, education, environment, public administration, employment and eCommerce.next Seamless experiences are underpinned by tools that deliver interoperability. Citizens, employees, students, patients and customers can securely transact across networks and ecosystems.next Hybrid infrastructure will support the transition from Web2 to Web3, delivering security, convenience and decentralised services.next Enterprise customers can complete their Web3 transition with Secure Value Exchange by Meeco. Offering secure data storage through to self-sovereign identity and verifiable credentials, SVX is a complete toolkit for enterprise customers to deploy trusted Personal data ecosystems.next",https://meeco.me,,Company,,Company,Enterprise,ID,,Personal Data,,,,2012-08-23,https://github.com/Meeco,https://twitter.com/meeco_me,https://www.youtube.com/user/MeecoMe,https://blog.meeco.me/,,,https://www.crunchbase.com/organization/meeco,https://www.linkedin.com/company/meeco-me/,,https://dev.meeco.me/,https://app.meeco.me/,,
|
||
Meeco,Meeco,,,,,,,EU Data Strategy,,European Strategy for Data,"A Meeco Review of the European Strategy for Data Communication from the European Commission on February 19th, 2020",,https://media.meeco.me/public-assets/reports/Meeco_Review_of_European_Strategy_for_Data.pdf,,Report,,Ecosystem,Public,,,,,,,2020-02-19,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,EU Data Governance Act,,EU Data Governance Act,"We welcome the regulation as a needed common ground for clarifying the role of data intermediaries, building trust in these intermediaries and setting the direction for data governance, including the emergence of digital human rights.In this context we offer the following suggestions:<br>1. Explicitly include individuals as active participants in the definitions [...]<br>2. Clarify the scope of the data sharing services (Art. 9 (2)) and extend it to include services that empower the data subject beyond compliance.<br>3. Foster the growth of intermediaries, which offer new technologies and have the greatest likelihood of success in Europe if supported by the Data Governance Act.<br>4. Open silos and implement soft infrastructure such as standards & open APIs to accelerate uptake and interoperability between data sharing services.<br>5. Foster eco-systems and demonstrate the value through practical use-cases.<br>6. Create a level playing field for sustainable data sharing by providing funding to pioneers at the forefront of developing data eco-systems<br><br>","The proposed European Data Governance Act is another progressive indication that the EU is seeking to develop a more equitable digital economy. However, where we go from here depends on how the European Union is able to use the Data Governance Act to strike a balance between the existing tech giants and data platforms alongside an entirely new range of services designed to enable the collection, protection and exchange of data. Currently, a handful of global players enjoy a virtual monopoly on the exploitation of data. Unlocking these data silos and regulating for data mobility and interoperability will provide the vital infrastructure required for meeting the challenges of the next century, including timely and informed decision making. At Meeco we believe that enabling citizens, students, patients, passengers and consumers to more equitably join the value chains fuelled by data will ultimately lead to greater trust and Personalisation, resulting in a more prosperous society. However, this will require new commercial models, enforceable regulation such as the Data Governance Act and the digital tools to transform our connected society. We believe this will lead to significant benefits to including Personalised health and education, increased financial literacy and better financial decisions, more informed consumer choices which also contribute to protecting our environment. Meeco is endorsing the Data Governance Act as a founding member of Data Sovereignty Now; a coalition of leading Europe-based technology companies, research institutions and not-for-profit organisations. We are working together to ensure that the control of data remains in the hands of the people and organisations that generate it in order to play a key role in not only securing the rights of individuals over their data, but also providing significant stimulus for the digital economy. Meeco is also a member of MyData Global and was amongst the first 16 organisations to be awarded the MyData Operator designation in 2020. We join in the goal towards developing interconnected and human-centric data intermediaries to meet the Personalisation and equity challenges of open digital society. We welcome the regulation as a needed common ground for clarifying the role of data intermediaries, building trust in these intermediaries and setting the direction for data governance, including the emergence of digital human rights. In this context we offer the following suggestions: - Explicitly include individuals as active participants in the definitions: define the key roles in data sharing (Art. 2 Definitions) so that data rights holders (data subject) and technical data holders (controller or processor) can be separated and acknowledge the type of data sharing where individuals are active participants in the transactions. - Clarify the scope of the data sharing services (Art. 9 (2)) and extend it to include services that empower the data subject beyond compliance. - Foster the growth of intermediaries, which offer new technologies and have the greatest likelihood of success in Europe if supported by the Data Governance Act. - Open silos and implement soft infrastructure such as standards & open APIs to accelerate uptake and interoperability between data sharing services. - Foster eco-systems and demonstrate the value through practical use-cases. The EU data sharing ecosystem is formative; therefore, it is imperative to demonstrate utility and benchmark best practices that contribute to a more sustainable, healthy, resilient and safe digital society. - Create a level playing field for sustainable data sharing by providing funding to pioneers at the forefront of developing data eco-systems, this includes start-ups, scale-ups alongside established enterprises. Included is a Meeco white paper detailing practical use-cases aligned to our response, including the barriers the Data Governance Act can address to make data work for all. Meeco is a global leader in the collection, protection & permission management of Personal data and decentralised identity. Our award-winning patented API platform & tools enable developers to create mutual value through the ethical exchange of Personal data. Privately, securely and always with explicit consent. Data Sovereignty Now is a coalition of partners who believe that Data Sovereignty should become the guiding principle in the development of national and European data sharing legislation. Data Sovereignty is the key driver for super-charging the data economy by putting the control of Personal and business data back in the hands of the people and organisations which generate it. The foundation members include aNewGovernance, freedom lab, INNOPAY, International Data Spaces Association, iSHARE, Meeco, University of Groningen, MyData Global, SITRA, The Chain Never Stops and TNO. MyData Global is an award-winning international non-profit. The purpose of MyData Global is to empower empower individuals by improving their right to self-determination regarding their Personal data, based on the MyData Declaration. MyData Global has over 100 organisation members and more than 400 individual members from over 40 countries, on six continents.",https://blog.meeco.me/eu-data-governance-act/,,Review,,Ecosystem,Public,,,,,,,2021-02-16,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,EU Data Strategy,,Meeco Review of the European Data Strategy,"This document has been written to give the reader a snapshot of the new European Union (EU) Strategy for Data published as an EU communication on February 19th, 2020. In this document the authors express their opinion by way of commentary on the topic of Personal data management and analysis of the strategy that the EU will adopt.<br><br>Meeco’s review points the reader to some of the most important elements of the EU’s position on various data issues, as well as the key elements of its strategy. We have taken care to include all direct excerpts between quotation marks and to reference them clearly back to the original communication document, by way of footnotes and demonstrated through relevant case studies.","The EU wants to become a leading role model for a society empowered by data to make better decisions – for individuals, in business and the public sector. It intends to be a major actor in the new Data Economy, holding its own against other world powers. The EU is committing to build a supportive environment in Europe from a regulatory and legal perspective. It will participate alongside Member States and private enterprise to build next generation technology and infrastructure solutions such as Cloud at the Edge Computing, Quantum Computing and of course Blockchain. – European Strategy for Data, A Meeco review of the European Strategy for Data Communication from the European Commission on February 19th, 2020. Europe will continue to place the individual citizen at the centre of the data equation, by promoting a human-centric approach to Personal data management. This document has been written to give the reader a snapshot of the new European Union (EU) Strategy for Data published as an EU communication on February 19th, 2020. In this document the authors express their opinion by way of commentary on the topic of Personal data management and analysis of the strategy that the EU will adopt. Meeco’s review points the reader to some of the most important elements of the EU’s position on various data issues, as well as the key elements of its strategy. We have taken care to include all direct excerpts between quotation marks and to reference them clearly back to the original communication document, by way of footnotes and demonstrated through relevant case studies. The production of data is literally exploding, with volumes growing massively from 33 zettabytes in 2018 to 175 zettabytes forecast in 2025. Much of this growth will be driven by the Internet of Things but not solely, as individuals, businesses and organisations fully understand the true potential of data-usage. In promoting more effective data usage, the EU believes that we can solve social and environmental issues for example and in doing so make for a more prosperous and sustainable society. “Data is the lifeblood of economic development” – page 14 European Strategy for Data € 4 – 6 Billion to be invested in total in 9 common European data spaces and a European federation of Cloud infrastructure and services. In addition, the strategy calls out the need for Personal data spaces that are ethical, compliant and human-centric. Industrial (manufacturing) data space, to support the competitiveness and performance of EU industry and to capture the potential value of non-Personal data in manufacturing. Green Deal data space, to use the major potential of data in support of the Green Deal priority actions on climate change, circular economy, zero-pollution, biodiversity, deforestation and compliance assurance. Mobility data space, to position Europe at the forefront of the development of an intelligent transport system and to facilitate access, pooling and sharing of data from existing and future transport and mobility databases. Health data space, which is essential for advances in preventing, detecting and curing diseases, as well as improvements in the accessibility, effectiveness and sustainability of the healthcare systems. Financial data space, to stimulate enhanced data sharing, innovation, market transparency and sustainable finance, as well as access to finance for European businesses and a more integrated market. Energy data space, to promote availability and cross-sector sharing of data, in a manner that facilitates innovative solutions and decarbonisation of the energy system. Agriculture data space, to drive sustainable and competitive agricultural performance, by processing and analysing data that helps a tailored and precise approach to production at the farm level. Public Administration data space, to boost transparency & accountability in public spending, fight corruption and support law enforcement needs and enable innovative ‘gov tech’, ‘reg tech’ and ‘legal tech’ applications. Skills data space, to reduce the skills mismatches between the education & training system and labour market needs, with a strong link to the EU’s digital skill enhancement programs and investments. We have also taken the opportunity by way of Case Studies, to highlight leading examples of Personal data management, and how their approach to Personal data management fits in with the European strategy for data. The Case Studies comprise five key perspectives: Business, Legal, Technology, Barriers and Catalysts. Through this lens we have addressed the commercial opportunities enabled through new technologies designed to meet regulatory compliance. We also explore the barriers that exist to build scale, together with the strategies that can accelerate adoption. Human-centric Personal data solutions allow individuals to derive more benefit from the Personal data that they exchange with the people and organisations they trust. For enterprise and organisations, they bring better compliance with Personal data rights and regulations, new business model opportunities and improved operating efficiencies. Everyone wins! Still, barriers to human-centric data solutions exist such as embedded data practices and business models, despite hefty fines and flat growth. These can be broken down in future by catalysts such as greater investment in European alternatives to existing social and surveillance economy platforms. Personal financial management requires a human-centric approach based on compliance, tax regulation and Personal circumstance and information. But accessing and reviewing the relevant data and information is both costly and time-consuming. In wealth advisory, use of Personal data management products supporting secure financial collaboration between advisors and clients facilitates transaction readiness and significantly decreases time-to-value. The upfront effort to collect relevant Personal information on a Personal data management platform can be a real barrier to sharing it with a wealth advisor, adversely impacting the quality of the advice. Giving Europeans easy, secure and timely access to machine-readable financial data (banks, brokers, insurers and government) and initiatives such as PSD2 and Open Banking are amongst catalysts that will help to increase adoption of this best practice. The use of a Privacy-by-Design Digital Safe is a way in which enterprises can provision added-value services for their individual customers. In the case of a bank, this is akin to the extension of a traditional bank safe, in that it serves to store Personal and valuable things (data, documents, Personal information), it requires two sets of keys to open the safe and only the registered owner of the safe can access the contents. The Digital Safe serves as the foundation for building a whole portfolio of data-enabled services that generate great value for individual customers. It also serves to position the enterprise or organisation one step ahead of GDPR. A common barrier to enterprise solutions for Personal data management is that many business managers do not yet fully appreciate the business opportunities that Personal data management offers. A catalyst that can serve to offset this barrier amongst others would be tax and financial incentives for European enterprises that adopt human-centric technologies and demonstrably contribute to the success of the EU Data Strategy People and businesses need to make claims about Personal information and status or business information daily by using physical documents issued by trusted organisations, such as passports, audited tax returns, diplomas or marriage certificates. In a digital world this becomes much more complicated. The use of digital Verifiable Credentials and blockchain technology allows the individual or business to manage their credentials, once issued by a trusted source, on multiple occasions and with different 3rd parties. Two organisations that are actively driving the standards around Verifiable Credentials Decentralised Identity are the Worldwide Web Consortium (W3C) and the Decentralized Identity Foundation (DIF). The single most important barrier to the development of Verified Credentials is the effort required to help individuals, enterprises and businesses understand blockchain and distributed technology. Catalysts that would help achieve more widespread adoption of Verifiable Credentials include a clear governance framework to clarify the roles of issuer, recipient and verifier, as well as standards for interoperability that help to read, use and revoke Verified Credentials. Data is one of those rare commodities that can be used and re-used by multiple parties, across different sectors and concomitantly or at different times. But this means data needs to be interoperable and reusable. Two organisations that are offering Thought Leadership and promoting the governance and standards required in this domain are MyData.org and The Kantara Initiative. A definite barrier to interoperability and reusability of Personal data is that enterprises and organisations continue to believe that they own their clients’ Personal data and therefore make no effort to render it interoperable and reusable. Catalysts that would help achieve more widespread adoption of Verifiable Credentials include a clear governance framework to clarify the roles of issuer, recipient and verifier, as well as standards for interoperability that help to read, use and revoke Verified Credentials. There is a wide spectrum of potential data-enabled business models that could benefit from the European data strategy. At one end, the model predicated on monetisation of Personal data by virtue of the transfer of control or data to a 3rd party. At the other end of the spectrum, a model predicated on collaboration between the individual data subject and 3rd parties creating an equitable share in the value generated. Market forces will ultimately determine which model if any prevails. The barriers to such disruptive business models include first and foremost the slow rate of adoption of human-centric tools, allowing individuals to collaborate with enterprises and organisations. Beyond incentives for the offer and uptake of human-centric tools, a catalyst that can help promote new business models would be the reduction in reliance on current platforms that monetise data versus surfacing intent insight that rewards individual participation. Despite all the federation and unity that still holds Europe together, there is a degree of fragmentation amongst Member States with regards to adapting their legal frameworks to the host of issues involved in data management in Europe. There are 8 priority issues that will need to be solved by all Member States together if Europe is to leverage the size of its internal data market. These include: The EU strategy for data is built on four pillars: Amongst the various actions that support these four pillars, it is worthwhile highlighting the very considerable investment that will be dedicated to ensuring that Europe reaches sovereignty in the technology that is necessary to support data management and usage. The EU intends to reach beyond its borders in all matters pertaining to international data flows, in order to protect the rights of its citizens and the competitiveness of its enterprises. It also sees an opportunity to attract storage and processing of data from other countries and regions, based on its high standards in data regulation such as GDPR, as well as ongoing policy formulation. By any standards, Europe is a vast entity. In 2017 it had 27.5 million active enterprises, employing 150 million people. And the total population for the 28 Member States was close to 450 million inhabitants. There is a sense of the huge opportunity of a single European data market that works properly. This document has been written to give the reader a snapshot of the new European Union (EU) Strategy for Data published as an EU communication on February 19th, 2020. In this document the authors express their opinion by way of commentary on the topic of Personal data management and analysis of the strategy that the EU will adopt. Meeco’s review points the reader to some of the most important elements of the EU’s position on various data issues, as well as the key elements of its strategy. We have taken care to include all direct excerpts between quotation marks and to reference them clearly back to the original communication document, by way of footnotes and demonstrated through relevant case studies.",https://www.meeco.me/data,,Review,,Ecosystem,Public,,,,,,,2019,,,,,,,,,,,,,
|
||
Meeco,Meeco,,HelloUser,,,,,,,"Hello, User: Episode 13 with Katryna Dow","Welcome to lucky episode number 13! Your new host Aubrey Turner, Executive Advisor at Ping, is thrilled to welcome Katryna Dow, CEO & Founder of the award-winning data platform Meeco. Katryna discusses Meeco’s mission to enable everyone on the planet access to equity and value in exchange for the data and information they share. She talks about why she saw a need for Meeco’s services, what we need to know as we approach a more “physigital”world, and how her vision all started with a Tom Cruise film.","Description: Welcome to lucky episode number 13! Your new host Aubrey Turner, Executive Advisor at Ping, is thrilled to welcome Katryna Dow, CEO & Founder of the award-winning data platform Meeco. Katryna discusses Meeco’s mission to enable everyone on the planet access to equity and value in exchange for the data and information they share. She talks about why she saw a need for Meeco’s services, what we need to know as we approach a more “physigital”world, and how her vision all started with a Tom Cruise film. Key Takeaways: [1:34] Katryna talks about her journey of founding Meeco, and how she was inspired by Tom Cruise’s movie Minority Report. In early 2012 she sat down and wrote a Manifesto, and asked the question: what would happen if everyday people had the power to make really good decisions on data, the way that social networks, government, and enterprise do? How can we create meaningful value and make better decisions with our data? [8:12] Katryna shares some of her concerns around modern privacy and where she sees things evolving, both good and bad. [9:35] Technology is neutral. It’s what we do with it that gives it bias and can make it either creepy or cool. [11:33] What does Katryna mean when she says it starts with trust by design? [17:22] The next wave may be just starting to bring people and things into the direct value chain, through wearables or IoT devices for example. [18:31] How can we create better digital onboarding for employees, knowing that even post-COVID-19 our world will not go back to how it was in December 2020? One thing that Katryna is sure of is that we must lean into innovation rather than doing nothing and waiting to see. [36:13] We must make sure we are paying attention to the misalignment between law and technology, especially when it comes to ethics and the safety of children growing up in a digital-forward world. Quotes: - “I think the challenge for any kind of technology and regulation is a lag factor, not a lead factor.”—Katryna - “The line between creepy and cool is one of the things we are always trying to address from a technology point of view.”—Katryna - “There isn’t really the option to not find better ways of digitally engaging.”—Katryna Mentioned in This Episode: PingIdentity AubreyTurner KatrynaDow Meeco “HowCOVID-19 has pushed companies over the technology tipping point—and transformed business forever”",https://hellouser.libsyn.com/episode-13-with-katryna-dow,,Episode,,Explainer,,,,,,,,2022-01-11,,,,,,,,,,,,,
|
||
Meeco,Ubisecure,,,,,,,,,"Katryna Dow - Data minimisation: value, trust and obligation","She is the founder and CEO of Meeco, a Personal data and distributed ledger platform that enables people to securely exchange data via the API-of-Me with the people and organisations they trust. Katryna has been pioneering Personal data rights since 2002, when she envisioned a time when Personal sovereignty, identity and contextual privacy would be as important as being connected. Now within the context of GDPR and Open Banking, Distributed Ledger, Cloud, Artificial Intelligence and the Internet of Things have converged to make Meeco both possible and necessary. For the past three years, Katryna has been named as one of the Top 100 Identity Influencers.","with Katryna Dow, founder and CEO of Meeco. Katryna talks to Oscar about her career (including inspiration from Minority Report), Meeco’s Personal data & distributed ledger platform, the importance of data minimisation to inspire trust in organisations, and cultural differences in attitudes towards digital identity. [Scroll down for transcript] “The greatest way to overcome this privacy paradox is transparency.” “Where regulators have moved to increase the data transparency and data rights of individuals, these need to actually be part of the solution architecture.” Katryna Dow is the founder and CEO of Meeco; a Personal data & distributed ledger platform that enables people to securely exchange data via the API-of-Me with the people and organisations they trust. Katryna has been pioneering Personal data rights since 2002, when she envisioned a time when Personal sovereignty, identity and contextual privacy would be as important as being connected. Now within the context of GDPR and Open Banking, distributed ledger, cloud, AI and IoT have converged to make Meeco both possible and necessary. Find out more about Meeco at Meeco.me. For the past three years, Katryna has been named as one of the Top 100 Identity Influencers. She is the co-author of the blockchain identity paper ‘Immutable Me’ and co-author/co-architect of Meeco’s distributed ledger solution and technical White Paper on Zero Knowledge Proofs for Access, Control, Delegation and Consent of Identity and Personal Data. Katryna speaks globally on digital rights, privacy and data innovation. Follow Katryna on her blog at katrynadow.me, on LinkedIn and on Twitter @katrynadow. We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla. Oscar Santolalla: Hi and thanks for joining today. Today, we’re going to have a very interesting conversation about how many technologies and business ideas converge into products that help people directly to protect their data and their identity. For that we have a very special guest. Our guest today is Katryna Dow. She is the founder and CEO of Meeco, a Personal data and distributed ledger platform that enables people to securely exchange data via the API-of-Me with the people and organisations they trust. Katryna has been pioneering Personal data rights since 2002, when she envisioned a time when Personal sovereignty, identity and contextual privacy would be as important as being connected. Now within the context of GDPR and Open Banking, Distributed Ledger, Cloud, Artificial Intelligence and the Internet of Things have converged to make Meeco both possible and necessary. For the past three years, Katryna has been named as one of the Top 100 Identity Influencers. Hello, Katryna. Katryna Dow: Hello, Oscar. That introduction makes me feel I’m going backwards and forwards in time at the same time. Oscar: Very nice talking with you now Katryna. It’s super interesting having this conversation with you. I know there are so many things we can talk about. And so, I would like to hear from you what was your journey to this world of digital identity? Katryna: So, I don’t know where to start because I’m not sure it’s something that I ever consciously woke up one day and went, “Oh, you know, I really want to work in the identity space.” And I think that maybe true for a lot of people that maybe you’ve even interviewed previously. It actually unfolds out of something that is either driven by something you’re trying to do in society or related to commerce or related to access to services. And then all of a sudden you have this question of who or what are you? Are you supposed to be here? Are you allowed to have access to this place or this thing? And now you have access, what are you allowed to do? And kind of that idea of us coming up against some kind of perimeter where we need to prove or state who we are and then have access. It happens to us every single day in every way whether or not it’s jumping on a bus or paying something or opening a door or joining a call with a password. This idea of this entrance is something that is a theme for us in the physical world, in the digital world. And therefore I think, one of the things is when we say identity it really just means so many things. And it has so many applications to so many aspects of our life. And I think rarely does it mean the same thing in every circumstance or certainly the same thing to everybody. Oscar: But your origins were in the legal, in the tech, how you started? Katryna: Yeah. So, I think where it started for me, actually is I’m a big Sci-Fi fan and I guess the film that really changed the trajectory of my career and led to me starting Meeco was the film Minority Report. And I don’t know if you remember that or if listeners remember that film. It actually came out at around I think 2002. And in fact, Jaron Lanier, who went on to write a number of books specifically around privacy and Personal data, he was the technical consultant on that film. And it featured lots of fantastic, super futuristic technologies. And the interesting thing if you listen to Jaron Lanier speak now, for the film he invented all of these cool things where everyone’s data was being aggregated and everyone could be tracked and traced. And his aim was actually to scare people in terms of what that future would look like but the technology looks so cool. I think people then started from that time on starting to implement it. And for me, I remember watching that film and there are a couple of key scenes where you have Tom Cruise running through a mall with all of these digital messages being streamed into his eyes, all these advertisers, based on his digital footprints. And I just remember being in the cinema just staring at the screen and thinking, “You know, is this really the future?” And in some ways, it may look like a marketer’s dream. But how do we navigate that? How do we navigate that in decision making? How do our brains cope with that? How do our senses cope with that? How does our sense of self cope with that? And I remember coming out of the cinema and thinking, “I really enjoyed the film. But it freaked me out and someone should do something about this.” And it never dawned on me that it would be something that sort of changed the direction of my life in years to come. But I could say that there was absolutely that moment, walking out of the cinema, standing on a street corner, looking up to the sky and thinking, “Is this the future we’re heading towards?” And then it really started to bother me. So there were many, many aspects of that film around the idea of access to our information, our digital identity, our physical identity, our freedom of movement. Tere were just so many parts of that that then started to bubble up for me to be problems to solve or things to understand, which eventually led to me starting Meeco which was almost a decade later. Oscar: So super interesting. You are here because of, among other reasons, because of Minority Report. Super, super interesting. I would like to hear, as you start saying about the problems that have been appearing during these years, and now, you have some solutions for those and they are behind your company, Meeco. So, I’d like to jump into that. So, tell us about Meeco. What are you doing? What are use cases that you are working with? Katryna: Yeah. So, one of the things that we say that Meeco is simply the API-of-Me. And the reason we say that is pretty much everything that we do today, even the fact that you and I having this phone call, this digital phone call, you’re recording it, we’re co-creating data and information. So, everything we do creates data. And so we’re creating this digital twin or this digital replica of ourselves every minute of the day. And I guess the idea first came from the perspective that my background at that time that I started the company, I was working in financial services, I was working in strategy. You know I was noticing that a lot of the products and services that we were creating every day were not really helping the people that we were trying to help. They weren’t helping customers. And a big part of that was because we were building things that we thought people wanted or needed without necessarily understanding what was happening in people’s lives. And it seemed to me that data or understanding what was going on for somebody was a really important part of it. So that was one aspect. The other aspect was obviously this thing with Minority Report that had been driving me crazy. Another component of it was the idea that we’d already seen significant business value in organisations being able to integrate digitally. So the idea that you could have a financial institution talking to an insurance company or talking to a government department through a technical integration obviously led to really great outcomes. And so, for me, I was thinking OK, if we’re becoming more and more digital ourselves, wouldn’t it make sense that we could integrate directly and hence the idea of the API-of-Me. If we could bring our digital identity, or the relevant parts of our digital identity, and our data together into an API and then exchange that directly, plug that into an organisation and exchange that directly with people or organisations we trust. And why? Because that would give us a greater chance of having some context and control over the kind of information we share. But it would also mean that we could start to connect into more meaningful relationships where by using that information, hopefully we could make better decisions and have better outcomes. So the idea initially was around this idea of if you have access to data and information. And early 2013, I wrote a manifesto which is around the idea that up until now, governments, organisations, social networks, have had the ability to bring data together. What if you and I had that same power? So that was kind of the early concept. And at that time, to sort of validate whether or not the world was going that way, Facebook was about to do its IPO. So I think their IPO was around May 2012. And of course, there was massive growth in the organisation and so it was one of the things that I wanted to look at in terms of validating this hypothesis. So, if we were able to have access to our data, if we have secure tools and a method of exchanging or sharing that, if we wanted to use data to make better Personal decisions, better family decisions, better society decisions, what was happening in the data space? And the first thing I did was look at Facebook’s IPO documents and I jumped straight to the risk area to look at the risks that they were signalling to the market that may impact either the reputation of the company or their share price in the future. And there were four things that really jumped out at me as worthy problems to solve and the first one was privacy. This idea that individual users of the platform at some point might be concerned about their privacy. The second thing was that people may actually come to understand the value of their data and there was a risk that maybe the business model that Facebook was recommending may be something that individuals may see was not in their best interest which led to another risk around the possibility of regulation. So regulation that would protect data rights or protect your privacy. And then the fourth really important thing at that time was that Facebook was moving from desktop more and more to mobile, and so many aspects of their business model were becoming more opaque, you know their under the bonnet. And so a lot of what was going on around data collection and data use in advertising was not so visible on a small screen. And so for me, I was looking at those four things and I thought, “Well what would be the opposite of that?” The opposite of that would be a platform that was privacy by design, security by design. It would be a platform that didn’t read or mind or sell Personal data. It would be a platform that took into account the importance of regulation and protections. And it was a platform where the focus of the data use was for the specific benefit of the data controller. And it was sort of each of those factors that became the inspiration of saying, OK, why don’t we start to build this as a technology platform? Oscar: Yes. And these four risks that Facebook identified at that time, the time of the IPO, you analysed them. So, it means that now Meeco is a product that somehow is addressing these four risks. From the point of view of Facebook, they are risks. From the point of view of the people they are what would protect us. So, now you have a product, Meeco as a product could enable an individual to take control of their identity and data. Katryna: So we started with that Oscar. We started with the idea – well in fact, if I go back to the 2012, 2013 I think like any start-up story or the idea of you get one thing right, you get five things wrong. So, the early days were absolutely pioneering. You know we were trying to understand how we could realise the vision of those four things. And we certainly developed a product, in fact, early 2014 we developed a platform that could do all of those things that I just described. However, what we realised was that we were still really early from a market point of view. And one of the things that was missing was helping organisations, helping government, helping society understand that shift in power towards individuals being part of the value chain, being able to collaborate directly, but also use cases that focused on mutual outcomes. So, if I fast forward to where we are now, it’s not that any of our technology has necessarily changed but I wouldn’t describe Meeco as a technology now. I think what I would describe Meeco more accurately is a really important set of tools or a toolkit. And so, what we’ve done over the last couple of years and really importantly more so through COVID is realise that yes, you can build a product and you can try to bring a product like that to market and focus on either a single use case or set of use cases, or you could take a step back and think what if you took the components of that product and turned those into tools so that you could actually help solve these problems in lots of products. And really that’s what our focus is. So we have a suite of APIs. We focus on collection and protection of data, so encryption, the exchange, so fine-grained consent, the ability for you and I to connect and for me to be able to decide what I want to share and for how long, and a whole range of modular solutions that are available for a bank to use to build a more privacy enhancing experience for their customers. So a good example of that is KBC bank here in Belgium that had our technology inside three of their retail bank brands and that’s specifically to allow their customers to have a digital vault where all of the data is directly under the customer’s control and the bank can’t read that data and we can’t read that data. And it’s up to the customer to decide how they want to use that data. So, that’s an example of how our privacy by design, security by design APIs could develop something specifically for an end customer. And another example also here in Belgium is a really exciting product that we will co-launch in a few months called Mix It and it’s actually a media platform specifically for children. And as we often say the features of Mix It are all the things it doesn’t do, so no tracking, parental controls that make sure that there’s no content that a child has access to, the content has all been approved by a guardian, a parent. And again, it’s all our backend technology, our APIs, our encryption, our privacy by design, security by design capability. But the whole idea of what you can do with that is really what our focus is now. So for us, it’s here are these foundational tools that give people access and control and then how might you start to build applications that can be more inclusive, more privacy enhancing and ultimately create better outcomes for the service provider and definitely for the individual customer. Oscar: Sure. I understand more. So, nowadays Meeco has, as you said a toolkit, a set of tools that address different problems. For instance, who is the customer of Meeco and who is the one who is going to use the API, implement the API or integrate. It’s an organisation, correct? Katryna: Well, yes, we often say that we play this interesting position of kind of Switzerland, this neutral territory. Because if we go back to our original manifesto which still guides every decision we make, our focus is on empowering individuals – there’s no question about that. But we recognise, and this was part of our learning process, that you can actually make a much bigger difference if you’re able to take that capability and that power in those tools to where customers are already, where citizens are, where students are, where patients are, rather than expecting people to dramatically change their behaviour. And I think that’s one of the big challenges, you know with technology in general but certainly around privacy and helping everyday people to understand the benefits of being able to have greater control or transparency over their information. So in that regard, we’re very, very clear that our end-customer or the person we serve is always the individual that has those digital rights, always. However, we recognise one of the ways to do that is to enable applications to be built that make people data happy, or regulators data happy, or families data happy, or cities data happy. And what we mean by that is take all of those four foundational ideas that we had right at the beginning of building the company and our technology and make sure that they’re available to be embedded in existing applications but to give very different outcomes. So those different outcomes are greater transparency. Those outcomes are around privacy, security. Those outcomes are around ensuring that individuals recognise, before they say yes to something, whether or not that is going to result in a better outcome, or whether or not it enables them to do something with information in terms of decision making or access that they haven’t been able to do previously. So, our focus is always on how do we serve the customer, the patient, the student, the citizen and therefore how do we make these tools available into the various technologies that people are using every single day by adding some additional choices for them around that control and transparency. Oscar: And going – now, focusing on the identity of the set of tools that Meeco has, so how is the identity? How is the identity that is created? And could you tell us about that? What is created? Katryna: Sure. So there are a number of things that we do from an identity point of view. So, the most independent set of capabilities support self-sovereign identity. So we follow the W3C Standards with regards to verifiable credentials and SSID, so the ability for us to generate a wallet for verified credentials to be able to use to prove your identity. We also assist with a key store to help people bootstrap and get themselves out of trouble if they were to lose that identity in some way, by being able to regenerate keys and take some of the complexity out of SSID. And then the other thing that we do is we can marry that together also with a secure data storage, or secure enclave, where data can be saved and controlled and then it can also be integrated into other solutions. So, in terms of identity that’s one aspect. The other part of identity is to be able to embed our applications and tools inside existing infrastructure. So for instance, into your mobile banking applications so that you would log on to the bank exactly as you do now. You would rely on all the bank’s security architecture, except that you would find yourself transferring to a secure space inside a secure space that was completely controlled by you in terms of your data and information. So, sometimes it’s a case of identity standing alone and independent. And other times it’s about layering that identity into secure spaces that give you the ability to work within an existing environment. So for instance, within the context of your banking application or in an ecosystem, so for instance within the context of the relationship that you may have through your mobile banking identity with the other service providers, along with the ability for you to control the information that you want to share, or being able to have access to data from other parts of your life and bring that together with the information that is available to you from that service provider. And I guess the best example of that, the simple example of that, is obviously what’s happening with Open Banking. Oscar: OK, interesting. So you said the way you are addressing is thinking of the individual but you are operating with the organisation mostly. And what about this balance between the privacy needs that the people have and the privacy needs of the data needs of the organisation. Katryna: That’s an excellent question because one of the things that we know is that organisations are always looking for more data as a means of being able to Personalise outcomes and so there’s often a strong argument to say why they would like more data. But we also recognise, and I was actually reading a report just this week, it was published by the Australian government and it’s a post COVID study – and by post COVID I mean it’s very recent. It’s not something that was done before March. It’s been done within the last month or so, looking at trusts, specifically trust and privacy. And one of the really interesting statistics that jumped out at me, and this is in the Australian context, is that up to 81% of Australians in the research felt that when organisations ask for data that’s not directly related to the service that they are involved in, that they consider it misuse. So I think this is this paradox that we have here. So we have organisations that want access to more data in order to understand the customer better or the patient better or the student better or a city better to plan you know mobility or access to services or public transport. So you see that there is this interest in having the data but at the same time without that sense of transparency or there being a clear value proposition, it’s leading to greater mistrust. And so, this is one of the ways that we believe very strongly our technology addresses this privacy paradox – by bringing the individual directly into that value chain and making it clear what information you want and why, and how that is going to create some beneficial outcome. And the beneficial outcome doesn’t always have to be for the individual. A perfect example is how you may want to share data for city planning, so that public transport runs more effectively. Or, how you may want to participate in a clinical trial to make sure that something like a vaccine is more effective. Or how you may want to take intergenerational information that’s been important for you because you have small children, and you may have had an illness with one of your parents and you recognise that there is a great benefit in actually understanding the health and well-being of the family from an intergenerational point of view. So, one of the things we always talk about is equity and value. And we’re not talking about money. We’re not talking about grabbing your data and selling it because in some ways that’s a bit of a race to the bottom. Because the ability to help people make better decisions all contribute to decisions in their family, or their society, are actually just some of the ways that people can collaborate and use their data. So, the important thing is that transparency and if we go back to that statistic of 81% of people mistrusting and being concerned, we actually find it flips over to be greater than 86% when organisations are very transparent around why they were asking for something, how they’re going to use it and then what the value proposition or the benefit is for you to participate. And so, the greatest way to overcome this privacy paradox is actually transparency for organisations to say, “In order for us to serve you, these are the things we would need to know and this is why and this is how we will use this information to help.” Which leads to another really important factor which is data minimisation. Because when you then are able to enable the customer to directly collaborate with you then you only need to collect what you need for a specific period of time for a specific outcome. And so, not only does that reduce the burden for the organisation in terms of data collection, risk, fraud, saving unnecessary data, the compliance of unnecessary data. But it also reinforces the trust because now what you’re asking for is what you need and then you’re fulfilling an outcome which, if that is satisfactory, leads to the opportunity to build on that. And so, by taking this perspective and allowing the individual to be actually part of that collaboration, it actually does lead to better outcomes for both parties and it also starts to manage for an organisation their compliance responsibilities and also the risks associated with over collecting data. Oscar: Yes, as you said transparency is very important, so people feel that they are revealing some of their data but for a good purpose. And yeah, that also made me remember some times in both commercial and in public sites you are asked for a long list of the data – too much if it feels already too much, it’s already tedious. And at this point, with people are getting aware of these breaches, companies misusing data, it already feels like “Oh, it’s too much.” So, data minimisation is super important as you said. But also, I was thinking when the organisations tell you, OK, we are going to use this data of you for these reasons and that is for the let’s say, public benefit. But how does this have to be communicated? Just in a form or should there be some easier way for people to… Katryna: Yeah, I think it needs to be embedded in a really beautiful and simple digital experience and I think that’s the key thing. And look, lots of our research and studies have validated this. Many of the use cases we’ve built together with our partners that are deploying our technology. They show time and time and time again that when you build that transparency into a digital experience and you make it clear what you need to collect and why and the purpose and you make that understandable, really easy for an end-user to process, that again makes it much more likely that the individual be able to make an informed decision and engage. So, part of the idea of making that simple and clear is that leads to people being able to make good decisions and timely decisions. The more complexity there is the more likely- you know what it feels like if you don’t understand something or something seems like it’s just a wall of terms and conditions or makes you feel uncomfortable or afraid, the easiest thing to do is to say no. And so, the focus is always on that simplicity and making it really clear what that decision is about. And also, if you marry that together with the idea of data minimisation it comes back to your comment, yeah, there are so many times that you apply for a service and there are so many questions that are asked that don’t seem to be relevant at all. And I think we’ve got so used to answering those questions or filling out those forms without stepping back and going, “Hold on, I’m subscribing right now to a media platform does it really matter whether I’m a man or a woman? Yeah? Does it really matter what decade I was born or what date I was born?” And you understand that some of those things may be really important if you’re focusing on KYC, you know where you have to know the customer, you’re providing a regulated service and you need to be sure that it’s really you. But there are many, many services where we’ve got used to giving away really intimate, important information about ourselves that doesn’t really make too much of a difference with the service and what we’re trying to achieve. And I think that’s a big part of helping organisations to recognise that if you then focus on the things you really need for Personalising the service in a way that has a good outcome, then you’re much more likely to have people participate in sharing that information in that context. Oscar: Yeah, yeah definitely as you said, very simple, very engaging and well-designed user experiences. Now, bringing a bit of attention about the differences in the geography. So, you’re coming from Australia, now we are living in Brussels. Here in Finland, we’re in the European Union so things are a bit different, I guess. You might know more the difference. What are the cultural differences around identity and data privacy, how people think different, act differently? Katryna: Look, I think one of the things that’s very different, like a big difference, between Australia and Belgium or Australia and EU is we don’t have the idea of a sort of national identity card. So, I’ve been going through the process for the last year and only about a week ago received the final part of my visa which resulted in an identity card. So, I think that’s one of the biggest differences and that seems to be just normal in a European context. You have an identity card, you can use it at the border, you can use it to access the government service. There’s information coded on that card which is completely Personally identifiable, your name, your date of birth, the rights that you have within the context of the country in terms of living and working. So I think one of the big differences is this idea of national identity. And of course, we have that in Australia because we have a passport of course, or driver’s license which is issued state by state. Or we have what we call a Medicare card which is the access to a medical system. So, we have those identifiers and of course those could be federated that make it very easy from a government perspective to know who you are in that context. But I think that’s one of the biggest differences. I think the other difference is that there is more bureaucracy and process that I notice in comparison to maybe how things are done in Australia but in some ways the integration of that bureaucracy and process also enables you to do more. So, on the one hand, again if I come back to the idea of the identity card, it then connects you with a lot of services where you’re able to be recognised immediately and that eligibility is recognised immediately. Whereas in the context of where I come from in Australia, some of those things are more siloed and so you have a specific identity that works for one part of an essential service but doesn’t work in another. And for me, I’m not sure, Oscar, I’m not sure whether I think one is better than the other. I think I can see positives and negatives for both systems. And I think for me it also comes back to this idea of maybe governance, separation of concerns and the data minimisation. You know you can have a great federated system if you manage the governance of that well and the access management. Or you can have a highly siloed system but is not very well managed. And despite it not being connected to other things, you have somebody’s identity very much at risk even though it’s only in one siloed database or something. So I think for me that’s more around the security architecture than necessarily the way the system works. Oscar: OK. Well, interesting. And going now into thinking of business opportunity, business opportunities of people who are listening to this work in let’s say organisations and they want to follow this principle that you have been talking about now, keeping privacy first, focusing on the individual – what are opportunities that organisations have today? Katryna: I think the interesting thing is – and there are always three categories when it comes to thinking of any emerging technology. So there is the group of people that are interested in things because of the innovation, right? So, what’s happening? What’s available? What’s possible? How do we keep our organisation or our cohort at the edge of what’s happening from an innovation point of view? Then you have the middle where things may become mainstream and one of the things that creates that mainstream adoption often is regulation, or the way things are done. Meaning- a really good example for instance is Estonia and their digital identity scheme and the way that many of the services that you want including your tax return or doing anything with the government, is through that integrated system. And so the adoption is directly related to the motivation for somebody to get a rebate on their tax, OK? So, if you need to do that then you adopt. And then you often have the last group, which is interested in adopting technologies because there is no other choice, meaning that they face a fine from a regulatory point of view or maybe they have been fined, or they’ve had a data breach, or they’ve had a privacy problem, or there’s a loss of trust. So I think for us what we found over the last couple of years more and more we were working with that first group. So, a really good example with KBC here in Belgium is that for the last five years they have been voted the most trusted brand in Belgium, so not just bank, bank brand, but trusted brand. So it makes sense that they would focus on doing something that would give their customers a greater sense of that trust, privacy and control. And they’ve also last year were voted the best digital bank. So that’s a really good example of an organisation that has a strategy around trust and therefore makes decisions for its customers that are in line with that. However, what we see increasingly now are things like Open Banking, GDPR, in Australia the consumer data right, in California the CCPA. So what we’re starting to see is where regulators have moved to increase the data transparency and the data rights of individuals, that these need to actually be part of the solution architecture. And that’s really where Meeco comes in, is finding this lovely balance between innovating for the customer and providing better outcomes but also recognising the obligations that you have from a compliance point of view, not only to your customers to protect them, not only to your organisation to do the right thing, but also because you’re working within a regulated market where it’s critical that you are operating within whatever those guidelines are – you know data guidelines, financial guidelines. So what we find increasingly is the case now, is that organisations are looking for these kinds of tools and they’re looking for how they can be embedded in existing applications to give greater transparency to customers, but also to create better outcomes for customers and actually innovate with some amazing new capabilities, that weren’t possible without that foundation of trust and control. Oscar: Yes, definitely. Very good examples, as you said the one bank in Belgium is the most trusted brand. Wow, that’s a really great example also. Katryna, last question for you is going to be, if you will leave us with a tip, a practical advice that anybody can use to protect our digital identities. Katryna: OK. So, this may be is a slightly different way of coming at this answer. But I always like to think of my digital identity as if it was a part of my physical self, you know an extra finger, an extra toe. And if you start to think of your digital identity in the same way that you might think of your physical self and somebody actually wanted you to you know put your finger on a reader or have access to your physical self in some way, there is a moment where we always stop and think, yeah? If somebody wants to touch you physically, you think about it, yeah? Oscar: Yes. Katryna: You have a response. And I think what’s happened all too often as we’ve been so acclimatised that somebody offices for something digitally and we just give it. We don’t stop and have that little pause where we think, “OK, why? Or why am I doing this? Or what is it needed for?” And I think a good practice, a practice that I do myself all the time is if I’m asked for something digital, I stop for a moment, I think, if somebody was asking this for me in a physical sense, would the answer still be yes or no? And it may well be that the answer is yes, but you are completing that process in a very, I guess conscious way. You know, you’re clear about what you’re doing and why you’re doing it. And if the answer is no, then it just gives you an extra 30 seconds to consider about whether or not it’s the right thing to be doing digitally. You know, is this the right website to be putting my credit card into? Or is this the right service for me to be providing my date of birth and my nationality? Whatever it is, and it may well be yes, but it’s a good habit to cultivate to just stop, think about it in a physical sense and then work out whether or not you want to complete the transaction. Oscar: Oh yes. So you do it every day. Katryna: Every day. Every day. Oscar: Wow. Katryna: I think there’s a part of my brain that goes crazy whenever I go to do anything digitally there’s another part of my brain, “Oh, here she goes again.” Oscar: Well, it’s a good one. It’s a good one. Yeah, I haven’t heard that example. It’s definitely a good practice. I will try it definitely. Thanks for that. Well, thanks a lot Katryna. It’s very fascinating talking with you. Please let us know how people could learn more about you, the work you are doing in Meeco or how to get in touch, etc. Katryna: Yes, Oscar. It’s very simple. You can visit our website which is Meeco, M-E-E-C-O.me and there you’ll find some information for developers, so if there are things we’ve talked about today that inspire you, you want to build a product or a service and you care about your customers, you care about their privacy, you care about their digital rights then we would love to talk to you. We’d love to see how we can help. You may be and end-customer and you’re interested to see the type of applications that are already using our technology, which include, as I said, products in the financial space, in the media space for children. So you may just be interested in choosing products that give you greater control and choice. Or you may be an organisation that is starting to look at your overall digital strategy and the things that you care about is getting this balance between transparency and happy customers and also doing things correct from a regulatory point of view, then it’s a way of bringing all those things together. Oscar: OK. Perfect. Again, Katryna, it was a pleasure talking with you and all the best! Katryna: OK. Thank you so much, Oscar. Oscar: Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at Ubisecure.com/podcast or join us on Twitter @Ubisecure and use the #LTADI. Until next time. [End of transcript]",https://www.ubisecure.com/podcast/data-minimisation-meeco-katryna-dow/,,Episode,,Explainer,,,,,,,,2020-10-14,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,,,Data Privacy: does anyone care?,The compelling data and research suggest that my original question now needs to be reframed. People most certainly do care about their data privacy. The question now is: how are organisations going to bridge this data privacy chasm?,"We’ve all heard the refrain….no-one cares about privacy anymore. I confess, that sometimes I feel the same way, particularly when I see my own family members oblivious to what they are sharing. I’ve even done it myself. Then I realise that it’s not that they don’t care about data privacy, it’s just they don’t have any agency to do anything about it. I recently gave a talk at Kuppinger Cole European Identity Conference 2021 so had cause to dig into this question afresh. If the newspaper and magazine headlines are anything to go by then data privacy is still very much a live issue. But what about real people? Surely there must be some examples that would help. I looked at two recent events that could act as a litmus test of public sentiment. The first was Apple’s decision as part of its update to iOS 14.5 that allowed users, for the first time, to control whether or not to allow apps to track their data. Pretty convincing data but maybe it was skewed towards Apple users. Maybe if I looked at something more serious like helping to fight the Covid pandemic, I’d see a completely different picture? According to the latest Edelman Trust Barometer, even fighting a Global pandemic is not sufficient reason to share data. The willingness to share data to Governments to help contain Covid has actually decreased over the last 12 months. Perhaps not surprising given how some Governments have not been too adept in handling privacy concerns¹. Finally, I was convinced by some latest research by KPMG - Corporate Data Responsibility – in August 2021. The research was conducted in the US but I suspect can be translated across the world. One set of statistics stood out: The compelling data and research suggest that my original question now needs to be reframed. People most certainly do care about their data privacy. The question now is: how are organisations going to bridge this data privacy chasm? This a real opportunity for organisations to step up and take a lead. An opportunity for organisations to action rebuilding trust and becoming data sustainable for the future. There are some immediate steps every organisation should start with: - Analyse your own ethics around data collection and use. Consider implementing a code of data ethics. - Be transparent and explicit around data collection and use and do it in a way that is upfront, easy and clear for everyone to understand. - Consider using privacy enhancing technologies to anonymise data or make use of synthetic data. - Give people access and control over their data empowering them to gain value and equity by sharing. - People openly admit they don’t know how to protect their Personal data and they are rightly peeved that organisations aren’t doing much to help. Take the lead now in establishing corporate data responsibility. Meeco can help provide the infrastructure you need to bridge the data privacy chasm. Reference:[1] https://www.theguardian.com/technology/2020/jul/20/uk-government-admits-breaking-privacy-law-with-test-and-trace-contact-tracing-data-breaches-coronavirus ________ About the author A highly strategic, technical and commercially astute executive. Jason Smith has over 20 years of experience of starting, growing and managing businesses. The last 10 years of which have been with data businesses. Prior to joining Meeco, Jason led a global project within Experian as part of their global data innovation team (‘Dx’) focusing on consent, data sharing & privacy. He also co-led projects on digital ID, privacy enabling technologies and consumer Personal data management applications. Jason also contributed to Experian’s response to the 2020 EU Data Strategy consultation. Previously, he established a research data lab as part of ScaleUpNation, in Amsterdam, using machine learning & network science to research ‘what lies behind a successful scale-up’. Prior to that Jason co-founded and was CEO of Blurrt, a social media data intelligence software using AI natural language processing for sentiment and emotional analysis as well as topic clustering. Blurrt achieved a number of UK technology awards and firsts - notably real time analysis of political debates and sports matches using tweets which were broadcast live. Jason has written, presented and produced 3 radio documentaries for BBC on technology (‘BeingSocial’ on social media & data; ‘Becoming Artificial’ on AI & what it means for humans and ‘Going Viral’ on the use of technology during the first Covid lockdown). In addition Jason has written and been invited to speak on data & AI. He is a member of the European AI Alliance and was previously recognised by TechCityInsider as one of the top 200 tech entrepreneurs in the UK. Outside of work, he cycles.",https://blog.meeco.me/data-privacy-does-anyone-care/,,Post,,Explainer,,,Privacy,,,,,2021-09-15,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,,,Opening the domestic card rails to innovation,"Enabling Australian FinTechs a direct avenue to discuss how they partner and collaborate to access the eftpos payments network through the Committee resulted in recommendations that covered a number of central themes, including consultation and engagement, regulation, and technology and solutionsBen Tabell, eftpos Chief Information Officer and Committee Chair","Earlier this year, eftpos, in collaboration with FinTech Australia, established the eftpos FinTech Advisory Committee. The Committee was established as a way of giving Australian FinTechs a direct avenue to discuss how they partner and collaborate to access the eftpos payments network for the betterment of Australia's digital economy. Over the past five months, ten leading FinTech companies joined the Committee, chaired by Ben Tabell, eftpos' Chief Information Officer. Together, the Committee collaborated to help create an initial report on how to best leverage the eftpos digital roadmap, API programs along with a variety of industry topics. Enabling Australian FinTechs a direct avenue to discuss how they partner and collaborate to access the eftpos payments network through the Committee resulted in recommendations that covered a number of central themes, including consultation and engagement, regulation, and technology and solutionsBen Tabell, eftpos Chief Information Officer and Committee Chair Meeco is honoured to have been one of the companies invited to join the Committee alongside Assembly Payments, Bleu, Monoova, Sniip, Verrency, Ezypay, Azupay, POLi and Paypa Plane. The aim of the Committee is to advise eftpos on ways the company can build on its efforts to make it easier for FinTechs to access the eftpos network, products and services. The focus is to enable FinTechs to build experiences that can work across a broad range of connected devices in the digital economy. eftpos has now released the report in collaboration with FinTech Australia, delivering ten recommendations on how Australian fintechs can best leverage the eftpos digital roadmap and API programs. Of the recommendations in the report, Meeco is especially interested in the inclusion of data as the new currency, mobile wallets and digital identity. These map directly to the work Meeco has had the privilege to explore and validate together with eftpos over the past year. This includes a micropayments Proof-of-Technology using Meeco's multipurpose wallet decentralised on Hedera and a pilot that is now underway with eftpos' identity broker solution, connectID, for credentials verification as part of employee onboarding. We're delighted that this work with eftpos and Hedera Hashgraph has resulted in us being selected as a FinTech Australia Finnies finalist in the ""Excellence in Blockchain/Distributed Ledger"" category. The Finnies event and announcement of winners is now delayed to September, due to the rolling COVID restrictions in Australia. The eftpos FinTech Advisory Committee Report; Innovating on the domestic card rails, is now available to download. We would like to thank eftpos and FinTech Australia for the opportunity to have contributed to the Committee and the report. We hope you find it interesting and useful.",https://blog.meeco.me/opening-the-domestic-card-rails-to-innovation/,,Post,,Explainer,,,Payments,,,,,2021-07-29,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,,,"Understanding the difference between Identity, Authentication and Authorisation",Identity is the answer to the “Who am I?” or “What am I?” question.- Authentication is about asking can I trust who or what this is?<br>- Authorisation follows authentication to determine what services are available to the trusted party.,"On August 7–8th 2018, technology experts from Australia and around the globe gathered in Canberra for the Digital ID Show. The event was co-located with the Technology in Government Expo and the Cyber Security in Government Conference. Overall, more than 2,500 attendees wandered among 120+ exhibitors and listened to 120+ speakers from a range of industries. The Digital ID Show focused on the bigger questions underpinning digital identity and what that means for the roll out of digital identity in Australia. These questions are not only being asked in Australia, several countries including the UK, Netherlands, Canada and USA are reviewing similar ideas. The concept is to provide users with a better way to prove ID, establish trust, reduce fraud, fight crime and terrorism, and streamline services and online payments. There are many different players involved, both from the government, as well as the private sector. One thing everyone agrees on, is that innovation and collaboration between all stakeholders is vital. In a world where everything is connected 24/7, where the power of AI is increasing, it is critical to get the foundations right. The recent data breaches and Facebook's data privacy scandals highlight the impact that technology can have on our Personal lives, including disastrous consequences if we do not take the right measures. Technology is neutral. It is neither good or evil. It does not have a concept of ethics, it only does what we set it up to do. We must be the ones asking the right questions to position technology to succeed with the right outcomes in society. A range of regulations is being implemented across the world to set a legislative framework around the use of Personal data. Laws like the General Data Protection Regulation (GDPR) in Europe are setting a standard for data privacy legislations. In Australia, the Open Banking changes will come in place by July 2019, permitting consumers to allow other financial companies and third parties access to their banking information. This will open new opportunities for businesses, as well as fairer and better products and services for customers. If implemented properly, these new laws can both protect our Personal data while enabling Personalisation at scale, all with explicit and informed consent. If we fail to put in place the right measures, it can lead to a digital dark age where no trust is established between consumers, organisations and public institutions. Solving the question of Digital Identity is an important first step, but the efforts should not stop there. In her talk entitled “Do you really need to know who I am?”, Meeco founder Katryna Dow outlined the differences between Identity, Authentication and Authorisation and how emerging solutions like progressive disclosure and Zero Knowledge Proofs can help solve some of the identity and Personal data challenges we face in the digital world. Identity is the answer to the “Who am I?” or “What am I?” question.Authentication is about asking can I trust who or what this is? Authorisation follows authentication to determine what services are available to the trusted party. With minimum information and transparency, users can still get maximum value from many services. A great example is the difference between asking for your date of birth versus asking if you are over 18 and therefore eligible to enter a night club, drive or purchase alcohol. Many services do not need to know your exact identity to provide you with their services. A trusted persona (backed by your real identity) can enable services to be provided in a more privacy preserving way. The service provider just needs to know that they can trust the party that vouches for you. When service providers establish trust by being transparent with their customers and are clear about what they intend to do with the data they collect, research shows people are actually willing to share more data. “It is critical to understand the consequences that poor foundations have in the future. By understanding the differences between identity, authentication and authorisation, governments and service providers can design for privacy and reduce the collection of Personal data. The result of getting the foundation design right is a more secure digital society, one with less fraud and identity theft”. To learn more about Meeco’s vision for how Personal data can unlock value and enable trusted Personalised services read our White Paper; Zero Knowledge Proofs of the Modern Digital Life. If you have any questions or want to find out more, chat with us on Telegram or follow on Twitter.",https://blog.meeco.me/do-you-really-need-to-know-who-i-am/,,Post,,Explainer,,,,,,,,2018-08-10,,,,,,,,,,,,,
|
||
Meeco,DataSovereigntyNow,,,,,,,,,"When you invent the ship, you invent the shipwreck'","Katryna [Dow] believes that it is vital for the future of the internet that people and organisations obtain control of their data. This concept is called ‘data sovereignty’, but achieving that means rethinking the digital infrastructure the current internet is built upon. She calls for a soft infrastructure that consists of agreements between public and private-sector parties about the access to data.","In this edition of the Data Sharing Journal, INNOPAY’s Mariane ter Veen discusses data sovereignty with Katryna Dow, founder and CEO of Meeco. Katryna believes that it is vital for the future of the internet that people and organisations obtain control of their data. This concept is called ‘data sovereignty’, but achieving that means rethinking the digital infrastructure the current internet is built upon. She calls for a soft infrastructure that consists of agreements between public and private-sector parties about the access to data.",https://datasovereigntynow.org/2021/01/18/when-you-invent-the-ship-you-invent-the-shipwreck/,,Post,,Explainer,,,,,,,,2021-01-18,,,,,,,,,,,,,
|
||
Meeco,,,,,"EFTPOS, Hedera",,,,,Meeco announced as Finnies 2021 finalist,Meeco’s submission results from our work in collaboration with with [eftpos](https://www.eftposaustralia.com.au) and [Hedera Hashgraph](https://hedera.com/). The Australian payments giant eftpos recently announced joining the Hedera Governing Council after successfully conducting tests to determine the [feasibility of a digital Australian dollar stablecoin for micropayments](https://www.finextra.com/newsarticle/37360/australias-eftpos-to-set-up-hedera-hashgraph-node-for-micropayments). Meeco plays an important part in this world-leading initiative as the wallet provider for the proof-of-technology.,,https://blog.meeco.me/meeco-announced-as-finnies-2021-finalist/,,Post,,Meta,,,,,,,,2021-05-23,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,,Mydata 2016,Advancing human-centric Personal data: MyData 2016 powered by MeCast,"For conference attendees, whether participating in-person, or engaging with the MyData community in the comfort of your home or office, MyData 2016 powered by MeCast is the hub of Personal data activity, conversation, thought leadership and action throughout August and September.","Well, ‘powered by MeCast’ is just part of it. The entire event is actually a little bigger. We’ve written about this before, but to recap quickly, MyData 2016 is a three-day conference, starting on the 31st of August, hosted in the beautiful Finnish capital, Helsinki. The conference brings together thought leaders, entrepreneurs, academics, policy makers and executives. The thing bringing them together is that they’re all focused on developing the ecosystem and advancing collaborative approaches to human centric Personal data products, services and frameworks.In collaboration and in support of our ongoing partnership, Meeco is proud to announce the launch of MyData 2016 powered by MeCast. Many of you are probably familiar with MeCast by now, but for those who aren’t, MeCast is social in your control and on your terms. Just like Meeco, MeCast is Privacy by Design. It’s within your control, now and forever. The app makes it quick and easy for you to simultaneously post to your social networks, whilst also creating a back-up and searchable Personal timeline of all your social posts. For conference attendees, whether participating in-person, or engaging with the MyData community in the comfort of your home or office, MyData 2016 powered by MeCast is the hub of Personal data activity, conversation, thought leadership and action throughout August and September. So lets get practical, what can you do with the app? The MyData 2016 app enables posting to: - Yammer - Meeco And better yet, if you’re not able to fly al the way to Helsinki and experience the midnight sun, you can engage with speakers and topics through both MeCast and the Participate tab powered by Screen.io. This is your opportunity to ask some of the leading thought leaders, those driving the progression of the Personal data economy, the questions you’ve always wanted to ask. It’s also worth noting that the app will be used to communicate the agenda. There’s nothing worse than missing your favourite speaker, so here’s how to get started:Connect to your NetworksSimply tap on the MeCast tab in the App and follow the prompts to connect your social accounts Search your timelineUse MeCast to search your timeline for key words and #tags. Searches will return the post including the networks you have posted to. Keep a back up of your postsMeCast makes social sharing easy, it also helps you control your own story by backing up your posts (including photos) to your own Personal data store. The MeCast Timeline gives you an automatic back up on your mobile device.If you are a Meeco Member, you can add your MeCast Timeline to your Meeco dashboard to view and search your Personal Timeline on other devices. For more information about MeCast visit www.mecast.me After the conference download the full MeCast app from iTunesWhether we see you in person, or engage in friendly Twitter banter, we look forward to MeCast powering MyData 2016. Here's a sneak peak",https://blog.meeco.me/advancing-human-centric-personal-data/,,Post,,Meta,,,,,,,,2016-08-16,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,Katryna Dow,,,,,Katryna Dow Named as a Top 100 Identity Influencer,"Katryna has been pioneering Personal data rights since 2002, when she envisioned a time when Personal sovereignty, identity and contextual privacy would be as important as being connected. Now within the context of GDPR and Open Banking, distributed ledger, cloud, AI and IoT it has never been so important to make sure identity works for us all.","Meeco is proud to announce that Katryna Dow, our CEO and Founder, has been named by One World Identity in the OWI Top 100 Identity Influencers 2019 for the third year running. OWI is an independent market intelligence and strategy firm focused on identity, trust, and the data economy. Starting in 2017, OWI have published the top 100 Identity Influencers recognising pioneers leading the charge to improve identity. It’s a who’s who of the identity world — the minds shaping the future of how data (big and small) is collated, treated, protected and shared. Katryna was nominated and selected as part of this inspirational group of professionals, including entrepreneurs, innovators and leaders across public and private sector enterprises. Making the list is a significant achievement given the high calibre of influencers, which include Tim Cook (Apple), Mark Zuckerberg (Facebook), and identity veteran Kim Cameron (Microsoft). Of note this year, more than 30% of the Top 100 are women. These influencers are finding support and opportunities to raise their profiles through initiative such as Women in Identity. Women in Identity was founded by Emma Lindley (Visa, and Top 100), Pamela Dingle (Microsoft, and Top 100), Collette D'Alessandro (Ping) and Sue Dawes (formally OIX). Together, their vision was to improve diversity in the identity industry. They realised that without diversity within identity companies — it would likely make identity solutions less inclusive, the key thing identity solutions need to be! They wanted to raise the profiles and accomplishments of women founding companies through to leadership roles in multinationals. Since 2017, organisations including Microsoft, Barclays, Post Office, Capital One, HSBC, GBG, KuppingerCole, Identiverse and OWI have provided help through funding interns, space for meet-ups and hosting events. The 2019 Top 100 Identity Influences come from a variety of backgrounds. However they share a common goal: to solve the identity challenge. Increasingly, our physical lives are shadowed by our digital actions. If we want to achieve a balance between, privacy, security and utility we need to get the architecture of identity right. Some of these influencers are championing user empowered and decentralised solutions, whilst others are focussed on the commercial returns of centralised control. Katryna has been pioneering Personal data rights since 2002, when she envisioned a time when Personal sovereignty, identity and contextual privacy would be as important as being connected. Now within the context of GDPR and Open Banking, distributed ledger, cloud, AI and IoT it has never been so important to make sure identity works for us all. “Every child born in today’s digital world will live a dual physical and digital life. How we enable privacy and security for their digital twin is as important as the rights afforded to them in the physical world. Protecting their identity, enabling trusted authentication and authorisation of services is the key challenge for us to solve in an increasingly connected world. If we don’t get this right, we risk a generation born into digital slavery rather than delivering on the promise of empowerment through technology” – Katryna Dow, World Government Summit, 2019 Meeco’s vision is to create a place for everyone on the planet to get equity and value in exchange for the data and identity information they share. As individuals gain the legal rights to manage and control their data, organisations need to re-think how they collect, store and exchange their customers identity and information for mutual value. Meeco has been pioneering new commercial models and ways to enable individuals to achieve this, by being directly part of the value chain via the API-of-Me. Katryna will be speaking more on this and other identity related topics at OWI’s KNOW 2019 Conference in Las Vegas this March 24th – 27th.",https://blog.meeco.me/katryna-dow-named-top-100-identity-influencer/,,Post,,Meta,,,,,,,,2019-02-17,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,Eftpos; Hedera,,,,Finnies 2021,Meeco announced as Finnies 2021 finalist,Meeco’s submission results from our work in collaboration with with [eftpos](https://www.eftposaustralia.com.au) and [Hedera Hashgraph](https://hedera.com/). The Australian payments giant eftpos recently announced joining the Hedera Governing Council after successfully conducting tests to determine the [feasibility of a digital Australian dollar stablecoin for micropayments](https://www.finextra.com/newsarticle/37360/australias-eftpos-to-set-up-hedera-hashgraph-node-for-micropayments). Meeco plays an important part in this world-leading initiative as the wallet provider for the proof-of-technology.,"Awards such as FinTech Australia's Finnies are a testament to Australian innovation. Australia is often considered as a great test market. Despite the relatively small population compared to the USA or Europe, the high adoption of new technologies makes Australia a great market to test, validate and chart a pathways for solutions to take off globally. Indeed, Australia is proudly home to a lot of innovative start-ups, which makes the start-up industry in particular worth considering. The synergy of technological shifts, market opportunities and a vigorously growing digital culture has developed several FinTech Unicorns.In October 2020, Meeco was the proud recipient of the FinTech Australia Finnie for Excellence in Industry Collaboration & Partnerships, for our push into Belgium and collaboration with KBC Bank. So we are more than thrilled to keep the momentum going when we received news that we are been shortlisted again this year. We are proud to announce we made it to the finalists for FinTech Australia's Finnies 2021. This year, Meeco was nominated in the Excellence in Blockchain/Distributed Ledger category, for the development of our decentralised identity wallet. Contenders for the award this year include AgUnity, BOULEVARD Global, CommChain, Hutly, Oxen, Pellar Technology and RelayPay. “Excellence in Blockchain/Distributed Ledger” finalist Meeco's submission results from our work in collaboration with with eftpos and Hedera Hashgraph. The Australian payments giant eftpos recently announced joining the Hedera Governing Council after successfully conducting tests to determine the feasibility of a digital Australian dollar stablecoin for micropayments. Meeco plays an important part in this world-leading initiative as the wallet provider for the proof-of-technology. “When we were running the eftpos micropayments PoC with Hedera Hashgraph, we needed a wallet partner. We had had great experience of working with Meeco on our connectID PoC and needed aspects of digital identity for the micropayments PoC too, so they were a natural fit.This enabled us to combine the provisioning of an eftpos debit card, identity credentials and a stable coin in one wallet. The PoC was very successful and resulted in a technical platform that demonstrates micropayment capability in a way that no other payments provider has. Our next phase is to look at ways to commercialise this capability.” Rob Allen, Entrepreneur in Residence, eftpos Everything will be Tokenized Earlier this month, Meeco was invited to share our vision of a tokenised work at an FSClub webinar hosted by Z/Yen's Chairman, Professor Michael Mainelli. Meeco's CEO and Founder, Katryna Dow delivered the presentation titled Everything Will Be Tokenized: The Future of Identity. Katryna shared some of the exiting things we have been working on along with where the tokenised world is heading. “We're on track towards a world where everything can be tokenized. Tokenization plays a critical part in enabling more equitable value creation for people, organisations and things. Providing the means to issue and store value, trace provenance, and most importantly achieve consensus to instantly trust. In order for this tokenized world to emerge we will need the infrastructure for people and their digital twins to participate. This will include digital identity, verifiable credentials and payments.""Katryna Dow, CEO & Founder, Meeco Since 2012, Meeco has been pioneering the access, protection and value exchange of Personal data and identity. Over the past year we have been working towards making much of this possible, and this webinar featured some of the use-cases and practical steps along the way, such as: - Meeco’s wallet leveraging OpenID & W3C standards, designed to incentivise collaboration between stakeholders whilst maximising privacy - eftpos Australia determining feasibility of a digital Australian dollar stablecoin for micropayments - Hedera Hashgraph: the fast, fair, secure and stable Distributed Ledger Technology ensuring decentralised governance at scale. Over the past year, we have been working in conjunction with Hedera to launch a multi-purpose wallet that will enable decentralised identity, verifiable credentials, micropayments and tokens. The wallet will be available as a white label solution, with SDK and supported by Meeco's Secure Value Exchange credential solution. We look forward to sharing more news soon along with updates to our developer portal. In the meantime, the category winners for the FinTech Australia Finnies 2021 will be announced in-person this year on 21 July, 2021 at The Forum, Melbourne. From all of us at Meeco, huge congratulations to all the finalists. 🙌👏🙌👏🙌👏 In closing, a very special thank you to Rob Allen at eftpos and Tim McHale at Hedera, for enabling us to stand on the shoulders of giants. It’s an honour to be counted in such great company, thank you! 🙏",https://blog.meeco.me/meeco-announced-as-finnies-2021-finalist/,,Post,,Meta,,,,,,,,2021-05-23,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,VELA,,,,,Meeco announces VELA Solutions partnership,"Together, Meeco and VELA have created a secure digital credentialing platform to help individuals and businesses adapt to the changing work environment and modernise their HR practices.","As we head into the last month of 2020, this most extraordinary year, we’re delighted to announce a new partnership focussed on empowering work. VELA Solutions has selected Meeco to power their workforce digital credentials platform. Together, Meeco and VELA have created a secure digital credentialing platform to help individuals and businesses adapt to the changing work environment and modernise their HR practices. The post COVID-19 world will be more digital. Whilst many digital technologies already existed, to support working from home, telemedicine or signing a contract, necessity accelerated adoption this year. Now, as we look to 2021, with the promise of being able to work together again, there's an opportunity to rethink the way we seek and start work in a more trusted and digital way. New technical standards are emerging which focus on establishing trust in the digital world. One such standard is Verifiable Credentials, the method by which credentials can be issued, verified and exchanged. This includes the ability to cryptographically prove that a credential is authentic, valid and trustworthy. The benefits for people and organisations of this technology include: - Faster and more convenient to verify education and training records - Reduced risk and fraud - Credential portability and re-use - Verification and revocation - Improved compliance - Increased safety - Privacy management Founded by Andrew Scott and Michael Derwin, VELA Solutions is the culmination of their extensive experience across workforce management, organisational development and HR Tech. Their motivation to build VELA draws on their Personal stories. For Andrew, the importance of identity was something he learnt at a much younger age. In his late teens whilst living away from home to study, the share house he was living in was destroyed by fire. He escaped only with the jeans he was wearing and with no way to identify himself. Without proof of identity, he was not able to access basic things like health services or his bank account. Living rough was the only option until he could get back on his feet. This experience has shaped his desire to ensure people can benefit from decentralised identity, also referred to as Self Sovereign, Self-Managed or Portable Identity. “We believe that this technology will fundamentally change the way businesses and employees work together — it's just a matter of time.” – Andrew Scott, Director – VELA Solutions Michael has always had a passion for making technology work for people. He is no stranger to pioneering the change he wants to see in the world. His first foray was in the 1990s, using the early internet to support the education of Australian outback students using ""School of The Air"" – something we now know as open learning. More recently he co-founded the talent management software company, Adepto, which was successfully acquired by Degreed earlier this year. “We are now in the age where we can connect hardware and software technologies and truly empower individuals to take charge of their digital self.” – Michael Derwin, Director – VELA Solutions The maturing of decentralised identity solutions underscored the decision to find an experienced partner. Enter Meeco. We have been pioneering the collection, protection and exchange of Personal data and identity since 2012. The decentralised credentials technology developed for VELA draws on our expertise and uses the W3C Verifiable Credentials Standard, which means, we will deliver workforce solutions that are interoperable and globally supported. “This is a significant achievement towards empowering people to securely collect, verify and share their workforce credentials. The mutual benefits of risk management and value creation demonstrates the power of human-centric technology. Adopting this approach gives mutual control over the collection and use of Personal data, and give workforce eco-system partners a competitive edge.” – Katryna Dow, CEO & Founder, Meeco “The changing landscape around data privacy is putting the individual in charge of their data — this will fundamentally change the way companies need to operate.” Michael Derwin, Director, VELA Solutions VELA and Meeco are united by our vision to get trusted workforce data into the hands of people so they can use their verified work data to get work, utilise their skills and experience, and progress their careers. To support these workforce challenges, we have developed a secure platform with verified credential portability making it faster and easier to recruit, onboard and maintain employee and contractor professional credentials. Our differentiator will be the ability to configure implementations to support both decentralised wallets and integration into existing enterprise systems. This is achieved with Meeco's patented vault technology. Meeco's suite of APIs, SDKs and documentation are available to our eco-system partners to ensure integrations support corporate governance, risk tracking and reduction, improved safety and compliance outcomes, whilst always empowering people. The VELA solution can be applied across a range of sectors such as health, mining, education, aviation and government. To find out more and register to keep updated on our progress, visit VELA Solutions. Also stay tuned, in the coming weeks Meeco will announce our distributed ledger partner, along with our commitment to co-developing tools to support the issuers, verifiers and enterprises in this exciting new workforce eco-system.",https://www.meeco.me/blog/meeco-announces-vela-solutions-partnership,,Post,,Meta,,,Human Resources,,,,,2020-11-29,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,,,Meeco expands UX and Design team along with a new office in Adelaide,"Meeco now has illustrious neighbours such as the Australian Space Agency, the Australian Institute for Machine Learning, the Aboriginal Art and Cultures Centre and coming soon, Amazon and MIT.","Meeco has been enabling organisations and people to collect, protect and exchange Personal data since the Meeco manifesto came to life in 2012. Since then, Meeco has been steadily growing in size and recognition. 2020 was an interesting year! Despite worldwide shutdowns and uncertainty, Meeco doubled in size in the last year, adding to our development team in Melbourne. Meeco also secured new partnerships in Europe along with the Finnie 2020 award in “Excellence in Industry Collaborations & Partnerships” to prove it. Now as we enter 2021, we're keeping the momentum and thrilled to announce the expansion of our UX, Design and Testing team in Adelaide, Australia. The new meeps will significantly bolster Meeco's design chops, bringing more diversity to the team and addressing the need for more women working on digital identity and Personal data solutions. This new team is Meeco's Design/Digital hub, cementing Meeco's commitment to delivering beautiful design and digital work made in-house. The first task of the Meeco Design Team will be breathing some new life into its website and digital communication. The new meeps are excited to bring in their understanding of data collection, user interactions and how they can bring them together to create thoughtful design solutions. They add to Meeco's belief that an holistic approach to designing processes is key to delivering unique and powerful memories for people. Our new office The team has taken residence in Stone&Chalk Adelaide, keeping up with the tradition in Melbourne and Sydney. Stone&Chalk is located on Adelaide's remarkable Lot Fourteen. With a $150 million investment from the Federal Government, the Adelaide City Deal builds on Adelaide’s global reputation in innovation, culture and the arts.lotfourteen.com.au Meeco now has illustrious neighbours such as the Australian Space Agency, the Australian Institute for Machine Learning, the Aboriginal Art and Cultures Centre and coming soon, Amazon and MIT. The Design Team will be led by Mars El-Bougrini, Meeco’s Chief of Design and veteran meep of 7 years. “We’re very excited to be growing the Meeco design and testing team here in Adelaide. With the addition of the talented Yolanda, Ai and Himani, Meeco will continue to go from strength to strength.” Mars El-Bougrini, Meeco’s Chief of Design Meeco continues to build strong partnerships worldwide, helping organisations and developers solve hard problems while respecting data rights through holistic data solutions. Stay tuned for exciting new announcements we have in the works! Cheers Team Meeco",https://blog.meeco.me/meeco-expands-ux-and-design-team-along-with-a-new-office-in-adelaide/,,Post,,Meta,,,,,,,,2021-03-22,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,RegTech Association of Australia,,,,,Meeco joins RegTech Association of Australia,"History has shown us that at times of national emergency, Personal rights and freedoms are often traded for the greater good. Whilst these measures may be appropriate in the eye of the storm, society is often left with the legacy of less than optimal freedoms when life becomes more stable. One of the tools we have at our disposal is the rise of RegTech; specifically, the ability to implement greater monitoring, governance and separation of concerns that can help society balance between what we need to know, and the hard-fought freedoms of global citizens.","Meeco joins RegTech Association of Australia In the past weeks we have experienced a tectonic shift in our global societal, health and financial priorities. Now more than ever we need to be connected with access to data we can rely on and trust. History has shown us that at times of national emergency, Personal rights and freedoms are often traded for the greater good. Whilst these measures may be appropriate in the eye of the storm, society is often left with the legacy of less than optimal freedoms when life becomes more stable. One of the tools we have at our disposal is the rise of RegTech; specifically, the ability to implement greater monitoring, governance and separation of concerns that can help society balance between what we need to know, and the hard-fought freedoms of global citizens. “This is a golden age for the RegTech opportunity. There’s unprecedented focus on RegTech, the fines for regulatory non-compliance are growing and there’s prediction that globally the industry will spend in excess of $127 billion by 2024.” – Deborah Young, Welcome to 2020 address Through and post COVID-19, the impact of increasing and tighter regulations on Personal data and individual’s data rights, will be felt more and more in terms of the cost of compliance implementation or the consequences of non-compliance fines. Enterprises and organisations can choose to offset these costs by turning compliance into an operating model innovation opportunity. This can be done in partnership with patients and customers by the application of intelligent technology when collecting and using Personal data. This is at the heart of Meeco’s value proposition. It also sits at the heart of our commitment to collaboration, industry solutions and co-operative eco-systems. Since our launch in 2014, Meeco has strongly identified with the growing need for orderly, structured and controlled Personal data management. As an award-winning pioneer in this field, Meeco has always been closely connected to the issues involved in Personal data rights and the way in which they can be managed, in step with growing host of regulatory measures. This is particularly true for consent and permission management, and equally for Personal data security and fraud management. Meeco’s API-of-Me platform allows organisations to help their customers create secure ways to collect, share and manage their Personal data. The added benefit of Meeco’s secure data enclaves is that it also alleviates many of the compliance and cost burdens of regulations such as the General Data Protection Regulation in Europe, Consumer Data Rights regulation in Australia or the California Consumer Privacy Act, 2018 in the United States. The immediate consequence of the expansion of this type of regulation is the increased protection of data rights and Personal data for individuals. Whist these regulations may be aligned to other initiatives such as Open Banking, to enable better financial services decisions, the opportunities and benefits are as relevant in healthcare, education, transport and retail. These are positive steps forward, particularly considering the sharp increase in mismanagement of Personal data and increase in fraudulent activity that we have witnessed over the last few years. There are numerous examples of regulatory fines handed down to enterprises that have willingly or unwillingly mismanaged their customers Personal data. Of note, just two of these events across 2018-19 resulted in fines of USD$5 billion imposed by the US Federal Trade Commission on Facebook following the Cambridge Analytica scandal in 2018, and USD$700 million settlement agreed in 2019 by Equifax subsequent to the mishandling of users’ Personal information impacting 150 million customers in 2017. Additional impacts of these new regulations are the substantial increased operational costs related to the compliant collection, storage and management of their customers’ Personal data. Indeed, the rising need for risk management and compliance has fed an ever-increasing need for more and better-quality customer data. This has fuelled the advent of RegTech, with market size estimates reaching as high as USD$55 billion by 2025, on the back of a CAGR of 52.8% between 2019 and 2025[1]. RegTech companies are providing businesses with applications and tools to remove human error, make better decisions and provide evidence of compliance. These valuable capabilities enable businesses to reduce cost and optimise operational processes without compromising compliance. But is this enough in a world of shrinking margins and diminishing trust? We have seen a significant downturn over the past decade in consumer trust in the way large enterprises and digital platforms use and protect customer’s Personal data. Compliance with regulations and the imposition of severe penalties for breach are now the baseline expectations of a public that has become increasingly aware of the value of their Personal data, and the risk of its exposure. Clear evidence of this trend was already visible in 2017, when Deloitte found that “. . . consumers are less likely than ever to complete feedback surveys – thanks to privacy concerns”[2]. For too long businesses have used the Personal data of their customers to their own advantage, whilst meeting the minimum acceptable compliance obligations. Customers deserve more from the businesses with whom they choose to transact. It’s time to look beyond compliance and provide innovative solutions that provide all stakeholders including customers the value and privacy they deserve. At Meeco we believe in a world where people are empowered and equipped with the legal, business and technological tools to gain value from the use of their Personal data. Meeco has been championing this cause since our founding manifesto. We see that businesses can use their compliance obligations as the basis for creating new opportunities for value generation. In fact, KPMG endorsed this same view in 2018 when it stated that “Through direct improvements and freeing resources, RegTech also has the potential to: - Provide valuable business insight - Provide customers with better and faster service - Drive new products and services”[3]. To further this aim, Meeco has recently joined the RegTech Association. We believe that the compliance-based foundation of Meeco’s products makes us a great fit within this global community. Through this partnership we hope to contribute to the protection of Personal data assets in compliance with the relevant regulations. In January this year, members of the Meeco team had the privilege to attend a European Trade Mission lead by AusTrade which included Michelle East, the CEO of Certainty Compliance and member of the RegTech Advisory Committee. Michelle demonstrated first-hand her passion for helping raise the profile of the important role RegTech has in our changing technology landscape. Michelle’s commitment to developing a trusted eco-system, along with prior experience collaborating with Lisa Schutz were key motivating factor in Meeco’s decision to join the RegTech community. The Australian RegTech organisation is headed up by its CEO Deborah Young, together with Alison Shapiera heading up engagement, supported by the experienced Board comprised of Julian Fenwick (Chair) and directors Lisa Schutz, Harold Lucero, Jasper Poos and Peter Deans. On the decision to join the RegTech Association, Meeco’s CEO Katryna Dow said: “We're very much looking forward to contributing to the RegTech community and sharing our experience from different parts of the world. We've been privileged to work with organisations like KBC Bank in Belgium and Nexia Wealth in Australia, both great examples of businesses empowering their customers with innovative Personal data management and compliance tools”. KBC recently implemented a Privacy-by-Design solution which guarantees their customers complete privacy and control. Nexia Wealth provides their customers with a secure platform for sharing data and documents with trusted advisers.These are just some of the ways Meeco is helping organisations use compliance to be at the forefront of data innovation.",https://blog.meeco.me/innovate-beyond-compliance/,,Post,,Meta,,,,,,,,2020-03-23,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,OWI,,,GDPR,,Meeco Positioned as Leader in Personal Identity Products in One World Identity’s Identity Industry Landscape,"Meeco has been recognized as a leader in Personal Identity Products. Meeco is a GDPR compliant multi-sided Personal data platform to offer Personalised solutions and channels to build trust. Organisations can incorporate Meeco’s secure data enclave inside existing applications to create value for customers by simplifying onboarding and Personalising experiences. This balance of Me2B and B2Me, enabling new privacy enhanced business models to emerge, is core to Meeco’s unique position in the emerging Personal data market.","Meeco Recognised for Personal Data in Identity Industry Meeco announces move from R&D to Q2 2020 planned release of a range of new Personal data and identity products. The range of new services will include a Developer Portal with access to Meeco's API Platform, together with a Verified Claims Wallet. Both new products are designed to support enterprise and developers to rapidly prototype Personal data use-cases, validate customer value and minimise investment risk ahead of deploying data compliant solutions. This update builds on the One World Identity (OWI) announcement in November 2019, recognising Meeco as a leader in Personal Identity Products, and featured in the annual Identity Landscape Map. OWI is a market intelligence and strategy firm focused on identity, trust, and the data economy. Each year, OWI designs an Identity Landscape, providing a comprehensive and holistic view of leaders in the identity space. As the identity industry is rapidly developing, OWI’s landscape provides an unparalleled overview of how digital identity applications are evolving and the companies and markets shaping next-generation digital identity. With over 400 companies and 35 market segments, the 2019 Identity Landscape visually depicts a growing and maturing industry. The new, unique landscape format allows companies to touch multiple market segments, reflecting on the dynamic nature of digital identity applications. The OWI team selected 415 identity companies from a pool of over 2,000 based on several factors. - Each company must be an identity company OR have a distinguishable line of business focused on identity. - Each company must be at least 3 years old or have raised $3 million. - Each company must have a functioning product in the market. “Since 2017, the number of identity companies has more than quadrupled, from 500 companies to over 2,000,” said Travis Jarae, CEO and Founder of OWI. “With the wave of data breaches and privacy scandals, there is a rapid expansion of identity products and solutions. The OWI team interacts with identity companies every day, from startups to enterprise. We’re proud to share the Identity Landscape each year to distill how new companies, products, and solutions are shaping the future of identity.” Meeco has been recognized as a leader in Personal Identity Products. Meeco is a GDPR compliant multi-sided Personal data platform to offer Personalised solutions and channels to build trust. Organisations can incorporate Meeco's secure data enclave inside existing applications to create value for customers by simplifying onboarding and Personalising experiences. This balance of Me2B and B2Me, enabling new privacy enhanced business models to emerge, is core to Meeco’s unique position in the emerging Personal data market. Digital identity is the core of digital transformation. From Personal to professional applications, identity is the foundation for how we connect, engage, and interact in the digital economy. As there is increasing consumer demand for privacy and security, digital identity is no longer a nice-to-have; it is a pillar of success. The OWI Identity Landscape is a tool to help companies keep track of market growth and trends and understand the strategic importance of digital identity moving into 2020 and beyond. OWI will be releasing a more detailed research report delving into the details of each market segment and how companies within the industry overlap and intersect in upcoming months. Click here to download a complimentary copy of OWI’s 2019 Industry Landscape About Meeco: Meeco has been at the forefront of the emerging Personal data economy since 2012, having developed an award-winning Personal data sharing platform and Consent Engine for GDPR compliance. Meeco enables customers to aggregate Personal data across their life, including identity, financial, social, health and IoT then share it directly with the people and organisations they trust. Meeco was featured in the Personal Identity Products category of the OWI Map, however Meeco's services are extending into other parts of the landscape, for example the provision of ""secure data enclaves"" through their Microsoft partnership and European data sovereignty solutions. “We're proud to be featured alongside Mastercard, Apple and Visa” says founder Katryna Dow “as this demonstrates the place we have carved in the market since founding the company in 2012. Black Swan events such as CORVID-19 demonstrate how increasingly critical it is to be able to access verified Personal and public data. We believe enabling citizens and customers to participate directly and transparently creates a faster path to trust. In turn, this approach results in service providers being better equiped to deploy vital Personalised services such as healthcare, education and financial help.” Meeco is taking registrations for early access to the Developer Portal and will post relevant announcements through April 2020. About OWI: OWI is a market intelligence and strategy firm focused on identity, trust, and the data economy. Through advisory services, events and research, OWI helps a wide range of public and privately held companies, investors and governments stay ahead of market trends, so they can build sustainable, forward-looking products and strategies. OWI was founded in 2015 and is the host of the KNOW Identity Conference that was scheduled to take place in Vegas in April, but will now take place later in 2020.",https://blog.meeco.me/meeco-positioned-as-leader-in-personal-identity-products-in-one-world-identitys-identity-industry-landscape/,,Post,,Meta,,,,,,,,2020-03-16,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,,,Meeco Terms & Conditions Update - Feedback Welcome,"At Meeco, our mission is to develop the tools to enable people to collect, protect and securely exchange Personal data. We launched our first service in 2014 backed by Terms & Conditions we were proud to share. Starting with that first version, we've continued to invited feedback before implementing updates. We take our governance seriously, which starts with transparent and easy to understand terms of service.","At Meeco, our mission is to develop the tools to enable people to collect, protect and securely exchange Personal data. We launched our first service in 2014 backed by Terms & Conditions we were proud to share. Starting with that first version, we've continued to invited feedback before implementing updates. We take our governance seriously, which starts with transparent and easy to understand terms of service. This is the fifth update since then. Our last major update was back in 2018 prior to the introduction of the General Data Protection Regulation. Through that update we were able to strengthen data rights, and extend the GDPR protections to everyone using Meeco. This V5 update paves the way for a range of exciting new Meeco services, including applications like mIKs-it, designed for to make digital services safer and secure for kids. We've also been busy building tools for developers. Tools to support amazing start-ups like My Life Capsule, who are helping people manage, prepare, capture and share data to connect and organise families across generations. We're also deploying a range of new decentralised identity capabilities to support partners like VELA Solutions. VELA provide a digital credentialing platform to enable the secure storing and sharing of verifiable credentials. Over the next fourteen days, we would love your feedback or questions on any of the changes. Our Terms & Conditions reflect our commitment to giving people and organisations the tools to access, control and create mutual value from Personal data. Here's a high level summary of the changes: - Introduction of data administration roles for parents and guardians - Description of new and expanded Meeco Services - Information about Meeco’s entities in Belgium and the United Kingdom - Expanded terms to include children using Meeco Services - Prohibited use to protect children - Additions to protect Your data rights - Updates to increase Your data security - Expansion of Your commencement and termination rights - Introduction of terms regarding subscriptions and payments - Additions to meaning of words and legal terms. If you would like to share your feedback, simply email support@Meeco.me and include “Update to Terms and Conditions” in the subject heading. All going to plan, our new Terms & Conditions will apply by Monday 8th March 2021! Thank you for taking the time to read this update and we look forward to your comments. 🙏",https://blog.meeco.me/meeco-terms-conditions-update-feedback-welcome/,,Post,,Meta,,,,,,,,2021-02-18,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,mIKs-it; VELA; My Life Capsule,,,,,Support Centre for Data Sharing interview with Meeco,"Meeco’s inception, its work so far and recent growth. Some of the the exciting projects discussed include [mIKs-it, the safe multimedia app for children](https://miks-it.com/), developing a decentralised identity and verifiable credentials wallet and how innovators like [VELA Solutions](https://vela.solutions/) are transforming workforce management and [My Life Capsule](https://mylifecapsule.com/) are helping their customers be prepared for a family emergency.","The Support Centre for Data Sharing (SCDS) initiative focuses on researching, documenting, and reporting about the data sharing practices, EU legal frameworks, access and distribution technology that are relevant to organisations, and that imply novel models, and legal or technological challenges. ""Whilst privacy is paramount, you can't have a digital economy if everyone is locking their entire digital footprint away”Katryna Dow, CEO & Founder Meeco This is one of the many topics touched on the latest Support Centre for Data Sharing interview. Raymonde Weyzen and Esther Huyer interview Meecos' CEO & Founder Katryna Dow and Chief Commercial Officer Jason Smith. Some of the challenges that come with data sharing are data privacy and data control. However the paradox of data sharing is that it generally means the recipient needs the data in order to fulfil an obligation, such as deliver a service, validate identity, deliver a package or customise an experience. So the issues are often not about sharing, but about trust and transparency. Helping customers understand why the data is required and providing evidence that it is being used as intended is a great way to establish trust. Another way to boost trust is to focus on designing services that minimise the amount of data collected whilst maximising the value created. In this thought provoking interview Raymonde asks about Meeco's inception, its work so far and recent growth. Some of the the exciting projects discussed include mIKs-it, the safe multimedia app for children, developing a decentralised identity and verifiable credentials wallet and how innovators like VELA Solutions are transforming workforce management and My Life Capsule are helping their customers be prepared for a family emergency.Other questions and topics covered include: - How the idea of Meeco was conceived - Why is there a need for data sharing? - What is the data sharing lifecycle at Meeco? - Examples of use cases at Meeco - What specific licensing or standards to share data are used at Meeco? - In order to share data proper; y going forward, do we need more or less regulation? - Where would you see Meeco and ultimately the digital world in 10 years from now? And finally, the most difficult question we weren't prepared for ""What would be the working title of a movie starring Meeco?"" to find the answer to this and more click below to listen or watch the interview. 👇 Huge thanks to the Support Centre for Data Sharing team for all the great work they are doing to help people understand the value of data sharing 👏 We very much appreciated the opportunity to share Meeco's perspective. 🙏",https://blog.meeco.me/support-centre-for-data-sharing-interview-with-meeco/,,Post,,Meta,,,,,,,,2021-06-23,,,,,,,,,,,,,
|
||
Meeco,Meeco,,,,,,,,,"Zero Knowledge Proofs of the modern digital lifefor access, control, delegation and consent of identity andPersonal data","The Meeco solution provides access, control, delegation and consent from the perspective of the individual user. Meeco enables people (data subjects)to provide their own verified records and controlled consent. This API-of-Me allows Meeco to provide a meta-data driven attribute wallet with no knowledge of the data to any authenticated identity of a user,which in turn enables an auditable Personal-event chain of data interactions at scale.",,https://media.meeco.me/public-assets/white_papers/Meeco_Zero%20Knowledge%20Proofs%20of%20the%20modern%20digital%20life_V1.0_20180513.compressed.pdf,,Whitepaper,,Meta,,,,,API-of-Me,,,2018-05-13,,,,,,,,,,,,,
|
||
Meeco,HelloUser,,,,,,,,,Hello User,"In the digital world, identity has evolved far beyond its old definitions. It’s the way we consume products. Our ability to vote. Our financial security. Digital identity can be created quickly, accessed broadly and even stolen...easily. And it doesn’t just live online. Welcome to Hello, User: the podcast that covers modern identity across every facet of our lives, from Personal to public to professional.",,https://podcasts.apple.com/us/podcast/hello-user/id1541385551,,Podcast,,Resources,,,,,,,,2020-11-09,,,,,,,,,,,,,
|
||
Microsoft,,Microsoft,,Bill Gates; Paul Allen; Subur Khan,DIF; ID2020; ID2020 Founder; VCI Founder,"USA, Washington, Redmond",USA,,,Microsoft,Microsoft enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.,,https://www.microsoft.com/en-us/,,Company,,Company,Enterprise,,"IT, IAM, ID, SSI",,VCI,,"WebAuthN,Secure Data Storage",1975,https://github.com/microsoft,https://twitter.com/Microsoft,https://www.youtube.com/microsoft,https://techcommunity.microsoft.com/t5/identity-standards-blog/bg-p/IdentityStandards,https://techcommunity.microsoft.com/gxcuf89792/rss/board?board.id=IdentityStandards,,https://www.crunchbase.com/organization/microsoft,https://www.linkedin.com/company/microsoft/,,,,,
|
||
Microsoft,IDPro,,,Leo Sorokin,DIF,,,,,A Peek into the Future of Decentralized Identity,"As digital transformation sweeps across the globe, it has affected everyone – from citizens to employees, from corporations to governments. Digital identity is a foundational enabler for business processes in the digital economy. Decentralized identity is the next evolution of digital identity capabilities and brings with it an opportunity to streamline how people interact with other institutions, physical objects, and with one another. This paper considers the future world of decentralized identity and offers clarity around the benefits of decentralized identity, terminology, sample scenario, and a sample technical implementation, while also addressing some of the limitations of this model. This paper further grounds the reader in the current state of decentralized identity capabilities while outlining the evolution of identity practices from past to present.","Abstract As digital transformation sweeps across the globe, it has affected everyone – from citizens to employees, from corporations to governments. Digital identity is a foundational enabler for business processes in the digital economy. Decentralized identity is the next evolution of digital identity capabilities and brings with it an opportunity to streamline how people interact with other institutions, physical objects, and with one another. This paper considers the future world of decentralized identity and offers clarity around the benefits of decentralized identity, terminology, sample scenario, and a sample technical implementation, while also addressing some of the limitations of this model. This paper further grounds the reader in the current state of decentralized identity capabilities while outlining the evolution of identity practices from past to present. Keywords: Self-sovereign identity, Digital wallet, Digital Card, Decentralized Identity How to Cite: Sorokin, L., (2022) “A Peek into the Future of Decentralized Identity (v2)”, IDPro Body of Knowledge 1(7). doi: https://doi.org/10.55621/idpro.51 A Peek into the Future of Decentralized Identity (v2) Leo Sorokin © 2022 IDPro, Leo Sorokin Introduction Digital identity is rapidly gaining criticality in our world as organizations digitally transform. Identity plays a pivotal role in a digital transformation and can empower both governments and businesses to provide secure whilst restricted access to data for any stakeholder whether employee, partner, customer, or citizen. Digital identity is becoming a vital component of security in a world with data proliferation on a myriad of devices and a network perimeter that is ever-more challenging to define. One active area under development in the identity space is the concept of decentralized identity. Decentralized identity is a fundamental shift from account-based credentials toward verifiable credentials and is a major philosophical as well as technical change in the way identity-related information is acquired and presented. The World Wide Web Consortium (W3C) is working on publishing standards around Verifiable Credentials and Decentralized Identifiers.1,2 However, as with any technology standard, it must be broadly adopted by the community for it to be useful at scale. Today, a person’s digital identity (and associated Personal data) is strewn across many online services, with access to such services being primarily performed via a username and password. Such an account-based credential is usually provisioned directly by the service provider, or by a large and rather centralized identity provider (IdP), such as Google, Facebook, or Twitter with which a service provider application will federate. This account-based federated model, however, has some significant limitations: the IdP may stop offering its services to third-parties; the identity supported by this IdP may be compromised thus impacting every service provider application that uses that identity; the IdP may track an individual’s activities across multiple services; and an IdP may decommission the account being used for authentication. There are many challenges with the federated identity model, but going back to identity silos where each service provider provisions and manages its own set of credentials for its users, resulting in users having to manage dozens of such account-based credentials is not ideal either. Decentralized identity strives to place the individual at the center of digital identity experiences by attempting to insert the individual at the center of identity data exchange. At its simplest, decentralized identity attempts to map physical wallets and the physical cards within them to a very similar concept in the digital world – a digital wallet with digital cards. Today, there are many that are very excited about the potential of this model as well as many that are skeptical. Although decentralized identity and the concepts underpinning it attempt to solve the challenges we have had with digital identity over the past few decades, it is still too early to predict how individuals, governments, and corporations will approach it, and how each of these actors will be able to derive value from it. Decentralized Identity Benefits A decentralized identity system can be used to replace a traditional username and password during a typical authentication sequence. This is perhaps the first use-case most will think about. However, authenticating in a passwordless manner is possible today even without any decentralized identity components. As such, the true value of decentralized identity can be more easily understood during authorization. During authorization, the service provider may mitigate risk by requiring the individual to present one or more digitally signed attestations commensurate with the level of risk that specific transaction entails and the level of value being obtained. This capability could be leveraged to increase trust between the parties, improve the user experience for the individual, while at the same time lowering costs for the business. The purpose of decentralized identity is to empower individuals to own and control their digital identity and how their identity data is accessed and used. The premise behind decentralized identity decouples it from the notion of a username and password or the traditional account-based model. A digital identity is not yet another username and password-based account that is provisioned and maintained by a third party. With a decentralized identity model, the individual can be both authenticated and authorized to perform a transaction with one service, and then present the same identity information to another entity with which the individual might prefer to interact. In addition, the individual can become their own identity provider, which is more difficult to accomplish with the centralized or federated models we have today. Decentralized digital identity and the Personal data associated with it should enable the individual to have more control over how that data is accessed and used. As a byproduct of this philosophy, Personal data should be presented by the individual to service providers on an as-needed basis, with specific terms of use. This principle is fundamental to decentralized identity. In a decentralized identity ecosystem, there is no one single central authority; value is exchanged in a more peer-to-peer manner. Since the individual controls and owns their Personal data, they are the ones to enable other parties to access it by granting them specific permissions. This is in stark contrast to today’s reality where Personal data may be shared and stored by third parties outside the individual’s control with the individual having no means of specifying the terms of use under which the identity-related information is shared. In a decentralized identity environment, it may be possible to possess a digital card for a drivers’ license, credit card, or even a passport, and have them available on a mobile device. In another scenario, it may help when traveling abroad while having to visit a doctor. Today, it would be very cumbersome and not practical to share medical history and medications with a doctor, other than through a simple verbal explanation. However, with a healthy decentralized identity ecosystem of issuers and verifiers, it would be possible to share important medical information in a digital privacy-preserving manner, thus enabling the doctor to make a better medical decision and provide the patient with a much better service. An additional example is a mortgage lender that may need the homeowner to provide proof of active property insurance. To that end, the homeowner can present the property insurance information to the mortgage lender and the lender can periodically verify the current status of the insurance policy on its own without unnecessarily burdening the homeowner with having to constantly present this documentation to the lender for verification on a recurring schedule. While centralized or federated identity might also support these use cases, decentralized identity might be better suited for them. Decentralized identity may enable new business models and value exchange. It may pave the path for fully digital-only experiences that remove the requirement for individuals to present themselves in-person to perform high value transactions. Decentralized identity may also enable a better in-person user experience in a variety of situations without requiring a person to carry a physical wallet at all. There are also potential benefits for businesses to streamline how they might verify and build trust with their customers. There is definite potential here, but only time and the market will tell if the great expectations for decentralized identity will be fully realized in practice over the long term. Decentralized Identity Terminology The following are the primary components involved in a decentralized identity experience. These definitions have been simplified to make it easier to understand the actors and how they interact: Self-sovereign identity is a term that describes a digital movement that is founded on the principle that an individual should own and control their identity without the intervening administrative authorities. Verifiable credentials are attestations that an issuer makes about a subject. Verifiable credentials are digitally signed by the issuer. Issuer is the entity that issues verifiable credentials about subjects to holders. Issuers are typically a government entity or corporation, but an issuer can also be a person or device. Holder is the entity that holds verifiable credentials. Holders are typically users but can also be organizations or devices. Verifier is the entity that verifies verifiable credentials so that it can provide services to a holder. Verifiable presentations are the packaging of verifiable credentials, self-issued attestations, or other such artifacts that are then presented to verifiers for verification. Verifiable presentations are digitally signed by the holder and can encapsulate all the information that a verifier is requesting in a single package. This is also the place where holders can describe the specific terms of use under which the presentation is performed. User agent or digital agent is the software application that holders use (typically a mobile device app) that receives verifiable credentials from issuers, stores them, and presents verifiable credentials to verifiers for verification. Identity hub or repository is the place where users can store their encrypted identity-related information. An identity hub can be anywhere – on the edge, on the cloud, or on your own server. Its purpose is to store Personal data. Some implementations may allow other entities to access the identity hub of the user if the user specifically grants such access. You can think of an identity hub as the individual’s Personal data store. Decentralized Identifier (DID) is an identifier that is created and anchored in a decentralized system such as a blockchain or ledger and can represent any entity in the ecosystem – an issuer, a holder, a verifier, and even an identity hub. Digital cards represent verifiable credentials that users collect over time and are stored as part of the user agent or the identity hub of the user. It’s somewhat simpler to refer to them as digital cards rather than verifiable credentials when speaking about them. Digital wallet represents a digital metaphor for a physical wallet and is generally represented by the combination of the user agent and the underlying capabilities of the computing device, such as secure storage and secure enclaves on a mobile phone. The digital wallet contains digital cards. DPKI is a decentralized public key infrastructure and is usually implemented via an immutable blockchain or ledger – a place where DIDs can be registered and looked up alongside the associated public keys of the DID and its metadata. DPKI can be described more generally as the verifiable data registry, as the DPKI is just one of many possible implementations for a verifiable data registry. While this paper refers to DPKI, the reader should be aware that a verifiable data registry need not necessarily be “decentralized”. Universal resolver is an identifier resolver that works with any decentralized identifier system through DID drivers. The purpose of a universal resolver is to return a DID document containing DID metadata when given a specific DID value. This capability is very useful because DIDs can be anchored on any number of disparate DPKI implementations. The figure below highlights some of the terminology just outlined with the major actors and their relationships. It also represents the sample scenario we will cover later in this document. Figure - Verifiable Credential Issuance and Presentation It is essential to note that no Personally identifiable information should be stored on the decentralized public key infrastructure. Personal identity data is stored as part of the individual’s digital wallet or identity hub in a secure location. Usually, the holder will present verifiable credentials to verifiers during a business transaction in real-time, like the way we currently present our passport at a border crossing. However, in more advanced scenarios, some implementations may enable the holder to grant a verifier-specific access to data in the holder’s identity hub. That way, the verifier can access data that the individual has allowed access to, instead of the individual having to manually present verifiable credentials to the verifier on a recurring schedule. Nevertheless, the more traditional approach still requires the holder to present verifiable credentials to the verifier explicitly, but the verifier will have the ability to periodically check the status of the credential, such as whether or not the credential has been revoked by the issuer, on its own without burden to the holder. Now that you are armed with an understanding of the terminology, let’s take a closer look at a sample scenario. Decentralized Identity Scenario The example below is meant to provide an end-to-end use-case of the value and utility of a decentralized identity ecosystem. It is not a comprehensive or exhaustive description of all that is possible with decentralized identities as it represents just one possible decentralized identity flow. Suppose Sam wants to purchase vehicle insurance from Example Insurance, but to get a good rate, Example Insurance requires proof that Sam is a graduate of A University. In our decentralized identity scenario, the actors are as follows: Sam as the verifiable credential subject and holder. A University as the verifiable credential issuer. Example Insurance as the verifiable credential verifier. The following sequence of steps represents a flow where the end-goal is for Sam to receive a digital diploma from A University and then present it for verification to Example Insurance in order to claim the automobile insurance discount: Sam receives an email from A University congratulating Sam on graduating while also providing a QR code Sam can use to scan with Sam’s mobile phone. Sam has an app on Sam’s phone that is registered to handle such a request. This app represents Sam's digital wallet that will hold all the digital cards that were collected over time. Sam scans the QR code, the digital wallet app launches, and Sam is informed that in order to receive Sam’s digital diploma Sam needs to sign-in to the A University website. In our case, Sam presses on the link and enters Sam’s existing credentials to authenticate on the University's website or if Sam didn't have such a credential, Sam may be asked to come in person to the registrar's office to do ID proofing and receive their credentials. Once Sam provides their existing credentials, Sam is informed that Sam can go ahead and accept this digital card from A University. Once Sam accepts the card, Sam is asked to secure this operation with a biometric, such as a fingerprint, face, or even a PIN. After Sam performs this action, the card is now securely stored in Sam's digital wallet. Sam can inspect the card, view the data that the card has about Sam (which was attested to by the university), such as Sam’s full name, major, graduation date, and issue date. Also, Sam can view the activity that this card was involved in, such as when it was issued, to whom it was presented, and how it was used - all of this can be done from the digital wallet app on Sam's phone. Each such activity can be considered as a digital receipt or verifiable history that Sam can use to track who has (or had) access to the data for this card. These digital receipts are stored locally along with the card in Sam's digital wallet, which is always under Sam's control. More generally, we can also refer to this digital card as a verifiable credential. Now, to claim Sam’s discount, Sam navigates to the Example Insurance website on Sam’s mobile phone and notices the Verify Credentials button. This is a deep link and when Sam presses it, the digital wallet app opens with a permission request. The permission request indicates that Example Insurance needs to receive a A University alumni digital card for Sam to get Sam’s discount. Note that Sam doesn't have to authenticate to Example Insurance with a username and password nor use a federated IdP. Sam can simply present the digital diploma Sam already possesses in Sam’s digital wallet. In our scenario, Sam only presents Sam’s A University alumni digital card to Example Insurance, but Sam could also present other digital cards Sam has in Sam’s digital wallet such as a digital card that proves Sam is a resident of a specific territory or to prove Sam’s current address. Once Sam authorizes the permission request with Sam’s biometric such as a fingerprint scan, Example Insurance now receives the digital card and is able to verify that it was indeed issued to Sam by A University, and it is indeed Sam who is presenting this digital card to Example. Once Example Insurance completes the verification, it can now offer a discount to Sam! Sam can now view that Sam’s digital wallet app has a receipt for this card, indicating that this card was presented to Example Insurance on a given date and for a specified purpose with Example’s terms and conditions. Some implementations may further enable Sam to revoke the access Example Insurance has to view Sam’s digital card. This revocation action may generate another receipt that clearly indicates the date and time Sam revoked Example's access to Sam’s digital card. Once again, Sam can accomplish all this from Sam’s digital wallet app on Sam’s mobile phone, and all the digital cards that Sam collects over time and Sam’s associated receipts are under Sam's control. Sam can collect many such digital cards in Sam’s digital wallet and at some point may even need to present multiple cards, such as in the case if Sam wants to attend an advanced enterprise architecture training academy, both proving Sam is a A University alumni as well as a certified enterprise architect. The academy can then instantly verify both credentials presented and enable Sam to access Sam’s advanced training material. It is important to clarify that Sam sends a verifiable presentation to Example Insurance. The verifiable presentation contains a nested artifact which is the verifiable credential Sam has received from A University. In this manner, Example Insurance that is acting as the verifier, can verify the following two critical elements: Based on the digital signature of the verifiable credential, Example Insurance verifies that the verifiable credential is authentic and was indeed issued by A University to Sam Based on the digital signature of the verifiable presentation, Example Insurance verifies that it is indeed Sam who is performing this credential presentation After Example insurance has verified the above, it is able to confidently present Sam with Sam’s vehicle insurance discount. Decentralized Identity Technical Implementation The following sequence is a technical explanation of the same scenario presented above. It outlines the steps that must be taken to setup the decentralized identity experience as well as the verifiable credential issuance and presentation flows. However, this scenario assumes that the decentralized public key infrastructure (DPKI) has already been setup and will not be detailed here. Setup A University represents the issuer. A generates a decentralized identifier (DID) tied to a public/private key pair and registers their DID on the DPKI. The private key is stored by the A University IT team in a Key Vault or Hardware Security Module. The corresponding public key is published to a decentralized ledger such as a blockchain so that anyone can find it. A University IT publishes a DID document that associates its DID to the registered public Domain Name System (DNS) domain, such as A.edu. This represents a domain linkage verifiable credential. A University IT can host this file on their website which both proves ownership of the domain and the specific DID. The verifier (such as Example Insurance) can use this DID document to confirm the DID ownership for A University and ensure that the verifiable credential it receives is indeed issued by A University and not by some other issuer claiming to be A University. A University IT develop a contract that describes the requirements for the issuance of the verifiable credential. For example, A University IT can specify which attestations should be self-issued directly by the user, and which other verifiable credentials, if any, the individual must first provide. In our scenario, the IT team has mandated that the student authenticate with a federated IdP that supports the OpenID Connect protocol, so that it will be able to receive a security token and extract claims from it, such as first name, last name, and student number. The issuer will then be able to map it to attributes it will issue in the verifiable credential. Importantly, A University will indicate the schema(s) to which the verifiable credential will conform, so that other verifiers around the world will be able to consume the content of the verifiable credential those verifiers receive. Finally, A University IT administrators can setup and customize the branding of the soon-to-be-issued verifiable credential cards such as card color, logos, icons, images, and helpful text. The administrators can customize the helpful text strings via metadata that will appear as part of the cards based on the attestations issued with the card for credential data. This will help design the look and feel of verifiable credential alumni cards issued by A University, and ensure the issued digital cards reflect the brand of the university. In the future, these graphical elements should be standardized so that students enjoy a consistent digital card visual rendering experience regardless of which vendor develops the user agent or digital agent the student chooses to use. Verifiable Credential Issuance The credential issuance request flow begins when Sam scans a QR code using Sam’s mobile phone. The purpose of the issuance request is for Sam’s user agent to retrieve the requirements for credential issuance as dictated by the issuer and to display the appropriate UX to the user via the user agent. As such, the QR code is displayed on the A University website and scanning the QR code opens Sam's digital wallet mobile app and triggers an issuance request retrieval operation from the user agent to A University. Once the user agent receives the issuance request from A University, it begins the flow to issue the credential. The issuance request is digitally signed by A University and the user agent can verify the authenticity of such a request. The issuance request includes a reference to the contract that describes how the user agent should render the UX and what information Sam needs to provide in order to be given a verifiable alumni credential. After the user agent verifies that the request is genuine, it renders the UX to Sam. Because of the specific requirement that A has for issuing digital alumni cards in our scenario, Sam needs to sign in with Sam’s existing A University account, which, in turn, will issue a security token to the user agent with claims such as Sam's first name and last name, degree, and graduation date. (Note that during setup above, the issuer can be configured to accept security tokens from any trusted and compliant OpenID Connect identity provider and the user agent will use this identity provider during the issuance process.) Therefore, when the individual presses ‘Login to A University’ on the user agent, the user agent can redirect the individual to authenticate with the IdP, and it is there the individual can perform standard authentication tasks such as entering their username and password, performing Multi Factor Authentication (MFA), accepting terms of service, or even paying for their credential. All this activity occurs on the client side via the user agent (e.g., a mobile app). When the user agent finally receives the security token from the IdP, it can pass it along to the issuer which can then extract claims from it, as mentioned above, and inject these as attributes into the resulting verifiable credential, potentially enriching the claims with information obtained from other sources. As well, after the individual authenticates with the IdP, the user agent can display additional input fields that the individual is free to self-select. After the individual has provided all the required information, the user agent can verify that it has all the necessary issuer requirements fulfilled, and it can go ahead and ask if Sam would like to accept the card. In our scenario, when Sam accepts the card, Sam is asked to use a biometric gesture such as a fingerprint scan. This action generates a private/public key pair for Sam’s DID whereby the private key is stored on the mobile phone in a secure enclave, and the public key is published to a distributed ledger. Finally, the issuer receives all the required information alongside Sam’s DID and issues the digital card to Sam who then receives the verifiable credential, which is a JSON Web Token (JWT) following the W3C standard for verifiable credentials. The JWT includes both the DID of the subject, Sam, and the DID of the issuer, A University, as well as the type of the credential, and any attestations such as first name, last name, major, and graduation date. It also contains a way to find out the credential's revocation status in case the credential is later revoked by the issuer - A University. This verifiable credential is digitally signed by the issuer's DID. Once the user agent validates the verifiable credential received from A University, it inserts this digital card into Sam's digital wallet as a card Sam can now present to other organizations such as Example Insurance. Verifiable Credential Presentation When Sam visits the Example Insurance website on their mobile phone to receive a discount on their vehicle insurance, Sam presses the ‘Verify Credentials’ button on the Example website (which is a deep link) or simply scans a QR code generated by Example via their mobile phone. This generates a presentation/verification request for Sam to verify Sam’s A University alumni status. The request describes the type of card(s) that Sam should present to Example Insurance, such as Sam’s digital alumni card from A University, and this request is digitally signed by the verifier's DID, which in our case, is Example Insurance. The presentation request can also include Example's terms of service. After the signature of the request is verified by the user agent, Sam is presented with a UI on the user agent indicating that Example Insurance is requesting permission to see Sam’s A University alumni card with a reason as to why Example needs to see it (such as for Sam to be able to receive their discount). After Sam approves the request with a biometric gesture, such as with a fingerprint scan on the mobile phone, the verification response, which is essentially a presentation of a credential response (also known as a verifiable presentation), is sent to Example Insurance. The response is signed by Sam's private key and includes the verifiable credential issued by A University to Sam nested inside the JWT payload. Example Insurance attempts to match the person performing the presentation of the credential with the subject of the nested verifiable credential to ensure that it is indeed Sam who is presenting it to Example Insurance, and not anybody else. Therefore, the DID of Sam is present in both the outer JWT payload since Sam is performing the presentation of the credential, as well as inside the nested JWT payload as the subject of the verifiable credential issued by A University. Once Example Insurance confirms that the DID in the presentation matches the subject of the issued credential, Sam is both authenticated to the Example Insurance website and authorized to claim Sam’s discount! This is much better than simply possessing a username and password, since, in this mechanism, Example Insurance knows that the person presenting this credential is the same person to whom the card was issued. With a username and password, someone else can use it to impersonate you. In this architecture, however, this is significantly harder to do. Someone else will need to take control of Sam's private key stored on Sam’s phone's secure enclave to be able to accomplish this malevolent task. At last, Example Insurance can extract the data it requires from the verifiable credential such as Sam's first name, last name, major, graduation date, and go ahead and present Sam with Sam’s vehicle insurance discount! The credential verification flow completes when Sam stores a signed receipt by Example Insurance that will be associated with the card in Sam’s wallet. Sam now has a single place where Sam can view all the websites where Sam has presented Sam’s alumni card over time. In our scenario, the receipt includes information about Example Insurance, the reason Example needed to receive the card, Example's terms and conditions, and the date the receipt was generated. This signed receipt is associated with the card in Sam's digital wallet and will always be under Sam's possession. Some implementations may further enable Sam to go ahead and decide to revoke Example's access to Sam’s A University digital alumni card. Example should thus implement the necessary revocation measures to ensure it complies with Sam's request. The verifier should then cease to use the data from the card Sam presented to it. Sam can later prove that Sam issued a revocation request if such a need arises, and this can help with General Data Protection Regulation (GDPR) compliance. Scenario Summary In our simple use-case above, the issuer of a verifiable credential was A University, but in other contexts, the issuer can be an employer, a government agency, a device, a daemon process, or even the individual. Likewise, a verifier can also be any of the previously mentioned actors. The decentralized identity ecosystem is very broad and the standards allow for opportunities to unlock a more flexible, secure, and privacy-preserving way to perform digital interactions in a myriad of contexts. The components presented in the flow above are based on open standards. The verifiable credentials issuance and presentation flows depend on the foundational specification of the W3C Verifiable Credentials Standard, and the decentralized system, such as blockchains and ledgers, are based on W3C Decentralized Identifiers work. The purpose of the decentralized ledger technology is to support a decentralized public key infrastructure (DPKI). The DPKI anchors DIDs and their public keys and thus enables ownership of DIDs to be validated without relying on only a few privileged identity providers or certification authorities. The Decentralized Identity Foundation is leading the effort on decentralized identity, but more work remains to fully define the space.3 For example, the decentralized identity community is discussing how to enable better privacy preservation by empowering Sam to present Sam’s age in a privacy-preserving way without unnecessarily disclosing Sam’s exact date of birth to the verifier. Another area under discussion is how to empower Sam with performing self-owned key recovery in case Sam loses or damages Sam’s phone, so that Sam can more easily retrieve all Sam’s previously acquired digital cards back onto a different device or onto a different user agent in a more seamless manner. Decentralized Identity Limitations While decentralized identity has the potential to improve an individual’s productivity and digitize existing business processes for governments and corporations, it does have known limitations and areas where further research or investigation would be required. A decentralized identity ecosystem can only be successful when it achieves critical mass adoption by governments, businesses, and individuals. When Apple released the first iPhone, it ushered in a new and immediate change in the user experience the moment the purchaser took possession of their new device. In contrast, an individual may not gain much benefit in obtaining a verifiable credential from an issuer unless they can then use that verifiable credential with many verifiers. A digital passport, for example, is only useful to a citizen if it can be used at most airports and border crossings around the world. Organizations may heSITAte to be issuers or verifiers of verifiable credentials unless there is already a healthy ecosystem in place, but that ecosystem cannot develop unless there are entities willing to issue and verify these new credentials. Decentralized identity is a digital identity. Without the necessary technology to hold a digital wallet, such as on a mobile phone or some sort of computing device, it will be very difficult for the promise of digital identity to be realized by all individuals around the world. If an individual loses their device or decides to share their device with others without proper precautions, it can become a challenge to recover their data onto a different device or to prove who performed a specific interaction. Asking the average person to understand this and to safeguard their private key material remains a significant challenge to decentralized key management. In most decentralized identity use-cases, the developers assume all parties involved have access to the Internet. That may not be the case. Other scenarios that take the individual away from Internet access leave open the question of how verifiable credentials can be verified in such scenarios. Verifying verifiable credentials requires looking up information on the DPKI, or at the very least, checking if a credential that is being presented has been revoked, and that requires network connectivity. In purely disconnected offline environments this poses a challenge, and a potential hurdle to decentralized identity adoption in specific contexts and situations. The promise of decentralized identity is to empower individuals to own and control their digital identity and Personal data. However, if a person provides a verifiable credential containing Personal data to the service provider, the service provider is able to copy this data to its own databases for marketing purposes or to be able to continue providing services to the user. The individual can attempt to revoke access that the service provider has to the verifiable credential but there is no guarantee that the service provider will honor such a request and delete all the data it has stored about the user. This would be a very challenging problem to solve via strictly technological measures and would most likely require legal and policy frameworks in place to ensure everyone’s Personal data is protected, to ensure audit records are kept, and to establish a documented process for dispute management and resolution. Final Words Decentralized identity can enable entirely new business opportunities and empower citizens to be more in control of their identity and Personal data. Today, IT administrators need to perform cryptographic key exchange ceremonies to establish trust between two organizational entities. This does not scale when doing business with dozens or perhaps hundreds of other vendors in a more ad-hoc manner. Today, when a bank issues a credit card to a customer, that customer can use that credit card to make purchases with almost any merchant worldwide. In such a scenario, it is not feasible to expect every merchant to exchange cryptographic keys a-priori with every possible bank that issues credit cards. A decentralized identity ecosystem can enable a similar concept to credit card associations by introducing governance authorities and frameworks for many different trust communities in a wide array of industry verticals. As a result, merchants, or other verifiers, can avoid setting up multiple trust federations – they can simply ask the issuer to present additional proofs proving that the issuer is indeed a member of a specific governance authority with which the verifier already has an established trust relationship. One of the major hurdles for adopting blockchain today in enterprise scenarios is the lack of a decentralized identity infrastructure. After all, it’s not very logical to have a decentralized blockchain network if all the identities on it are still relying on centrally controlled accounts. Furthermore, in a decentralized identity ecosystem, consumers will be more easily able to track which websites they visit online and with whom they transact. You will know which businesses have your Personal data, and you will be able to revoke access to it should you so desire. Instead of sharing paper documents or physical cards, you will be able to share digital documents and digital cards in a fully digital, privacy-preserving, and auditable manner. For organizations, this may reduce GDPR-related risk since Personal data will be stored in the identity hub under the individual’s control, while the organization will only have access to specific data as granted by the user. Furthermore, the individual may have the opportunity to revoke access to their data, and this may simplify the GDPR compliance for an organization as well as streamline such requests for the individual. As well, GDPR compliance may be eased for an organization as it will be able to possess cryptographic proof as evidence that the individual has indeed provided them with specific data. As discussed, the digital wallet contains a digital agent app with which the user interacts. Such digital or user agents are mostly based on open source software. The individual can download a user agent from a commercial corporation, or perhaps even a government entity. An individual may even develop their own user agent from existing open source software. Conceptually, an individual must trust the user agent and it should be under the individual’s control. While it is extremely challenging to attempt to predict how the decentralized identity landscape will evolve given its nascent state, current trends are indicating government interest to ease the burden on citizens and businesses via government-issued digital IDs. Tailwinds from the unprecedented global COVID-19 pandemic are urging government institutions to streamline citizen and business access to government-provided services. As well, increasingly stringent regulatory compliance requirements and further demand by users for better user experience and increased convenience may further drive demand for digital identity in the form of verifiable credential exchange. Finally, verifiable credentials may prove very useful in situations where the same credential must be presented both online in digital transactions as well as in offline in-person interactions, since this can result in increased business efficiencies for the enterprise and a more consistent and simplified user experience. Conclusion Decentralized identity is a conceptual shift from the way the identity and access management community has been approaching identity in the past, yet it is able to co-exist with the account-based identity model that has existed for decades. Decentralized identity can add a lot of value to transactions that require high assurance of trust to make authorization decisions. If an individual continues to authenticate with a website using a traditional “account”, it does not preclude the individual from having to present verifiable credentials in order to, say, transfer large sums of money to another individual or organization. This offers the possibility to unlock a myriad of new opportunities for digital commerce and enable consumers, employees, and citizens around the world to transact on the web in a more secure, safe, and privacy-preserving manner. It may pave the path for a digital wallet with digital cards, like the way we all use a physical wallet and physical cards today. Verifiable credentials are easy to reason over because many of them will simply be digital representations of the physical cards we already carry in our wallets every day. We are still at the early days of decentralized identity. It is not a technology that a single company can simply release to the market. It requires both standards as well as collaboration between the private and public sector to have a healthy ecosystem of issuers, holders, and verifiers. When we finally reach critical mass adoption, digital experiences may look and feel much different from the experiences of today. Decentralized identity is an exciting development in the identity space, and it has the potential to offer more trustworthy digital experiences and unlock more value for everyone. Change Log |Date||Change| |2020-10-30||V1 published| |2022-02-28||Editorial changes only (changed example business names to non-Microsoft specific ones)| Author Bio Leo Sorokin has over 10+ years of experience in various solution architecture and enterprise architecture roles with large organizations in the financial, manufacturing, and software industries. He is currently a Cloud Solutions Architect at Microsoft helping the largest Canadian organizations adopt cloud technology. Leo has extensive experience with identity, service-oriented architecture, application integration, cloud-native application and hybrid-cloud architecture, as well as security software architecture. Leo is also TOGAF® 9 Certified, a Microsoft Certified Azure Solutions Architect and holds a Computer Science degree from York University. Leo has also taught technology related courses in several educational institutions. ""Verifiable Credentials Data Model 1.0,"" W3C Recommendation, World Wide Web Consortium (W3C), 19 November 2019, https://www.w3.org/TR/VC-data-model/.↩︎ “Decentralized Identifiers (DIDs) v1.0,” W3C Working Draft, World Wide Web Consortium (W3C), 27 October 2020, https://www.w3.org/TR/did-core/.↩︎ Decentralized Identity Foundation (DIF), [Online]. Available: https://identity.foundation/.↩︎",https://bok.idpro.org/article/id/51/,,Paper,,Explainer,,,,,,,"DID,Verifiable Credentials",2020-10-30,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,Joy Chik,,,,,,5 identity priorities for 2021—strengthening security for the hybrid work era and beyond,"In this paradigm, individuals can verify a credential with an ID verification partner once, then add it to Microsoft Authenticator (and other compatible wallets) and use it everywhere in a trustworthy manner.",,https://www.microsoft.com/security/blog/2021/01/28/5-identity-priorities-for-2021-strengthening-security-for-the-hybrid-work-era-and-beyond/,,Post,,Explainer,,,,,,,,2021-01-28,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Pamela Dingle,,,,,,Decentralized Identity: The Basics of Decentralized Identity,"At the most basic level, decentralized identity is the story of three standardized documents: a proclamation, a letter of introduction, and an endorsement.","Here in part three of our decentralized identity series, I’ll describe the key parts of a decentralized identity architecture without diving too far into the technical details. It takes a village for this kind of ecosystem to work – as you’re about to see – and the concepts discussed here are industry standards that anyone can research and implement. If I succeed, you’ll be able to explain the design pattern behind this architecture and have enough information to look up the underlying specifications, if you choose. Quick background: We started this series with the 5 guiding principles for decentralized identities. Then, we looked at why we think that the Direct Presentation model gives us advantages in moving towards our goals. Now comes the FUN part – we get to dig into the technical mechanisms that we think could underlie decentralized trust for individuals. Part I: The Five Guiding Principles Part II: The Direct Presentation model Part III: The Basics of Decentralized Identity <-- You are here Part IV: Deep Dive: Verifiable Credentials Part V: Deep Dive: Anchored Decentralized Identifiers A story of three documents At the most basic level, decentralized identity is the story of three standardized documents: a proclamation, a letter of introduction, and an endorsement. That’s it! The rest is just tying up loose ends. Because I hate to leave you all in suspense, I’ll give you the technical names for each of these documents up front - but don’t panic! We’ll define them in context right away. A DID document is a proclamation that helps strangers verify communications are authentic. A verifiable credential is a letter of introduction by an issuer containing authoritative statements about a subject. A verifiable presentation is an endorsement of a subject at the time a verifiable credential is passed to a verifier. You can think of this architecture as a feudal letter passing scheme, with just as much intrigue but a lot more math involved. We open our feudal saga in a mythical land, long ago… Chapter 1: The proclamation Rose Abbey is known throughout the land for its beautiful gardens. The main building is something of a famous landmark: everyone knows where to find it, and those who do can find a proclamation nailed to the door with a wax seal attached that only this abbey can produce: A paper with a wax seal in the shape of a flower that says: “Hear ye hear ye: All messages from Rose Abbey will be sealed with wax mixed with rose petals that can only be found in the Abbey. It looks like this” There are two interesting properties of this proclamation document. First is the provenance of the proclamation. It’s clearly a proclamation from Rose Abbey because it’s nailed right to the door, for all to see. Anyone can find Rose Abbey on a map, walk to the door, and read the document to learn about the wax that verifies a message is from the Abbey. The second is the verification method within the document. The proclamation provides the recipe that explains how anyone can compare the exact composition of wax affixed to a given message to the one on this proclamation, and verify the message was created using wax from the roses unique to the Abbey. Let’s go one step further and say that any attempt to melt and re-form the wax changes the color so that only the Rose Abbey artisans are able to achieve a unique seal from specific raw ingredients. What will Rose Abbey do with this magic superpower? Rose Abbey needs shoes for their monks. Cobbler Jan is the best around, working from her pushcart which she wheels around the village. Luckily, Jan has a proclamation too - but Cobbler Jan isn’t as rich or established as Rose Abbey. Jan doesn’t have a fixed, well-known storefront. Instead of hanging on a fancy door, Cobbler Jan’s proclamation is posted among other merchants’ proclamations on a board in the town squares: A paper with a wax seal in the shape of a boot that says: “Hear ye hear ye: All messages from Cobbler Jan will be sealed with a handmade one-of-a-kind wax seal made in the cobbler’s shop. It looks like this” With these proclamations, Rose Abbey and Cobbler Jan can start corresponding with a guarantee of message integrity, each signing messages with the resource only they control, and verifying the other’s message using the public proclamation document. As long as Cobbler Jan can reliably find Rose Abbey’s proclamation and Rose Abbey can reliably find Cobbler Jan’s proclamation, they should be able to evaluate the unique signature represented by each other’s wax and know they’re the true originators of the message. A DID document is the digital equivalent of a proclamation nailed to the door of Rose Abbey. DID documents contain statements about how to securely interact with an entity. Message recipients who want to validate a message need to know two things: 1) how the DID document can be located and 2) what exact proclamation to use. In the case of Rose Abbey, this is easy; they use the time-tested “front-door” method to locate their proclamation, which involves opening a map, looking for a building called Rose Abbeythen proceeding to the front door of the largest building at that address. If you get to an address, look at the front door, and the proclamation doesn’t say “Rose Abbey,” you know you’re in the wrong place or don’t have the right proclamation. In the real world, you would encode both pieces of information into a DID. The amazing thing about the DID is that that it’s self-contained, holding both the “how” and the “what.” For example, the DID for Rose Abbey could be did:front-door:RoseAbbey. Anyone who sees a message from did:front-door:RoseAbbey knows to get a map, find Rose Abbey, and to read the proclamation containing the identifier did:front-door:RoseAbbey on the largest building. In Cobbler Jan’s case, you would visit the commons to find their proclamation among those posted. Cobbler Jan has fewer resources than Rose Abbey but still has a DID. A message signed using the “town-square” method such as did:town-square:CobblerJan is just as valid, but the instructions are different. Instead of opening a map and looking for a direct address for Cobbler Jan, the town square method instructions say to find on the map, and search for the proclamation posted at the commons with a matching identifier. Note that no matter what method the DID uses, the DID document produced has the same standardized contents. 2: Verifiable credentials: The letter of introduction Now that Rose Abbey and Cobbler Jan know they can correspond securely, they negotiate for the contract to make shoes, and Jan is awarded the job. Rose Abbey creates a new credential document that Cobbler Jan can use to purchase supplies. This is a document standardized throughout the land – here's a before and after of Rose Abbey filling out the document: A form-based letter of introduction with blanks where four values can be filled in. The letter says “To whom it may concern, blank (the issuer) says blank (the subject) is blank (claims). Signed, blank (a signature).” The same form-based letter of introduction filled in to say “To whom it may concern, did:front-door: RoseAbbey says did:town-square: CobblerJan is an authorized buyer of leather. Signed, wax seal in the shape of a flower.” Cobbler Jan is now in possession of an authoritative statement that can be used to buy leather for the Rose Abbey boot order! But notice this letter doesn’t say “the bearer of this document is an authorized buyer.” It states that Cobbler Jan is an authorized buyer. Because of this, leather merchants won’t release leather to anyone who presents this credential, only to someone who can prove they’re the person explicitly named. Cobbler Jan can’t sell the letter or give it away because they would have to give up the secret recipe to their handmade wax. The modern equivalent to this letter of introduction is a verifiable credential, or VC. In this case, Rose Abbey has wrapped an entitlement to buy leather into a container that is coded to their DID and signed cryptographically. Anyone who wants to confirm the VC really came from Rose Abbey can use the embedded issuer DID in the VC to find a public cryptographic key in the associated DID document and do the math to confirm. Time to go buy some leather! 3: Verifiable presentation: The endorsement This brings us to the morning that Cobbler Jan arrives at the leather shop. They must do two things: - Produce the letter of introduction signed by Rose Abbey that shows entitlement to buy on behalf of the Abbey. - Prove in the moment that the entitled party mentioned by Rose Abbey is the same party that is present in the shop. Cobbler Jan accomplishes these two steps by opening the document from Rose Abbey and stamping an endorsement onto the credential, attaching the cobbler’s own unique wax seal only they can produce to prove they’re the same Cobbler Jan that the Rose Abbey credential refers to. The previously shown letter of introduction with an additional note sealed by a wax seal in the shape of a boot that says “Endorsed by Cobbler Jan on April 22” This is the most important part! The leather merchant could be looking at fraudulent documents that look convincing. Only by verifying the wax on both seals against the wax in the proclamations of each party can the merchant be sure they’re not giving away leather to the wrong party. If, for example, Cobbler Fred (a competitor) creates a fake letter of Introduction claiming to be from Rose Abbey using his own identifier and authoritative seal, when the leather merchant looks up the seal that is authoritative for Rose Abbey they’ll see that it doesn’t quite match the seal on the endorsement that Cobbler Fred handed over. One difference between our feudal world and modern times is that Cobbler Fred might be smart and talented enough with an inkpot and quill to steal the original letter, and alter it convincingly while keeping the seal intact. Luckily, modern digital doesn’t rely on human perception, so this risk is mitigated today. This endorsement of Rose Abbey’s letter of introduction could happen many times. Cobbler Jan could visit various leather merchants and prove each time that the entity referred to in the letter of introduction is in fact the person there in the moment. As long as everyone protects their secret wax formulas, and everyone faithfully validates the documents there’s reasonable protection against counterfeit letters. The modern equivalent of this endorsement process is a verifiable presentation, or VP. A VP can be considered a “proof of possession.” Here’s how it works in some truly medieval pseudocode: If: a VP and VC are presented together and the issuer of the VP and the subject of the VC are both Cobbler Jan and the VP signature can be validated with Cobbler Jan’s DID document verification method and the VC signature can be validated with Rose Abbey’s DID document verification method then it can be assumed that the VC was issued to the entity that issued the VP. The big picture And that, my friends, is the basic model behind DID. We have fancier terms and we can use modern cryptography instead of wax seals with unique ingredients, but the fundamentals are the same. All parties involved must safely manage the ingredients needed to produce a signature that can’t be replicated, and must all proclaim a signature verification method using a mechanism that’s well-known. When credentials successfully validate, they guarantee message integrity, and that guarantee allows secure conversation to begin, a key layer for trusted commerce. Credentials can be issued to a subject, then presented and endorsed by that subject, enabling the exchange of goods and services as a result. But where is the decentralization? If you’ve made it this far, you may have noticed we aren’t talking a lot about decentralized concepts. You might also be itching to point out that this story is just a tale of a secure messaging scheme based on asymmetric cryptography, and that signed messages have been around for almost as long as Rose Abbey. You’re absolutely right. There's nothing about the model or standards we’ve defined that forces decentralization or excludes centralization. The DID and VC specifications are intentionally agnostic to the properties of the specific trust anchors that might be implemented, which is a big part of the reason we think they can be successful. The difference between any old, secure messaging scheme based on asymmetric cryptography and this messaging scheme based on asymmetric cryptography, is the layer of abstraction that allows old and new, decentralized and centralized - and anything we come up with in the future - to work alongside each other. We’ll talk more about this in the next installation of this series. To quickly recap: DID documents are the posted proclamations that enable VCs. VCs are the “fill in the blank” authoritative statements of attribution of entitlement that are issued against a subject identifier. are endorsements at the time of presentation that prove that the entity showing the credential is, in fact, the named party. The system looks something like this: A diagram showing rose abbey on the left with a title of “Issuer”, Cobbler Jan in the middle with the title of “Holder” and a leather shop on the right with the title of “Verifier”. Below each entity is a DID document and there are arrows that show 1) a verifiable credential passing from Rose Abbey to Cobbler Jan, 2) Cobbler Jan verifying the Did document of Rose Abbey, 3) a verifiable presentation passing from Cobbler Jan to the leather shop, 4) the leather shop verifying Cobbler Jan’s DID document and 5) The leather shop verifying Rose Abbey’s DID document. A few important things to note here: Everyone has a DID document (even the leather shop didn’t cover in-depth) and the content is standardized. But the method by which the document is fetched can differ. DID documents could be world-readable and searchable like Rose Abbey, but others could be shared privately at the time of first interaction, or even ephemerally created once and then discarded. Every entity can make its own choices about which method works best in each context. We can also support emerging options for DID document storage without requiring a rewrite of the entire architecture. Of course, there’s a lot more to end-to-end architecture than just the documents that fly around. How are the documents delivered? How are the exact contents and types of letters negotiated to match the context? Can you selectively disclose part of the letter without revealing all the details? How does everyone know the rules in any given village? And, how do people know who to trust and for what? The answers are… stay tuned! Feudal metaphors were not built in a day. I’ll be back in the next installment to discuss fascinating topics, like whether Rose Abbey uses couriers on horseback to deliver their documents, or how Cobbler Jan can choose between African or European carrier pigeons to send different portions of the letter of introduction, depending on whether or not a given monk wants to be sure their shoes are made only from vegan leather. If you can’t wait that long, you can cheat a little and – it’s a long list but that’s the fun of it. If you want to learn more about these topics and the technologies that fill out all the steps in the process, you’re in luck! Our next installment in this series is a DEEP DIVE! We’ll take our feudal tale and outline the industry stack we’ve implemented at Microsoft, including W3C Decentralized Identifiers, the OpenID Foundation OIDC4SSI Transport family, and the W3C VC Data Model 1.1. If you’re still here after this fantastical feudal adventure, I salute you and thank you for joining me on our quest. Learn more about Microsoft identity:",https://techcommunity.microsoft.com/t5/identity-standards-blog/decentralized-identity-the-basics-of-decentralized-identity/ba-p/3071980,,Post,,Explainer,,,,,,,"Verifiable Credentials, DID",2022-03-24,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Pamela Dingle,,,,,,Decentralized Identity: Verifiable Credentials Deep Dive,"To understand the world of verifiable credentials and verifiable presentations, you need to understand the ecosystem in which they are expected to be used. The VC Data Model v1.1 defines the roles different entities play in the data format, as well as two processes: credential issuance and credential presentation. You are probably familiar with a concept of issuers or verifiers in the physical world, but the role of holder could be new to you. The specification defines a holder as an entity that can “possess a verifiable credential and generate verifiable presentations”. Another way to think about this is the place you store your credentials until you are ready to take them out and use them for a purpose.","Welcome to part four of our decentralized identity series! The goal for this segment of our larger story is to show you what a concrete VC (aka Verifiable Credential) looks like and to describe enough of the terms and concepts that you can further research W3C VC Data Format 1.1 specification easily. There are a lot of links in this post, most of them will take you right to the corresponding part of the specification. If you missed our earlier installations in the series you can find them here: Part I: The Five Guiding Principles Part II: The Direct Presentation model Part III: The Basics of Decentralized Identity Part IV: Deep Dive: Verifiable Credentials <-- You are here Part V: Deep Dive: Decentralized Identifiers General Model – and what is a Holder? To understand the world of verifiable credentials and verifiable presentations, you need to understand the ecosystem in which they are expected to be used. The VC Data Model v1.1 defines the roles different entities play in the data format, as well as two processes: credential issuance and credential presentation. You are probably familiar with a concept of issuers or verifiers in the physical world, but the role of holder could be new to you. The specification defines a holder as an entity that can “possess a verifiable credential and generate verifiable presentations”. Another way to think about this is the place you store your credentials until you are ready to take them out and use them for a purpose. You might want to jump to the conclusion that a holder and a digital wallet are the same thing - a wallet can play the role of holder, but it has many other characteristics we will discuss some other time. Additionally, a wallet is usually a ""user-present"" technology, meaning that there is a human performing a ceremony as part of the credential interaction. While wallet interactions are the most talked-about use of VCs there are also machine-to-machine interactions that do not have a user experience at all. Credential Issuance I have previously called a verifiable credential (VC) a letter of introduction – a statement of relationship between an issuer and a subject. A verifiable credential uses cryptographic proofs to bind an issuer statement about a subject to the subject's identifier. The resulting document can contain claims relative to the subject and in some cases proofs of different kinds can be bundled together. Once issued, a verifiable credential can be held for potentially long periods of time and presented for multiple purposes and in multiple ways. VCs in a wallet wrapped by a VP and sent to verifiers Credential Presentation In the verifiable credential world, presentation involves the generation of a VP – a verifiable presentation. A VP is generated by a holder - it is a document that wraps a verifiable credential with a new credential that is both fresh and which proves a relationship in the moment between the holder and the original VC subject. Most often the subject/holder relationship is direct - meaning that the entity presenting the credential can prove they are the same entity named in the credential. To check this relationship, the verifier checks that the VC subject is the same as the VP issuer, AND that the signature of the VP is valid. VCs in a wallet wrapped by a VP and sent to verifiers All of this passing around of VC's and VP's is like a big game of hot potato, and different credentials may have different patterns of presentation frequency and breadth of presentation across many verifiers. A visual representation of overall VC/VP travel - VCs travel between Issuers and Holders, then VCs wrapped in VPs travel from holders to verifiers. What is in a Verifiable Credential? Now that you know how verifiable credentials move, we can look at what they contain. While the VC Data Model doesn't specify the protocols for travel of credential between actors in the flow, the actors must be part of the data model, because their associated public/private keypairs are used for cryptographic signing operations. At the highest and geekiest level, a verifiable credential bundles claims together into a container called a credential, and imposes standardized rules around how parties interacting with the VC can derive confidence in different characteristics of that bundle - for example the data model shows how to verify that the created credential was made by the entity claiming to be the credential issuer. Canonical Data Format vs. Encoded Representations If you take all proofs and interpretation out of a verifiable credential, you have a base data format (the bundle I referenced earlier) that is simply referred to in the spec as “a credential”. This credential data format is referred to as the “canonical form” and the assumption is that you can always get from a given verifiable credential representation to a canonical credential data format and back again, using the encoding & decoding rules laid out in section 6 of the VC Data Model 1.1. This is why the examples in the specification document have three tabs – one shows the canonical form, and the other two show each of the representations that can be used. While any number of representations could be additionally defined, there are two specified in the core data model: - JSON-encoded verifiable credentials (known as JWT-VCs) are encoded as signed JWTs (JSON Web Tokens). - JSON-LD encoded verifiable credentials (known as LDP-VCs) can use different proof formats, but the most common are Linked Data Proofs. Three boxes showing a credential (without verifiability) on top and two credentials (with viability) descending - a JSON encoded credential and a JSON-LD Encoded Credential A simple way to think about the representations versus the canonical credentials is that the representations are what add the verifiability to a verifiable credential. A canonical credential is the bundle of claims that purports to be from an issuer and purports to be about a subject, and contains a set of additional claims, but anyone can tamper with the content of that bundle. When the canonical credential is encoded into a representation however, the representation dictates what kind of proof structure can be used to evaluate various properties of the credential, with tamper-proofness being a required check (this happens through checking a cryptographic signature). Other checks could potentially happen, for example validating a selective disclosure of information rather than an entire credential. All of my examples in this blog article will be JSON-encoded, partly because this is what Microsoft has implemented (and therefore what I’m more experienced in) but also because many of you reading this blog are probably familiar with the JSON Web Tokens already, especially if you have worked with JWT-based implementations like OpenID Connect IDtokens, Federated Identity Credentials, or JWT bearer client authentication tokens. If you aren’t familiar with what a JSON Web Token is, check out this tutorial. A JWT-VC is (surprise surprise) a mashup of a JSON Web Token and a canonical VC. The JWT supplies verifiability through a cryptographic signature bound to a subject identifier that is labeled as iss (aka the issuer). JWTs have a very familiar structure. A visual representation of a JSON-encoded Verifiable Credential - 3 sections - the header contains the alg, typ and kid claims. The payload contains iss, nbf, exp, jti, sub claims and an object called VC, which contains a @context, type, and the credential subject object. That object contains various claims. The last of the 3 major sections is the signature. A JWT-VC has three parts, and the payload contains what I would call envelope information: the data needed to know who the credential is is bound to, who made the credential, when it was made and how it can be identified. Additionally, there is a JSON object called “VC”. Claims information is embedded inside the VC object. A JWT-VC uses an external proof, meaning in this case that signature data is not embedded inline with the credential, the signature is detached from the credential. Here’s an example of a JSON-encoded verifiable credential that my university might issue to communicate my relationship and some additional information. Note that this is NOT a real-world Microsoft-issued example, it is a construct meant to highlight characteristics of the specification. If you see ways in which my example is not conformant, please point it out in the comments, for extra street cred! A valid JSON object formatted as a JWT-VC with a header, payload and Signature section The JWT-VC payload combines two different kinds of claims: - The three-letter claims (iss, nbf, exp, jti, sub in this example ) come from RFC 7519 (the JWT specification). These claims are required to validate the JWT and would be expected by any conformant JWT library or product. - The VC claim at the first level of the JWT comes from the W3C VC Data Model 1.1. The VC claim contains the following standardized claims: - credentialSubject is the container for authoritative attribute statements about a subject (for example, the “achieved” object in my example, which is custom to the NorthAmericanUniversityGraduationRecord credential type and shows the type of accreditation earned). - @context contains a set of URIs that point to machine-readable definitions conforming to the W3C JSON-LD @context definition. In this version of the spec, even if you aren’t using JSON-LD, you need to respect this format and ensure that the values set in @context additionally contain a reference to the VC Data Model v1.1 base context so that JWT-VC credentials won’t break JSON-LD parsers. In our example, we included the base context but also a reference to “global-diplomas.org/accreditation”, which is where the credential schema for the credential in the type attribute would be found. - type is another property that should be constructed according to JSON-LD conventions. According to Section B.2 of the VC Data Model 1.1, @type is used to “indicate which set of claims the verifiable credential contains”. Unlike the @context property, which is always a set of URIs, @type can be a single string or an unordered set. Our NorthAmericanUniversityGraduationRecord type helps verifiers to reliably ask for a credential with predictable contents. In order to know how a canonical credential corresponds to a JWT-VC, you need to spend quality time in section 6.3.1 of the VC data model. Here’s an example of how our JWT-VC would look decoded into the canonical data format: Diagram mapping our example JWT-VC object to a canonical credential Note that in the JWT-VC, both the jti and the sub properties map to a property called “id” in the credential data model, one property applies to the parent credential object and the other applies to a credentialSubject object. Distinctions, Differences and Gotchas The JWT-VC representation imposes rules on how a VC can be constructed, and in some cases the choice of representation has important consequences. Single Signature/Issuer Signed JWTs are structured to have only one signature, and that single signature is affiliated to one issuer. This is an external proof (in VC terms) and it wraps the entire assertion. Single Subject RFC 7519 defines a JWT sub claim as containing a single value, which maps to the id value of a credentialSubject object. There can only be one JWT sub claim, which means that a JWT-VC can’t have more than one credentialSubject object within it. Other encodings do not have this limitation, allowing multiple credentialSubject objects within a single verifiable credential. Nbf, not iat In the VC data model, issuance date is defined as “the date and time when a credential becomes valid”. In JWT-land, there is a difference between the moment the token was issued (iat), and the earliest moment that any verifier should consider the token as valid (nbf). The JWT authors defined two separate values to help account for clock skew, because if the system clocks aren't perfectly aligned, the token could arrive before it was even issued. The VC data model JSON encoding maps the issuanceDate claim in the abstract credential to the JWT “not before” claim rather than the JWT “issued at” claim. Remember, the Verifiable Credential is just half of the Story! A subject might have a signed JWT-VC stored with a holder, but to present it to a verifier, the holder needs to also construct a JWT-VP to show that the subject & holder have a legitimate relationship to this particular verifiable credential, and that the subject wants this particular VC to be presented to this particular verifier, at this particular time. Our JWT-VC claims that a globally unique subject (did:ion:pamelarosiedee) has earned a degree, now we need to convince a verifier that did:ion:pamelarosiedee is right here, right now, present in the transaction and is actively passing their credential along. Cryptographically speaking, a JWT-VP represents a proof of possession: a verifiable way to know for sure that the entity presenting the diploma is the same entity that the diploma was issued to in the first place. Going back to my last blog entry, a verifiable presentation is an endorsement. It wraps our letter of introduction (VC) and adds an extra proof to the mix – a time specific proof that the exact subject listed in the verifiable credential is present and part of a real-time identity transaction. If my employer does all their validation checks properly, they will know that an entity with a globally unique identifier of did:ion:pamelarosiedee (me) is claiming to be the subject of an authoritative educational achievement issued by did:web:ucalgary.ca (my university) in the moment in time when the employer needs that information. The validation checks enable cryptographic trust, meaning that the data hasn’t been tampered with, but don’t be fooled that cryptographic trust is the only trust decision that must be made! What if did:web:ucalgary.ca wasn’t a real university, for example? I could go register the domain “dingleuniversity.com” and create a degree for myself that could be generated and presented with perfect cryptographic trust! The verifier needs to be able not only to know the documents are not tampered with, but that the issuers of the documents are acceptable business partners in a given context. This is where trust federations, trust registries and trust frameworks come in, topics that are out of scope for this article but that we will return to. Heading back to our example, what would a verifiable presentation look like if I wanted, on the 8th of August 2022, to present my University degree to my employer? A JWT-VP issued by my decentralized identifier and audienced for Microsoft’s decentralized identifier that contains the verifiable credential we’ve already studied A Closer look at JWT-VP The example above is a JSON-encoded verifiable presentation, or JWT-VP. The “how-to” of JSON-encoding into a JWT-VP is blended into section 6 of the spec, so there is no easy section to refer to, but there are a few significant differences if you read closely: No Subject The subject property isn’t needed in a VP, because the issuer is asserting a thing about itself, in other words a JWT-VC is self-issued (you will hear this term again). Wraps a Related Verifiable Credential Remember, the JWT-VC must prove a relationship between the presented credential and the entity doing the presentation. By far the most common relationship to prove is the “this is my credential” relationship, also called “Subject is the Holder”. If I stole my sister’s diploma and uploaded it to my employer to verify, the burden is on the verifier to detect the fraud and reject my claim to the uploaded credential. If the verifier does a bad job, the fraud could work. In verifiable credentials we use cryptographic proofs to be sure the presenter of the credential is the subject of the credential. If the issuer that signs the JWT-VP is the same entity as the subject listed in the encapsulated JWT-VC, you have a strong proof of legitimate possession. In our JWT-VP world this check is straightforward because both JWT issuers and subject are always single-valued. Validation logic for this use case is currently non-normative, but do not interpret non-normative as not important! Constrained to an Audience While verifiable credentials are an open-ended assertion, verifiable presentations are targeted, using a property called aud. This critical security component prevents an attacker from stealing a presentation made to one verifier and reusing it at another verifier. Creating a presentation without an audience is like broadcasting your endorsement – as long as nobody records your broadcast you might be ok, but if anyone does record that broadcast, they can replay it at will! Imagine if you uploaded your diploma to your social network and openly endorsed it as yours to get a pretty badge, but somebody recorded you endorsing your diploma and used that recording to apply for 3 different jobs in your name! Not only is it imperative to use an audience in all presentations, but it is imperative not to accept broadcasts, that is verifiers should not accept an un-audienced presentation, because it could be a replay. This does not mean that audience needs to be a strict match in the way that a subject/holder match is performed – it could be that the presentation is legitimately audienced to a group of verifiers rather than a single verifier, but every degree of freedom allowed in the audience introduces replay risk. Time-Limited and Single-Use The momentary nature of a verifiable presentation is what makes it safe to hold and store verifiable credentials. A JWT-VP represents the exact moment where a subject intends to share a credential. These VPs are only valuable if they can reliably represent the true intention and only the true intention of the subject in the moment they intended it and no other. Once a JWT-VP is signed and represents that intention, the token itself becomes extremely powerful. To keep abuse to a minimum, the JWT-VP exp property together with nbf creates a short window of validity. Verifiers need to track every incoming jti value (the unique token identifier) for as long as it is valid, so that they can detect any token that might fall through the cracks and get replayed. A short JWT-VP validity window reduces the number of active identifiers that have to be tracked by verifiers (helping performance), and also condenses the window of opportunity for an attacker to act, this also means a smaller horizon for risk/fraud tools to look for abuse trends. Note that many VCs will also have a validity window calculated by the exp property, however that window can be much longer, in part because a stolen JWT-VC has less value by itself without a fresh JWT-VP to wrap it. How Does the VC Data Model Compare to Other Assertion Formats? If you examine a SAML assertion, an OpenID Connect IDtoken, and a VC/VP pair, all three formats have a lot in common: - They are all secure envelopes for sending set of claims around the internet. - They all rely on cryptographic digital signatures to protect issuer’s integrity of the document. - They all rely on a trust framework: some external system of trust to know which set of participant identifiers to transact with in the first place. There are also important differences. Separation of Issuance from Presentation SAML assertions and IDtokens are messages that carry claims bundled in the moment for one single audience & transaction. This differs from verifiable credentials, which are bundles of claims with longer lifetimes that can be held, stored, and presented to multiple audiences as a separate transaction. This separation between the bundling of claims and the use of the bundles is really what makes the VC data model unique. Advantages that may be unlocked because of this separation include: - Usage blinding – Issuers don’t necessarily know which verifiers their credential goes to. - User-centric analytics – trends of credential use can be evaluated by the user across issuers & verifiers. - Historical presentation - expired statements can still be evaluated. - Selective disclosure - the subject chooses how much of the credential is revealed to the verifier. Subject Autonomy The VC data model requires credential subjects to be cryptographic participants. That is huge! IDtokens and SAML assertions communicate their assertion subject using an identifier that is only unique within the identity provider’s namespace. Verifiable credentials require a globally unique identifier for all cryptographic participants, and while the spec does not prescribe a mechanism for resolving identifiers to signature validation keys, the current commonly-accepted format for a globally unique identifier that can “resolve” into a public key for the purpose of digital signature validation is a decentralized identifier, or DID. We are going to cover the DID specification in detail in the next blog, so stay tuned for why that specification is so important. Thrilling Conclusion As you can see, VC and VP data formats work hard to make it possible for credentials to be issued, held, and presented at the discretion of individuals. A VC gives structure to the data and binds the data to a given globally unique subject identifer. A VP adds a timely endorsement to the credential that acts as a proof of possession for the credential. The great thing about the VC data model compared to other standards that do something similar, is that VCs are a general model, and can describe many different types of credentials. Of course, for the purposes of interoperability, it is better if folks could agree on what credentials look like within a given community, for example we probably don't want 100 different representations of what a university diploma looks like. The fun part of the next few years (in my opinion) will be figuring out what common instantiations of credentials can and should become household names. There are guardrails we will want to place, and it would be lovely if we can figure out where guardrails need to be before vehicles go over the cliff, rather than after. Early adopters of this technology will have a huge part to play in socializing the kinds of credentials that become defacto standards. You may also have noticed that key management was a topic completely absent from this blog post, despite frequent use of the term ""cryptographic signing"". That's because the VC data model considers key management only obliquely. The spec states for example that ""relevant metadata about the issuer property is expected to be available to the verifier"". In a world where I might present my credential to verifiers with no pre-existing relationship to my issuer, how can that verifier discover the relevant metadata? This is where decentralized identifiers (or DIDs) come in. DIDs are not a required part of a verifiable credential, but they have properties that are very attractive. And you will never guess it but the next blog in the series will be a deep dive on decentralized identifiers! They are a workhorse of a decentralized identity architecture, and an important reason why there is a chance that verifiable credential ecosystems can scale to represent individual people, not just organizations. DIDs are also the bridge between many different decentralized paradigms, between unanchored and static key encoding mechanisms, and between centralized services, through the use of methods like did:web. We will dig into how the mechanism works, and why different did methods might be useful at different times. While this blog is about the protocols not product, if you want to try issuing a verifiable credential you can do it for free, see below for the link to get started. Once you’ve tinkered with VCs for a while, let us know how it’s going and what tools you need. Learn more about Decentralized Identity:",https://techcommunity.microsoft.com/t5/identity-standards-blog/decentralized-identity-verifiable-credentials-deep-dive/ba-p/3690641,,Post,,Explainer,,,,,,,Verifiable Credentials,2022-12-09,,,,,,,,,,,,,
|
||
Microsoft,Office Garage IT Pro,,,,,,,,,"Digital Identity, use Verifiable Credentials with Blockchain","a special edition of Microsoft Mechanics with Microsoft’s Identity CVP Joy Chik, to cover a brand new solution called Verifiable Credentials that uses blockchain-based underpinnings and cryptographic keys to ensure that you’re in control of your own identity online","Digital Identity, use Verifiable Credentials with Blockchain A solution that uses blockchain-based underpinnings and cryptographic keys, to ensure that YOU are in control of your own identity online. Owning your identity is more relevant than ever. In the digital environment, it’s hard to retain ownership of your identity once you’ve shared it. Every time you give away Personal information in exchange for a service — like bank account numbers, proof of education, or even employment, it’s now in the custody of those different institutions. You no longer control the data associated with them, and you can’t take it back. Verifiable Credentials is one of the most exciting and transformative areas of innovation. Verifiable Credentials and Decentralized Identifiers help you share your verifiable credentials without giving up your privacy. No one company or institution can control or store your information centrally — you can revoke your verifiable credentials at any time. Joy Chik, Microsoft’s Identity CVP, joins Jeremy Chapman to show you how it works and gives you the key steps to get up and running. QUICK LINKS: 00:07 — New solution: Verifiable Credentials 01:39 — What is it, and how do things change? 02:16 — See how it all works 08:04 — Get it up and running 13:26 — Where else are Verifiable Credentials applied? 14:08 — Links to learn more Link References: To learn more and get all of the tutorials, go to https://aka.ms/DIDForDevs. Download the Verifiable Credentials SDK and create your own DID at https://aka.ms/VCSDK. Keep up to date with our Decentralized Identity blog at https://aka.ms/IdentityBlog/DID. Unfamiliar with Microsoft Mechanics? We are Microsoft’s official video series for IT. You can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. - Subscribe to our YouTube: https://www.YouTube.com/c/MicrosoftMechanicsSeries?sub_confirmation=1 - Follow us on Twitter: https://Twitter.com/MicrosoftMechanics Follow us on LinkedIn: https://www.linkedin.com/company/Microsoft-mechanics/ - Follow us on Facebook: https://facebook.com/microsoftmechanics/ Video Transcript: Hello, and welcome to a special edition of Microsoft Mechanics with Microsoft’s Identity CVP Joy Chik, to cover a brand new solution called Verifiable Credentials that uses blockchain-based underpinnings and cryptographic keys to ensure that you’re in control of your own identity online. So, Joy, it’s been a while. Welcome back to Microsoft Mechanics. It’s great to have you back and speaking about one of our favorite topics: identity. - Thank you. It is great to be back. - So, this whole topic of owning your own identity is more relevant than ever. Now, we’ve all heard stories or maybe even experienced times where our identities may have been compromised or even stolen, yet in the digital environment, it’s hard to retain ownership of your identity once you’ve shared it. - Right, and that problem compounds every time you give away Personal information in exchange for a service, like bank account numbers and a proof of education, or even an employment, and once you give it away, you can’t take it back. It is part of your growing digital footprint. Think about all of the services that you sign into every day to enable day-to-day transactions, or your school ID, or employer-issued IDs. Once you share your information, it is now in the custody of those different institutions. It is hard to keep track of the different accounts. You can no longer control the data associate with them, and you can’t take them back. - Okay, so how do things change, then, with Verifiable Credentials? - This is one of the most exciting and transformative areas of innovation that we are contributing to at Microsoft. A verifiable credential is a piece of information that a third party can validate digitally. In partnership with the open standards community, we are implementing an approach that’s based on W3C standards for verifiable credentials and decentralized identifiers to help you share your verifiable credentials without giving up your privacy. No one company or institution can control or store your information centrally. You can revoke your verifiable credentials at any time. - Okay, so how does it all work? - Let me explain with a real-world example. There are three parties in this chain of trust: the issuer, the subject, and a verifier. The issuer creates the verifiable credential. This can be an organization or entity that asserts information about you, like your employer or university. In this case, the issuer is the United States Department of Defense. The Navy MilGears program is piloting a solution to digitally issue verifiable credentials to each service member. They are the subjects who can now easily prove their skills to an accredited institution, what we call the verifier. This way, the subject can present credentials issued directly from the source, and as a result, hopefully accelerate their career. - Okay, so can we see it? - Yes, let me show you this in action. So, here, MilGears has made its verifiable credential service available to service members through their portal. To retrieve credentials as a service member, I just need to securely authenticate, and once I’m signed in, I can access my service history, training, education, and credentials. I’m going to pick up where I left off. Now, if I scroll down, you’ll see the individual digital record that the DOD is attesting in the verifiable credential; in this case, my high school graduation status. To accept and activate the credential as a service member, I can scan the QR code with a supported authenticator app, and in this case, it’s Microsoft Authenticator. This stores a unique private key. Although you cannot see it on my screen, it has signed me in with biometric proof. Scanning the QR code stores the verifiable credential in the digital wallet on my device. Next, to complete my university application as a service member, I need to submit my military credentials. On a participating university site, in this case Trident, I just need to sign in with my usual username and a password, and in my case, I know I need to get my high school diploma verified, so I will click there. I will click here to verify, and that pops up a QR code. So, I will scan that, and if I’m there, I’m prompted to approve the request on my phone by tapping Accept, and once I approve it, the university can now access my verifiable credentials within seconds, and the beauty of this approach is your authenticator app keeps a record of every time you share your credentials. - Okay, so for this process to work, clearly, there must be incentives, then, for each party. - Yes, everyone benefits in this scenario. The issuer, in this case MilGears, gains a more efficient process by verifying service member information. They also don’t bear the risk of storing sensitive Personal data. That data stays with the subject. The verifier, in this case the university, can trust the subject’s information because it comes directly from the issuing source. So, they save the time and the cost on background checks. And as a subject or service member, I have control over my information and can revoke access to it at any time. As long as issuers and the verifiers participating in this approach, you can prove anything about yourself while retaining full ownership of your digital footprint. - And that’s great because for this approach to take off, there has to be a high level of convenience really for all parties involved, but what’s enabling the issuance and verification processes to work? - Under the covers, we’re using a blockchain-based distributed ledger with crypto keys as an intermediary. To participate in this process in Microsoft implementation, the issuer uses Azure AD with Azure Key Vault to register a key pair to the blockchain. The verifier does this using the Verifiable Credential SDK. During this process, each party gets a unique decentralized identifier, or DID, that gets written to the ledger along with their associated public key. The subject uses the authenticator app to generate a crypto key pair stored on their device. Once they accept that first credential, it is registered on the blockchain with the DID that is also associated with their public key. - Okay, so this then allows all three entities to be looked up in a trusted manner, all while protecting their privacy. - That’s right, so for example, when the service member approved the request for verification by scanning the QR code, it’s passed the credentials to the university. The university were then able to verify the service member’s DID and the issuer’s DID by looking up the public keys in the distributed ledger. - And all of this happens in just a few seconds. So, it’s very efficient and establishes the chain of trust across you, the issuer, and the verifier. But this is Mechanics, so we’re gonna show you the key steps to get all this up and running, starting with the issuer. Now, while this is based on open standards and could run anywhere from Microsoft’s implementation, you’ll need a couple of services in place as the issuer. First, an Azure subscription, and also Azure Active Directory Premium or trial. You’ll need to enable your Azure AD tenant for verifiable credentials, and this assigns you with the decentralized identifier, DID, and also equips your tenant with a service so that you can actually issue credentials, and you can learn more about all of this, by the way, aka.ms/DIDForDevs, and also, you’ll need an Azure Key Vault and storage account. - That’s right, and then you take just three steps to configure and issue verifiable credentials. The first step is to connect your tenant to the distribute ledger. Here in the Azure Key Vault, you will generate and manage a pair level crypto keys, one private and one public. Azure AD then publishes the DID and then the public key to the distributed ledger, to manage signing, recovery, and updates for your DID. - Now, Azure AD customers can securely manage their keys to sign the verifiable credentials themselves. - That’s right. The second step is to configure the properties of your verifiable credential. To do this, you create a rules file, which is a simple JSON file that defines the properties for the verifiable credentials being issued. On the left are credential details from the Azure portal, and on the right is VS Code, with the two key files we need to create, one for rules and one for the card display. Let’s start with the rules file. You will see that it includes the credential issuer’s address that’s mapped to their domain address. Much like traditional identity tokens, an issuer can choose to set an expiration for the credential expressed in seconds. Next, it calls out the verifiable credential type and ties that to learning record schema. So, applications can decipher the transcript and then map it to their own business process, and then finally, the issuer can declare what the user must present to receive such a credential. This can be self-attested information from filling out a digital form, for example. It can be one or more ID tokens, which can come from a traditional identity system like Facebook, Google, or a work account, or even verifiable credentials from some other issuers. - And I noticed that this one was actually branded as MilGears, so how did you do that? - For that, let’s take a look at our card display file to configure the look and the feel for the credential. As I mentioned before, verifiable credentials issued to you look like just the cards in your Microsoft Authenticator wallet. Issuers can customize them using the card configuration JSON file that you can see here. We’re able to specify the card’s color, icon, and the text strings to match the MilGears look and feel, and I can use other fields to let users know the purpose of the card, its attributes, and much more. - Okay, so that’s the issuer’s role, but let’s switch gears to the verifier. What do they need to do? - If you are the verifier, you still need to do a few things for this to work. To get started, you need to download the Verifiable Credentials SDK, and then create your own DID. You can find that at aka.ms/VCSDK. You also need to create a QR code or deep link and then publish to your website or include it in an email to trigger the verification process. This QR code or deep link will trigger a process that you can create using the Verifiable Credential SDK. Let’s look at the code that is behind the request that our QR code links to. This is a presentation request to initiate the request for a subject’s credential. The verifier receives a response that they can trust by verifying three things. Let’s take a closer look at the response file, and you will see. Proof of presentation. You verify this by matching the signature of the presenter with the DID of the subject in the credential. Proof of authenticity. You also verify the signature of the presenter by looking up their public key on the ledger. Proof of issuance. You verify the signature of the issuer by looking up their public key on the ledger. - All right, and just to be clear, for the user, it’s really easy. They’re just using the Authenticator app. - Yes, that’s right. Once the subject is issued the verifiable credential, it is available as another card in their wallet, and they can use to prove information about themselves. - Right, and just like that, the circle of trust with this decentralized identity approach is established, and I like that you broke out of the conceptual realm to really show how this gets implemented with the real example of using the Department of Defense, but where else do you see verifiable credentials being used? What’s next? - Well, starting with skills verification, beyond what I just showed you, imagine even broader applications like presenting verifiable credentials on your LinkedIn profile. How cool would that be to help speed up the hiring and onboarding process? - Very cool. I can see this really speeding up the job application process. - And of course, beyond skills, there’s a broad range of scenarios where verifiable credentials can create more efficiency. Think of industries like healthcare and finance, where both privacy and the validating information that people share is just so important. - Okay, so for the people that are watching this that wanna try it out, where they go to learn more? - To get started, you can go to aka.ms/DIDForDevs to get all of the tutorials. We’re also looking for customers with a skills verification use case to join our private preview, and our Decentralized Identity blog at aka.ms/IdentityBlog/DID is a great way to keep up-to-date on our progress in this space. - Thanks, Joy, for being on the show today. It’s always fantastic to have you on, and thank you so much for joining us today. This is a popular topic, and we hope that we’ve answered all of your questions and made it real for you, and don’t forget to subscribe to Microsoft Mechanics and keep watching the latest updates in this area and much more. Thanks for watching. We’ll see you next time.",https://officegarageitpro.medium.com/digital-identity-use-verifiable-credentials-with-blockchain-e3927e7b3cfc,,Post,,Explainer,,,,,,,,2021-03-02,,,,,,,,,,,,,
|
||
Microsoft,Personal,,,Damien Bowden,,,,,,Getting started with Self Sovereign Identity SSI,"Self-sovereign identity is an emerging solution built on blockchain technology for solving digital identities which gives the management of identities to the users and not organisations. It makes it possible the solve consent and data privacy of your data and makes it possible to authenticate your identity data across organisations or revoke it. It does not solve the process of authenticating users in applications. You can authenticate into your application using credentials from any trusted issuer, but this is vulnerable to phishing attacks. FIDO2 would be a better solution for this together with an OIDC flow for the application type. Or if you could use your credentials together with a registered FIDO2 key for the application, this would work. The user data is stored in a digital wallet, which is usually stored on your mobile phone. Recovery of this wallet does not seem so clear but a lot of work is going on here which should result in good solutions for this. The credentials DIDs are stored to a blockchain and to verify the credentials you need to search in the same blockchain network.","The blog is my getting started with Self Sovereign identity. I plan to explore developing solutions using Self Sovereign Identities, the different services and evaluate some of the user cases in the next couple of blogs. Some of the definitions are explained, but mainly it is a list of resources, links for getting started. I’m developing this blog series together with Matteo and will create several repos, blogs together. - Getting started with Self Sovereign Identity SSI - Creating Verifiable credentials in ASP.NET Core for decentralized identities using Trinsic - Verifying Verifiable Credentials in ASP.NET Core for Decentralized Identities using Trinsic - Create an OIDC credential Issuer with Mattr and ASP.NET Core - Present and Verify Verifiable Credentials in ASP.NET Core using Decentralized Identities and Mattr - Verify vaccination data using Zero Knowledge Proofs with ASP.NET Core and Mattr - Challenges to Self Sovereign Identity - Create and issue verifiable credentials in ASP.NET Core using Azure AD - Implement Compound Proof BBS+ verifiable credentials using ASP.NET Core and Mattr What is Self Sovereign Identity SSI? Self-sovereign identity is an emerging solution built on blockchain technology for solving digital identities which gives the management of identities to the users and not organisations. It makes it possible the solve consent and data privacy of your data and makes it possible to authenticate your identity data across organisations or revoke it. It does not solve the process of authenticating users in applications. You can authenticate into your application using credentials from any trusted issuer, but this is vulnerable to phishing attacks. FIDO2 would be a better solution for this together with an OIDC flow for the application type. Or if you could use your credentials together with a registered FIDO2 key for the application, this would work. The user data is stored in a digital wallet, which is usually stored on your mobile phone. Recovery of this wallet does not seem so clear but a lot of work is going on here which should result in good solutions for this. The credentials DIDs are stored to a blockchain and to verify the credentials you need to search in the same blockchain network. What are the players? Digital Identity, Decentralized identifiers (DIDs) A digital identity can be expressed as a universal identifier which can be owned and can be publicly shared. A digital identity provides a way of showing a subject (user, organisation, thing), a way of exchanging credentials to other identities and a way to verify the identity without storing data on a shared server. This can be all done across organisational boundaries. A digital identity can be found using decentralized identifiers (DID) and this has working group standards in the process of specifying this. The DIDs are saved to a blockchain network which can be resolved. https://w3c.GitHub.io/did-core/ The DIDs representing identities are published to a blockchain network. Digital wallet A digital wallet is a database which stores all your verified credentials which you added to your data. This wallet is usually stored on your mobile phone and needs encryption. You want to prevent all third party access to this wallet. Some type of recovery process is required, if you use a digital wallet. A user can add or revoke credentials in the wallet. When you own a wallet, you would publish a public key to a blockchain network. A DID is returned representing the digital identity for this wallet and a public DID was saved to the network which can be used to authenticate anything interacting with the wallet. Digital wallets seem to be vendor locked at the moment which will be problematic for mainstream adoption. Credentials, Verifiable credentials https://www.w3.org/TR/VC-data-model/ A verifiable credential is an immutable set of claims created by an issuer which can be verified. A verifiable credential has claims, metadata and proof to validate the credential. A credential can be saved to a digital wallet, so no data is persisted anywhere apart from the issuer and the digital wallet. This credential can then be used anywhere. The credential is created by the issuer for the holder of the credential. This credential is presented to the verifier by the holder from a digital wallet and the verifier can validate the credential using the issuer DID which can be resolved from the blockchain network. Networks The networks are different distributed blockchains with verifiable data registries using DIDs. You need to know how to resolve each DID, issuer DID to verify or use a credential and so you need to know where to find the network on which the DID is persisted. The networks are really just persisted distributed databases. Sovrin or other blockchains can be used as a network. The blockchain holds public key DIDs, DID documents, ie credentials and schemas. Energy consumption This is something I would like to evaluate, and if this technology was to become widespread, how much energy would this cost. I have no answers to this at the moment. YouTube videos, channels An introduction to decentralized identities | Azure Friday An introduction to Self-Sovereign Identity Intro to SSI for Developers: Architecting Software Using Verifiable Credentials Decentralized identity explained Books, Blogs, articles, info Self-Sovereign Identity: The Ultimate Beginners Guide! Decentralized Identity Foundation SELF-SOVEREIGN IDENTITY PDF by Marcos Allende Lopez https://en.wikipedia.org/wiki/Self-sovereign_identity https://decentralized-id.com/ https://GitHub.com/Animo/awesome-self-sovereign-identity Organisations https://identity.foundation/ https://GitHub.com/decentralized-identity People Drummond Reed @drummondreed Rieks Joosten Oskar van Deventer Alex Preukschat @AlexPreukschat Danny Strockis @dStrockis Tomislav Markovski @tmarkovski Riley Hughes @rileyphughes Michael Boyd @michael_boyd_ Marcos Allende Lope @MarcosAllendeL Adrian Doerk @doerkadrian Mathieu Glaude @mathieu_glaude Markus Sabadello @peacekeeper Ankur Patel @_AnkurPatel Daniel Ƀrrr @csuwildcat Matthijs Hoekstra @mahoekst Kaliya-Identity Woman @IdentityWoman Products https://docs.Trinsic.id/docs https://docs.Microsoft.com/en-us/azure/active-directory/verifiable-credentials/ Companies Specs https://w3c.GitHub.io/did-core/ https://w3c.GitHub.io/VC-data-model/ https://www.w3.org/TR/VC-data-model/ Links https://GitHub.com/swiss-ssi-group https://www.Hyperledger.org/use/aries https://GitHub.com/Evernym what-is-self-sovereign-identity https://techcommunity.Microsoft.com/t5/identity-standards-blog/ion-we-have-liftoff/ba-p/1441555",https://damienbod.com/2021/03/29/getting-started-with-self-sovereign-identity-ssi/,,Post,,Explainer,,,,,,,"Verifiable Credentials,DID",2021-03-29,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Pamela Dingle,,,,,,The Direct Presentation model,"A credential is issued, and then held for a long period of time with intermittent voluntary presentations to many different verifiers.","Take a look in your physical wallet. Maybe you have some credit cards and probably identification of some kind. Do you have a roadside assistance card, perchance? The cards we carry and present every single day feel different than many common digital credentials (such as federated credentials) that identify us and communicate our attributes online, but that gap is narrowing. As the identity ecosystem looks at emerging paradigms for trust and individual agency in our online interactions, we see use cases where the federated identity model we know so well in the real world might work to provide more digital autonomy and control. We call this model the “presentation model” because an end user collects and controls the presentation of their credential when and where they choose. The presentation model does not replace online federated models, but instead, each model can be used where most valuable. Part one of our series introduced our 5 guiding principles for decentralized identities. In this and subsequential blogs, we will continue to dive deeper into the technical basics of decentralized identity. Read on for part two of this five-part series on decentralized identity. Part I: The Five Guiding Principles Part II: The Direct Presentation model <-- You are here Part III: The Basics of Decentralized Identity Part IV: Deep Dive: Verifiable Credentials Part V: Deep Dive: Anchored Decentralized Identifiers When we use identity-related credentials in the real world, they often have different properties than digital credentials. There are two separate ceremonies in most cases: issuance and presentation. If you have spent any time at a department of motor vehicles or a passport office, you’ve seen what an issuance ceremony can be – it is usually an intensive process, but the tradeoff is that once you have a credential like a driver’s license or passport, you can use that credential for multiple purposes, at multiple places, over a long period of time. You, the “holder” of the credential, can choose to present your credential as a (mostly) voluntary exchange, in which the verifier of your credential can grant you access to some kind of resource - for example, they might allow you into a nightclub, or allow you to check out a library book. Let’s use a concrete real-world scenario: roadside assistance. Many of us are members of automotive clubs which advertise national and international coverage in the case of vehicle mishaps. After a sign-up song and dance and giving them my annual fee, they issue me a card I stuff into my wallet and forget about. Until one fateful night when my car breaks down in a location I am unfamiliar with. In this case, a miraculous thing happens – I can pull my auto club card from my wallet, and present that card to a garage that has never heard of me before, and a tow truck will arrive to help me. This issue-then-present credentialling model that has long powered the physical world is ready for adoption in the digital world too. The technical name for the model at work here is called the Direct Presentation model. A credential is issued, and then held for a long period of time with intermittent voluntary presentations to many different verifiers. To put that back into our example, my auto club issues me a card, which I hold in my wallet for years (maybe decades!), and I might choose to show that card to a garage when I need a tow, or possibly to a store to get a discount. Now let’s switch gears (see what I did there?) to look at the digital world as you use it today. Rather than giving me a card I can store in my wallet, websites (and other services) create a user account. My resource access is tied to this account, and I must prove I own the account by authenticating to the website. After I have authenticated, there is a limited amount of time where the website “knows” who I am and can vouch on my behalf to applications and to other domains, introducing me to resources while sharing data about me. This process of an authoritative Identity Provider who redirects me to resources and introduces me as a valid account within their domain with certain attributes and entitlements is called the Federated Presentation model. Going back to my night on the highway, instead of presenting my road service card to the garage, I’d have to borrow their point of sale (POS) terminal to authenticate to the auto club website, then be redirected to the garage website - with my card information included in the redirect. Both of these models got me my tow truck. They both have important qualities, and there are a number of use cases where either could do the job. But each has a few use cases where they can really shine. In the case of federated presentation, enterprise scenarios such as single sign-on (SSO) provide a compelling combination of convenience and security. Likewise, the digital version of the direct presentation model has great applications - where users benefit from increased agency in their dealings with businesses, governments, and even each other. We’ll cover the key qualities of the direct presentation model in the rest of this blog, and then in our next blog, we can talk through what high level architectural requirements we foresee. Remember, my car has broken down, I’m in a remote and unfamiliar place, and I need to leverage the roadside assistance that I have purchased in advance to get service from a garage that I have never previously done business with. Common qualities of the Direct Presentation model (Real-world and digital) My choice of when and what gets presented As I sit in my broken-down car, I have options. I might have roadside assistance coverage from multiple providers. I might even have a coupon for a free tow I picked up from the roadside diner or prefer to use a credit card. If I do, I can debate which offers the best deal, and present what works for me. This sounds like a simple thing, but it is tougher to do in the federated model because I can’t look through my wallet to remember which clubs I’m a member of or what coupons I’ve squirreled away. Instead, the garage would have to try to help me remember which auto clubs I might be affiliated with by showing me logos until I see one I recognize. Everything goes through me When I interact with the garage, I’m the person presenting my credential and I’m the person getting a receipt back. No real-time interaction is needed with the auto club, which is lucky because it’s the middle of the night. But I am aware of the interactions in real time, and if I want to, I can compare my receipts from all the breakdowns my car has had, see trends in what kinds of transactions I’ve been a part of, and even find places where I’ve been overcharged or where the promises of the transaction haven’t materialized. This is because I was an agent in the direct presentation. In a federated model, the logs of what data was passed about me are likely available but I would have to collect those logs from many different places – if they’re made available to me - to see the full picture. Present it anywhere Imagine the case where every garage had to have a separate contract with every regional auto club whose members might want to use their services. The overhead of such a system would be high. Instead, the garages and regional clubs all agree to abide by a common association’s trust framework that establishes the rules and practices that everyone must follow. In the direct presentation model, standards help establish the rules which allow easy establishment of associations, and there’s a potential for direct presentation to make it easier to set up common trust frameworks. This is what allows me to present my membership card from Alberta on a dark highway in North Dakota – where both I and the driver can be confident in the transaction. If everyone uses a common presentation model, it opens the doors to make it easier for trust frameworks to be formed and governed. Unique properties of the Digital Direct Presentation model There are few things that couldn’t work for me as I sit in my broken-down car with a physical auto club card, but that could work if I were using a digital credential. Selective disclosure When I hand my auto club card to the tow truck operator on the dark highway, I cannot prevent that person from reading everything printed on that card. If the card were to show my home address, the truck driver would know where I lived. But with a digital credential, I can choose what to release (and what not to) during presentation – in this example, I can simply release the fact that I qualify for a tow, and nothing else. My wallet has a brain Another unique quality of a digital direct presentation is that because software can help, my wallet has a brain and can be a trusted advisor. As I sit in my car, shivering, I might make a bad choice about which garage to call. A digital wallet might cross-reference my choice in garages with the Better Business Bureau and suggest I choose a business with a rating higher than one star, saving me from a bad experience. But what IS the digital version of Direct Presentation? There are several important initiatives already underway that qualify as direct presentation – including mobile Driver’s License projects based on the ISO 18013 specification, as well as various ePassport initiatives. Vaccine passports are another case where the direct presentation model is emerging, with initiatives such as VCI and Good Health Pass (GHP) gaining traction. In the identity world, we at Microsoft see the W3C Verifiable Credentials specification as a Data format with long-term potential to let the industry move towards multi-purpose direct presentation. We hope to see some of the models converge over time as they mature, and we’ll deep dive on each of these options over time. Federated Presentation and Direct Presentation are complimentary You might be wondering: do we see the direct presentation model replacing federated models? No. We believe that federated and direct presentation models will co-exist, and that each model lends itself to solving different problems. Federated presentation is a modern workhorse when it comes to use cases like enterprise single sign-on, and there will continue to be many places where involving the user in a presentation ceremony will never make sense. The good news is that we have specifications that can help federated and direct models to co-exist, and we are working in the OpenID Foundation, the Decentralized Identity Foundation, and other standards bodies to find an interoperable balance. If you want a sneak peek, check out Kristina Yasuda and Torsten Lodderstedt at the recent OAuth Security Workshop, talking about some of their work that leverages the Self-Issued Identity Provider section of the OpenID Connect specification. This work truly bridges the worlds of Direct and Federated Presentation models. But wait, what does this have to do with Decentralization? If you’ve noticed that we’re not talking a lot about decentralized tech here, don’t be shocked. Microsoft sees unique and differentiated value in how decentralized technology can fulfill many of the criteria listed above, and we’re excited to dig deeper into what decentralized technology can deliver in later posts! But our goals center on both human and business outcomes. As such, we can and will use decentralized technology where it’s needed and useful, and we’ll do our best to discuss with the community the trade-offs and advantages in the process. What’s next? Now that we have a foundational understanding of the Direct Presentation model and its place in the digital world, our next installment will discuss how we build a reference architecture that could deliver on our principles using the Direct Presentation model. We’ll start with a high-level architecture and slowly drill into the various parts of that architecture, describing the standards at play and how we can see those standards pushing us closer to the 5 guiding principles. There are quite a lot of standards – we’ve constructed a handy linktree to let you easily index into the various specifications we are participating in and/or implementing. This is going to be fun!",https://techcommunity.microsoft.com/t5/identity-standards-blog/decentralized-identity-the-direct-presentation-model/ba-p/3071981,,Post,,Explainer,,,,,,,,2022-02-02,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,Alex Simons,,,,,,The Five Guiding Principles for Decentralized Identities,"Our goal in sharing these principles and our commitments is to help our customers, partners, and the decentralized identity community understand what motivates and guides us and how we think about this exciting opportunity.","Three years ago, as part of Microsoft’s mission to empower people and organizations to achieve more, we announced that we were incubating a new set of decentralized identity technologies based on a simple vision: Each of us needs a digital identity we own, one which securely and privately stores all elements of our digital identity. This self-owned identity must be easy to use and give us complete control over how our identity data is accessed and used. During this incubation, customers and partners all around the world have helped us understand their challenges and the shortcomings of their existing identity systems. We’ve learned a ton through a set of successful proof of concepts partnering with Keio University,1 The National Health Service (UK),2 and the Government of Flanders.3 We’ve worked with our partners in the Decentralized Identity Foundation (DIF) and the open standards community to develop standards and demonstrate interoperability. Using these new open standards and all these learnings to guide us, we turned on the public preview of our new decentralized identity system—Microsoft Azure Active Directory Verifiable Credentials—in April 2021. That preview generated a ton of valuable feedback and gave us the opportunity to learn from all of you. Through all these interactions and investments, we have become even more excited about the opportunity to create a decentralized identity system that increases customer trust and adoption by minimizing data processing and providing the user much greater control of the specific identity data they share and how it will be used. Now we are well into the next phase of our plan, working on two parallel efforts: - Partner with the decentralized identity community to finalize a set of high-quality open standards that we can all support. - Deliver the first General Availability release of our decentralized identity service in parallel with these still-evolving standards. The 5 guiding principles In this new phase, we want to share the set of guiding principles that we will use to guide both efforts. Not all these principles will be realizable from the start, but we believe that all are necessary over time to realize the promise of decentralized identities: 1. Secure, reliable, and trustworthy - My digital identity must be secure. It must not be easy to forge or hack. No one must be able to use it to impersonate me. - I must always have a way to access, use, and securely recover my digital identity. - I must have access to a detailed log of all the times I’ve used my digital identity, who I used it with, and what it was used for. 2. Privacy protecting and in my control - My digital identity is under my control. It must only be used with my consent and when I consent; I must know who will use it and how it will be used. - I must be able to review which elements of my digital identity are being requested and I must have the option to only disclose the specific information necessary to support the consented use. - My use of my digital identity must be private. No one, other than the party I explicitly share it with, should know I am using it without my consent. - My digital identity must not be able to be used to track me across unrelated services or applications without my consent. - I must have the freedom to switch between the devices and applications of my choosing to manage my digital identity, and never be locked in. - I must be able to delete all aspects of my digital identity and any associated data and log files from wherever I choose to store them. 3. Inclusive, fair, and easy to use - My digital identity must be usable, available, and accessible regardless of my race, ethnicity, abilities, gender, gender identity, sexual orientation, national origin, socio-economic status, or political status. - My digital identity must be easy to use and use universal design principles to make it useful for people with a wide variety of abilities. 4. Supervisable - I must be able to designate trusted friends or family members who can access my digital identity as needed if I become incapacitated or pass away. - If I am a child, my digital identity must support appropriate parental or custodial oversight and control. 5. Environmentally responsible - Creating and using my digital identity must be environmentally sustainable and not cause long-term environmental harm. Microsoft’s commitments to the new digital identity system In building and running this new system, we are also making an additional set of commitments we believe are critically important: - Legitimate and lawful: This new digital identity system must be legitimate and lawful. We will strive to assure it doesn’t encourage illegal activity, enable corruption, or expose people to undue risk or unlawful access. We will strive to ensure the technology doesn’t cause or exacerbate unjust or disparate impacts on systemically marginalized members of society. - Interoperable and accessible: We will strive to ensure technical and policy interoperability among domestic and international stakeholders, ease of use, broad inclusion, and equity of access. We will work to ensure the system works across modalities, including using it online, in person, and over the phone. We will build the system based on open, non-proprietary, and accessible standards to assure broad interoperability. - Safe: We will strive to place user safety and security at the center of our decentralized identity system design. Looking forward Our goal in sharing these principles and our commitments is to help our customers, partners, and the decentralized identity community understand what motivates and guides us and how we think about this exciting opportunity. Visit Microsoft decentralized identity to learn more about the benefits and opportunities of a decentralized identity ecosystem based on open standards. And we hope you’ll read the next blog in our five-part series on decentralized identity, where Pamela Dingle demystifies the basics of direct presentation, decentralized identity, verifiable credentials, and anchored decentralized identifiers. It’s quite entertaining, as well. To learn more about Microsoft Security solutions, visit our website. Bookmark the Security blog to keep up with our expert coverage on security matters. Also, follow us at @MicrosoftSecurity for the latest news and updates on cybersecurity. 1University to enable students to securely manage their own transcripts with Verifiable Credentials, Customer Stories, Microsoft. 16 March 2021. 2With high levels of security and trust, the NHS rapidly meets clinical demands using verified credentials, Customer Stories, Microsoft. 15 March 2021. 3How a decentralized identity and verifiable credentials can streamline both public and private processes, Customer Stories, Microsoft. 17 March 2021.",https://www.microsoft.com/en-us/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/,https://www.microsoft.com/en-us/security/blog/wp-content/uploads/2021/10/guiding-principles-of-decentralized-identities.jpg,Post,,Explainer,,,,,,,,2021-10-06,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Tim Cappalli,FIDO,,,,,"What's New in Passwordless Standards, 2021 edition!","The Web Authentication API (WebAuthN) Level 2 specification is currently a Candidate Recommendation at the W3C. ""Level 2"" essentially means major version number 2. The version 2.1 of the Client to Authenticator Protocol (CTAP) specification is a Release Draft at the FIDO Alliance. ","Hi everyone and welcome to chapter 14 of 2020! It’s been a little while since we talked about standards for passwordless so we’re excited to tell you about some new enhancements and features in FIDO2 land that you'll start seeing in the wild in the next few months! Specification Status The Web Authentication API (WebAuthN) Level 2 specification is currently a Candidate Recommendation at the W3C. ""Level 2"" essentially means major version number 2. The version 2.1 of the Client to Authenticator Protocol (CTAP) specification is a Release Draft at the FIDO Alliance. This means the spec is in a public review period before final publication. These new draft versions are on their way to becoming the next wave of FIDO functionality (as of the writing of this blog, we support Level 1 of WebAuthN and CTAP version 2.0). We think you might want to hear about what we think is especially fun about WebAuthN L2 and CTAP 2.1. Enterprise Attestation (EA) Enterprise Attestation is a new feature coming as part of WebAuthN L2 and CTAP 2.1 that enables binding of an authenticator to an account using a persistent identifier, similar to a smart card today. FIDO privacy standards require that ""a FIDO device does not have a global identifier within a particular website"" and ""a FIDO device must not have a global identifier visible across websites"". EA is designed to be used exclusively in enterprise-like environments where a trust relationship exists between devices and/or browsers and the relying party via management and/or policy. If EA is requested by a Relying Partying (RP) and the OS/browser is operating outside an enterprise context (Personal browser profile, unmanaged device, etc), the browser is expected to prompt the user for consent and provide a clear warning about the potential for tracking via the persistent identifier being shared. Authenticators can be configured to support Vendor-facilitated and/or Platform-managed Enterprise Attestation. Vendor-facilitated EA involves an authenticator vendor hardcoding a list of Relying Party IDs (RP IDs) into the authenticator firmware as part of manufacturing. This list is immutable (aka non-updateable). An enterprise attestation is only provided to RPs in that list. Platform-managed EA involves an RP ID list delivered via enterprise policy (ex: managed browser policy, mobile application management (MAM), mobile device management (MDM) and is enforced by the platform. Spec reference: CTAP 2.1 - Section 7.1: Enterprise Attestation WebAuthN L2 - Section 5.4.7: Attestation Conveyance Preference Authenticator Credential Management and Bio Enrollment Credential Management is part of CTAP 2.1 and allows management of discoverable credentials (aka resident keys) on an authenticator. Management can occur via a browser, an OS settings panel, an app or a CLI tool. Here's an example of how the Credential Management capability is baked into Chrome 88 on macOS (chrome://settings/securityKeys). Here I can manage my PIN, view discoverable credentials, add and remove fingerprints (assuming the authenticator has a fingerprint reader!) and factory reset my authenticator. Clicking on ""Sign-in data"" shows the discoverable credentials on the authenticator and allows me to remove them. This security key has an Azure AD account and an identity for use with SSH. Bio Enrollment allows the browser, client, or OS to aid in configuring biometrics on authenticators that support them. This security key has one finger enrolled. I can either remove the existing finger or add more. Here's an example of authenticator credential management via a CLI tool, ykman from Yubico. Spec references: CTAP 2.1 - Section 5.8: Credential Management CTAP 2.1 - Section 5.7: Bio Enrollment Set Minimum PIN Length and Force Change PIN CTAP 2.1 allows an enterprise to require a minimum PIN length on the authenticator. If the existing PIN does not meet the requirements, a change PIN flow can be initiated. An authenticator can also be configured with a one-time use PIN that must be changed on first use. This is an additional layer of protection when an authenticator is pre-provisioned by an administrator and then needs to be sent to an end user. The temporary PIN can be communicated to the end user out of band. We see this being used in conjunction with Enterprise Attestation to create a strong relationship between an authenticator and a user. Spec reference: CTAP 2.1 - Section 7.4: Set Minimum PIN Length Always Require User Verification (AlwaysUV) AlwaysUV is part of CTAP 2.1 and allows the user to configure their authenticator to always prompt for user verification (PIN, biometric, etc), even when the Relying Party does not ask for it. This adds an extra layer of protection by ensuring all credentials on the authenticator require the same verification method. Spec reference: CTAP 2.1 - Section 7.2: Always Require User Verification Virtual Authenticator DevTool This one is not tied to updates of either specification but we love it and wanted to share! Chrome and Edge (version 87+) now include a virtual authenticator as part of DevTools. It started as a Chromium extension back in 2019 and is now native! Oh, and the code is on GitHub! It is a great tool for testing, debugging and learning! Try it with one of the awesome WebAuthN test sites: Microsoft WebAuthN Sample App, WebAuthN.io, Yubico WebAuthN Demo. To access the tool, open Developer Tools ( F12 or Option + Command+ I ), click the Menu icon on the top right (…) then More tools and WebAuthN. Enabling the virtual authenticator environment will allow you to create a new authenticator by picking a protocol (CTAP2 or U2F), transport (USB, Bluetooth, NFC or internal), resident key (discoverable) and user verification support. As new credentials are created, you’ll see them listed and the sign count will increase as the credential is used. Want to know more? Here’s an amazing blog by Nina Satragno from the Chrome team over at Google who created this amazing DevTool! How we built the Chrome DevTools WebAuthN tab Wrap Up That rounds out the major features we believe will have the most impact. Here’s a few other enhancements and features that are important to mention! If you’d like to hear more about any of these enhancements/features (or anything else identity related, let's be honest), leave us a note in the comments. Thanks for reading! Tim Cappalli | Microsoft Identity | @timcappalli",https://techcommunity.microsoft.com/t5/identity-standards-blog/what-s-new-in-passwordless-standards-2021-edition/ba-p/2124136,,Post,,Explainer,,,,,,,"WebAuthN,CTAP",2021-02-11,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,Melanie Maynes,,,,,,Why decentralization is the future of digital identities,"Turning credentials into digital form isn’t new, but decentralizing identity goes beyond that. It gives individuals the ability to verify their credentials once and use them anywhere as proof of attestation.",,https://www.microsoft.com/security/blog/2022/03/10/why-decentralization-is-the-future-of-digital-identities/,,Post,,Explainer,,,,,,,,2022-03-10,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,YouTube,,,,,,,Decentralized identity explained,What is decentralized identity? How does it give you more control over your digital identity and keep your information on the internet safer? This video explains in short what decentralized identity is and how it can replace usernames and passwords to verify you are who you say you are quickly and easily.,,https://www.youtube.com/watch?v=ew-_f-otdfi,,Video,,Explainer,,,,,,,,2020-08-14,,,,,,,,,,,,,
|
||
Microsoft,Personal,,,Damien Bowden,Mattr,,,,,Create an OIDC credential Issuer with Mattr and ASP.NET Core,"Whilst in Damien's blog post he showed how a verifiable credential can be issued to a so called credential holder, this blog post will be about how we can verify such credentials as part of a business workflow. After an issuer has issued credentials to the holder and they have stored these into their wallet, a verifier can now ask a holder to verify them self with a certain credential. A verifier can add policies to check for certain attributes but also add restrictions like a specific issuer DID. With this in place a verifier can create a verification request which will be sent to the credential holder. This step is very important because it is where a cryptographic challenge is generated that the holder must respond to. This guarantees that the holder is responding to exactly this specific verification request. After the verification request gets returned to the verifier, it needs to be verified against the ledger to make sure it is valid. The verification record does not only contain the attributes, but also some metadata such as the digital signature of the issuer of the credentials, revocation details, verification policies etc. which then get validated against their sources. The image below describes this trust-triangle between the issuer, holder and verifier.<br>","This article shows how to create and issue verifiable credentials using Mattr and an ASP.NET Core. The ASP.NET Core application allows an admin user to create an OIDC credential issuer using the Mattr service. The credentials are displayed in an ASP.NET Core Razor Page web UI as a QR code for the users of the application. The user can use a digital wallet form Mattr to scan the QR code, authenticate against an Auth0 identity provider configured for this flow and use the claims from the id token to add the verified credential to the digital wallet. In a follow up post, a second application will then use the verified credentials to allow access to a second business process. Code: https://GitHub.com/swiss-ssi-group/MattrGlobalAspNetCore Blogs in the series - Getting started with Self Sovereign Identity SSI - Creating Verifiable credentials in ASP.NET Core for decentralized identities using Trinsic - Verifying Verifiable Credentials in ASP.NET Core for Decentralized Identities using Trinsic - Create an OIDC credential Issuer with Mattr and ASP.NET Core - Present and Verify Verifiable Credentials in ASP.NET Core using Decentralized Identities and Mattr - Verify vaccination data using Zero Knowledge Proofs with ASP.NET Core and Mattr - Challenges to Self Sovereign Identity - Create and issue verifiable credentials in ASP.NET Core using Azure AD - Implement Compound Proof BBS+ verifiable credentials using ASP.NET Core and Mattr Setup The solutions involves an Mattr API which handles all the blockchain identity logic. An ASP.NET Core application is used to create the digital identity and the OIDC credential issuer using the Mattr APIs and also present this as a QR code which can be scanned. An identity provider is required to add the credential properties to the id token. The properties in a verified credential are issued using the claims values from the id token so a specific identity provider is required with every credential issuer using this technic. Part of the business of this solution is adding business claims to the identity provider. A Mattr digital wallet is required to scan the QR code, authenticate against the OIDC provider which in our case is Auth0 and then store the verified credentials to the wallet for later use. Mattr Setup You need to register with Mattr and create a new account. Mattr will issue you access to your sandbox domain and you will get access data from them plus a link to support. Once setup, use the OIDC Bridge tutorial to implement the flow used in this demo. The docs are really good but you need to follow the docs exactly. https://learn.Mattr.global/tutorials/issue/oidc-bridge/issue-oidc Auth0 Setup A standard trusted web application which supports the code flow is required so that the Mattr digital wallet can authenticate using the identity provider and use the id token values from the claims which are required in the credential. It is important to create a new application which is only used for this because the client secret is required when creating the OIDC credential issuer and is shared with the Mattr platform. It would probably be better to use certificates instead of a shared secret which is persisted in different databases. We also use a second Auth0 application configuration to sign into the web application but this is not required to issue credentials. In Auth0, rules are used to extend the id token claims. You need to add your claims as required by the Mattr API and your business logic for the credentials you wish to issue. function (user, context, callback) { const namespace = 'https://--your-tenant--.vii.Mattr.global/'; context.idToken[namespace + 'license_issued_at'] = user.user_metadata.license_issued_at; context.idToken[namespace + 'license_type'] = user.user_metadata.license_type; context.idToken[namespace + 'name'] = user.user_metadata.name; context.idToken[namespace + 'first_name'] = user.user_metadata.first_name; context.idToken[namespace + 'date_of_birth'] = user.user_metadata.date_of_birth; callback(null, user, context); } For every user (holder) who should be able to create verifiable credentials, you must add the credential data to the user profile. This is part of the business process with this flow. If you were to implement this for a real application with lots of users, it would probably be better to integrate the identity provider into the solution issuing the credentials and add a UI for editing the user profile data which is used in the credentials. This would be really easy using ASP.NET Core Identity and for example OpenIddict or IdentityServer4. It is important that the user cannot edit this data. This logic is part of the credential issuer logic and not part of the user profile. After creating a new Mattr OIDC credential issuer, the callback URL needs to be added to the Open ID connect code flow client used for the digital wallet sign in. Add the URL to the Allowed Callback URLs in the settings of your Auth0 application configuration for the digital wallet. Implementing the OpenID Connect credentials Issuer application The ASP.NET Core application is used to create new OIDC credential issuers and also display the QR code for these so that the verifiable credential can be loaded to the digital wallet. The application requires secrets. The data is stored to a database, so that any credential can be added to a wallet at a later date and also so that you can find the credentials you created. The MattrConfiguration is the data and the secrets you received from Mattr for you account access to the API. The Auth0 configuration is the data required to sign in to the application. The Auth0Wallet configuration is the data required to create the OIDC credential issuer so that the digital wallet can authenticate the identity using the Auth0 application. This data is stored in the user secrets during development. { // use user secrets ""ConnectionStrings"": { ""DefaultConnection"": ""--your-connection-string--"" }, ""MattrConfiguration"": { ""Audience"": ""https://vii.Mattr.global"", ""ClientId"": ""--your-client-id--"", ""ClientSecret"": ""--your-client-secret--"", ""TenantId"": ""--your-tenant--"", ""TenantSubdomain"": ""--your-tenant-sub-domain--"", ""Url"": ""http://Mattr-prod.au.Auth0.com/oauth/token"" }, ""Auth0"": { ""Domain"": ""--your-Auth0-domain"", ""ClientId"": ""--your--Auth0-client-id--"", ""ClientSecret"": ""--your-Auth0-client-secret--"", } ""Auth0Wallet"": { ""Domain"": ""--your-Auth0-wallet-domain"", ""ClientId"": ""--your--Auth0-wallet-client-id--"", ""ClientSecret"": ""--your-Auth0-wallet-client-secret--"", } } Accessing the Mattr APIs The MattrConfiguration DTO is used to fetch the Mattr account data for the API access and to use in the application. public class MattrConfiguration { public string Audience { get; set; } public string ClientId { get; set; } public string ClientSecret { get; set; } public string TenantId { get; set; } public string TenantSubdomain { get; set; } public string Url { get; set; } } The MattrTokenApiService is used to acquire an access token and used for the Mattr API access. The token is stored to a cache and only fetched if the old one has expired or is not available. public class MattrTokenApiService { private readonly ILogger<MattrTokenApiService> _logger; private readonly MattrConfiguration _MattrConfiguration; private static readonly Object _lock = new Object(); private IDistributedCache _cache; private const int cacheExpirationInDays = 1; private class AccessTokenResult { public string AcessToken { get; set; } = string.Empty; public DateTime ExpiresIn { get; set; } } private class AccessTokenItem { public string access_token { get; set; } = string.Empty; public int expires_in { get; set; } public string token_type { get; set; } public string scope { get; set; } } private class MattrCrendentials { public string audience { get; set; } public string client_id { get; set; } public string client_secret { get; set; } public string grant_type { get; set; } = ""client_credentials""; } public MattrTokenApiService( IOptions<MattrConfiguration> MattrConfiguration, IHttpClientFactory httpClientFactory, ILoggerFactory loggerFactory, IDistributedCache cache) { _MattrConfiguration = MattrConfiguration.Value; _logger = loggerFactory.CreateLogger<MattrTokenApiService>(); _cache = cache; } public async Task<string> GetApiToken(HttpClient client, string api_name) { var accessToken = GetFromCache(api_name); if (accessToken != null) { if (accessToken.ExpiresIn > DateTime.UtcNow) { return accessToken.AcessToken; } else { // remove => NOT Needed for this cache type } } _logger.LogDebug($""GetApiToken new from oauth server for {api_name}""); // add var newAccessToken = await GetApiTokenClient(client); AddToCache(api_name, newAccessToken); return newAccessToken.AcessToken; } private async Task<AccessTokenResult> GetApiTokenClient(HttpClient client) { try { var payload = new MattrCrendentials { client_id = _MattrConfiguration.ClientId, client_secret = _MattrConfiguration.ClientSecret, audience = _MattrConfiguration.Audience }; var authUrl = ""https://auth.Mattr.global/oauth/token""; var tokenResponse = await client.PostAsJsonAsync(authUrl, payload); if (tokenResponse.StatusCode == System.Net.HttpStatusCode.OK) { var result = await tokenResponse.Content.ReadFromJsonAsync<AccessTokenItem>(); DateTime expirationTime = DateTimeOffset.FromUnixTimeSeconds(result.expires_in).DateTime; return new AccessTokenResult { AcessToken = result.access_token, ExpiresIn = expirationTime }; } _logger.LogError($""tokenResponse.IsError Status code: {tokenResponse.StatusCode}, Error: {tokenResponse.ReasonPhrase}""); throw new ApplicationException($""Status code: {tokenResponse.StatusCode}, Error: {tokenResponse.ReasonPhrase}""); } catch (Exception e) { _logger.LogError($""Exception {e}""); throw new ApplicationException($""Exception {e}""); } } private void AddToCache(string key, AccessTokenResult accessTokenItem) { var options = new DistributedCacheEntryOptions().SetSlidingExpiration(TimeSpan.FromDays(cacheExpirationInDays)); lock (_lock) { _cache.SetString(key, JsonConvert.SerializeObject(accessTokenItem), options); } } private AccessTokenResult GetFromCache(string key) { var item = _cache.GetString(key); if (item != null) { return JsonConvert.DeserializeObject<AccessTokenResult>(item); } return null; } } Generating the API DTOs using Nswag The MattrOpenApiClientSevice file was generated using Nswag and the Open API file provided by Mattr here. We only generated the DTOs using this and access the client then using a HttpClient instance. The Open API file used in this solution is deployed in the git repo. Creating the OIDC credential issuer The MattrCredentialsService is used to create an OIDC credentials issuer using the Mattr APIs. This is implemented using the CreateCredentialsAndCallback method. The created callback is returned so that it can be displayed in the UI and copied to the specific Auth0 application configuration. private readonly IConfiguration _configuration; private readonly DriverLicenseCredentialsService _driverLicenseService; private readonly IHttpClientFactory _clientFactory; private readonly MattrTokenApiService _MattrTokenApiService; private readonly MattrConfiguration _MattrConfiguration; public MattrCredentialsService(IConfiguration configuration, DriverLicenseCredentialsService driverLicenseService, IHttpClientFactory clientFactory, IOptions<MattrConfiguration> MattrConfiguration, MattrTokenApiService MattrTokenApiService) { _configuration = configuration; _driverLicenseService = driverLicenseService; _clientFactory = clientFactory; _MattrTokenApiService = MattrTokenApiService; _MattrConfiguration = MattrConfiguration.Value; } public async Task<string> CreateCredentialsAndCallback(string name) { // create a new one var driverLicenseCredentials = await CreateMattrDidAndCredentialIssuer(); driverLicenseCredentials.Name = name; await _driverLicenseService.CreateDriverLicense(driverLicenseCredentials); var callback = $""https://{_MattrConfiguration.TenantSubdomain}/ext/oidc/v1/issuers/{driverLicenseCredentials.OidcIssuerId}/federated/callback""; return callback; } The CreateMattrDidAndCredentialIssuer method implements the different steps described in the Mattr API documentation for this. An access token for the Mattr API is created or retrieved from the cache and DID is created and the id from the DID post response is used to create the OIDC credential issuer. The DriverLicenseCredentials is returned which is persisted to a database and the callback is created using this object. private async Task<DriverLicenseCredentials> CreateMattrDidAndCredentialIssuer() { HttpClient client = _clientFactory.CreateClient(); var accessToken = await _MattrTokenApiService .GetApiToken(client, ""MattrAccessToken""); client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(""Bearer"", accessToken); client.DefaultRequestHeaders .TryAddWithoutValidation(""Content-Type"", ""application/json""); var did = await CreateMattrDid(client); var oidcIssuer = await CreateMattrCredentialIssuer(client, did); return new DriverLicenseCredentials { Name = ""not_named"", Did = JsonConvert.SerializeObject(did), OidcIssuer = JsonConvert.SerializeObject(oidcIssuer), OidcIssuerId = oidcIssuer.Id }; } The CreateMattrDid method creates a new DID as specified by the API. The MattrOptions class is used to create the request object. This is serialized using the StringContentWithoutCharset class due to a bug in the Mattr API validation. I created this class using the blog from Gunnar Peipman. private async Task<V1_CreateDidResponse> CreateMattrDid(HttpClient client) { // create did , post to dids // https://learn.Mattr.global/api-ref/#operation/createDid // https://learn.Mattr.global/tutorials/dids/use-did/ var createDidUrl = $""https://{_MattrConfiguration.TenantSubdomain}/core/v1/dids""; var payload = new MattrOpenApiClient.V1_CreateDidDocument { Method = MattrOpenApiClient.V1_CreateDidDocumentMethod.Key, Options = new MattrOptions() }; var payloadJson = JsonConvert.SerializeObject(payload); var uri = new Uri(createDidUrl); using (var content = new StringContentWithoutCharset(payloadJson, ""application/json"")) { var createDidResponse = await client.PostAsync(uri, content); if (createDidResponse.StatusCode == System.Net.HttpStatusCode.Created) { var v1CreateDidResponse = JsonConvert.DeserializeObject<V1_CreateDidResponse>( await createDidResponse.Content.ReadAsStringAsync()); return v1CreateDidResponse; } var error = await createDidResponse.Content.ReadAsStringAsync(); } return null; } The MattrOptions DTO is used to create a default DID using the key type “ed25519”. See the Mattr API docs for further details. public class MattrOptions { /// <summary> /// The supported key types for the DIDs are ed25519 and bls12381g2. /// If the keyType is omitted, the default key type that will be used is ed25519. /// /// If the keyType in options is set to bls12381g2 a DID will be created with /// a BLS key type which supports BBS+ signatures for issuing ZKP-enabled credentials. /// </summary> public string keyType { get; set; } = ""ed25519""; } The CreateMattrCredentialIssuer implements the OIDC credential issuer to create the post request. The request properties need to be setup for your credential properties and must match claims from the id token of the Auth0 user profile. This is where the OIDC client for the digital wallet is setup and also where the credential claims are specified. If this is setup up incorrectly, loading the data into your wallet will fail. The HTTP request and the response DTOs are implemented using the Nswag generated classes. private async Task<V1_CreateOidcIssuerResponse> CreateMattrCredentialIssuer(HttpClient client, V1_CreateDidResponse did) { // create VC, post to credentials api // https://learn.Mattr.global/tutorials/issue/oidc-bridge/setup-issuer var createCredentialsUrl = $""https://{_MattrConfiguration.TenantSubdomain}/ext/oidc/v1/issuers""; var payload = new MattrOpenApiClient.V1_CreateOidcIssuerRequest { Credential = new Credential { IssuerDid = did.Did, Name = ""NationalDrivingLicense"", Context = new List<Uri> { new Uri( ""https://schema.org"") // Only this is supported }, Type = new List<string> { ""nationaldrivinglicense"" } }, ClaimMappings = new List<ClaimMappings> { new ClaimMappings{ JsonLdTerm=""name"", OidcClaim=$""https://{_MattrConfiguration.TenantSubdomain}/name""}, new ClaimMappings{ JsonLdTerm=""firstName"", OidcClaim=$""https://{_MattrConfiguration.TenantSubdomain}/first_name""}, new ClaimMappings{ JsonLdTerm=""licenseType"", OidcClaim=$""https://{_MattrConfiguration.TenantSubdomain}/license_type""}, new ClaimMappings{ JsonLdTerm=""dateOfBirth"", OidcClaim=$""https://{_MattrConfiguration.TenantSubdomain}/date_of_birth""}, new ClaimMappings{ JsonLdTerm=""licenseIssuedAt"", OidcClaim=$""https://{_MattrConfiguration.TenantSubdomain}/license_issued_at""} }, FederatedProvider = new FederatedProvider { ClientId = _configuration[""Auth0Wallet:ClientId""], ClientSecret = _configuration[""Auth0Wallet:ClientSecret""], Url = new Uri($""https://{_configuration[""Auth0Wallet:Domain""]}""), Scope = new List<string> { ""openid"", ""profile"", ""email"" } } }; var payloadJson = JsonConvert.SerializeObject(payload); var uri = new Uri(createCredentialsUrl); using (var content = new StringContentWithoutCharset(payloadJson, ""application/json"")) { var createOidcIssuerResponse = await client.PostAsync(uri, content); if (createOidcIssuerResponse.StatusCode == System.Net.HttpStatusCode.Created) { var v1CreateOidcIssuerResponse = JsonConvert.DeserializeObject<V1_CreateOidcIssuerResponse>( await createOidcIssuerResponse.Content.ReadAsStringAsync()); return v1CreateOidcIssuerResponse; } var error = await createOidcIssuerResponse.Content.ReadAsStringAsync(); } throw new Exception(""whoops something went wrong""); } Now the service is completely ready to generate credentials. This can be used in any Blazor UI, Razor page or MVC view in ASP.NET Core. The services are added to the DI in the startup class. The callback method is displayed in the UI if the application successfully creates a new OIDC credential issuer. private readonly MattrCredentialsService _MattrCredentialsService; public bool CreatingDriverLicense { get; set; } = true; public string Callback { get; set; } [BindProperty] public IssuerCredential IssuerCredential { get; set; } public AdminModel(MattrCredentialsService MattrCredentialsService) { _MattrCredentialsService = MattrCredentialsService; } public void OnGet() { IssuerCredential = new IssuerCredential(); } public async Task<IActionResult> OnPostAsync() { if (!ModelState.IsValid) { return Page(); } Callback = await _MattrCredentialsService .CreateCredentialsAndCallback(IssuerCredential.CredentialName); CreatingDriverLicense = false; return Page(); } } public class IssuerCredential { [Required] public string CredentialName { get; set; } } Adding credentials you wallet After the callback method has been added to the Auth0 callback URLs, the credentials can be used to add verifiable credentials to your wallet. This is fairly simple. The Razor Page uses the data from the database and generates an URL using the Mattr specification and the id from the created OIDC credential issuer. The claims from the id token or the profile data is just used to display the data for the user signed into the web application. This is not the same data which is used be the digital wallet. If the same person logs into the digital wallet, then the data is the same. The wallet authenticates the identity separately. public class DriverLicenseCredentialsModel : PageModel { private readonly DriverLicenseCredentialsService _driverLicenseCredentialsService; private readonly MattrConfiguration _MattrConfiguration; public string DriverLicenseMessage { get; set; } = ""Loading credentials""; public bool HasDriverLicense { get; set; } = false; public DriverLicense DriverLicense { get; set; } public string CredentialOfferUrl { get; set; } public DriverLicenseCredentialsModel(DriverLicenseCredentialsService driverLicenseCredentialsService, IOptions<MattrConfiguration> MattrConfiguration) { _driverLicenseCredentialsService = driverLicenseCredentialsService; _MattrConfiguration = MattrConfiguration.Value; } public async Task OnGetAsync() { //""license_issued_at"": ""2021-03-02"", //""license_type"": ""B1"", //""name"": ""Bob"", //""first_name"": ""Lammy"", //""date_of_birth"": ""1953-07-21"" var identityHasDriverLicenseClaims = true; var nameClaim = User.Claims.FirstOrDefault(t => t.Type == $""https://{_MattrConfiguration.TenantSubdomain}/name""); var firstNameClaim = User.Claims.FirstOrDefault(t => t.Type == $""https://{_MattrConfiguration.TenantSubdomain}/first_name""); var licenseTypeClaim = User.Claims.FirstOrDefault(t => t.Type == $""https://{_MattrConfiguration.TenantSubdomain}/license_type""); var dateOfBirthClaim = User.Claims.FirstOrDefault(t => t.Type == $""https://{_MattrConfiguration.TenantSubdomain}/date_of_birth""); var licenseIssuedAtClaim = User.Claims.FirstOrDefault(t => t.Type == $""https://{_MattrConfiguration.TenantSubdomain}/license_issued_at""); if (nameClaim == null || firstNameClaim == null || licenseTypeClaim == null || dateOfBirthClaim == null || licenseIssuedAtClaim == null) { identityHasDriverLicenseClaims = false; } if (identityHasDriverLicenseClaims) { DriverLicense = new DriverLicense { Name = nameClaim.Value, FirstName = firstNameClaim.Value, LicenseType = licenseTypeClaim.Value, DateOfBirth = dateOfBirthClaim.Value, IssuedAt = licenseIssuedAtClaim.Value, UserName = User.Identity.Name }; // get per name //var offerUrl = await _driverLicenseCredentialsService.GetDriverLicenseCredentialIssuerUrl(""ndlseven""); // get the last one var offerUrl = await _driverLicenseCredentialsService.GetLastDriverLicenseCredentialIssuerUrl(); DriverLicenseMessage = ""Add your driver license credentials to your wallet""; CredentialOfferUrl = offerUrl; HasDriverLicense = true; } else { DriverLicenseMessage = ""You have no valid driver license""; } } } The data is displayed using Bootstrap. If you use a Mattr wallet to scan the QR Code shown underneath, you will be redirected to authenticate against the specified Auth0 application. If you have the claims, you can add verifiable claims to you digital wallet. Notes Mattr API has a some problems with its API and a stricter validation would help a lot. But Mattr support is awesome and the team are really helpful and you will end up with a working solution. It would be also awesome if the Open API file could be used without changes to generate a client and the DTOs. It would makes sense, if you could issue credentials data from the data in the credential issuer application and not from the id token of the user profile. I understand that in some use cases, you would like to protect against any wallet taking credentials for other identities, but I as a credential issuer cannot always add my business data to user profiles from the IDP. The security of this solution all depends on the user profile data. If a non authorized person can change this data (in this case, this could be the same user), then incorrect verifiable credentials can be created. Next step is to create an application to verify and use the verifiable credentials created here. Links https://Mattr.global/get-started/ https://learn.Mattr.global/ https://learn.Mattr.global/tutorials/dids/did-key https://gunnarpeipman.com/httpclient-remove-charset/",https://damienbod.com/2021/05/03/create-an-oidc-credential-issuer-with-Mattr-and-asp-net-core/,,Post,,HowTo,,,,,,ASP.NET,,2021-05-03,,,,,,,,,,,,,
|
||
Microsoft,Personal,,,Damien Bowden,,,,,,Create and issue verifiable credentials in ASP.NET Core using Azure AD,"This article shows how Azure AD verifiable credentials can be issued and used in an ASP.NET Core application. An ASP.NET Core Razor page application is used to implement the credential issuer. To issue credentials, the application must manage the credential subject data as well as require authenticated users who would like to add verifiable credentials to their digital wallet. The Microsoft Authenticator mobile application is used as the digital wallet.","This article shows how Azure AD verifiable credentials can be issued and used in an ASP.NET Core application. An ASP.NET Core Razor page application is used to implement the credential issuer. To issue credentials, the application must manage the credential subject data as well as require authenticated users who would like to add verifiable credentials to their digital wallet. The Microsoft Authenticator mobile application is used as the digital wallet. Code: [https://GitHub.com/swiss-ssi-group/AzureADVerifiableCredentialsAspNetCore](https://GitHub.com/swiss-ssi-group/AzureADVerifiableCredentialsAspNetCore) Blogs in this series Setup Two ASP.NET Core applications are implemented to issue and verify the verifiable credentials. The credential issuer must administrate and authenticate its identities to issue verifiable credentials. A verifiable credential issuer should never issue credentials to unauthenticated subjects of the credential. As the verifier normally only authorizes the credential, it is important to know that the credentials were at least issued correctly. We do not know as a verifier who or and mostly what sends the verifiable credentials but at least we know that the credentials are valid if we trust the issuer. It is possible to use private holder binding for a holder of a wallet which would increase the trust between the verifier and the issued credentials. The credential issuer in this demo issues credentials for driving licenses using Azure AD verifiable credentials. The ASP.NET Core application uses Microsoft.Identity.Web to authenticate all identities. In a real application, the application would be authenticated as well requiring 2FA for all users. Azure AD supports this good. The administrators would also require admin rights, which could be implemented using Azure security groups or Azure roles which are added to the application as claims after the OIDC authentication flow. Any authenticated identity can request credentials (A driving license in this demo) for themselves and no one else. The administrators can create data which is used as the subject, but not issue credentials for others. Azure AD verifiable credential setup Azure AD verifiable credentials is setup using the Azure Docs for the Rest API and the Azure verifiable credential ASP.NET Core sample application. Following the documentation, a display file and a rules file were uploaded for the verifiable credentials created for this issuer. In this demo, two credential subjects are defined to hold the data when issuing or verifying the credentials. { ""default"": { ""locale"": ""en-US"", ""card"": { ""title"": ""National Driving License VC"", ""issuedBy"": ""Damienbod"", ""backgroundColor"": ""#003333"", ""textColor"": ""#ffffff"", ""logo"": { ""uri"": ""https://raw.GitHubusercontent.com/swiss-ssi-group/TrinsicAspNetCore/main/src/NationalDrivingLicense/wwwroot/ndl_car_01.png"", ""description"": ""National Driving License Logo"" }, ""description"": ""Use your verified credential to prove to anyone that you can drive."" }, ""consent"": { ""title"": ""Do you want to get your Verified Credential?"", ""instructions"": ""Sign in with your account to get your card."" }, ""claims"": { ""VC.credentialSubject.name"": { ""type"": ""String"", ""label"": ""Name"" }, ""VC.credentialSubject.details"": { ""type"": ""String"", ""label"": ""Details"" } } } } The rules file defines the attestations for the credentials. Two standard claims are used to hold the data, the given_name and the family_name. These claims are mapped to our name and details subject claims and holds all the data. Adding custom claims to Azure AD or Azure B2C is not so easy and so I decided for the demo, it would be easier to use standard claims which works without custom configurations. The data sent from the issuer to the holder of the claims can be sent in the application. It should be possible to add credential subject properties without requiring standard AD id_token claims, but I was not able to set this up in the current preview version. { ""attestations"": { ""idTokens"": [ { ""id"": ""https://self-issued.me"", ""mapping"": { ""name"": { ""claim"": ""$.given_name"" }, ""details"": { ""claim"": ""$.family_name"" } }, ""configuration"": ""https://self-issued.me"", ""client_id"": """", ""redirect_uri"": """" } ] }, ""validityInterval"": 2592001, ""VC"": { ""type"": [ ""MyDrivingLicense"" ] } } The rest of the Azure AD credentials are setup exactly like the documentation. Administration of the Driving licenses The verifiable credential issuer application uses a Razor page application which accesses a Microsoft SQL Azure database using Entity Framework Core to access the database. The administrator of the credentials can assign driving licenses to any user. The DrivingLicenseDbContext class is used to define the DBSet for driver licenses. ublic class DrivingLicenseDbContext : DbContext { public DbSet<DriverLicense> DriverLicenses { get; set; } public DrivingLicenseDbContext(DbContextOptions<DrivingLicenseDbContext> options) : base(options) { } protected override void OnModelCreating(ModelBuilder builder) { builder.Entity<DriverLicense>().HasKey(m => m.Id); base.OnModelCreating(builder); } } A DriverLicense entity contains the infomation we use to create verifiable credentials. public class DriverLicense { [Key] public Guid Id { get; set; } public string UserName { get; set; } = string.Empty; public DateTimeOffset IssuedAt { get; set; } public string Name { get; set; } = string.Empty; public string FirstName { get; set; } = string.Empty; public DateTimeOffset DateOfBirth { get; set; } public string Issuedby { get; set; } = string.Empty; public bool Valid { get; set; } public string DriverLicenseCredentials { get; set; } = string.Empty; public string LicenseType { get; set; } = string.Empty; } Issuing credentials to authenticated identities When issuing verifiable credentials using Azure AD Rest API, an IssuanceRequestPayload payload is used to request the credentials which are to be issued to the digital wallet. Verifiable credentials are issued to a digital wallet. The credentials are issued for the holder of the wallet. The payload classes are the same for all API implementations apart from the CredentialsClaims class which contains the subject claims which match the rules file of your definition. public class IssuanceRequestPayload { [JsonPropertyName(""includeQRCode"")] public bool IncludeQRCode { get; set; } [JsonPropertyName(""callback"")] public Callback Callback { get; set; } = new Callback(); [JsonPropertyName(""authority"")] public string Authority { get; set; } = string.Empty; [JsonPropertyName(""registration"")] public Registration Registration { get; set; } = new Registration(); [JsonPropertyName(""issuance"")] public Issuance Issuance { get; set; } = new Issuance(); } public class Callback { [JsonPropertyName(""url"")] public string Url { get; set; } = string.Empty; [JsonPropertyName(""state"")] public string State { get; set; } = string.Empty; [JsonPropertyName(""headers"")] public Headers Headers { get; set; } = new Headers(); } public class Headers { [JsonPropertyName(""api-key"")] public string ApiKey { get; set; } = string.Empty; } public class Registration { [JsonPropertyName(""clientName"")] public string ClientName { get; set; } = string.Empty; } public class Issuance { [JsonPropertyName(""type"")] public string CredentialsType { get; set; } = string.Empty; [JsonPropertyName(""manifest"")] public string Manifest { get; set; } = string.Empty; [JsonPropertyName(""pin"")] public Pin Pin { get; set; } = new Pin(); [JsonPropertyName(""claims"")] public CredentialsClaims Claims { get; set; } = new CredentialsClaims(); } public class Pin { [JsonPropertyName(""value"")] public string Value { get; set; } = string.Empty; [JsonPropertyName(""length"")] public int Length { get; set; } = 4; } /// Application specific claims used in the payload of the issue request. /// When using the id_token for the subject claims, the IDP needs to add the values to the id_token! /// The claims can be mapped to anything then. public class CredentialsClaims { /// <summary> /// attribute names need to match a claim from the id_token /// </summary> [JsonPropertyName(""given_name"")] public string Name { get; set; } = string.Empty; [JsonPropertyName(""family_name"")] public string Details { get; set; } = string.Empty; } The GetIssuanceRequestPayloadAsync method sets the data for each identity that requested the credentials. Only a signed in user can request the credentials for themselves. The context.User.Identity is used and the data is selected from the database for the signed in user. It is important that credentials are only issued to authenticated users. Users and the application must be authenticated correctly using 2FA and so on. Per default, the credentials are only authorized on the verifier which is probably not enough for most security flows. public async Task<IssuanceRequestPayload> GetIssuanceRequestPayloadAsync(HttpRequest request, HttpContext context) { var payload = new IssuanceRequestPayload(); var length = 4; var pinMaxValue = (int)Math.Pow(10, length) - 1; var randomNumber = RandomNumberGenerator.GetInt32(1, pinMaxValue); var newpin = string.Format(""{0:D"" + length.ToString() + ""}"", randomNumber); payload.Issuance.Pin.Length = 4; payload.Issuance.Pin.Value = newpin; payload.Issuance.CredentialsType = ""MyDrivingLicense""; payload.Issuance.Manifest = _credentialSettings.CredentialManifest; var host = GetRequestHostName(request); payload.Callback.State = Guid.NewGuid().ToString(); payload.Callback.Url = $""{host}:/api/issuer/issuanceCallback""; payload.Callback.Headers.ApiKey = _credentialSettings.VcApiCallbackApiKey; payload.Registration.ClientName = ""Verifiable Credential NDL Sample""; payload.Authority = _credentialSettings.IssuerAuthority; var driverLicense = await _driverLicenseService.GetDriverLicense(context.User.Identity.Name); payload.Issuance.Claims.Name = $""{driverLicense.FirstName} {driverLicense.Name} {driverLicense.UserName}""; payload.Issuance.Claims.Details = $""Type: {driverLicense.LicenseType} IssuedAt: {driverLicense.IssuedAt:yyyy-MM-dd}""; return payload; } The IssuanceRequestAsync method gets the payload data and request credentials from the Azure AD verifiable credentials REST API and returns this value which can be scanned using a QR code in the Razor page. The request returns fast. Depending on how the flow continues, a web hook in the application will update the status in a cache. This cache is persisted and polled from the UI. This could be improved by using SignalR. [HttpGet(""/api/issuer/issuance-request"")] public async Task<ActionResult> IssuanceRequestAsync() { try { var payload = await _issuerService.GetIssuanceRequestPayloadAsync(Request, HttpContext); try { var (Token, Error, ErrorDescription) = await _issuerService.GetAccessToken(); if (string.IsNullOrEmpty(Token)) { _log.LogError($""failed to acquire accesstoken: {Error} : {ErrorDescription}""); return BadRequest(new { error = Error, error_description = ErrorDescription }); } var defaultRequestHeaders = _httpClient.DefaultRequestHeaders; defaultRequestHeaders.Authorization = new AuthenticationHeaderValue(""Bearer"", Token); HttpResponseMessage res = await _httpClient.PostAsJsonAsync( _credentialSettings.ApiEndpoint, payload); var response = await res.Content.ReadFromJsonAsync<IssuanceResponse>(); if(response == null) { return BadRequest(new { error = ""400"", error_description = ""no response from VC API""}); } if (res.StatusCode == HttpStatusCode.Created) { _log.LogTrace(""succesfully called Request API""); if (payload.Issuance.Pin.Value != null) { response.Pin = payload.Issuance.Pin.Value; } response.Id = payload.Callback.State; var cacheData = new CacheData { Status = IssuanceConst.NotScanned, Message = ""Request ready, please scan with Authenticator"", Expiry = response.Expiry.ToString() }; _cache.Set(payload.Callback.State, JsonSerializer.Serialize(cacheData)); return Ok(response); } else { _log.LogError(""Unsuccesfully called Request API""); return BadRequest(new { error = ""400"", error_description = ""Something went wrong calling the API: "" + response }); } } catch (Exception ex) { return BadRequest(new { error = ""400"", error_description = ""Something went wrong calling the API: "" + ex.Message }); } } catch (Exception ex) { return BadRequest(new { error = ""400"", error_description = ex.Message }); } } The IssuanceResponse is returned to the UI. public class IssuanceResponse { [JsonPropertyName(""requestId"")] public string RequestId { get; set; } = string.Empty; [JsonPropertyName(""url"")] public string Url { get; set; } = string.Empty; [JsonPropertyName(""expiry"")] public int Expiry { get; set; } [JsonPropertyName(""pin"")] public string Pin { get; set; } = string.Empty; [JsonPropertyName(""id"")] public string Id { get; set; } = string.Empty; } The IssuanceCallback is used as a web hook for the Azure AD verifiable credentials. When developing or deploying, this web hook needs to have a public IP. I use ngrok to test this. Because the issuer authenticates the identities using an Azure App registration, everytime the ngrok URL changes, the redirect URL needs to be updated. Each callback request updates the cache. This API also needs to allow anonymous requests if the rest of the application is authenticated using OIDC. The AllowAnonymous attribute is required, if you use an authenticated ASP.NET Core application. [AllowAnonymous] [HttpPost(""/api/issuer/issuanceCallback"")] public async Task<ActionResult> IssuanceCallback() { string content = await new System.IO.StreamReader(Request.Body).ReadToEndAsync(); var issuanceResponse = JsonSerializer.Deserialize<IssuanceCallbackResponse>(content); try { //there are 2 different callbacks. 1 if the QR code is scanned (or deeplink has been followed) //Scanning the QR code makes Authenticator download the specific request from the server //the request will be deleted from the server immediately. //That's why it is so important to capture this callback and relay this to the UI so the UI can hide //the QR code to prevent the user from scanning it twice (resulting in an error since the request is already deleted) if (issuanceResponse.Code == IssuanceConst.RequestRetrieved) { var cacheData = new CacheData { Status = IssuanceConst.RequestRetrieved, Message = ""QR Code is scanned. Waiting for issuance..."", }; _cache.Set(issuanceResponse.State, JsonSerializer.Serialize(cacheData)); } if (issuanceResponse.Code == IssuanceConst.IssuanceSuccessful) { var cacheData = new CacheData { Status = IssuanceConst.IssuanceSuccessful, Message = ""Credential successfully issued"", }; _cache.Set(issuanceResponse.State, JsonSerializer.Serialize(cacheData)); } if (issuanceResponse.Code == IssuanceConst.IssuanceError) { var cacheData = new CacheData { Status = IssuanceConst.IssuanceError, Payload = issuanceResponse.Error?.Code, //at the moment there isn't a specific error for incorrect entry of a pincode. //So assume this error happens when the users entered the incorrect pincode and ask to try again. Message = issuanceResponse.Error?.Message }; _cache.Set(issuanceResponse.State, JsonSerializer.Serialize(cacheData)); } return Ok(); } catch (Exception ex) { return BadRequest(new { error = ""400"", error_description = ex.Message }); } } The IssuanceCallbackResponse is returned to the UI. public class IssuanceCallbackResponse { [JsonPropertyName(""code"")] public string Code { get; set; } = string.Empty; [JsonPropertyName(""requestId"")] public string RequestId { get; set; } = string.Empty; [JsonPropertyName(""state"")] public string State { get; set; } = string.Empty; [JsonPropertyName(""error"")] public CallbackError? Error { get; set; } } The IssuanceResponse method is polled from a Javascript client in the Razor page UI. This method updates the status in the UI using the cache and the database. [HttpGet(""/api/issuer/issuance-response"")] public ActionResult IssuanceResponse() { try { //the id is the state value initially created when the issuance request was requested from the request API //the in-memory database uses this as key to get and store the state of the process so the UI can be updated string state = this.Request.Query[""id""]; if (string.IsNullOrEmpty(state)) { return BadRequest(new { error = ""400"", error_description = ""Missing argument 'id'"" }); } CacheData value = null; if (_cache.TryGetValue(state, out string buf)) { value = JsonSerializer.Deserialize<CacheData>(buf); Debug.WriteLine(""check if there was a response yet: "" + value); return new ContentResult { ContentType = ""application/json"", Content = JsonSerializer.Serialize(value) }; } return Ok(); } catch (Exception ex) { return BadRequest(new { error = ""400"", error_description = ex.Message }); } } The DriverLicenseCredentialsModel class is used for the credential issuing for the sign-in user. The HTML part of the Razor page contains the Javascript client code which was implemented using the code from the Microsoft Azure sample. public class DriverLicenseCredentialsModel : PageModel { private readonly DriverLicenseService _driverLicenseService; public string DriverLicenseMessage { get; set; } = ""Loading credentials""; public bool HasDriverLicense { get; set; } = false; public DriverLicense DriverLicense { get; set; } public DriverLicenseCredentialsModel(DriverLicenseService driverLicenseService) { _driverLicenseService = driverLicenseService; } public async Task OnGetAsync() { DriverLicense = await _driverLicenseService.GetDriverLicense(HttpContext.User.Identity.Name); if (DriverLicense != null) { DriverLicenseMessage = ""Add your driver license credentials to your wallet""; HasDriverLicense = true; } else { DriverLicenseMessage = ""You have no valid driver license""; } } } Testing and running the applications Ngrok is used to provide a public callback for the Azure AD verifiable credentials callback. When the application is started, you need to create a driving license. This is done in the administration Razor page. Once a driving license exists, the View driver license Razor page can be used to issue a verifiable credential to the logged in user. A QR Code is displayed which can be scanned to begin the issue flow. Using the Microsoft authenticator, you can scan the QR Code and add the verifiable credentials to your digital wallet. The credentials can now be used in any verifier which supports the Microsoft Authenticator wallet. The verify ASP.NET Core application can be used to verify and used the issued verifiable credential from the Wallet. Links: https://docs.Microsoft.com/en-us/azure/active-directory/verifiable-credentials/ https://GitHub.com/Azure-Samples/active-directory-verifiable-credentials-dotnet https://www.Microsoft.com/de-ch/security/business/identity-access-management/decentralized-identity-blockchain https://didproject.azurewebsites.net/docs/issuer-setup.html https://didproject.azurewebsites.net/docs/credential-design.html https://GitHub.com/Azure-Samples/active-directory-verifiable-credentials https://identity.foundation/ https://www.w3.org/TR/VC-data-model/ https://daniel-krzyczkowski.GitHub.io/Azure-AD-Verifiable-Credentials-Intro/ https://dotnetthoughts.net/using-node-services-in-aspnet-core/ https://identity.foundation/ion/explorer https://www.npmjs.com/package/ngrok https://GitHub.com/Microsoft/VerifiableCredentials-Verification-SDK-Typescript https://identity.foundation/ion/explorer https://www.npmjs.com/package/ngrok https://GitHub.com/Microsoft/VerifiableCredentials-Verification-SDK-Typescript",https://damienbod.com/2021/10/25/create-and-issuer-verifiable-credentials-in-asp-net-core-using-azure-ad/,,Post,,HowTo,,,,,,,,2021-10-25,https://GitHub.com/swiss-ssi-group/AzureADVerifiableCredentialsAspNetCore,,,,,,,,,,,,
|
||
Microsoft,Personal,,,Damien Bowden<br>,Trinsic,,,,,Creating Verifiable credentials in ASP.NET Core for decentralized identities using Trinsic,"The National Driving license application is responsible for issuing driver licenses and administrating licenses for users which have authenticated correctly. The user can see his or her driver license and a verifiable credential displayed as a QR code which can be used to add the credential to a digital wallet. When the application generates the credential, it adds the credential DID to the blockchain ledger with the cryptographic proof of the issuer and the document. When you scan the QR Code, the DID will get validated and will be added to the wallet along with the request claims. The digital wallet must be able to find the DID on the correct network and the schema and needs to search for the ledger in the correct blockchain. A good wallet should take care of this for you. The schema is required so that the data in the DID document can be understood.",,https://damienbod.com/2021/04/05/creating-verifiable-credentials-in-asp-net-core-for-decentralized-identities-using-trinsic/,,Post,,HowTo,,,,,,QR Code,DID,2021-04-05,,,,,,,,,,,,,
|
||
Microsoft,XT Seminars,,,,,,,,,Issuing your own DIDs & VCs with Azure AD,,,https://www.xtseminars.co.uk/post/issuing-your-own-dids-vcs-with-azure-ad,,Post,,HowTo,,,,,,"Entra,AzureAD",,2021-06-01,,,,,,,,,,,,,
|
||
Microsoft,Personal,,,Damien Bowden,Mattr,,,,,Present and Verify Verifiable Credentials in ASP.NET Core using Decentralized Identities and Mattr,"This article shows how use verifiable credentials stored on a digital wallet to verify a digital identity and use in an application. For this to work, a trust needs to exist between the verifiable credential issuer and the application which requires the verifiable credentials to verify. A blockchain decentralized database is used and Mattr is used as a access layer to this ledger and blockchain. The applications are implemented in ASP.NET Core.","This article shows how use verifiable credentials stored on a digital wallet to verify a digital identity and use in an application. For this to work, a trust needs to exist between the verifiable credential issuer and the application which requires the verifiable credentials to verify. A blockchain decentralized database is used and Mattr is used as a access layer to this ledger and blockchain. The applications are implemented in ASP.NET Core. The verifier application Bo Insurance is used to implement the verification process and to create a presentation template. The application sends a HTTP post request to create a presentation request using the DID Id from the OIDC credential Issuer, created in the previous article. This DID is created from the National Driving license application which issues verifiable credentials and so a trust needs to exist between the two applications. Once the credentials have been issued to a holder of the verifiable credentials and stored for example in a digital wallet, the issuer is no longer involved in the process. Verifying the credentials only requires the holder and the verifier and the decentralized database which holds the digital identities and documents. The verifier application gets the DID from the ledger and signs the verify request. The request can then be presented as a QR Code. The holder can scan this using a Mattr digital wallet and grant consent to share the credentials with the application. The digital wallet calls the callback API defined in the request presentation body and sends the data to the API. The verifier application hosting the API would need to verify the data and can update the application UI using SignalR to continue the business process with the verified credentials. Code https://GitHub.com/swiss-ssi-group/MattrGlobalAspNetCore Blogs in the series - Getting started with Self Sovereign Identity SSI - Creating Verifiable credentials in ASP.NET Core for decentralized identities using Trinsic - Verifying Verifiable Credentials in ASP.NET Core for Decentralized Identities using Trinsic - Create an OIDC credential Issuer with Mattr and ASP.NET Core - Present and Verify Verifiable Credentials in ASP.NET Core using Decentralized Identities and Mattr - Verify vaccination data using Zero Knowledge Proofs with ASP.NET Core and Mattr - Challenges to Self Sovereign Identity - Create and issue verifiable credentials in ASP.NET Core using Azure AD - Implement Compound Proof BBS+ verifiable credentials using ASP.NET Core and Mattr Create the presentation template for the Verifiable Credential A presentation template is required to verify the issued verifiable credentials stored on a digital wallet. The digital identity (DID) Id of the OIDC credential issuer is all that is required to create a presentation request template. In the application which issues credentials, ie the NationalDrivingLicense, a Razor page was created to view the DID of the OIDC credential issuer. The DID can be used to create the presentation template. The Mattr documentation is really good here: https://learn.Mattr.global/tutorials/verify/presentation-request-template A Razor page was created to start this task from the UI. This would normally require authentication as this is an administrator task from the application requesting the verified credentials. The code behind the Razor page takes the DID request parameter and calls the MattrPresentationTemplateService to create the presentation template and present this id a database. public class CreatePresentationTemplateModel : PageModel { private readonly MattrPresentationTemplateService _MattrVerifyService; public bool CreatingPresentationTemplate { get; set; } = true; public string TemplateId { get; set; } [BindProperty] public PresentationTemplate PresentationTemplate { get; set; } public CreatePresentationTemplateModel(MattrPresentationTemplateService MattrVerifyService) { _MattrVerifyService = MattrVerifyService; } public void OnGet() { PresentationTemplate = new PresentationTemplate(); } public async Task<IActionResult> OnPostAsync() { if (!ModelState.IsValid) { return Page(); } TemplateId = await _MattrVerifyService.CreatePresentationTemplateId(PresentationTemplate.DidId); CreatingPresentationTemplate = false; return Page(); } } public class PresentationTemplate { [Required] public string DidId { get; set; } } The Razor page html template creates a form to post the request to the server rendered page and displays the templateId after, if the creation was successful. @page @model BoInsurance.Pages.CreatePresentationTemplateModel <div class=""container-fluid""> <div class=""row""> <div class=""col-sm""> <form method=""post""> <div> <div class=""form-group""> <label class=""control-label"">DID ID</label> <input asp-for=""PresentationTemplate.DidId"" class=""form-control"" /> <span asp-validation-for=""PresentationTemplate.DidId"" class=""text-danger""></span> </div> <div class=""form-group""> @if (Model.CreatingPresentationTemplate) { <input class=""form-control"" type=""submit"" readonly=""@Model.CreatingPresentationTemplate"" value=""Create Presentation Template"" /> } </div> <div class=""form-group""> @if (!Model.CreatingPresentationTemplate) { <div class=""alert alert-success""> <strong>Mattr Presentation Template created</strong> </div> } </div> </div> </form> <hr /> <p>When the templateId is created, you can use the template ID to verify</p> </div> <div class=""col-sm""> <div> <img src=""~/ndl_car_01.png"" width=""200"" alt=""Driver License""> <div> <b>Driver Licence templateId from presentation template</b> <hr /> <dl class=""row""> <dt class=""col-sm-4"">templateId</dt> <dd class=""col-sm-8""> @Model.TemplateId </dd> </dl> </div> </div> </div> </div> </div> The MattrPresentationTemplateService is used to create the Mattr presentation template. This class uses the Mattr API and sends a HTTP post request with the DID Id of the OIDC credential issuer and creates a presentation template. The service saves the returned payload to a database and returns the template ID as the result. The template ID is required to verify the verifiable credentials. The MattrTokenApiService is used to request an API token for the Mattr API using the credential of your Mattr account. This service has a simple token cache and only requests new access tokens when no token exists or the token has expired. The BoInsuranceDbService service is used to access the SQL database using Entity Framework Core. This provides simple methods to persist or select the data as required. private readonly IHttpClientFactory _clientFactory; private readonly MattrTokenApiService _MattrTokenApiService; private readonly BoInsuranceDbService _boInsuranceDbService; private readonly MattrConfiguration _MattrConfiguration; public MattrPresentationTemplateService(IHttpClientFactory clientFactory, IOptions<MattrConfiguration> MattrConfiguration, MattrTokenApiService MattrTokenApiService, BoInsuranceDbService boInsuranceDbService) { _clientFactory = clientFactory; _MattrTokenApiService = MattrTokenApiService; _boInsuranceDbService = boInsuranceDbService; _MattrConfiguration = MattrConfiguration.Value; } public async Task<string> CreatePresentationTemplateId(string didId) { // create a new one var v1PresentationTemplateResponse = await CreateMattrPresentationTemplate(didId); // save to db var drivingLicensePresentationTemplate = new DrivingLicensePresentationTemplate { DidId = didId, TemplateId = v1PresentationTemplateResponse.Id, MattrPresentationTemplateReponse = JsonConvert .SerializeObject(v1PresentationTemplateResponse) }; await _boInsuranceDbService .CreateDriverLicensePresentationTemplate(drivingLicensePresentationTemplate); return v1PresentationTemplateResponse.Id; } private async Task<V1_PresentationTemplateResponse> CreateMattrPresentationTemplate(string didId) { HttpClient client = _clientFactory.CreateClient(); var accessToken = await _MattrTokenApiService.GetApiToken(client, ""MattrAccessToken""); client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(""Bearer"", accessToken); client.DefaultRequestHeaders.TryAddWithoutValidation(""Content-Type"", ""application/json""); var v1PresentationTemplateResponse = await CreateMattrPresentationTemplate(client, didId); return v1PresentationTemplateResponse; } The CreateMattrPresentationTemplate method sends the HTTP Post request like in the Mattr API documentation. Creating the payload for the HTTP post request using the Mattr Open API definitions is a small bit complicated. This could be improved with a better Open API definition. In our use case, we just want to create the default template for the OIDC credential issuer and so just require the DID Id. Most of the other properties are fixed values, see the Mattr API docs for more information. private async Task<V1_PresentationTemplateResponse> CreateMattrPresentationTemplate( HttpClient client, string didId) { // create presentation, post to presentations templates api // https://learn.Mattr.global/tutorials/verify/presentation-request-template var createPresentationsTemplatesUrl = $""https://{_MattrConfiguration.TenantSubdomain}/v1/presentations/templates""; var additionalProperties = new Dictionary<string, object>(); additionalProperties.Add(""type"", ""QueryByExample""); additionalProperties.Add(""credentialQuery"", new List<CredentialQuery> { new CredentialQuery { Reason = ""Please provide your driving license"", Required = true, Example = new Example { Context = new List<object>{ ""https://schema.org"" }, Type = ""VerifiableCredential"", TrustedIssuer = new List<TrustedIssuer2> { new TrustedIssuer2 { Required = true, Issuer = didId // DID use to create the oidc } } } } }); var payload = new MattrOpenApiClient.V1_CreatePresentationTemplate { Domain = _MattrConfiguration.TenantSubdomain, Name = ""certificate-presentation"", Query = new List<Query> { new Query { AdditionalProperties = additionalProperties } } }; var payloadJson = JsonConvert.SerializeObject(payload); var uri = new Uri(createPresentationsTemplatesUrl); using (var content = new StringContentWithoutCharset(payloadJson, ""application/json"")) { var presentationTemplateResponse = await client.PostAsync(uri, content); if (presentationTemplateResponse.StatusCode == System.Net.HttpStatusCode.Created) { var v1PresentationTemplateResponse = JsonConvert .DeserializeObject<MattrOpenApiClient.V1_PresentationTemplateResponse>( await presentationTemplateResponse.Content.ReadAsStringAsync()); return v1PresentationTemplateResponse; } var error = await presentationTemplateResponse.Content.ReadAsStringAsync(); } throw new Exception(""whoops something went wrong""); } The application can be started and the presentation template can be created. The ID is returned back to the UI for the next step. Verify the verifiable credentials Now that a template exists to request the verifiable data from the holder of the data which is normally stored in a digital wallet, the verifier application can create and start a verification process. A post request is sent to the Mattr APIs which creates a presentation request using a DID ID and the required template. The application can request the DID from the OIDC credential issuer. The request is signed using the correct key from the DID and the request is published in the UI as a QR Code. A digital wallet is used to scan the code and the user of the wallet can grant consent to share the Personal data. The wallet sends a HTTP post request to the callback API. This API handles the request, would validate the data and updates the UI using SignalR to move to the next step of the business process using the verified data. Step 1 Invoke a presentation request The InvokePresentationRequest method implements the presentation request. This method requires the DID Id of the OIDC credential issuer which will by used to get the data from the holder of the data. The template ID is also required from the template created above. A challenge is also used to track the verification. The challenge is a random value and is used when the digital wallet calls the API with the verified data. The callback URL is where the data is returned to. This could be unique for every request or anything you want. The payload is created like the docs from the Mattr API defines. The post request is sent to the Mattr API and a V1_CreatePresentationRequestResponse is returned if all is configured correctly. private async Task<V1_CreatePresentationRequestResponse> InvokePresentationRequest( HttpClient client, string didId, string templateId, string challenge, string callbackUrl) { var createDidUrl = $""https://{_MattrConfiguration.TenantSubdomain}/v1/presentations/requests""; var payload = new MattrOpenApiClient.V1_CreatePresentationRequestRequest { Did = didId, TemplateId = templateId, Challenge = challenge, CallbackUrl = new Uri(callbackUrl), ExpiresTime = Mattr_EPOCH_EXPIRES_TIME_VERIFIY // Epoch time }; var payloadJson = JsonConvert.SerializeObject(payload); var uri = new Uri(createDidUrl); using (var content = new StringContentWithoutCharset(payloadJson, ""application/json"")) { var response = await client.PostAsync(uri, content); if (response.StatusCode == System.Net.HttpStatusCode.Created) { var v1CreatePresentationRequestResponse = JsonConvert .DeserializeObject<V1_CreatePresentationRequestResponse>( await response.Content.ReadAsStringAsync()); return v1CreatePresentationRequestResponse; } var error = await response.Content.ReadAsStringAsync(); } return null; } Step 2 Get the OIDC Issuer DID The RequestDID method uses the Mattr API to get the DID data from the blochchain for the OIDC credential issuer. Only the DID Id is required. private async Task<V1_GetDidResponse> RequestDID(string didId, HttpClient client) { var requestUrl = $""https://{_MattrConfiguration.TenantSubdomain}/core/v1/dids/{didId}""; var uri = new Uri(requestUrl); var didResponse = await client.GetAsync(uri); if (didResponse.StatusCode == System.Net.HttpStatusCode.OK) { var v1CreateDidResponse = JsonConvert.DeserializeObject<V1_GetDidResponse>( await didResponse.Content.ReadAsStringAsync()); return v1CreateDidResponse; } var error = await didResponse.Content.ReadAsStringAsync(); return null; } Step 3 Sign the request using correct key and display QR Code To verify data using a digital wallet, the payload must be signed using the correct key. The SignAndEncodePresentationRequestBody uses the DID payload and the request from the presentation request to create the payload to sign. Creating the payload is a big messy due to the OpenAPI definitions created for the Mattr API. A HTTP post request with the payload returns the signed JWT in a payload in a strange data format so we parse this as a string and manually get the JWT payload. private async Task<string> SignAndEncodePresentationRequestBody( HttpClient client, V1_GetDidResponse did, V1_CreatePresentationRequestResponse v1CreatePresentationRequestResponse) { var createDidUrl = $""https://{_MattrConfiguration.TenantSubdomain}/v1/messaging/sign""; object didUrlArray; did.DidDocument.AdditionalProperties.TryGetValue(""authentication"", out didUrlArray); var didUrl = didUrlArray.ToString().Split(""\"""")[1]; var payload = new MattrOpenApiClient.SignMessageRequest { DidUrl = didUrl, Payload = v1CreatePresentationRequestResponse.Request }; var payloadJson = JsonConvert.SerializeObject(payload); var uri = new Uri(createDidUrl); using (var content = new StringContentWithoutCharset(payloadJson, ""application/json"")) { var response = await client.PostAsync(uri, content); if (response.StatusCode == System.Net.HttpStatusCode.OK) { var result = await response.Content.ReadAsStringAsync(); return result; } var error = await response.Content.ReadAsStringAsync(); } return null; } The CreateVerifyCallback method uses the presentation request, the get DID and the sign HTTP post requests to create a URL which can be displayed in a UI. The challenge is created using the RNGCryptoServiceProvider class which creates a random string. The access token to access the API is returned from the client credentials OAuth requests or from the in memory cache. The DrivingLicensePresentationVerify class is persisted to a database and the verify URL is returned so that this could be displayed as a QR Code in the UI. /// <summary> /// https://learn.Mattr.global/tutorials/verify/using-callback/callback-e-to-e /// </summary> /// <param name=""callbackBaseUrl""></param> /// <returns></returns> public async Task<(string QrCodeUrl, string ChallengeId)> CreateVerifyCallback(string callbackBaseUrl) { callbackBaseUrl = callbackBaseUrl.Trim(); if (!callbackBaseUrl.EndsWith('/')) { callbackBaseUrl = $""{callbackBaseUrl}/""; } var callbackUrlFull = $""{callbackBaseUrl}{Mattr_CALLBACK_VERIFY_PATH}""; var challenge = GetEncodedRandomString(); HttpClient client = _clientFactory.CreateClient(); var accessToken = await _MattrTokenApiService.GetApiToken(client, ""MattrAccessToken""); client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(""Bearer"", accessToken); client.DefaultRequestHeaders.TryAddWithoutValidation(""Content-Type"", ""application/json""); var template = await _boInsuranceDbService.GetLastDriverLicensePrsentationTemplate(); // Invoke the Presentation Request var invokePresentationResponse = await InvokePresentationRequest( client, template.DidId, template.TemplateId, challenge, callbackUrlFull); // Request DID V1_GetDidResponse did = await RequestDID(template.DidId, client); // Sign and Encode the Presentation Request body var signAndEncodePresentationRequestBodyResponse = await SignAndEncodePresentationRequestBody( client, did, invokePresentationResponse); // fix strange DTO var jws = signAndEncodePresentationRequestBodyResponse.Replace(""\"""", """"); // save to db // TODO add this back once working var drivingLicensePresentationVerify = new DrivingLicensePresentationVerify { DidId = template.DidId, TemplateId = template.TemplateId, CallbackUrl = callbackUrlFull, Challenge = challenge, InvokePresentationResponse = JsonConvert.SerializeObject(invokePresentationResponse), Did = JsonConvert.SerializeObject(did), SignAndEncodePresentationRequestBody = jws }; await _boInsuranceDbService.CreateDrivingLicensePresentationVerify(drivingLicensePresentationVerify); var qrCodeUrl = $""didcomm://https://{_MattrConfiguration.TenantSubdomain}/?request={jws}""; return (qrCodeUrl, challenge); } private string GetEncodedRandomString() { var base64 = Convert.ToBase64String(GenerateRandomBytes(30)); return HtmlEncoder.Default.Encode(base64); } private byte[] GenerateRandomBytes(int length) { using var randonNumberGen = new RNGCryptoServiceProvider(); var byteArray = new byte[length]; randonNumberGen.GetBytes(byteArray); return byteArray; } The CreateVerifierDisplayQrCodeModel is the code behind for the Razor page to request a verification and also display the verify QR Code for the digital wallet to scan. The CallbackUrl can be set from the UI so that this is easier for testing. This callback can be any webhook you want or API. To test the application in local development, I used ngrok. The return URL has to match the proxy which tunnels to you PC, once you start. If the API has no public address when debugging, you will not be able to test locally. Step 4 Implement the Callback and update the UI using SignalR After a successful verification in the digital wallet, the wallet sends the verified credentials to the API defined in the presentation request. The API handling this needs to update the correct client UI and continue the business process using the verified data. We use SignalR for this with a single client to client connection. The Signal connections for each connection is associated with a challenge ID, the same Id we used to create the presentation request. Using this, only the correct client will be notified and not all clients broadcasted. The DrivingLicenseCallback takes the body with is specific for the credentials you issued. This is always depending on what you request. The data is saved to a database and the client is informed to continue. We send a message directly to the correct client using the connectionId of the SignalR session created for this challenge. [ApiController] [Route(""api/[controller]"")] public class VerificationController : Controller { private readonly BoInsuranceDbService _boInsuranceDbService; private readonly IHubContext<MattrVerifiedSuccessHub> _hubContext; public VerificationController(BoInsuranceDbService boInsuranceDbService, IHubContext<MattrVerifiedSuccessHub> hubContext) { _hubContext = hubContext; _boInsuranceDbService = boInsuranceDbService; } /// <summary> /// { /// ""presentationType"": ""QueryByExample"", /// ""challengeId"": ""GW8FGpP6jhFrl37yQZIM6w"", /// ""claims"": { /// ""id"": ""did:key:z6MkfxQU7dy8eKxyHpG267FV23agZQu9zmokd8BprepfHALi"", /// ""name"": ""Chris"", /// ""firstName"": ""Shin"", /// ""licenseType"": ""Certificate Name"", /// ""dateOfBirth"": ""some data"", /// ""licenseIssuedAt"": ""dda"" /// }, /// ""verified"": true, /// ""holder"": ""did:key:z6MkgmEkNM32vyFeMXcQA7AfQDznu47qHCZpy2AYH2Dtdu1d"" /// } /// </summary> /// <param name=""body""></param> /// <returns></returns> [HttpPost] [Route(""[action]"")] public async Task<IActionResult> DrivingLicenseCallback([FromBody] VerifiedDriverLicense body) { string connectionId; var found = MattrVerifiedSuccessHub.Challenges .TryGetValue(body.ChallengeId, out connectionId); // test Signalr //await _hubContext.Clients.Client(connectionId).SendAsync(""MattrCallbackSuccess"", $""{body.ChallengeId}""); //return Ok(); var exists = await _boInsuranceDbService.ChallengeExists(body.ChallengeId); if (exists) { await _boInsuranceDbService.PersistVerification(body); if (found) { //$""/VerifiedUser?challengeid={body.ChallengeId}"" await _hubContext.Clients .Client(connectionId) .SendAsync(""MattrCallbackSuccess"", $""{body.ChallengeId}""); } return Ok(); } return BadRequest(""unknown verify request""); } } The SignalR server is configured in the Startup class of the ASP.NET Core application. The path for the hub is defined in the MapHub method. public void ConfigureServices(IServiceCollection services) { // ... services.AddRazorPages(); services.AddSignalR(); services.AddControllers(); } public void Configure(IApplicationBuilder app, IWebHostEnvironment env) { // ... app.UseEndpoints(endpoints => { endpoints.MapRazorPages(); endpoints.MapHub<MattrVerifiedSuccessHub>(""/MattrVerifiedSuccessHub""); endpoints.MapControllers(); }); } The Hub implementation requires only one fixed method. The AddChallenge method takes the challenge Id and adds this the an in-memory cache. The controller implemented for the callbacks uses this ConcurrentDictionary to find the correct connectionId which is mapped to the challenges form the verification. public class MattrVerifiedSuccessHub : Hub { /// <summary> /// This should be replaced with a cache which expires or something /// </summary> public static readonly ConcurrentDictionary<string, string> Challenges = new ConcurrentDictionary<string, string>(); public void AddChallenge(string challengeId, string connnectionId) { Challenges.TryAdd(challengeId, connnectionId); } } The Javascript SignalR client in the browser connects to the SignalR server and registers the connectionId with the challenge ID used for the verification of the verifiable credentials from the holder of the digital wallet. If a client gets a message from that a verify has completed successfully and the callback has been called, it will redirect to the verified page. The client listens to the MattrCallbackSuccess for messages. These messages are sent from the callback controller directly. The VerifiedUserModel Razor page displays the data and the business process can continue using the verified data. public class VerifiedUserModel : PageModel { private readonly BoInsuranceDbService _boInsuranceDbService; public VerifiedUserModel(BoInsuranceDbService boInsuranceDbService) { _boInsuranceDbService = boInsuranceDbService; } public string ChallengeId { get; set; } public DriverLicenseClaimsDto VerifiedDriverLicenseClaims { get; private set; } public async Task OnGetAsync(string challengeId) { // user query param to get challenge id and display data if (challengeId != null) { var verifiedDriverLicenseUser = await _boInsuranceDbService.GetVerifiedUser(challengeId); VerifiedDriverLicenseClaims = new DriverLicenseClaimsDto { DateOfBirth = verifiedDriverLicenseUser.DateOfBirth, Name = verifiedDriverLicenseUser.Name, LicenseType = verifiedDriverLicenseUser.LicenseType, FirstName = verifiedDriverLicenseUser.FirstName, LicenseIssuedAt = verifiedDriverLicenseUser.LicenseIssuedAt }; } } } public class DriverLicenseClaimsDto { public string Name { get; set; } public string FirstName { get; set; } public string LicenseType { get; set; } public string DateOfBirth { get; set; } public string LicenseIssuedAt { get; set; } } Running the verifier To test the BoInsurance application locally, which is the verifier application, ngrok is used so that we have a public address for the callback. I install ngrok using npm. Without a license, you can only run your application in http. npm install -g ngrok Run the ngrok from the command line using the the URL of the application. I start the ASP.NET Core application at localhost port 5000. ngrok http localhost:5000 You should be able to copied the ngrok URL and use this in the browser to test the verification. Once running, a verification can be created and you can scan the QR Code with your digital wallet. Once you grant access to your data, the data is sent to the callback API and the UI will be redirected to the success page. Notes Mattr APIs work really well and support some of the flows for digital identities. I plan to try out the zero proof flow next. It is only possible to create verifiable credentials from data from your identity provider using the id_token. To issue credentials, you have to implement your own identity provider and cannot use business data from your application. If you have full control like with Openiddict, IdenityServer4 or Auth0, this is no problem, just more complicated to implement. If you do not control the data in your identity provider, you would need to create a second identity provider to issue credentials. This is part of your business logic then and not just an identity provider. This will always be a problem is using Azure AD or IDPs from large, Medium size companies. The quality of the verifiable credentials also depend on how good the OIDC credential issuers are implemented as these are still central databases for these credentials and are still open to all the problems we have today. Decentralized identities have to potential to solve many problems but still have many unsolved problems. Links https://learn.Mattr.global/tutorials/verify/using-callback/callback-e-to-e https://Mattr.global/get-started/ https://learn.Mattr.global/ https://learn.Mattr.global/tutorials/dids/did-key https://gunnarpeipman.com/httpclient-remove-charset/",https://damienbod.com/2021/05/10/present-and-verify-verifiable-credentials-in-asp-net-core-using-decentralized-identities-and-Mattr/,,Post,,HowTo,,,,,,ASP.NET,,2021-05-10,,,,,,,,,,,,,
|
||
Microsoft,Personal,,,Damien Bowden,Mattr,,,,,Verify Vaccination Data Using Zero Knowldege Proofs with ASP.Net Core and Mattr,"This article shows how Zero Knowledge Proofs ZKP verifiable credentia can be used to verify a persons vaccination data implemented in ASP.NET Core and Mattr. The ZKP BBS+ verifiable credentials are issued and stored on a digital wallet using a Self-Issued Identity Provider (SIOP) and Open ID Connect. The data can then be used to verify if the holder has the required credentials, but only the required data is used and returned to the verification application.<br>",,https://damienbod.com/2021/05/31/verify-vaccination-data-using-zero-knowledge-proofs-with-asp-net-core-and-Mattr/,,Post,,HowTo,,,,,,ASP.NET,Verifiable Credentials,2021-05-31,https://GitHub.com/swiss-ssi-group/MattrZeroKnowledgeProofsAspNetCore,,,,,,,,,,,,
|
||
Microsoft,ML-Software,,,Matteo Locher,Trinsic,,,,,Verifying Verifiable Credentials in ASP.NET Core for Decentralized Identities using Trinsic,In this part we are going to look at how we can verify these credentials in order to continue some sort of business process. We will continue with the sample that Damien started and after obtaining our driver license we want to sign up for a new insurance. But we can only sign up at this insurance company if we can deliver proof of our driver license.,"This blog post is a continuation of Damien's blog post about the creation of verifiable credentials. In his blog post Damien showed how to set up an ASP.NET Core application to obtain a credential from the Trinsic platform. In this part we are going to look at how we can verify these credentials in order to continue some sort of business process. We will continue with the sample that Damien started and after obtaining our driver license we want to sign up for a new insurance. But we can only sign up at this insurance company if we can deliver proof of our driver license. The code for this can be found on GitHub. Whilst in Damien's blog post he showed how a verifiable credential can be issued to a so called credential holder, this blog post will be about how we can verify such credentials as part of a business workflow. After an issuer has issued credentials to the holder and they have stored these into their wallet, a verifier can now ask a holder to verify them self with a certain credential. A verifier can add policies to check for certain attributes but also add restrictions like a specific issuer DID. With this in place a verifier can create a verification request which will be sent to the credential holder. This step is very important because it is where a cryptographic challenge is generated that the holder must respond to. This guarantees that the holder is responding to exactly this specific verification request. After the verification request gets returned to the verifier, it needs to be verified against the ledger to make sure it is valid. The verification record does not only contain the attributes, but also some metadata such as the digital signature of the issuer of the credentials, revocation details, verification policies etc. which then get validated against their sources. The image below describes this trust-triangle between the issuer, holder and verifier. Inside of the Trinsic studio you can now create a new organization. This can be on the same account as you have created the issuer organization but also on a different account works fine. After you have created the organization you need to acquire the API-Key that is required to call the Trinsic API from the verifier application. For this example we did no create a template for the verification request. So there is nothing more to do in the Trinsic Studio. For this scenario we used Connectionsless Verifications. These have the ability to create a verification request without having to create a enduring relationship with the credential holder. You can read more about this here. The verifier application will require a NuGet package offered by Trinsic so make the communication with the Trinsic API easier. Add the Trinsic.ServiceClients package to your project and add the service inside of your startup. Remember to put your API-Key into your user secrets and not in your app settings. public void ConfigureServices(IServiceCollection services) { services.AddTrinsicClient(options => { // For CredentialsClient and WalletClient // Insurance API Key // API key of Bo Insurance (Organisation which does the verification) options.AccessToken = Configuration[""Trinsic:ApiKey""]; }); services.AddScoped<IDriversLicenseVerificationService, DriversLicenseVerificationService>(); services.AddRazorPages(); services.AddControllers(); } The logic for the verification is encapsulated inside of the DriversLicenseVerificationService. Due to the limitation of Trinsic only allowing 50 credential exchanges (which include verification requests) there is also a MockService that can be used during development. When creating a verification request with Trinsic we are creating the policies during runtime instead of using a template in the Trinsic Studio which makes it easier to change. In the policy below we require a certain list of attributes to be present in the credential and also restrict that the credential was issued by a certain issuer with supplying its DID. Otherwise any credential with the attributes present could be used for the verification request. public async Task<(string verificationId, string verificationUrl)> CreateVerificationRequest() { IList<VerificationPolicyAttributeContract> attributePolicies = new List<VerificationPolicyAttributeContract>() { new VerificationPolicyAttributeContract() { PolicyName = ""National Driver License Policy"", AttributeNames = new List<string>() { ""Issued At"", ""Name"", ""First Name"", ""Date of Birth"", ""License Type"" }, Restrictions = new List<VerificationPolicyRestriction>() { new VerificationPolicyRestriction { IssuerDid = _issuerDid, // Restrict by issuer identifier } } } }; // Optionally check if a revocable credential is valid at a given time var revocationRequirement = new VerificationPolicyRevocationRequirement() { ValidNow = true }; // Create the verification var verificationContract = await _credentialsServiceClient.CreateVerificationFromParametersAsync( new VerificationPolicyParameters { Name = ""Driver License Verification"", Version = ""1.0"", // Must follow Semantic Versioning scheme (https://semver.org), Attributes = attributePolicies, }); return (verificationId: verificationContract.VerificationId, verificationUrl: verificationContract.VerificationRequestUrl); } In our sample application if a customer wants to sign up for a new insurance we ask them to verify their driver license. During this step we call the Trinsic API to create a verification request. From the API call we get a url that can be embedded inside a QR Code. This QR code can then be scanned by the credential holder inside of their wallet application and they can approve the request with their credential obtained by the issuer. Upon successful verification the form gets submitted to the backend for further processing. The goal of this blog post was to show how easy it is to work with verifiable credentials. If you read the docs and all the post that are out there it might be overwhelming with all the terminology about blockchain and so on. I think Trinsic have done a good job with making this technology accessible for any developer. Yet there is like always room for improvement. More on this topic can be found on the Trinsic documentation page. If you like this blog post drop a comment or buy me a coffee at the bottom of the pagecomments powered by Disqus",https://ml-software.ch/posts/verifiying-verifiable-credentials-using-trinsic,,Post,,HowTo,,,,,,,,2021-04-13,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Alex Simons,,,,,,Announcing Azure AD Verifiable Credentials,We started on a journey with the open standards community to empower everyone to own and control their own identity. I’m thrilled to share that we’ve achieved a major milestone in making this vision real. Today we’re announcing that the public preview for Azure AD verifiable credentials is now available: organizations can empower users to control credentials that manage access to their information.,,https://techcommunity.microsoft.com/t5/azure-active-directory-identity/announcing-azure-ad-verifiable-credentials/ba-p/1994711,,Post,,Meta,,,,,,,Verifiable Credentials,2021-04-05,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Alex Simmons,,,,,,Decentralized digital identities and blockchain: The future as we see it,"Over the last 12 months we’ve invested in incubating a set of ideas for using Blockchain (and other distributed ledger technologies) to create new types of digital identities, identities designed from the ground up to enhance Personal privacy, security and control. We’re pretty excited by what we’ve learned and by the new partnerships we’ve formed in the process. Today we’re taking the opportunity to share our thinking and direction with you. This blog is part of a series and follows on Peggy Johnson’s blog post announcing that Microsoft has joined the ID2020 initiative. If you haven’t already Peggy’s post, I would recommend reading it first.",,https://techcommunity.microsoft.com/t5/azure-active-directory-identity/decentralized-digital-identities-and-blockchain-the-future-as-we/ba-p/1994714,,Post,,Meta,,,,,,Entra,,2021-02-18,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,Alex Simons,,,,,,Decentralized digital identities and blockchain: The future as we see it,"Over the last 12 months we’ve invested in incubating a set of ideas for using Blockchain (and other distributed ledger technologies) to create new types of digital identities, identities designed from the ground up to enhance Personal privacy, security and control. We’re pretty excited by what we’ve learned and by the new partnerships we’ve formed in the process. Today we’re taking the opportunity to share our thinking and direction with you. This blog is part of a series and follows on Peggy Johnson’s blog post announcing that Microsoft has joined the ID2020 initiative. If you haven’t already Peggy’s post, I would recommend reading it first.<br>","Decentralized digital identities and blockchain: The future as we see it Howdy folks, I hope you’ll find today’s post as interesting as I do. It’s a bit of brain candy and outlines an exciting vision for the future of digital identities. Over the last 12 months we’ve invested in incubating a set of ideas for using Blockchain (and other distributed ledger technologies) to create new types of digital identities, identities designed from the ground up to enhance Personal privacy, security and control. We’re pretty excited by what we’ve learned and by the new partnerships we’ve formed in the process. Today we’re taking the opportunity to share our thinking and direction with you. This blog is part of a series and follows on Peggy Johnson’s blog post announcing that Microsoft has joined the ID2020 initiative. If you haven’t already Peggy’s post, I would recommend reading it first. I’ve asked Ankur Patel, the PM on my team leading these incubations to kick our discussion on Decentralized Digital Identities off for us. His post focuses on sharing some of the core things we’ve learned and some of the resulting principles we’re using to drive our investments in this area going forward. And as always, we’d love to hear your thoughts and feedback. Best Regards, Alex Simons (Twitter: @Alex_A_Simons) Director of Program Management Microsoft Identity Division ———- Greetings everyone, I’m Ankur Patel from Microsoft’s Identity Division. It is an awesome privilege to have this opportunity to share some of our learnings and future directions based on our efforts to incubate Blockchain/distributed ledger based Decentralized Identities. What we see As many of you experience every day, the world is undergoing a global digital transformation where digital and physical reality are blurring into a single integrated modern way of living. This new world needs a new model for digital identity, one that enhances individual privacy and security across the physical and digital world. Microsoft’s cloud identity systems already empower thousands of developers, organizations and billions of people to work, play, and achieve more. And yet there is so much more we can do to empower everyone. We aspire to a world where the billions of people living today with no reliable ID can finally realize the dreams we all share like educating our children, improving our quality of life, or starting a business. To achieve this vision, we believe it is essential for individuals to own and control all elements of their digital identity. Rather than grant broad consent to countless apps and services, and have their identity data spread across numerous providers, individuals need a secure encrypted digital hub where they can store their identity data and easily control access to it. Each of us needs a digital identity we own, one which securely and privately stores all elements of our digital identity. This self-owned identity must be easy to use and give us complete control over how our identity data is accessed and used. We know that enabling this kind of self-sovereign digital identity is bigger than any one company or organization. We’re committed to working closely with our customers, partners and the community to unlock the next generation of digital identity-based experiences and we’re excited to partner with so many people in the industry who are making incredible contributions to this space. What we’ve learned To that end today we are sharing our best thinking based on what we’ve learned from our decentralized identity incubation, an effort which is aimed at enabling richer experiences, enhancing trust, and reducing friction, while empowering every person to own and control their Digital Identity. - Own and control your Identity. Today, users grant broad consent to countless apps and services for collection, use and retention beyond their control. With data breaches and identity theft becoming more sophisticated and frequent, users need a way to take ownership of their identity. After examining decentralized storage systems, consensus protocols, blockchains, and a variety of emerging standards we believe blockchain technology and protocols are well suited for enabling Decentralized IDs (DID). - Privacy by design, built in from the ground up. Today, apps, services, and organizations deliver convenient, predictable, tailored experiences that depend on control of identity-bound data. We need a secure encrypted digital hub (ID Hubs) that can interact with user’s data while honoring user privacy and control. - Trust is earned by individuals, built by the community. Traditional identity systems are mostly geared toward authentication and access management. A self-owned identity system adds a focus on authenticity and how community can establish trust. In a decentralized system trust is based on attestations: claims that other entities endorse – which helps prove facets of one’s identity. - Apps and services built with the user at the center. Some of the most engaging apps and services today are ones that offer experiences Personalized for their users by gaining access to their user’s Personally Identifiable Information (PII). DIDs and ID Hubs can enable developers to gain access to a more precise set of attestations while reducing legal and compliance risks by processing such information, instead of controlling it on behalf of the user. - Open, interoperable foundation. To create a robust decentralized identity ecosystem that is accessible to all, it must be built on standard, open source technologies, protocols, and reference implementations. For the past year we have been participating in the Decentralized Identity Foundation (DIF) with individuals and organizations who are similarly motivated to take on this challenge. We are collaboratively developing the following key components: - Decentralized Identifiers (DIDs) – a W3C spec that defines a common document format for describing the state of a Decentralized Identifier - Identity Hubs – an encrypted identity datastore that features message/intent relay, attestation handling, and identity-specific compute endpoints. - Universal DID Resolver – a server that resolves DIDs across blockchains - Verifiable Credentials – a W3C spec that defines a document format for encoding DID-based attestations. - Ready for world scale: To support a vast world of users, organizations, and devices, the underlying technology must be capable of scale and performance on par with traditional systems. Some public blockchains (Bitcoin [BTC], Ethereum, Litecoin, to name a select few) provide a solid foundation for rooting DIDs, recording DPKI operations, and anchoring attestations. While some blockchain communities have increased on-chain transaction capacity (e.g. blocksize increases), this approach generally degrades the decentralized state of the network and cannot reach the millions of transactions per second the system would generate at world-scale. To overcome these technical barriers, we are collaborating on decentralized Layer 2 protocols that run atop these public blockchains to achieve global scale, while preserving the attributes of a world class DID system. - Accessible to everyone: The blockchain ecosystem today is still mostly early adopters who are willing to spend time, effort, and energy managing keys and securing devices. This is not something we can expect mainstream people to deal with. We need to make key management challenges, such as recovery, rotation, and secure access, intuitive and fool-proof. Our next steps New systems and big ideas, often make sense on a whiteboard. All the lines connect, and assumptions seem solid. However, product and engineering teams learn the most by shipping. Today, the Microsoft Authenticator app is already used by millions of people to prove their identity every day. As a next step we will experiment with Decentralized Identities by adding support for them into to Microsoft Authenticator. With consent, Microsoft Authenticator will be able to act as your User Agent to manage identity data and cryptographic keys. In this design, only the ID is rooted on chain. Identity data is stored in an off-chain ID Hub (that Microsoft can’t see) encrypted using these cryptographic keys. Once we have added this capability, apps and services will be able to interact with user’s data using a common messaging conduit by requesting granular consent. Initially we will support a select group of DID implementations across blockchains and we will likely add more in the future. Looking ahead We are humbled and excited to take on such a massive challenge, but also know it can’t be accomplished alone. We are counting on the support and input of our alliance partners, members of the Decentralized Identity Foundation, and the diverse Microsoft ecosystem of designers, policy makers, business partners, hardware and software builders. Most importantly we will need you, our customers to provide feedback as we start testing these first set of scenarios. This is our first post about our work on Decentralized Identity. In upcoming posts we will share information about our proofs of concept as well as technical details for key areas outlined above. We look forward to you joining us on this venture! Key resources: - Follow-us at @AzureAD on Twitter - Get involved with Decentralized Identity Foundation (DIF) - Participate in W3C Credentials Community Group Regards, Ankur Patel (@_AnkurPatel) Principal Program Manager Microsoft Identity Division",https://www.microsoft.com/en-us/microsoft-365/blog/2018/02/12/decentralized-digital-identities-and-blockchain-the-future-as-we-see-it/,,Post,,Meta,,,,,,,,2018-02-12,,,,,,,,,,,,,
|
||
Microsoft,BitcoinMagazine,,,GIULIO PRISCO,Blockstack; Consensys; ID2020; uPort,,,,,"Microsoft Building Open Blockchain-Based Identity System With Blockstack, ConsenSys","The Microsoft strategist said that the Redmond, Washington, giant is working with Blockstack Labs and ConsenSys to leverage their current Bitcoin and Ethereum-based identity solutions, Blockstack and uPort. Through this open source collaboration, Microsoft and its partners intend to produce a cross-chain identity solution that can be extended to any future blockchains or new kinds of decentralized, distributed systems. In the coming weeks an open-source framework for developers will be made available on Azure.","Microsoft Building Open Blockchain-Based Identity System With Blockstack, ConsenSys Microsoft has announced that it is collaborating with Blockstack Labs, ConsenSys and developers across the globe on an open source, self-sovereign, blockchain-based identity system that allows people, products, apps and services to interoperate across blockchains, cloud providers and organizations. The United Nation's Sustainable Development Goals include giving everyone a legal identity by 2030. As a first step, the U.N. wants to develop scalable identity systems by 2020. The inaugural ""ID2020 Summit ‒ Harnessing Digital Identity for the Global Community,"" held at the United Nations headquarters in New York on May 20, brought together policymakers and technology companies to develop an action plan. “While we don’t profess to have solutions to these overwhelming problems today, we can start where the open source community is best: collaboration,” said Yorke Rhodes III, blockchain business strategist at Microsoft. “To progress toward these goals, we have been working with partners to address identity using the self-owned or self-sovereign qualities of blockchain technology.” The Microsoft strategist said that the Redmond, Washington, giant is working with Blockstack Labs and ConsenSys to leverage their current Bitcoin and Ethereum-based identity solutions, Blockstack and uPort. Through this open source collaboration, Microsoft and its partners intend to produce a cross-chain identity solution that can be extended to any future blockchains or new kinds of decentralized, distributed systems. In the coming weeks an open-source framework for developers will be made available on Azure. Blockstack ‒ an open source blockchain application stack ‒ permits building decentralized, serverless apps by plugging into Blockstack's services for identity, naming, storage and authentication. According to the Blockstack team, Blockstack is the largest, most popular blockchain identity system, with 50,000 registered identities that come with profiles and globally unique names. Identities can be registered for people, companies, websites, software packages and more. Profiles can contain both private and public information, which is attested to by the user and can be verified by peers and select authorities. “Microsoft will make it easy to deploy new Blockstack servers and infrastructure on the Azure cloud and plans to integrate Blockstack with some internal systems for identity and authentication,” notes the Blockstack blog. “With the Blockstack technology users are in complete control of their usernames and data and don’t need to trust any third party for their information. We appreciate Microsoft’s committed to making the internet a more secure and user-centric place and to promote open-source software development.” In November Bitcoin Magazinereported that Microsoft had partnered with ConsenSys, a blockchain startup focused on Ethereum technology, founded in October 2014 by Ethereum Foundation’s co-founder Joseph Lubin. In December, Microsoft and ConsenSys announced Ethereum Blockchain as a Service (EBaaS) on Microsoft Azure, to provide a single-click cloud-based blockchain developer environment to Azure Enterprise clients and developers. In October, ConsenSys revealed that it was working on an identity management system called uPort . “[We] have started to integrate an ID and persona construct across all of our dApps,” noted the ConsenSys blog. “Soon a uPort persona will enable access to any dApp ConsenSys or other developers build. ConsenSys has begun efforts to work with various partners towards standardization of these components.” The company added that user-owned ID and data will be crucial for realizing the compelling vision of Web 3.0. “We’re also collaborating with ConsenSys on a cross-blockchain solution for global namespaces,” notes the Blockstack blog. “We believe that a global identity system should not be dependent on any particular blockchain and users should be able to migrate from one blockchain to another, if needed. Along these lines, we plan to work with ConsenSys to add Ethereum support to the Blockstack server.” Redmond Magazinenotes that there are many unofficial identity systems in the social media world, including the systems operated by Google, Facebook and Microsoft itself, as well as various emerging blockchain-based platforms that have been proposed for the online world. But the U.N. and the companies that participated in the inaugural ID2020 Summit are more ambitious: They want to develop globally recognized identity systems for the real world. One-fifth of the world’s population ‒ one and a half billion people ‒ are without proper identification, and 50 million children are born every year without a birth certificate and a legal identity. These numbers are growing, which underlines the importance of the U.N. goal of giving everyone on the planet a solid and tamper-proof digital identity based on common, interoperable standards. According to John Farmer, director of technology and civic innovation at Microsoft, blockchain technology can offer three key features to an identity system: It's an immutable, trustless, and transparent agreed-upon network. “[We] can imagine a world where an individual can register their identity in a cross blockchain fashion, providing a single namespace for lookup regardless of blockchain of choice,” concludes the Microsoft announcement. “[We are] excited by the potential societal benefits that can be derived from an identity that transcends borders, blockchains, organizations and companies.”",https://bitcoinmagazine.com/articles/microsoft-building-open-blockchain-based-identity-system-with-blockstack-consensys-1464968713/,,Post,,Meta,,,,,,"Bitcoin,Ethereum",,2016-06-03,,,,,,,,,,,,,
|
||
Microsoft,BusinessInsider,,,Isobel Asher Hamilton,,,,,,Microsoft is quietly testing a project that aims to hand people complete control over their online data,"Foley reported that Bali's ""about"" page described itself as a ""new Personal data bank which puts users in control of all data collected about them... The bank will enable users to store all data (raw and inferred) generated by them. It will allow the user to visualize, manage, control, share and monetize the data.""<br><br>It also cited the concept of ""inverse privacy,"" a paper published by Microsoft researchers in 2014. It's the idea that someone else has access to your online data, but you don't.","- Microsoft is quietly working on a project codenamed ""Bali,"" which could give users much more control over their Personal data. - Bali was first spotted by a Twitter user, and reporters then found what looked like the project's website. - The website described Bali as a ""new Personal data bank which puts users in control of all data collected about them."" - When Business Insider tried to access the site, it had vanished. Microsoft is working on a research project which could give customers vast control over their Personal online data. Microsoft has been quietly testing the idea and even launched a beta website, according to reports. It comes at a time when privacy is high on the agenda following a series of scandals, including Facebook's Cambridge Analytica data breach last year. Reporters first got wind of the project from a tweet. Twitter user ""Longhorn"" said on Wednesday: ""Microsoft Bali is a project that can delete all your connection and account information (inverseprivacyproject). It's currently in private beta still."" ZDNet journalist Mary Jo Foley then found what looked like the Bali website. The site reportedly required a code to sign in, but visitors could request a code. PC Magazine also appears to have visited the site, but when Business Insider followed the link, the website failed to load. Foley reported that Bali's ""about"" page described itself as a ""new Personal data bank which puts users in control of all data collected about them... The bank will enable users to store all data (raw and inferred) generated by them. It will allow the user to visualize, manage, control, share and monetize the data."" It also cited the concept of ""inverse privacy,"" a paper published by Microsoft researchers in 2014. It's the idea that someone else has access to your online data, but you don't. Business Insider contacted Microsoft for comment.",https://www.businessinsider.com/microsoft-working-on-project-bali-to-give-people-control-over-data-2019-1,,Post,,Meta,,,,,,,,2019-01-04,,,,,,,,,,,,,
|
||
Microsoft,Wired,,,,"At its Ignite conference today, Microsoft announced that it will launch a public preview of its “Azure Active Directory verifiable credentials” this spring.",,,,,Microsoft's Dream of Decentralized IDs Enters the Real World,"Beyond just controlling access, developers can further secure user data by encrypting that data using keys from their decentralized identifiers,' a Microsoft spokesperson told WIRED in a statement. Based on such an approach, a bad actor may gain access to a system or datastore but can’t decrypt the data without keys that reside with individual user.'","For years, tech companies have touted blockchain technology as a means to develop identity systems that are secure and decentralized. The goal is to build a platform that could store information about official data without holding the actual documents or details themselves. Instead of just storing a scan of your birth certificate, for example, a decentralized ID platform might store a validated token that confirms the information in it. Then when you get carded at a bar or need proof of citizenship, you could share those pre-verified credentials instead of the actual document or data. Microsoft has been one of the leaders of this pack—and is now detailing tangible progress toward its vision of a decentralized digital ID. At its Ignite conference today, Microsoft announced that it will launch a public preview of its “Azure Active Directory verifiable credentials” this spring. Think of the platform as a digital wallet like Apple Pay or Google Pay, but for identifiers rather than credit cards. Microsoft is starting with things like university transcripts, diplomas, and professional credentials, letting you add them to its Microsoft Authenticator app along with two-factor codes. It's already testing the platform at Keio University in Tokyo, with the government of Flanders in Belgium, and with the United Kingdom's National Health Service. ""If you have a decentralized identifier I can verify, say, where you went to school, and I don’t need you to send me all of the information,"" says Joy Chik, corporate vice president for Microsoft's cloud and enterprise identity division. “All I need is to get that digital credential and because it’s already been verified I can trust it."" Microsoft will release a software development kit in the coming weeks that organizations can use to start building applications that issue and request credentials. And long-term, the company says, it hopes the system could be used around the world for everything from renting an apartment to establishing identity for refugees who are struggling without documents—a dream of virtually all decentralized identification efforts. In the NHS pilot, for example, health care providers can request access to professional certifications from existing NHS health care workers, who can in turn choose to allow that access, streamlining a process for transferring to another facility that previously required a much more involved back and forth. Under Microsoft's setup, you can also revoke access to your credentials if the recipient no longer needs access. “In the NHS system, at each hospital health care workers go to, it used to take months of effort to verify their credentials before they could practice,"" Chik says. “Now it literally takes five minutes to be enrolled in the hospital and starting to treat patients."" A big hurdle to widespread adoption of a decentralized ID scheme has been interoperability. Having 10 competing frameworks out there wouldn't make things easier for anyone. Currently there are some potential competitors, like an offering from Mastercard that's still in testing. Microsoft's ubiquity potentially makes it a good candidate to rally a critical mass of users. With this in mind, the company developed Azure Active Directory verifiable credentials off of open authentication standards, like the World Wide Web Consortium's WebAuthN. That should make it easier for customers to adopt the platform, and for other tech giants to support its use in their products as well. Currently, Microsoft is working with digital identity partners Acuant, Au10tix, Idemia, Jumio, Socure, Onfido, and Vu Security to pilot the platform, and Chik says the goal is to expand that list quickly over time.",https://www.wired.com/story/microsoft-decentralized-id-blockchain/,,Post,,Meta,,,,,,,,2021-03-02,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,Peggy Johnson,ID2020,,,,,Partnering for a path to digital identity,"As discussions begin this week at the World Economic Forum, creating universal access to identity is an issue at the top of Microsoft’s agenda, and we think technology can be a powerful tool to tackle this challenge. It was last summer that Microsoft took a first step, collaborating with Accenture and Avanade on a blockchain-based identity prototype on Microsoft Azure. Together, we pursued this work in support of the ID2020 Alliance – a global public-private partnership dedicated to aiding the 1.1 billion people around the world who lack any legal form of identity. To say that we were encouraged by its mission would be an understatement. We were inspired by it.","In the U.S. and abroad, fundamental rights and services like voting, healthcare, housing and education are tethered to legal proof of identification – you can’t participate if you don’t have it. Yet nearly one in six people worldwide – the majority of them being women, children and refugees – live without it. The lack of legal documentation not only strips access to critical services, it puts those trapped in the “identity gap” at risk for larger issues including displacement and child trafficking. As discussions begin this week at the World Economic Forum, creating universal access to identity is an issue at the top of Microsoft’s agenda, and we think technology can be a powerful tool to tackle this challenge. It was last summer that Microsoft took a first step, collaborating with Accenture and Avanade on a blockchain-based identity prototype on Microsoft Azure. Together, we pursued this work in support of the ID2020 Alliance – a global public-private partnership dedicated to aiding the 1.1 billion people around the world who lack any legal form of identity. To say that we were encouraged by its mission would be an understatement. We were inspired by it. Today, we are excited to share that we are deepening our commitment to this issue by formally joining ID2020 as a founding member. In addition to a donation of $1 million, we will commit resources and expertise to further develop a secure, portable form of digital identity and help implement it across governments and agencies. In the coming months, Microsoft, our partners in the ID2020 Alliance, and developers around the globe will collaborate on an open source, self-sovereign, blockchain-based identity system that allows people, products, apps and services to interoperate across blockchains, cloud providers and organizations. We will lend the technical expertise of our Identity team to provide guidance as the project scales, empowering people with direct consent over who has access to their Personal information, and when to release and share data. We will also help establish standards that ensure this work is impactful and scalable. Our shared ambition with ID2020 is to start piloting this solution in the coming year to bring it to those who need it most, beginning with refugee populations. Amid a growing refugee crisis, we believe technology can play a powerful role when put in the hands of displaced people and the organizations that are supporting them. Over the last two years, Microsoft Philanthropies has donated $33 million in technology and funding to organizations that aid refugees and empower them to rebuild their lives. Closing the identity gap is an enormous challenge. It will take the work of many committed people and organizations coming together across different geographies, sectors and technologies. But it’s exciting to imagine a world where safe and secure digital identities are possible, providing everyone with an essential building block to every right and opportunity they deserve. Tags: digital identity, ID2020 Alliance",https://blogs.microsoft.com/blog/2018/01/22/partnering-for-a-path-to-digital-identity/,,Post,,Meta,,,,,,,,2018-01-02,,,,,,,,,,,,,
|
||
Microsoft,Newswire CA,,,,,,,,,Credivera Joins Microsoft Partner Network as Verifiable Credentials Provider,"Credivera, a global leader in the secure, open exchange of verifiable credentials and digital identity solutions, today announced that it has joined the Microsoft Partner Network. In addition, it has been selected by Microsoft as a Microsoft Entra Verified ID solution provider. Credivera joins a list of internationally based companies in the Microsoft Partner Network who are leading the development of innovative digital identity tools, empowering individuals to completely own and control their unique digital identity. ","Jul 26, 2022, 13:00 ET CALGARY, AB, July 26, 2022 /CNW/ - Credivera, a global leader in the secure, open exchange of verifiable credentials and digital identity solutions, today announced that it has joined the Microsoft Partner Network. In addition, it has been selected by Microsoft as a Microsoft Entra Verified ID solution provider. Credivera joins a list of internationally based companies in the Microsoft Partner Network who are leading the development of innovative digital identity tools, empowering individuals to completely own and control their unique digital identity. Recent market conditions, such as the emerging world of decentralized identity, the remote nature of today's global workforce, and the troubling increase in widespread identity theft, uniquely position Credivera as a trusted source of truth, supporting businesses and enterprises everywhere as they look to automate the verification of identity credentials for their workforce. ""We are in the business of verifiable career credentials and today's announcement is a major milestone for the entire Credivera team as we respond to the urgent demand for trusted digital identity and open standard solutions that enable secure, private information sharing. We're excited to represent Canada on a global stage within the Microsoft Partner Network alongside an esteemed list of companies and will continue to deliver innovative digital identity solutions for the workforce that return power and control into the hands of the individual, allowing each of us to own what we know and share what we want."" said Dan Giurescu, Credivera co-founder and Chief Executive Officer. Credivera's technology platform is built using Microsoft Azure SQL Database, Azure Active Directory, and is integrated with Microsoft Dynamics 365 Business Central and Power BI. Credivera also integrates with third-party HR and Safety programs, meaning that an individual's digital credentials, that are available in a Credivera digital wallet, are always accessible, always on, and always true for multiple contexts and scenarios. Beyond the advantages for individuals, key organizational benefits of the solution include enhanced systems productivity, a scalable global reach, definitive trust in fraud-free, valid workforce credentials, and eliminating any possibility of liability and risk. To learn more about how our digital identity verifications solutions can work for you, visit credivera.com/the-exchange/verifiable-credentials. To learn more about the Microsoft Partner Network, please visit partner.Microsoft.com. TerraHub Technologies Inc., known as Credivera commercially, is the world's first secure, open exchange for verifiable credentials. A leader in workforce management and digital identity, Credivera gives employees, employers, and organizations that issue credentials increased productivity and control of how important credentials are stored and shared. The Credivera Exchange optimizes Personal privacy and trust with up-to-date verifiable credentials secured in a digital wallet, resulting in reduced risk for all. Founded in 2017, with offices in Toronto and Calgary, Credivera supports regulated industries and global technology firms in over 30 countries worldwide",https://www.newswire.ca/news-releases/credivera-joins-microsoft-partner-network-as-verifiable-credentials-provider-857742185.html,,Press,,Meta,,,,,,,,2022-07-26,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,,Condatis,,,,,Condatis revolutionizes staff management with Microsoft Entra Verified ID,"At Edinburgh-based Condatis, as more employees transition from a hybrid work model to a full return to the office, they’re being greeted by a new, intuitive sign-in experience built on virtual, verifiable credentials that provide value-added access to office spaces and services. Whether someone is being onboarded, coming in as a temporary hire, or visiting a staff member, each person will see that some doors in the office will be open for them, and others won’t.",Microsoft customer stories See how Microsoft tools help companies run their business. Microsoft,https://customers.microsoft.com/en-us/story/1508854534910834689-condatis-partner-professional-services-entra-verified-id,,Testimonial,,Meta,,,,,,,Verifiable Credentials,2023-01-01,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,,,,,,,Decentralized Identity Own and control your identity,"Microsoft cloud identity systems already empower developers, organizations, and billions of people to work, play, and achieve more, but there’s so much more we can do to create a world where each of us, even in displaced populations, can pursue our life goals, including educating our children, improving our quality of life, and starting a business.To achieve this vision, we need to augment existing cloud identity systems with one that individuals, organizations, and devices can own so they can control their digital identity and data. This self-owned identity must seamlessly integrate into our daily lives, providing complete control over what we share and with whom we share it, and—when necessary—provide the ability to take it back. Instead of granting broad consent to countless apps and services and spreading their identity data across numerous providers, individuals need a secure, encrypted digital hub where they can store their identity data and easily control access to it.",,https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/re2djfy,,Whitepaper,,Meta,,,,,,,,2018,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,Microsoft Entra Verified ID documentation,,,,,,,Azure AD Verifiable Credentials architecture overview (preview),"It’s important to plan your verifiable credential solution so that in addition to issuing and or validating credentials, you have a complete view of the architectural and business impacts of your solution. If you haven’t reviewed them already, we recommend you review Introduction to Microsoft Entra Verified ID and the FAQs, and then complete the Getting Started tutorial.<br><br>This architectural overview introduces the capabilities and components of the Microsoft Entra Verified ID service. ","Microsoft Entra Verified ID architecture overview Note Azure Active Directory Verifiable Credentials is now Microsoft Entra Verified ID and part of the Microsoft Entra family of products. Learn more about the Microsoft Entra family of identity solutions and get started in the unified Microsoft Entra admin center. It’s important to plan your verifiable credential solution so that in addition to issuing and or validating credentials, you have a complete view of the architectural and business impacts of your solution. If you haven’t reviewed them already, we recommend you review Introduction to Microsoft Entra Verified ID and the FAQs, and then complete the Getting Started tutorial. This architectural overview introduces the capabilities and components of the Microsoft Entra Verified ID service. For more detailed information on issuance and validation, see Approaches to identity Today most organizations use centralized identity systems to provide employees credentials. They also use various methods to bring customers, partners, vendors, and relying parties into the organization’s trust boundaries. These methods include federation, creating and managing guest accounts with systems like Azure AD B2B, and creating explicit trusts with relying parties. Most business relationships have a digital component, so enabling some form of trust between organizations requires significant effort. Centralized identity systems Centralized approaches still work well in many cases, such as when applications, services, and devices rely on the trust mechanisms used within a domain or trust boundary. In centralized identity systems, the identity provider (IDP) controls the lifecycle and usage of credentials. However, there are scenarios where using a decentralized architecture with verifiable credentials can provide value by augmenting key scenarios such as secure onboarding of employees’ and others’ identities, including remote scenarios. access to resources inside the organizational trust boundary based on specific criteria. accessing resources outside the trust boundary, such as accessing partners’ resources, with a portable credential issued by the organization. Decentralized identity systems In decentralized identity systems, control of the lifecycle and usage of the credentials is shared between the issuer, the holder, and relying party consuming the credential. Consider the scenario in the diagram below where Proseware, an e-commerce website, wants to offer Woodgrove employees corporate discounts. Terminology for verifiable credentials (VCs) might be confusing if you're not familiar with VCs. The following definitions are from the Verifiable Credentials Data Model 1.0 terminology section. After each, we relate them to entities in the preceding diagram. “An issuer is a role an entity can perform by asserting claims about one or more subjects, creating a verifiable credential from these claims, and transmitting the verifiable credential to a holder.” - In the preceding diagram, Woodgrove is the issuer of verifiable credentials to its employees. “A holder is a role an entity might perform by possessing one or more verifiable credentials and generating presentations from them. A holder is usually, but not always, a subject of the verifiable credentials they are holding. Holders store their credentials in credential repositories.” - In the preceding diagram, Alice is a Woodgrove employee. They obtained a verifiable credential from the Woodgrove issuer, and is the holder of that credential. “A verifier is a role an entity performs by receiving one or more verifiable credentials, optionally inside a verifiable presentation for processing. Other specifications might refer to this concept as a relying party.” - In the preceding diagram, Proseware is a verifier of credentials issued by Woodgrove. “A credential is a set of one or more claims made by an issuer. A verifiable credential is a tamper-evident credential that has authorship that can be cryptographically verified. Verifiable credentials can be used to build verifiable presentations, which can also be cryptographically verified. The claims in a credential can be about different subjects.” “A decentralized identifier is a portable URI-based identifier, also known as a DID, associated with an entity. These identifiers are often used in a verifiable credential and are associated with subjects, issuers, and verifiers.”. - In the preceding diagram, the public keys of the actor’s DIDs are made available via trust system (Web or ION). “A decentralized identifier document, also referred to as a DID document, is a document that is accessible using a verifiable data registry and contains information related to a specific decentralized identifier, such as the associated repository and public key information.” In the scenario above, both the issuer and verifier have a DID, and a DID document. The DID document contains the public key, and the list of DNS web domains associated with the DID (also known as linked domains). Woodgrove (issuer) signs their employees’ VCs with its private key; similarly, Proseware (verifier) signs requests to present a VC using its key, which is also associated with its DID. A trust system is the foundation in establishing trust between decentralized systems. It can be a distributed ledger or it can be something centralized, such as DID Web. “A distributed ledger is a non-centralized system for recording events. These systems establish sufficient confidence for participants to rely upon the data recorded by others to make operational decisions. They typically use distributed databases where different nodes use a consensus protocol to confirm the ordering of cryptographically signed transactions. The linking of digitally signed transactions over time often makes the history of the ledger effectively immutable.” - The Microsoft solution uses the Identity Overlay Network (ION) to provide decentralized public key infrastructure (PKI) capability. As an alternative to ION, Microsoft also offers DID Web as the trust system. Combining centralized and decentralized identity architectures When you examine a verifiable credential solution, it's helpful to understand how decentralized identity architectures can be combined with centralized identity architectures to provide a solution that reduces risk and offers significant operational benefits. The user journey This architectural overview follows the journey of a job candidate and employee, who applies for and accepts employment with an organization. It then follows the employee and organization through changes where verifiable credentials can augment centralized processes. Actors in these use cases Alice, a person applying for and accepting employment with Woodgrove, Inc. Woodgrove, Inc, a fictitious company. Adatum, Woodgrove’s fictitious identity verification partner. Proseware, Woodgrove’s fictitious partner organization. Woodgrove uses both centralized and decentralized identity architectures. Steps in the user journey Alice applying for, accepting, and onboarding to a position with Woodgrove, Inc. Accessing digital resources within Woodgrove’s trust boundary. Accessing digital resources outside of Woodgrove’s trust boundary without extending Woodgrove or partners’ trust boundaries. As Woodgrove continues to operate its business, it must continually manage identities. The use cases in this guidance describe how Alice can use self-service functions to obtain and maintain their identifiers and how Woodgrove can add, modify, and end business-to-business relationships with varied trust requirements. These use cases demonstrate how centralized identities and decentralized identities can be combined to provide a more robust and efficient identity and trust strategy and lifecycle. User journey: Onboarding to Woodgrove Awareness: Alice is interested in working for Woodgrove, Inc. and visits Woodgrove’s career website. Activation: The Woodgrove site presents Alice with a method to prove their identity by prompting them with a QR code or a deep link to visit its trusted identity proofing partner, Adatum. Request and upload: Adatum requests proof of identity from Alice. Alice takes a selfie and a driver’s license picture and uploads them to Adatum. Issuance: Once Adatum verifies Alice’s identity, Adatum issues Alice a verifiable credential (VC) attesting to their identity. Presentation: Alice (the holder and subject of the credential) can then access the Woodgrove career portal to complete the application process. When Alice uses the VC to access the portal, Woodgrove takes the roles of verifier and the relying party, trusting the attestation from Adatum. Distributing initial credentials Alice accepts employment with Woodgrove. As part of the onboarding process, an Azure Active Directory (AD) account is created for Alice to use inside of the Woodgrove trust boundary. Alice’s manager must figure out how to enable Alice, who works remotely, to receive initial sign-in information in a secure way. In the past, the IT department might have provided those credentials to their manager, who would print them and hand them to Alice. This doesn’t work with remote employees. VCs can add value to centralized systems by augmenting the credential distribution process. Instead of needing the manager to provide credentials, Alice can use their VC as proof of identity to receive their initial username and credentials for centralized systems access. Alice presents the proof of identity they added to their wallet as part of the onboarding process. In the onboarding use case, the trust relationship roles are distributed between the issuer, the verifier, and the holder. The issuer is responsible for validating the claims that are part of the VC they issue. Adatum validates Alice’s identity to issue the VC. In this case, VCs are issued without the consideration of a verifier or relying party. The holder possesses the VC and initiates the presentation of the VC for verification. Only Alice can present the VCs she holds. The verifier accepts the claims in the VC from issuers they trust and validate the VC using the decentralized ledger capability described in the verifiable credentials data model. Woodgrove trusts Adatum’s claims about Alice’s identity. By combining centralized and decentralized identity architectures for onboarding, privileged information about Alice necessary for identity verification, such as a government ID number, need not be stored by Woodgrove, because they trust that Alice’s VC issued by the identity verification partner (Adatum) confirms their identity. Duplication of effort is minimized, and a programmatic and predictable approach to initial onboarding tasks can be implemented. User journey: Accessing resources inside the trust boundary As an employee, Alice is operating inside of the trust boundary of Woodgrove. Woodgrove acts as the identity provider (IDP) and maintains complete control of the identity and the configuration of the apps Alice uses to interact within the Woodgrove trust boundary. To use resources in the Azure AD trust boundary, Alice provides potentially multiple forms of proof of identification to sign in Woodgrove’s trust boundary and access the resources inside of Woodgrove’s technology environment. This is a typical scenario that is well served using a centralized identity architecture. Woodgrove manages the trust boundary and using good security practices provides the least-privileged level of access to Alice based on the job performed. To maintain a strong security posture, and potentially for compliance reasons, Woodgrove must also be able to track employees’ permissions and access to resources and must be able to revoke permissions when the employment is terminated. Alice only uses the credential that Woodgrove maintains to access Woodgrove resources. Alice has no need to track when the credential is used since the credential is managed by Woodgrove and only used with Woodgrove resources. The identity is only valid inside of the Woodgrove trust boundary when access to Woodgrove resources is necessary, so Alice has no need to possess the credential. Using VCs inside the trust boundary Individual employees have changing identity needs, and VCs can augment centralized systems to manage those changes. While employed by Woodgrove Alice might need gain access to resources based on meeting specific requirements. For example, when Alice completes privacy training, she can be issued a new employee VC with that claim, and that VC can be used to access restricted resources. VCs can be used inside of the trust boundary for account recovery. For example, if the employee has lost their phone and computer, they can regain access by getting a new VC from the identity verification service trusted by Woodgrove, and then use that VC to get new credentials. User journey: Accessing external resources Woodgrove negotiates a product purchase discount with Proseware. All Woodgrove employees are eligible for the discount. Woodgrove wants to provide Alice a way to access Proseware’s website and receive the discount on products purchased. If Woodgrove uses a centralized identity architecture, there are two approaches to providing Alice the discount: Alice could provide Personal information to create an account with Proseware, and then Proseware would have to verify Alice’s employment with Woodgrove. Woodgrove could expand their trust boundary to include Proseware as a relying party and Alice could use the extended trust boundary to receive the discount. With decentralized identifiers, Woodgrove can provide Alice with a verifiable credential (VC) that Alice can use to access Proseware’s website and other external resources. By providing Alice the VC, Woodgrove is attesting that Alice is an employee. Woodgrove is a trusted VC issuer in Proseware’s validation solution. This trust in Woodgrove’s issuance process allows Proseware to electronically accept the VC as proof that Alice is a Woodgrove employee and provide Alice the discount. As part of validation of the VC Alice presents, Proseware checks the validity of the VC by using the trust system. In this solution: Woodgrove enables Alice to provide Proseware proof of employment without Woodgrove having to extend its trust boundary. Proseware doesn’t need to expand their trust boundary to validate Alice is an employee of Woodgrove. Proseware can use the VC that Woodgrove provides instead. Because the trust boundary isn’t expanded, managing the trust relationship is easier, and Proseware can easily end the relationship by not accepting the VCs anymore. Alice doesn’t need to provide Proseware Personal information, such as an email. Alice maintains the VC in a wallet application on a Personal device. The only person that can use the VC is Alice, and Alice must initiate usage of the credential. Each usage of the VC is recorded by the wallet application, so Alice has a record of when and where the VC is used. By combining centralized and decentralized identity architectures for operating inside and outside of trust boundaries, complexity and risk can be reduced and limited relationships become easier to manage. Changes over time Woodgrove will add and end business relationships with other organizations and will need to determine when centralized and decentralized identity architectures are used. By combining centralized and decentralized identity architectures, the responsibility and effort associated with identity and proof of identity is distributed, risk is reduced, and the user doesn't risk releasing their private information as often or to as many unknown verifiers. Specifically: - In centralized identity architectures, the IDP issues credentials and performs verification of those issued credentials. Information about all identities is processed by the IDP, either storing them in or retrieving them from a directory. IDPs may also dynamically accept security tokens from other IDP systems, such as social sign-ins or business partners. For a relying party to use identities in the IDP trust boundary, they must be configured to accept the tokens issued by the IDP. How decentralized identity systems work In decentralized identity architectures, the issuer, user, and relying party (RP) each have a role in establishing and ensuring ongoing trusted exchange of each other’s credentials. The public keys of the actors’ DIDs are resolvable via the trust system, which allows signature validation and therefore trust of any artifact, including a verifiable credential. Relying parties can consume verifiable credentials without establishing trust relationships with the issuer. Instead, the issuer provides the subject a credential to present as proof to relying parties. All messages between actors are signed with the actor’s DID; DIDs from issuers and verifiers also need to own the DNS domains that generated the requests. For example: When VC holders need to access a resource, they must present the VC to that relying party. They do so by using a wallet application to read the RP’s request to present a VC. As a part of reading the request, the wallet application uses the RP’s DID to find the RPs public keys using the trust system, validating that the request to present the VC hasn't been tampered with. The wallet also checks that the DID is referenced in a metadata document hosted in the DNS domain of the RP, to prove domain ownership. Flow 1: Verifiable credential issuance In this flow, the credential holder interacts with the issuer to request a verifiable credential as illustrated in the following diagram The holder starts the flow by using a browser or native application to access the issuer’s web frontend. There, the issuer website drives the user to collect data and executes issuer-specific logic to determine whether the credential can be issued, and its content.) The issuer web frontend calls the Entra Verified ID service to generate a VC issuance request. The web frontend renders a link to the request as a QR code or a device-specific deep link (depending on the device). The holder scans the QR code or deep link from step 3 using a Wallet app such as Microsoft Authenticator The wallet downloads the request from the link. The request includes: DID of the issuer. This is used by the wallet app to resolve via the trust system to find the public keys and linked domains. URL with the VC manifest, which specifies the contract requirements to issue the VC. This can include id_token, self-attested attributes that must be provided, or the presentation of another VC. Look and feel of the VC (URL of the logo file, colors, etc.). The wallet validates the issuance requests and processes the contract requirements: Validates that the issuance request message is signed by the issuer’ keys found in the DID document resolved via the trust system. This ensures that the message hasn't been tampered with. Validates that the DNS domain referenced in the issuer’s DID document is owned by the issuer. Depending on the VC contract requirements, the wallet might require the holder to collect additional information, for example asking for self-issued attributes, or navigating through an OIDC flow to obtain an id_token. Submits the artifacts required by the contract to the Entra Verified ID service. The Entra Verified ID service returns the VC, signed with the issuer’s DID key and the wallet securely stores the VC. For detailed information on how to build an issuance solution and architectural considerations, see Plan your Microsoft Entra Verified ID issuance solution. Flow 2: Verifiable credential presentation In this flow, a holder interacts with a relying party (RP) to present a VC as part of its authorization requirements. The holder starts the flow by using a browser or native application to access the relying party’s web frontend. The web frontend calls the Entra Verified ID service to generate a VC presentation request. The web frontend renders a link to the request as a QR code or a device-specific deep link (depending on the device). The holder scans the QR code or deep link from step 3 using a wallet app such as Microsoft Authenticator The wallet downloads the request from the link. The request includes: a standards based request for credentials of a schema or credential type. the DID of the RP, which the wallet looks up in the trust system. The wallet validates that the presentation request and finds stored VC(s) that satisfy the request. Based on the required VCs, the wallet guides the subject to select and consent to use the VCs.After the subject consents to use of the VC, the wallet generates a unique pairwise DID between the subject and the RP. Then, the wallet sends a presentation response payload to the Entra Verified ID service signed by the subject. It contains: The VC(s) the subject consented to. The pairwise DID generated as the “subject” of the payload. The RP DID as the “audience” of the payload. The Entra Verified ID service validates the response sent by the wallet. Depending on how the original presentation request was created in step 2, this validation can include checking the status of the presented VC with the VC issuer for cases such as revocation. Upon validation, the Entra Verified ID service calls back the RP with the result. For detailed information on how to build a validation solution and architectural considerations, see Plan your Microsoft Entra Verified ID verification solution. Key Takeaways Decentralized architectures can be used to enhance existing solutions and provide new capabilities. To deliver on the aspirations of the Decentralized Identity Foundation (DIF) and W3C Design goals, the following should be considered when creating a verifiable credential solution: There are no central points of trust establishment between actors in the system. That is, trust boundaries aren't expanded through federation because actors trust specific VCs. The trust system enables the discovery of any actor’s decentralized identifier (DID). The solution enables verifiers to validate any verifiable credentials (VCs) from any issuer. The solution doesn't enable the issuer to control authorization of the subject or the verifier (relying party). The actors operate in a decoupled manner, each capable of completing the tasks for their roles. Issuers service every VC request and don't discriminate on the requests serviced. Subjects own their VC once issued and can present their VC to any verifier. Verifiers can validate any VC from any subject or issuer. Next steps Learn more about architecture for verifiable credentials",https://learn.microsoft.com/en-us/azure/active-directory/verifiable-credentials/introduction-to-verifiable-credentials-architecture,,Docs,,Product,,,,,,Entra,,2022-10-19,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,Microsoft Entra Verified ID documentation,,,,,,,Configure your tenant for Microsoft Entra Verified ID,"Microsoft Entra Verified ID is a decentralized identity solution that helps you safeguard your organization. The service allows you to issue and verify credentials. Issuers can use the Verified ID service to issue their own customized verifiable credentials. Verifiers can use the service's free REST API to easily request and accept verifiable credentials in their apps and services. In both cases, you will have to configure your Azure AD tenant so that you can use it to either issue your own verifiable credentials, or verify the presentation of a user's verifiable credentials that were issued by another organization. In case you are both an issuer and a verifier, you can use a single Azure AD tenant to both issue your own verifiable credentials as well as verify those of others.","Configure your tenant for Microsoft Entra Verified ID Note Azure Active Directory Verifiable Credentials is now Microsoft Entra Verified ID and part of the Microsoft Entra family of products. Learn more about the Microsoft Entra family of identity solutions and get started in the unified Microsoft Entra admin center. Microsoft Entra Verified ID is a decentralized identity solution that helps you safeguard your organization. The service allows you to issue and verify credentials. Issuers can use the Verified ID service to issue their own customized verifiable credentials. Verifiers can use the service's free REST API to easily request and accept verifiable credentials in their apps and services. In both cases, you will have to configure your Azure AD tenant so that you can use it to either issue your own verifiable credentials, or verify the presentation of a user's verifiable credentials that were issued by another organization. In case you are both an issuer and a verifier, you can use a single Azure AD tenant to both issue your own verifiable credentials as well as verify those of others. In this tutorial, you learn how to configure your Azure AD tenant to use the verifiable credentials service. Specifically, you learn how to: - Create an Azure Key Vault instance. - Set up the Verified ID service. - Register an application in Azure AD. The following diagram illustrates the Verified ID architecture and the component you configure. Prerequisites - You need an Azure tenant with an active subscription. If you don't have Azure subscription, create one for free. - Ensure that you have the global administrator or the authentication policy administrator permission for the directory you want to configure. If you're not the global administrator, you will need permission application administrator to complete the app registration including granting admin consent. - Ensure that you have the contributor role for the Azure subscription or the resource group that you will deploy Azure Key Vault in. Create a key vault Azure Key Vault is a cloud service that enables the secure storage and access of secrets and keys. The Verified ID service stores public and private keys in Azure Key Vault. These keys are used to sign and verify credentials. If you don't have an Azure Key Vault instance available, follow these steps to create a key vault using the Azure portal. Note By default, the account that creates a vault is the only one with access. The Verified ID service needs access to the key vault. You must configure your key vault with access policies allowing the account used during configuration to create and delete keys. The account used during configuration also requires permissions to sign so that it can create the domain binding for Verified ID. If you use the same account while testing, modify the default policy to grant the account sign permission, in addition to the default permissions granted to vault creators. Set access policies for the key vault A Key Vault access policy defines whether a specified security principal can perform operations on Key Vault secrets and keys. Set access policies in your key vault for both the Verified ID service administrator account, and for the Request Service API principal that you created. After you create your key vault, Verifiable Credentials generates a set of keys used to provide message security. These keys are stored in Key Vault. You use a key set for signing, updating, and recovering verifiable credentials. Set access policies for the Verified ID Admin user In the Azure portal, go to the key vault you use for this tutorial. Under Settings, select Access policies. In Add access policies, under USER, select the account you use to follow this tutorial. For Key permissions, verify that the following permissions are selected: Create, Delete, and Sign. By default, Create and Delete are already enabled. Sign should be the only key permission you need to update. - To save the changes, select Save. Set up Verified ID To set up Verified ID, follow these steps: In the Azure portal, search for Verified ID. Then, select Verified ID. From the left menu, select Getting started. Set up your organization by providing the following information: Organization name: Enter a name to reference your business within Verified IDs. Your customers don't see this name. Domain: Enter a domain that's added to a service endpoint in your decentralized identity (DID) document. The domain is what binds your DID to something tangible that the user might know about your business. Microsoft Authenticator and other digital wallets use this information to validate that your DID is linked to your domain. If the wallet can verify the DID, it displays a verified symbol. If the wallet can't verify the DID, it informs the user that the credential was issued by an organization it couldn't validate. Important The domain can't be a redirect. Otherwise, the DID and domain can't be linked. Make sure to use HTTPS for the domain. For example: https://contoso.com. Key vault: Select the key vault that you created earlier. Under Advanced, you may choose the trust system that you want to use for your tenant. You can choose from either Web or ION. Web means your tenant uses did:web as the did method and ION means it uses did:ion. Important The only way to change the trust system is to opt-out of the Verified ID service and redo the onboarding. Select Save and get started. Set access policies for the Verified ID service principals When you set up Verified ID in the previous step, the access policies in Azure Key Vault are automatically updated to give service principals for Verified ID the required permissions. If you ever are in need of manually resetting the permissions, the access policy should look like below. |Service Principal||AppId||Key Permissions| |Verifiable Credentials Service||bb2a64ee-5d29-4b07-a491-25806dc854d3||Get, Sign| |Verifiable Credentials Service Request||3db474b9-6a0c-4840-96ac-1fceb342124f||Sign| Register an application in Azure AD Your application needs to get access tokens when it wants to call into Microsoft Entra Verified ID so it can issue or verify credentials. To get access tokens, you have to register an application and grant API permission for the Verified ID Request Service. For example, use the following steps for a web application: Sign in to the Azure portal with your administrative account. If you have access to multiple tenants, select the Directory + subscription. Then, search for and select your Azure Active Directory. Under Manage, select App registrations > New registration. Enter a display name for your application. For example: verifiable-credentials-app. For Supported account types, select Accounts in this organizational directory only (Default Directory only - Single tenant). Select Register to create the application. Grant permissions to get access tokens In this step, you grant permissions to the Verifiable Credentials Service Request Service principal. To add the required permissions, follow these steps: Stay in the verifiable-credentials-app application details page. Select API permissions > Add a permission. Select APIs my organization uses. Search for the Verifiable Credentials Service Request service principal and select it. Choose Application Permission, and expand VerifiableCredential.Create.All. Select Add permissions. Select Grant admin consent for <your tenant name>. You can choose to grant issuance and presentation permissions separately if you prefer to segregate the scopes to different applications. Service endpoint configuration - Navigate to the Verified ID service in the Azure portal. - Select Registration. - Notice that there are two sections:DID registrationDomain ownership verification. - Select on each section and download the JSON file under each. - Create a website that you can use to distribute the files. If you specified https://contoso.com as your domain, the URLs for each of the files would look as shown below: https://contoso.com/.well-known/did.json https://contoso.com/.well-known/did-configuration.json Once that you have successfully completed the verification steps, you are ready to continue to the next tutorial. If you have selected ION as the trust system, you will not see the DID registration section as it is not applicable for ION and you only have to distribute the did-configuration.json file.",https://learn.microsoft.com/en-us/azure/active-directory/verifiable-credentials/verifiable-credentials-configure-tenant,,Docs,,Product,,,,,,Entra,,2022-11-16,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,,,,,,,DTDL models - Azure Digital Twins,"DTDL is based on JSON-LD and is programming-language independent. DTDL isn't exclusive to Azure Digital Twins, but is also used to represent device data in other IoT services such as IoT Plug and Play.","Learn about twin models and how to define them in Azure Digital Twins A key characteristic of Azure Digital Twins is the ability to define your own vocabulary and build your twin graph in the self-defined terms of your business. This capability is provided through user-provided models. You can think of models as the nouns in a description of your world. Azure Digital Twins models are represented in the JSON-LD-based Digital Twin Definition Language (DTDL). A model is similar to a class in an object-oriented programming language, defining a data shape for one particular concept in your real work environment. Models have names (such as Room or TemperatureSensor), and contain elements such as properties, telemetry, and relationships that describe what this type of entity in your environment does. Later, you'll use these models to create digital twins that represent specific entities that meet this type description. Digital Twin Definition Language (DTDL) for models Models for Azure Digital Twins are defined using the Digital Twins Definition Language (DTDL). You can view the full language specs for DTDL in GitHub: Digital Twins Definition Language (DTDL) - Version 2 Reference. This page includes detailed DTDL reference and examples to help you get started writing your own DTDL models. DTDL is based on JSON-LD and is programming-language independent. DTDL isn't exclusive to Azure Digital Twins, but is also used to represent device data in other IoT services such as IoT Plug and Play. Azure Digital Twins uses DTDL version 2 (use of DTDL version 1 with Azure Digital Twins has now been deprecated). The rest of this article summarizes how the language is used in Azure Digital Twins. Model overview Twin type models can be written in any text editor. The DTDL language follows JSON syntax, so you should store models with the extension .json. Using the JSON extension will enable many programming text editors to provide basic syntax checking and highlighting for your DTDL documents. There's also a DTDL extension available for Visual Studio Code. Here are the fields within a model interface: |Field||Description| |A Digital Twin Model Identifier (DTMI) for the model. Must be in the format | |Identifies the kind of information being described. For an interface, the type is | |Sets the context for the JSON document. Models should use | |[optional] Gives you the option to define a friendly name for the model. If you don't use this field, the model will use its full DTMI value.| |All remaining interface data is placed here, as an array of attribute definitions. Each attribute must provide a | Here's an example of a basic DTDL model. This model describes a Home, with one property for an ID. The Home model also defines a relationship to a Floor model, which can be used to indicate that a Home twin is connected to certain Floor twins. { ""@id"": ""dtmi:com:adt:dtsample:home; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Home"", ""contents"": [ { ""@type"": ""Property"", ""name"": ""id"", ""schema"": ""string"" }, { ""@type"": ""Relationship"", ""@id"": ""dtmi:com:adt:dtsample:home:rel_has_floors; 1"", ""name"": ""rel_has_floors"", ""displayName"": ""Home has floors"", ""target"": ""dtmi:com:adt:dtsample:floor; 1"" } ] } Model attributes The main information about a model is given by its attributes, which are defined within the contents section of the model interface. Here are the attributes available in DTDL. A DTDL model interface may contain zero, one, or many of each of the following fields: Property - Properties are data fields that represent the state of an entity (like the properties in many object-oriented programming languages). Properties have backing storage and can be read at any time. For more information, see Properties and telemetry below. Telemetry - Telemetry fields represent measurements or events, and are often used to describe device sensor readings. Unlike properties, telemetry isn't stored on a digital twin; it's a series of time-bound data events that need to be handled as they occur. For more information, see Properties and telemetry below. Relationship - Relationships let you represent how a digital twin can be involved with other digital twins. Relationships can represent different semantic meanings, such as contains(""floor contains room""), cools(""hvac cools room""), isBilledTo(""compressor is billed to user""), and so on. Relationships allow the solution to provide a graph of interrelated entities. Relationships can also have properties of their own. For more information, see Relationships below. Component - Components allow you to build your model interface as an assembly of other interfaces, if you want. An example of a component is a frontCamera interface (and another component interface backCamera) that are used in defining a model for a phone. First define an interface for frontCamera as though it were its own model, and then reference it when defining Phone. Use a component to describe something that is an integral part of your solution but doesn't need a separate identity, and doesn't need to be created, deleted, or rearranged in the twin graph independently. If you want entities to have independent existences in the twin graph, represent them as separate digital twins of different models, connected by relationships. Tip Components can also be used for organization, to group sets of related properties within a model interface. In this situation, you can think of each component as a namespace or ""folder"" inside the interface. For more information, see Components below. Note The spec for DTDL also defines Commands, which are methods that can be executed on a digital twin (like a reset command, or a command to switch a fan on or off). However, commands are not currently supported in Azure Digital Twins. Properties and telemetry This section goes into more detail about properties and telemetry in DTDL models. For comprehensive information about the fields that may appear as part of a property, see Property in the DTDL V2 Reference. For comprehensive information about the fields that may appear as part of telemetry, see Telemetry in the DTDL V2 Reference. Note The writable DTDL attribute for properties is not currently supported in Azure Digital Twins. It can be added to the model, but Azure Digital Twins will not enforce it. For more information, see Service-specific DTDL notes. Difference between properties and telemetry Here's some guidance on conceptually distinguishing between DTDL properties and telemetry in Azure Digital Twins. - Properties are expected to have backing storage, which means that you can read a property at any time and retrieve its value. If the property is writable, you can also store a value in the property. - Telemetry is more like a stream of events; it's a set of data messages that have short lifespans. If you don't set up listening for the event and actions to take when it happens, there's no trace of the event at a later time. You can't come back to it and read it later.In C# terms, telemetry is like a C# event.In IoT terms, telemetry is typically a single measurement sent by a device. Telemetry is often used with IoT devices, because many devices either can't, or aren't interested in, storing the measurement values they generate. Instead, they send them out as a stream of ""telemetry"" events. In this case, you can't query the device at any time for the latest value of the telemetry field. You'll need to listen to the messages from the device and take actions as the messages arrive. As a result, when designing a model in Azure Digital Twins, you'll probably use properties in most cases to model your twins. Doing so allows you to have the backing storage and the ability to read and query the data fields. Telemetry and properties often work together to handle data ingress from devices. You'll often use an ingress function to read telemetry or property events from devices, and set a property in Azure Digital Twins in response. You can also publish a telemetry event from the Azure Digital Twins API. As with other telemetry, that is a short-lived event that requires a listener to handle. Schema As per DTDL, the schema for property and telemetry attributes can be of standard primitive types— integer, double, string, and boolean—and other types such as dateTime and duration. In addition to primitive types, property and telemetry fields can have these complex types: Object Map Enum - (telemetry only) Array They can also be semantic types, which allow you to annotate values with units. Basic property and telemetry examples Here's a basic example of a property on a DTDL model. This example shows the ID property of a Home. { ""@id"": ""dtmi:com:adt:dtsample:home; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Home"", ""contents"": [ { ""@type"": ""Property"", ""name"": ""id"", ""schema"": ""string"" }, { ""@type"": ""Relationship"", ""@id"": ""dtmi:com:adt:dtsample:home:rel_has_floors; 1"", ""name"": ""rel_has_floors"", ""displayName"": ""Home has floors"", ""target"": ""dtmi:com:adt:dtsample:floor; 1"" } ] } Here's a basic example of a telemetry field on a DTDL model. This example shows Temperature telemetry on a Sensor. { ""@id"": ""dtmi:com:adt:dtsample:sensor; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Sensor"", ""contents"": [ { ""@type"": ""Telemetry"", ""name"": ""Temperature"", ""schema"": ""double"" }, { ""@type"": ""Property"", ""name"": ""humidity"", ""schema"": ""double"" } ] } Complex (object) type example Properties and telemetry can be of complex types, including an Object type. The following example shows another version of the Home model, with a property for its address. address is an object, with its own fields for street, city, state, and zip. { ""@id"": ""dtmi:com:adt:dtsample:home; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Home"", ""extends"": ""dtmi:com:adt:dtsample:core; 1"", ""contents"": [ { ""@type"": ""Property"", ""name"": ""address"", ""schema"": { ""@type"": ""Object"", ""fields"": [ { ""name"": ""street"", ""schema"": ""string"" }, { ""name"": ""city"", ""schema"": ""string"" }, { ""name"": ""state"", ""schema"": ""string"" }, { ""name"": ""zip"", ""schema"": ""string"" } ] } }, { ""@type"": ""Relationship"", ""@id"": ""dtmi:com:adt:dtsample:home:rel_has_floors; 1"", ""name"": ""rel_has_floors"", ""displayName"": ""Home has floors"", ""target"": ""dtmi:com:adt:dtsample:floor; 1"", ""properties"": [ { ""@type"": ""Property"", ""name"": ""lastOccupied"", ""schema"": ""dateTime"" } ] } ] } Semantic type example Semantic types make it possible to express a value with a unit. Properties and telemetry can be represented with any of the semantic types that are supported by DTDL. For more information on semantic types in DTDL and what values are supported, see Semantic types in the DTDL V2 Reference. The following example shows a Sensor model with a semantic-type telemetry for Temperature, and a semantic-type property for Humidity. { ""@id"": ""dtmi:com:adt:dtsample:sensor; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Sensor"", ""contents"": [ { ""@type"": [""Telemetry"", ""Temperature""], ""name"": ""temperature"", ""unit"": ""degreeFahrenheit"", ""schema"": ""double"" }, { ""@type"": [""Property"", ""Humidity""], ""name"": ""humidity"", ""unit"": ""gramPerCubicMetre"", ""schema"": ""double"" } ] } Note ""Property"" or ""Telemetry"" must be the first element of the @type array, followed by the semantic type. Otherwise, the field may not be visible in Azure Digital Twins Explorer. Relationships This section goes into more detail about relationships in DTDL models. For a comprehensive list of the fields that may appear as part of a relationship, see Relationship in the DTDL V2 Reference. Note The writable, minMultiplicity, and maxMultiplicity DTDL attributes for relationships are not currently supported in Azure Digital Twins. They can be added to the model, but Azure Digital Twins will not enforce them. For more information, see Service-specific DTDL notes. Basic relationship example Here's a basic example of a relationship on a DTDL model. This example shows a relationship on a Home model that allows it to connect to a Floor model. { ""@id"": ""dtmi:com:adt:dtsample:home; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Home"", ""contents"": [ { ""@type"": ""Property"", ""name"": ""id"", ""schema"": ""string"" }, { ""@type"": ""Relationship"", ""@id"": ""dtmi:com:adt:dtsample:home:rel_has_floors; 1"", ""name"": ""rel_has_floors"", ""displayName"": ""Home has floors"", ""target"": ""dtmi:com:adt:dtsample:floor; 1"" } ] } Note For relationships, @id is an optional field. If no @id is provided, the digital twin interface processor will assign one. Targeted and non-targeted relationships Relationships can be defined with or without a target. A target specifies which types of twin the relationship can reach. For example, you might include a target to specify that a Home model can only have a rel_has_floors relationship with twins that are Floor twins. Sometimes, you might want to define a relationship without a specific target, so that the relationship can connect to many different types of twins. Here's an example of a relationship on a DTDL model that doesn't have a target. In this example, the relationship is for defining what sensors a Room might have, and the relationship can connect to any type. { ""@id"": ""dtmi:com:adt:dtsample:room; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Room"", ""extends"": ""dtmi:com:adt:dtsample:core; 1"", ""contents"": [ { ""@type"": [""Property"", ""Humidity""], ""name"": ""humidity"", ""unit"": ""gramPerCubicMetre"", ""schema"": ""double"" }, { ""@type"": ""Component"", ""name"": ""thermostat"", ""schema"": ""dtmi:com:adt:dtsample:thermostat; 1"" }, { ""@type"": ""Relationship"", ""@id"": ""dtmi:com:adt:dtsample:room:rel_has_sensors; 1"", ""name"": ""rel_has_sensors"", ""displayName"": ""Room has sensors"" } ] }, Properties of relationships DTDL also allows for relationships to have properties of their own. When defining a relationship within a DTDL model, the relationship can have its own properties field where you can define custom properties to describe relationship-specific state. The following example shows another version of the Home model, where the rel_has_floors relationship has a property representing when the related Floor was last occupied. { ""@id"": ""dtmi:com:adt:dtsample:home; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Home"", ""extends"": ""dtmi:com:adt:dtsample:core; 1"", ""contents"": [ { ""@type"": ""Property"", ""name"": ""address"", ""schema"": { ""@type"": ""Object"", ""fields"": [ { ""name"": ""street"", ""schema"": ""string"" }, { ""name"": ""city"", ""schema"": ""string"" }, { ""name"": ""state"", ""schema"": ""string"" }, { ""name"": ""zip"", ""schema"": ""string"" } ] } }, { ""@type"": ""Relationship"", ""@id"": ""dtmi:com:adt:dtsample:home:rel_has_floors; 1"", ""name"": ""rel_has_floors"", ""displayName"": ""Home has floors"", ""target"": ""dtmi:com:adt:dtsample:floor; 1"", ""properties"": [ { ""@type"": ""Property"", ""name"": ""lastOccupied"", ""schema"": ""dateTime"" } ] } ] } Components This section goes into more detail about components in DTDL models. For a comprehensive list of the fields that may appear as part of a component, see Component in the DTDL V2 Reference. Basic component example Here's a basic example of a component on a DTDL model. This example shows a Room model that makes use of a thermostat model as a component. [ { ""@id"": ""dtmi:com:adt:dtsample:room; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Room"", ""extends"": ""dtmi:com:adt:dtsample:core; 1"", ""contents"": [ { ""@type"": [""Property"", ""Humidity""], ""name"": ""humidity"", ""unit"": ""gramPerCubicMetre"", ""schema"": ""double"" }, { ""@type"": ""Component"", ""name"": ""thermostat"", ""schema"": ""dtmi:com:adt:dtsample:thermostat; 1"" }, { ""@type"": ""Relationship"", ""@id"": ""dtmi:com:adt:dtsample:room:rel_has_sensors; 1"", ""name"": ""rel_has_sensors"", ""displayName"": ""Room has sensors"" } ] }, { ""@context"": ""dtmi:dtdl:context; 2"", ""@id"": ""dtmi:com:adt:dtsample:thermostat; 1"", ""@type"": ""Interface"", ""displayName"": ""thermostat"", ""contents"": [ { ""@type"": [""Property"", ""Temperature""], ""name"": ""temperature"", ""unit"": ""degreeFahrenheit"", ""schema"": ""double"" } ] } ] If other models in this solution should also contain a thermostat, they can reference the same thermostat model as a component in their own definitions, just like Room does. Important The component interface (thermostat in the example above) must be defined in the same array as any interfaces that use it (Room in the example above) in order for the component reference to be found. Model inheritance Sometimes, you may want to specialize a model further. For example, it might be useful to have a generic model Room, and specialized variants ConferenceRoom and Gym. To express specialization, DTDL supports inheritance. Interfaces can inherit from one or more other interfaces. You can do so by adding an extends field to the model. The extends section is an interface name, or an array of interface names (allowing the extending interface to inherit from multiple parent models). A single parent can serve as the base model for multiple extending interfaces. The following example re-imagines the Home model from the earlier DTDL example as a subtype of a larger ""core"" model. The parent model (Core) is defined first, and then the child model (Home) builds on it by using extends. { ""@id"": ""dtmi:com:adt:dtsample:core; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Core"", ""contents"": [ { ""@type"": ""Property"", ""name"": ""id"", ""schema"": ""string"" }, { ""@type"": ""Property"", ""name"": ""name"", ""schema"": ""string"" } ] } { ""@id"": ""dtmi:com:adt:dtsample:home; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Home"", ""extends"": ""dtmi:com:adt:dtsample:core; 1"", ""contents"": [ { In this case, Core contributes an ID and name to Home. Other models can also extend the Core model to get these properties as well. Here's a Room model extending the same parent interface: { ""@id"": ""dtmi:com:adt:dtsample:room; 1"", ""@type"": ""Interface"", ""@context"": ""dtmi:dtdl:context; 2"", ""displayName"": ""Room"", ""extends"": ""dtmi:com:adt:dtsample:core; 1"", ""contents"": [ { Once inheritance is applied, the extending interface exposes all properties from the entire inheritance chain. The extending interface can't change any of the definitions of the parent interfaces; it can only add to them. It also can't redefine a capability already defined in any of its parent interfaces (even if the capabilities are defined to be the same). For example, if a parent interface defines a double property mass, the extending interface can't contain a declaration of mass, even if it's also a double. Service-specific DTDL notes Not all services that use DTDL implement the exact same features of DTDL. There are some DTDL features that Azure Digital Twins doesn't currently support, including: - DTDL commands - The writableattribute on properties or relationships. Although this attribute can be set as per DTDL specifications, the value isn't used by Azure Digital Twins. Instead, these attributes are always treated as writable by external clients that have general write permissions to the Azure Digital Twins service. - The minMultiplicityand maxMultiplicityproperties on relationships. Although these attributes can be set as per DTDL specifications, the values aren't enforced by Azure Digital Twins. For a DTDL model to be compatible with Azure Digital Twins, it must also meet these requirements: - All top-level DTDL elements in a model must be of type Interface. The reason for this requirement is that Azure Digital Twins model APIs can receive JSON objects that represent either an interface or an array of interfaces. As a result, no other DTDL element types are allowed at the top level. - DTDL for Azure Digital Twins must not define any commands. - Azure Digital Twins only allows a single level of component nesting, meaning that an interface that's being used as a component can't have any components itself. - Interfaces can't be defined inline within other DTDL interfaces; they must be defined as separate top-level entities with their own IDs. Then, when another interface wants to include that interface as a component or through inheritance, it can reference its ID. Modeling tools and best practices This section describes additional considerations and recommendations for modeling. Use DTDL industry-standard ontologies If your solution is for a certain established industry (like smart buildings, smart cities, or energy grids), consider starting with a pre-existing set of models for your industry instead of designing your models from scratch. Microsoft has partnered with domain experts to create DTDL model sets based on industry standards, to help minimize reinvention and encourage consistency and simplicity across industry solutions. You can read more about these ontologies, including how to use them and what ontologies are available now, in What is an ontology?. Consider query implications While designing models to reflect the entities in your environment, it can be useful to look ahead and consider the query implications of your design. You may want to design properties in a way that will avoid large result sets from graph traversal. You may also want to model relationships that will need to be answered in a single query as single-level relationships. Validate models Tip After creating a model, it's recommended to validate your models offline before uploading them to your Azure Digital Twins instance. There is a language-agnostic DTDL Validator sample for validating model documents to make sure the DTDL is correct before uploading it to your instance. The DTDL validator sample is built on a .NET DTDL parser library, which is available on NuGet as a client-side library: Microsoft.Azure.DigitalTwins.Parser. You can also use the library directly to design your own validation solution. Version 4.0.8 of the parser library is the version that's currently recommended for compatibility with Azure Digital Twins. You can learn more about the validator sample and parser library, including usage examples, in Parse and validate models. Upload and delete models in bulk Here are two sample projects that can simplify dealing with multiple models at once: - Model uploader: Once you're finished creating, extending, or selecting your models, you need to upload them to your Azure Digital Twins instance to make them available for use in your solution. If you have many models to upload, or if they have many interdependencies that would make ordering individual uploads complicated, you can use this model uploader sample to upload many models at once. - Model deleter: This sample can be used to delete all models in an Azure Digital Twins instance at once. It contains recursive logic to handle model dependencies through the deletion process. Visualize models Once you have uploaded models into your Azure Digital Twins instance, you can use Azure Digital Twins Explorer to view them. The explorer contains a list of all models in the instance, as well as a model graph that illustrates how they relate to each other, including any inheritance and model relationships. Here's an example of what a model graph might look like: For more information about the model experience in Azure Digital Twins Explorer, see Explore models and the Model Graph. Next steps Learn about creating models based on industry-standard ontologies: What is an ontology? Dive deeper into managing models with API operations: Manage DTDL models Learn about how models are used to create digital twins: Digital twins and the twin graph Feedback Submit and view feedback for",https://learn.microsoft.com/en-us/azure/digital-twins/concepts-models,,Docs,,Product,,,IOT,,,,"DTDL,JSON-LD",2023-04-05,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,Microsoft Entra Verified ID documentation,,,,,,,Issue Azure AD Verifiable Credentials from an application,- Set up Azure Blob Storage for storing your Azure AD Verifiable Credentials configuration files.<br>- Create and upload your Verifiable Credentials configuration files.<br>- Create the verified credential expert card in Azure.<br>- Gather credentials and environment details to set up the sample application.<br>- Download the sample application code to your local computer.<br>- Update the sample application with your verified credential expert card and environment details.<br>- Run the sample application and issue your first verified credential expert card.<br>- Verify your verified credential expert card.,"Issue Microsoft Entra Verified ID credentials from an application Note Azure Active Directory Verifiable Credentials is now Microsoft Entra Verified ID and part of the Microsoft Entra family of products. Learn more about the Microsoft Entra family of identity solutions and get started in the unified Microsoft Entra admin center. In this tutorial, you run a sample application from your local computer that connects to your Azure Active Directory (Azure AD) tenant. Using the application, you're going to issue and verify a verified credential expert card. In this article, you learn how to: - Create the verified credential expert card in Azure. - Gather credentials and environment details to set up the sample application. - Download the sample application code to your local computer. - Update the sample application with your verified credential expert card and environment details. - Run the sample application and issue your first verified credential expert card. - Verify your verified credential expert card. The following diagram illustrates the Microsoft Entra Verified ID architecture and the component you configure. Prerequisites - Set up a tenant for Microsoft Entra Verified ID. - To clone the repository that hosts the sample app, install GIT. - Visual Studio Code, or similar code editor. - .NET 5.0. - Download ngrok and sign up for a free account. If you can't use ngrokin your organization, read this FAQ. - A mobile device with Microsoft Authenticator:Android version 6.2206.3973 or later installed.iOS version 6.6.2 or later installed. Create the verified credential expert card in Azure In this step, you create the verified credential expert card by using Microsoft Entra Verified ID. After you create the credential, your Azure AD tenant can issue it to users who initiate the process. Using the Azure portal, search for Verified ID and select it. After you set up your tenant, the Create credential should appear. Alternatively, you can select Credentials in the left hand menu and select + Add a credential. In Create credential, select Custom Credential and click Next: For Credential name, enter VerifiedCredentialExpert. This name is used in the portal to identify your verifiable credentials. It's included as part of the verifiable credentials contract. Copy the following JSON and paste it in the Display definition textbox { ""locale"": ""en-US"", ""card"": { ""title"": ""Verified Credential Expert"", ""issuedBy"": ""Microsoft"", ""backgroundColor"": ""#000000"", ""textColor"": ""#ffffff"", ""logo"": { ""uri"": ""https://didcustomerplayground.blob.core.windows.net/public/VerifiedCredentialExpert_icon.png"", ""description"": ""Verified Credential Expert Logo"" }, ""description"": ""Use your verified credential to prove to anyone that you know all about verifiable credentials."" }, ""consent"": { ""title"": ""Do you want to get your Verified Credential?"", ""instructions"": ""Sign in with your account to get your card."" }, ""claims"": [ { ""claim"": ""VC.credentialSubject.firstName"", ""label"": ""First name"", ""type"": ""String"" }, { ""claim"": ""VC.credentialSubject.lastName"", ""label"": ""Last name"", ""type"": ""String"" } ] } Copy the following JSON and paste it in the Rules definition textbox { ""attestations"": { ""idTokenHints"": [ { ""mapping"": [ { ""outputClaim"": ""firstName"", ""required"": true, ""inputClaim"": ""$.given_name"", ""indexed"": false }, { ""outputClaim"": ""lastName"", ""required"": true, ""inputClaim"": ""$.family_name"", ""indexed"": false } ], ""required"": false } ] }, ""validityInterval"": 2592000, ""VC"": { ""type"": [ ""VerifiedCredentialExpert"" ] } } Select Create. The following screenshot demonstrates how to create a new credential: Gather credentials and environment details Now that you have a new credential, you're going to gather some information about your environment and the credential that you created. You use these pieces of information when you set up your sample application. In Verifiable Credentials, select Issue credential. Copy the authority, which is the Decentralized Identifier, and record it for later. Copy the manifest URL. It's the URL that Authenticator evaluates before it displays to the user verifiable credential issuance requirements. Record it for later use. Copy your Tenant ID, and record it for later. The Tenant ID is the guid in the manifest URL highlighted in red above. Download the sample code The sample application is available in .NET, and the code is maintained in a GitHub repository. Download the sample code from GitHub, or clone the repository to your local machine: git clone https://GitHub.com/Azure-Samples/active-directory-verifiable-credentials-dotnet.git Configure the verifiable credentials app Create a client secret for the registered application that you created. The sample application uses the client secret to prove its identity when it requests tokens. Go to the App registrations page that is located inside Azure Active Directory. Select the verifiable-credentials-app application you created earlier. Select the name to go into the registration details. Copy the Application (client) ID, and store it for later. From the main menu, under Manage, select Certificates & secrets. Select New client secret, and do the following: In Description, enter a description for the client secret (for example, VC-sample-secret). Under Expires, select a duration for which the secret is valid (for example, six months). Then select Add. Record the secret's Value. You'll use this value for configuration in a later step. The secret’s value won't be displayed again, and isn't retrievable by any other means. Record it as soon as it's visible. At this point, you should have all the required information that you need to set up your sample application. Update the sample application Now you'll make modifications to the sample app's issuer code to update it with your verifiable credential URL. This step allows you to issue verifiable credentials by using your own tenant. Under the active-directory-verifiable-credentials-dotnet-main folder, open Visual Studio Code, and select the project inside the 1-asp-net-core-api-idtokenhint folder. Under the project root folder, open the appsettings.json file. This file contains information about your Microsoft Entra Verified ID environment. Update the following properties with the information that you recorded in earlier steps:Tenant ID: your tenant IDClient ID: your client IDClient Secret: your client secretIssuerAuthority: Your Decentralized IdentifierVerifierAuthority: Your Decentralized IdentifierCredential Manifest: Your manifest URL Save the appsettings.json file. The following JSON demonstrates a complete appsettings.json file: { ""AppSettings"": { ""Endpoint"": ""https://verifiedid.did.msidentity.com/v1.0"", ""VCServiceScope"": ""3db474b9-6a0c-4840-96ac-1fceb342124f/.default"", ""Instance"": ""https://login.microsoftonline.com/{0}"", ""TenantId"": ""12345678-0000-0000-0000-000000000000"", ""ClientId"": ""33333333-0000-0000-0000-000000000000"", ""ClientSecret"": ""123456789012345678901234567890"", ""CertificateName"": ""[Or instead of client secret: Enter here the name of a certificate (from the user cert store) as registered with your application]"", ""IssuerAuthority"": ""did:web:example.com..."", ""VerifierAuthority"": ""did:web:example.com..."", ""CredentialManifest"": ""https://verifiedid.did.msidentity.com/v1.0/12345678-0000-0000-0000-000000000000/verifiableCredentials/contracts/VerifiedCredentialExpert"" } } Issue your first verified credential expert card Now you're ready to issue your first verified credential expert card by running the sample application. From Visual Studio Code, run the Verifiable_credentials_DotNet project. Or, from your operating system's command line, run: cd active-directory-verifiable-credentials-dotnet/1-asp-net-core-api-idtokenhint dotnet build ""AspNetCoreVerifiableCredentials.csproj"" -c Debug -o .\\bin\\Debug<br>etcoreapp3. dotnet run In another command prompt window, run the following command. This command runs ngrok to set up a URL on 5000, and make it publicly available on the internet. ngrok http 5000 Note On some computers, you might need to run the command in this format: ./ngrok http 3000. Open the HTTPS URL generated by ngrok. From a web browser, select Get Credential. Using your mobile device, scan the QR code with the Authenticator app. You can also scan the QR code directly from your camera, which will open the Authenticator app for you. At this time, you'll see a message warning that this app or website might be risky. Select Advanced. At the risky website warning, select Proceed anyways (unsafe). You're seeing this warning because your domain isn't linked to your decentralized identifier (DID). To verify your domain, follow Link your domain to your decentralized identifier (DID). For this tutorial, you can skip the domain registration, and select Proceed anyways (unsafe). You'll be prompted to enter a PIN code that is displayed in the screen where you scanned the QR code. The PIN adds an extra layer of protection to the issuance. The PIN code is randomly generated every time an issuance QR code is displayed. After you enter the PIN number, the Add a credential screen appears. At the top of the screen, you see a Not verified message (in red). This warning is related to the domain validation warning mentioned earlier. Select Add to accept your new verifiable credential. Congratulations! You now have a verified credential expert verifiable credential. Go back to the sample app. It shows you that a credential successfully issued. Verifiable credential names Your verifiable credential contains Megan Bowen for the first name and last name values in the credential. These values were hardcoded in the sample application, and were added to the verifiable credential at the time of issuance in the payload. In real scenarios, your application pulls the user details from an identity provider. The following code snippet shows where the name is set in the sample application. //file: IssuerController.cs [HttpGet(""/api/issuer/issuance-request"")] public async Task<ActionResult> issuanceRequest() { ... // Here you could change the payload manifest and change the first name and last name. payload[""claims""][""given_name""] = ""Megan""; payload[""claims""][""family_name""] = ""Bowen""; ... } Next steps In the next step, learn how a third-party application, also known as a relying party application, can verify your credentials with its own Azure AD tenant verifiable credentials API service.",https://learn.microsoft.com/en-us/azure/active-directory/verifiable-credentials/verifiable-credentials-configure-issuer,,Docs,,Product,,,,,,"Entra,AzureAD",,2022-11-16,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,,,,,,,Microsoft Entra Verified ID documentation,Verifiable credentials help you build solutions that empower customers to manage their own data.,,https://learn.microsoft.com/en-us/azure/active-directory/verifiable-credentials/,,Docs,,Product,,,,,,Entra,Verifiable Credentials,2023-01-24,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,Microsoft Entra Verified ID documentation,,,,,,,Plan your issuance solution,"This article covers the technical aspects of planning for a verifiable credential issuance solution. The Microsoft solution for verifiable credentials follows the World Wide Web Consortium (W3C) Verifiable Credentials Data Model 1.0 and Decentralized Identifiers (DIDs) V1.0 standards so can interoperate with non-Microsoft services. However, the examples in this content reflect the Microsoft solution stack for verifiable credentials.<br><br>Out of scope for this content is articles covering supporting technologies that aren't specific to issuance solutions. ","Plan your Microsoft Entra Verified ID issuance solution Note Azure Active Directory Verifiable Credentials is now Microsoft Entra Verified ID and part of the Microsoft Entra family of products. Learn more about the Microsoft Entra family of identity solutions and get started in the unified Microsoft Entra admin center. It’s important to plan your issuance solution so that in addition to issuing credentials, you have a complete view of the architectural and business impacts of your solution. If you haven’t done so, we recommend you view the Microsoft Entra Verified ID architecture overview for foundational information. Scope of guidance This article covers the technical aspects of planning for a verifiable credential issuance solution. The Microsoft solution for verifiable credentials follows the World Wide Web Consortium (W3C) Verifiable Credentials Data Model 1.0 and Decentralized Identifiers (DIDs) V1.0 standards so can interoperate with non-Microsoft services. However, the examples in this content reflect the Microsoft solution stack for verifiable credentials. Out of scope for this content is articles covering supporting technologies that aren't specific to issuance solutions. For example, websites are used in a verifiable credential issuance solution but planning a website deployment isn't covered in detail. Components of the solution As part of your plan for an issuance solution, you must design a solution that enables the interactions between the issuer, the user, and the verifier. You may issue more than one verifiable credential. The following diagram shows the components of your issuance architecture. Microsoft VC issuance solution architecture Azure Active Directory tenant A prerequisite for running the Microsoft Entra Verified ID service is that it's hosted in an Azure Active Directory (Azure AD) tenant. The Azure AD tenant provides an Identity and Access Management (IAM) control plane for the Azure resources that are part of the solution. Each tenant uses the multi-tenant Microsoft Entra Verified ID service, and has a decentralized identifier (DID). The DID provides proof that the issuer owns the domain incorporated into the DID. The DID is used by the subject and the verifier to validate the issuer. Microsoft Azure services The Azure Key Vault service stores your issuer keys, which are generated when you initiate the Microsoft Entra Verified ID issuance service. The keys and metadata are used to execute credential management operations and provide message security. Each issuer has a single key set used for signing, updating, and recovery. This key set is used for every issuance of every verifiable credential you produce. Microsoft Entra Verified ID Service is used to store credential metadata and definitions; specifically, the rules and display definitions for your credentials. Display definitions determine how claims are displayed in the holder’s wallet and also includes branding and other elements. The Display definition can be localized into multiple languages. See How to customize your verifiable credentials. Rules are an issuer-defined model that describes the required inputs of a verifiable credential. Rules also defined trusted input sources, and the mapping of input claims to output claims stored in the VC. Depending on the type of attestation defined in the rules definition, the input claims can come from different providers. Input claims may come from an OIDC Identity Provider, from an id_token_hint or they may be self asserted during issuance via user input in the wallet.Input – Are a subset of the model in the rules file for client consumption. The subset must describe the set of inputs, where to obtain the inputs and the endpoint to call to obtain a verifiable credential. Microsoft Entra Verified ID service The Microsoft Entra Verified ID service enables you to issue and revoke VCs based on your configuration. The service: Provisions the decentralized identifier (DID). Each issuer has a single DID per tenant. Provisions key sets to Key Vault. Stores the configuration metadata used by the issuance service and Microsoft Authenticator. Provides REST APIs interface for issuer and verifier web front ends Trust System Microsoft Entra Verified ID currently supports two trust system. One is the Identity Overlay Network (ION), a Sidetree-based network that uses Bitcoin’s blockchain for decentralized identifier (DID) implementation. The DID document of the issuer is stored in ION and is used to perform cryptographic signature checks by parties to the transaction. The other alternative for trust system is DID Web, where the DID document is hosted on the issuers webserver. Microsoft Authenticator application Microsoft Authenticator is the mobile application that orchestrates the interactions between the user, the Microsoft Entra Verified ID service, and dependencies that are described in the contract used to issue VCs. It acts as a digital wallet in which the holder of the VC stores the VC, including the private key of the subject of the VC. Authenticator is also the mechanism used to present VCs for verification. Issuance business logic Your issuance solution includes a web front end where users request a VC, an identity store and or other attribute store to obtain values for claims about the subject, and other backend services. A web front end serves issuance requests to the subject’s wallet by generating deep links or QR codes. Based on the configuration of the contract, other components might be required to satisfy the requirements to create a VC. These services provide supporting roles that don't necessarily need to integrate with ION or Microsoft Entra Verified ID issuance service. This layer typically includes: Open ID Connect (OIDC)-compliant service or services are used to obtain id_tokens needed to issue the VC. Existing identity systems such as Azure AD or Azure AD B2C can provide the OIDC-compliant service, as can custom solutions such as Identity Server. Attribute stores – These might be outside of directory services and provide attributes needed to issue a VC. For example, a student information system might provide claims about degrees earned. Additional middle-tier services that contain business rules for lookups, validating, billing, and any other runtime checks and workflows needed to issue credentials. For more information on setting up your web front end, see the tutorial Configure your Azure AD to issue verifiable credentials. Credential Design Considerations Your specific use cases determine your credential design. The use case will determine: the interoperability requirements the way users will need to prove their identity to get their VC the claims that are needed in the credentials if credentials will ever need to be revoked Credential Use Cases With Microsoft Entra Verified ID, the most common credential use cases are: Identity Verification: a credential is issued based on multiple criteria. This may include verifying the authenticity of government-issued documents like a passport or driver’s license and corelating the information in that document with other information such as: a user’s selfie verification of liveness This kind of credential is a good fit for identity onboarding scenarios of new employees, partners, service providers, students, and other instances where identity verification is essential. Proof of employment/membership: a credential is issued to prove a relationship between the user and an institution. This kind of credential is a good fit to access loosely coupled business-to-business applications, such as retailers offering discounts to employees or students. One main value of VCs is their portability: Once issued, the user can use the VC in many scenarios. For more use cases, see Verifiable Credentials Use Cases (w3.org). Credential interoperability As part of the design process, investigate industry-specific schemas, namespaces, and identifiers to which you can align to maximize interoperability and usage. Examples can be found in Schema.org and the DIF - Claims and Credentials Working Group. Common schemas are an area where standards are still emerging. One example of such an effort is the Verifiable Credentials for Education Task Force. We encourage you to investigate and contribute to emerging standards in your organization's industry. Credential Type and Attributes After establishing the use case for a credential, you need to decide the credential type and what attributes to include in the credential. Verifiers can read the claims in the VC presented by the users. All verifiable credentials must declare their type in their rules definition. The credential type distinguishes a verifiable credentials schema from other credentials and it ensures interoperability between issuers and verifiers. To indicate a credential type, provide one or more credential types that the credential satisfies. Each type is represented by a unique string. Often, a URI is used to ensure global uniqueness. The URI doesn't need to be addressable. It's treated as a string. As an example, a diploma credential issued by Contoso University might declare the following types: |Type||Purpose| |Declares that diplomas issued by Contoso University contain attributes defined by the schema.org | |Declares that diplomas issued by Contoso University contain attributes defined by the U.S. Department of Education.| |Declares that diplomas issued by Contoso University contain attributes defined by Contoso University.| In addition to the industry-specific standards and schemas that might be applicable to your scenarios, consider the following aspects: Minimize private information: Meet the use cases with the minimal amount of private information necessary. For example, a VC used for e-commerce websites that offers discounts to employees and alumni can be fulfilled by presenting the credential with just the first and last name claims. Additional information such as hiring date, title, department, aren't needed. Favor abstract claims: Each claim should meet the need while minimizing the detail. For example, a claim named “ageOver” with discrete values such as “13”,”21”,”60”, is more abstract than a date of birth claim. Plan for revocability: We recommend you define an index claim to enable mechanisms to find and revoke credentials. You are limited to defining one index claim per contract. It is important to note that values for indexed claims aren't stored in the backend, only a hash of the claim value. For more information, see Revoke a previously issued verifiable credential. For other considerations on credential attributes, refer to the Verifiable Credentials Data Model 1.0 (w3.org) specification. Plan quality attributes Plan for performance As with any solution, you must plan for performance. The key areas to focus on are latency and scalability. During initial phases of a release cycle, performance shouldn't be a concern. However, when adoption of your issuance solution results in many verifiable credentials being issued, performance planning might become a critical part of your solution. The following provides areas to consider when planning for performance: The Microsoft Entra Verified ID issuance service is deployed in West Europe, North Europe, West US 2, and West Central US Azure regions. If your Azure Active Directory tenant resides within EU, the Microsoft Entra Verified ID service will be in EU too. To limit latency, deploy your issuance frontend website and key vault in the region listed above that is closest to where requests are expected to originate. Model based on throughput: The Issuer service is subject to Azure Key Vault service limits. For Azure Key Vault, there are three signing operations involved in each a VC issuance: One for issuance request from the website One for the VC created One for the contract download Maximum signing performance of a Key Vault is 2,000 signing/~10 seconds. This is about 12,000 signings per minute. This means your solution can support up to 4,000 VC issuances per minute. You can't control throttling; however, we recommend you read Azure Key Vault throttling guidance. If you are planning a large rollout and onboarding of VCs, consider batching VC creation to ensure you don't exceed limits. As part of your plan for performance, determine what you will monitor to better understand the performance of the solution. In addition to application-level website monitoring, consider the following as you define your VC issuance monitoring strategy: For scalability, consider implementing metrics for the following: Define the logical phases of your issuance process. For example: Initial request Servicing of the QR code or deep link Attribute lookup Calls to Microsoft Entra Verified ID issuance service Credential issued Define metrics based on the phases: Total count of requests (volume) Requests per unit of time (throughput) Time spent (latency) Monitor Azure Key Vault using the following: Monitor the components used for your business logic layer. Plan for reliability To plan for reliability, we recommend: After you define your availability and redundancy goals, use the following guides to understand how to achieve your goals: For frontend and business layer, your solution can manifest in an unlimited number of ways. As with any solution, for the dependencies you identify, ensure that the dependencies are resilient and monitored. If the rare event that the Microsoft Entra Verified ID issuance service or Azure Key Vault services become unavailable, the entire solution will become unavailable. Plan for compliance Your organization may have specific compliance needs related to your industry, type of transactions, or country of operation. Data residency: The Microsoft Entra Verified ID issuance service is deployed in a subset of Azure regions. The service is used for compute functions only. We don't store values of verifiable credentials in Microsoft systems. However, as part of the issuance process, Personal data is sent and used when issuing VCs. Using the VC service shouldn't impact data residency requirements. If, as a part of identity verification you store any Personal information, that should be stored in a manner and region that meets your compliance requirements. For Azure-related guidance, visit the Microsoft Trust Center website. Revoking credentials: Determine if your organization will need to revoke credentials. For example, an admin may need to revoke credentials when an employee leaves the company. Or if a credential is issued for a driver’s license, and the holder is caught doing something that would cause the driver’s license to be suspended, the VC might need to be revoked. For more information, see Revoke a previously issued verifiable credential. Expiring credentials: Determine if you will expire credentials, and if so under what circumstances. For example, if you issue a VC as proof of having a driver’s license, it might expire after a few years. If you issue a VC as a verification of an association with a user, you may want to expire it annually to ensure users come back annually to get the most updated version of the VC. Plan for operations When planning for operations, it is critical you develop a schema to use for troubleshooting, reporting and distinguishing various customers you support. Additionally, if the operations team is responsible for executing VC revocation, that process must be defined. Each step in the process should be correlated so that you can determine which log entries can be associated with each unique issuance request. For auditing, we recommend you capture each attempt of credential issuing individually. Specifically: Generate unique transaction IDs that customers and support engineers can refer to as needed. Devise a mechanism to correlate the logs of Azure Key Vault transactions to the transaction IDs of the issuance portion of the solution. If you are an identity verification service issuing VCs on behalf of multiple customers, monitor and mitigate by customer or contract ID for customer-facing reporting and billing. If you are an identity verification service issuing VCs on behalf of multiple customers, use the customer or contract ID for customer-facing reporting and billing, monitoring, and mitigating. Plan for security As part of your design considerations focused on security, we recommend the following: For key management: Create a dedicated Key Vault for VC issuance. Limit Azure Key Vault permissions to the Microsoft Entra Verified ID issuance service and the issuance service frontend website service principal. Treat Azure Key Vault as a highly privileged system - Azure Key Vault issues credentials to customers. We recommend that no human identities have standing permissions over the Azure Key Vault service. Administrators should have only just I time access to Key Vault. For more best practices for Azure Key Vault usage, refer to Azure Security Baseline for Key Vault. For service principal that represents the issuance frontend website: Define a dedicated service principal to authorize access Azure Key Vault. If your website is on Azure, we recommend that you use an Azure Managed Identity. Treat the service principal that represents the website and the user as a single trust boundary. While it is possible to create multiple websites, there is only one key set for the issuance solution. For security logging and monitoring, we recommend the following: Enable logging and alerting of Azure Key Vault to track credential issuance operations, key extraction attempts, permission changes, and to monitor and send alert for configuration changes. More information can be found at How to enable Key Vault logging. Archive logs in a security information and event management (SIEM) systems, such as Microsoft Sentinel for long-term retention. Mitigate spoofing risks by using the following DNS verification to help customers identify issuer branding. Domain names that are meaningful to end users. Trusted branding the end user recognizes. Mitigate distributed denial of service (DDOS) and Key Vault resource exhaustion risks. Every request that triggers a VC issuance request generates Key Vault signing operations that accrue towards service limits. We recommend protecting traffic by incorporating authentication or captcha before generating issuance requests. For guidance on managing your Azure environment, we recommend you review the Microsoft cloud security benchmark and Securing Azure environments with Azure Active Directory. These guides provide best practices for managing the underlying Azure resources, including Azure Key Vault, Azure Storage, websites, and other Azure-related services and capabilities. Additional considerations When you complete your POC, gather all the information and documentation generated, and consider tearing down the issuer configuration. This will help avoid issuing verifiable credentials after your POC timeframe expires. For more information on Key Vault implementation and operation, refer to Best practices to use Key Vault. For more information on Securing Azure environments with Active Directory, refer to Securing Azure environments with Azure Active Directory. Next steps Read the architectural overview Plan your verification solution Get started with verifiable credentials",https://learn.microsoft.com/en-us/azure/active-directory/verifiable-credentials/plan-issuance-solution,,Docs,,Product,,,,,,,,2022-10-19,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,Microsoft Entra Verified ID documentation,,,,,,,Plan your verification solution,"Microsoft’s Microsoft Entra Verified ID (Azure AD VC) service enables you to trust proofs of user identity without expanding your trust boundary. With Azure AD VC, you create accounts or federate with another identity provider. When a solution implements a verification exchange using verifiable credentials, it enables applications to request credentials that aren't bound to a specific domain. This approach makes it easier to request and verify credentials at scale.","Plan your Microsoft Entra Verified ID verification solution Note Azure Active Directory Verifiable Credentials is now Microsoft Entra Verified ID and part of the Microsoft Entra family of products. Learn more about the Microsoft Entra family of identity solutions and get started in the unified Microsoft Entra admin center. Microsoft’s Microsoft Entra Verified ID (Azure AD VC) service enables you to trust proofs of user identity without expanding your trust boundary. With Azure AD VC, you create accounts or federate with another identity provider. When a solution implements a verification exchange using verifiable credentials, it enables applications to request credentials that aren't bound to a specific domain. This approach makes it easier to request and verify credentials at scale. If you haven’t already, we suggest you review the Microsoft Entra Verified ID architecture overview. You may also want to review Plan your Microsoft Entra Verified ID issuance solution. Scope of guidance This content covers the technical aspects of planning for a verifiable credential (VC) verification solution using Microsoft products and services. The solution interfaces with a trust system, where currently supported trust systems are Identity Overlay Network (ION) or DID Web. ION acts as the decentralized public key infrastructure (DPKI) while DID Web is a centralized public key infrastructure. Supporting technologies that aren't specific to verification solutions are out of scope. For example, websites are used in a verifiable credential verification solution but planning a website deployment isn't covered in detail. As you plan your verification solution, you must consider what business capability is being added or modified. You must also consider what IT capabilities can be reused, and what capabilities must be added to create the solution. Also consider what training is needed for the people involved in the business process and the people that support the end users and staff of the solution. These articles aren't covered in this content. We recommend reviewing the Microsoft Azure Well-Architected Framework for information covering these articles. Components of the solution As part of your plan for a verification solution, you must enable the interactions between the verifier, the subject, and the issuer. In this article, the terms relying party and verifier are used interchangeably. The following diagram shows the components of your verification architecture. Microsoft Entra Verified ID service In the context of a verifier solution, the Microsoft Entra Verified ID service is the interface between the Microsoft components of the solution and the trust system. The service provisions the key set to Key Vault, provisions the decentralized identifier (DID). In the case of ION, the service writes the DID document to the distributed ledger, where it can be used by subjects and issuers. Azure Active Directory tenant The service requires an Azure AD tenant that provides an Identity and Access Management (IAM) control plane for the Azure resources that are part of the solution. Each Azure AD tenant uses the multi-tenant Microsoft Entra Verified ID service, and it issues a single DID document representing the verifier. If you have multiple relying parties using your verification service, they all use the same verifier DID. The verifier DID provides pointers to the public key that allows subjects and issuers to validate messages that come from the relying party. Azure Key Vault The Azure Key Vault service stores your verifier keys, which are generated when you enable the Microsoft Entra Verified ID issuance service. The keys are used to provide message security. Each verifier has a single key set used for signing, updating, and recovering VCs. This key set is used each time you service a verification request. Microsoft key set currently uses Elliptic Curve Cryptography (ECC) SECP256k1. We're exploring other cryptographic signature schemas that will be adopted by the broader DID community. Request Service API Application programming interfaces (APIs) provide developers a method to abstract interactions between components of the solution to execute verification operations. Trust System Microsoft Entra Verified ID currently supports two trust system. One is Identity Overlay Network (ION), a Sidetree-based networkthat uses Bitcoin’s blockchain for decentralized identifier (DID) implementation. The DID document of the issuer is stored in ION and is used to perform cryptographic signature checks by parties to the transaction. The other alternative for trust system is DID Web, where the DID document is hosted on the issuers webserver. Microsoft Authenticator application Microsoft Authenticator is the mobile application that orchestrates the interactions between the relying party, the user, the Microsoft Entra Verified ID issuance service, and dependencies described in the contract used to issue VCs. Microsoft Authenticator acts as a digital wallet in which the holder of the VC stores the VC. It's also the mechanism used to present VCs for verification. Relying party (RP) Web front end The relying party web front end uses the Request Service API to verify VCs by generating deep links or QR codes that are consumed by the subject’s wallet. Depending on the scenario, the front end can be a publicly accessible or internal website to enable end-user experiences that require verification. However, the endpoints that the wallet accesses must be publicly accessible. Specifically, it controls redirection to the wallet with specific request parameters. This is accomplished using the Microsoft-provided APIs. Business logic You can create new logic or use existing logic that is specific to the relying party and enhance that logic with the presentation of VCs. Scenario-specific designs The following are examples of designs to satisfy specific use cases. The first is for account onboarding, used to reduce the time, cost, and risk associated with onboarding new employees. The second is for account recovery, which enables an end user to recover or unlock their account using a self-service mechanism. The third is for accessing high-value applications and resources, specifically for business-to-business use cases where access is given to people that work for other companies. Account onboarding Verifiable credentials can be used to enable faster onboarding by replacing some human interactions. VCs can be used to onboard employees, students, citizens, or others to access services. For example, rather than an employee needing to go to a central office to activate an employee badge, they can use a VC to verify their identity to activate a badge that is delivered to them remotely. Rather than a citizen receiving a code they must redeem to access governmental services, they can use a VC to prove their identity and gain access. Other elements Onboarding portal: A web front end that orchestrates the Request Service API calls for VC presentation and validation, and the logic to onboard accounts. Custom logic / workflows: Specific logic with organization-specific steps before and after updating the user account. Examples might include approval workflows, other validations, logging, notifications, and so on. Target identity systems: Organization-specific identity repositories that the onboarding portal needs to interact with while onboarding subjects. The systems to integrate are determined based on the kinds of identities you want to onboard with VC validation. Common scenarios of identity verification for onboarding include: External Identities such as vendors, partners, suppliers, and customers, which in centralized identity systems onboard to Azure AD using APIs to issue business-to-business (B2B) invitations, or entitlement management assignment to packages. Employee identities, which in centralized identity systems are already onboarded through human resources (HR) systems. In this case, the identity verification might be integrated as part of existing stages of HR workflows. Design Considerations Issuer: Account onboarding is a good fit for an external identity-proofing service as the issuer of the VCs. Examples of checks for onboarding include: liveness check, government-issued document validation, address, or phone number confirmation, and so on. Storing VC Attributes: Where possible don't store attributes from VCs in your app-specific store. Be especially careful with Personal data. If this information is required by specific flows within your applications, consider asking for the VC to retrieve the claims on demand. VC Attribute correlation with back-end systems: When defining the attributes of the VC with the issuer, establish a mechanism to correlate information in the back-end system after the user presents the VC. The mechanism typically uses a time-bound, unique identifier in the context of your RP in combination with the claims you receive. Some examples: New employee: When the HR workflow reaches the point where identity proofing is required, the RP can generate a link with a time-bound unique identifier. The RP then sends it to the candidate’s email address on the HR system. This unique identifier should be sufficient to correlate information such as firstName, lastName from the VC verification request to the HR record or underlying data. The attributes in the VC can be used to complete user attributes in the HR system, or to validate accuracy of user attributes about the employee. External identities - invitation: When an existing user in your organization invites an external user to be onboarded in the target system, the RP can generate a link with a unique identifier that represents the invitation transaction and sends it to the external user’s email address. This unique identifier should be sufficient to correlate the VC verification request to the invitation record or underlying data and continue the provisioning workflow. The attributes in the VC can be used to validate or complete the external user attributes. External identities - self-service: When external identities sign up to the target system through self-service (for example, a B2C application) the attributes in the VC can be used to populate the initial attributes of the user account. The VC attributes can also be used to find out if a profile already exists. Interaction with target identity systems: The service-to-service communication between the web front end and your target identity systems needs to be secured as a highly privileged system, because it can create accounts. Grant the web front end the least privileged roles possible. Some examples include: To create a new user in Azure AD, the RP website can use a service principal that is granted the MS Graph scope of User.ReadWrite.Allto create users, and the scope UserAuthenticationMethod.ReadWrite.Allto reset their authentication method. To invite users to Azure AD using B2B collaboration, the RP website can use a service principal that is granted the MS Graph scope of User.Invite.Allto create invitations. If your RP is running in Azure, use Managed Identities to call Microsoft Graph. Using managed identities removes the risks of managing service principal credentials in code or configuration files. To learn more about Managed identities, go to Managed identities for Azure resources. Accessing high-value applications inside organizations Verifiable credentials can be used as other proof to access to sensitive applications inside the organization. For example, VCs can also be used to provide employees with access to line-of-business applications based on achieving specific criteria, such as a certification. Other elements Relying party web frontend: This is the web frontend of the application that is enhanced through Request Service API calls for VC presentation and validation, based on your business requirements. User access authorization logic: Logic layer in the application that authorizes user access and is enhanced to consume the user attributes inside the VC to make authorization decisions. Other backend services and dependencies: Represents the rest of the logic of the application, which typically is unchanged by the inclusion of identity proofing through VCs. Design Considerations Goal: The goal of the scenario determines what kind of credential and issuer is needed. Typical scenarios include: Authorization: In this scenario, the user presents the VC to make an authorization decision. VCs designed for proof of completion of a training or holding a specific certification, are a good fit for this scenario. The VC attributes should contain fine-grained information conducive to authorization decisions and auditing. For example, if the VC is used to certify the individual is trained and can access sensitive financial apps, the app logic can check the department claim for fine-grained authorization, and use the employee ID for audit purposes. Confirmation of identity verification: In this scenario, the goal is to confirm that the same person who initially onboarded is indeed the one attempting to access the high-value application. A credential from an identity verification issuer would be a good fit and the application logic should validate that the attributes from the VC align with the user who logged in the application. Check Revocation: When using VCs to access sensitive resources, it is common to check the status of the VC with the original issuer and deny access for revoked VCs. When working with the issuers, ensure that revocation is explicitly discussed as part of the design of your scenario. User Experience: When using VCs to access sensitive resources, there are two patterns you can consider. Step-up authentication: users start the session with the application with existing authentication mechanisms. Users must present a VC for specific high-value operations within the application such as approvals of business workflows. This is a good fit for scenarios where such high-value operations are easy to identify and update within the application flows. Session establishment: Users must present a VC as part of initiating the session with the application. This is a good fit when the nature of the entire application is high-value. Accessing applications outside organization boundaries Verifiable credentials can also be used by relying parties that want to grant access or benefits based on membership or employment relationship of a different organization. For example, an e-commerce portal can offer benefits such as discounts to employees of a particular company, students of a given institution, etc. The decentralized nature of verifiable credentials enables this scenario without establishing federation relationships. Other elements Relying party web frontend: This is the web frontend of the application that is enhanced through Request Service API calls for VC presentation and validation, based on your business requirements. User access authorization logic: Logic layer in the application that authorizes user access and is enhanced to consume the user attributes inside the VC to make authorization decisions. Other backend services and dependencies: Represents the rest of the logic of the application, which typically is unchanged by the inclusion of identity proofing through VCs. Design Considerations Goal: The goal of the scenario determines what kind of credential and issuer is needed. Typical scenarios include: Authentication: In this scenario, a user must have possession of VC to prove employment or relationship to a particular organization(s). In this case, the RP should be configured to accept VCs issued by the target organizations. Authorization: Based on the application requirements, the applications might consume the VC attributes for fine-grained authorization decisions and auditing. For example, if an e-commerce website offers discounts to employees of the organizations in a particular location, they can validate this based on the country claim in the VC (if present). Check Revocation: When using VCs to access sensitive resources, it is common to check the status of the VC with the original issuer and deny access for revoked VCs. When working with the issuers, ensure that revocation is explicitly discussed as part of the design of your scenario. User Experience: Users can present a VC as part of initiating the session with the application. Typically, applications also provide an alternative method to start the session to accommodate cases where users don’t have VCs. Account recovery Verifiable credentials can be used as an approach to account recovery. For example, when a user needs to recover their account, they might access a website that requires them to present a VC and initiate an Azure AD credential reset by calling MS Graph APIs as shown in the following diagram. Note: While the scenario we describe in this section is specific to recover Azure AD accounts, this approach can also be used to recover accounts in other systems. Other Elements Account portal: This is a web front end that orchestrates the API calls for VC presentation and validation. This orchestration can include Microsoft Graph calls to recover accounts in Azure AD. Custom logic or workflows: Logic with organization-specific steps before and after updating the user account. This might include approval workflows, other validations, logging, notifications, etc. Microsoft Graph: Exposes representational state transfer (REST) APIs and client libraries to access Azure AD data that is used to perform account recovery. Azure AD enterprise directory: This is the Azure AD tenant that contains the accounts that are being created or updated through the account portal. Design considerations VC Attribute correlation with Azure AD: When defining the attributes of the VC in collaboration with the issuer, establish a mechanism to correlate information with internal systems based on the claims in the VC and user input. For example, if you have an identity verification provider (IDV) verify identity prior to onboarding employees, ensure that the issued VC includes claims that would also be present in an internal system such as a human resources system for correlation. This might be a phone number, address, or date of birth. In addition to claims in the VC, the RP can ask for some information such as the last four digits of their social security number (SSN) as part of this process. Role of VCs with Existing Azure AD Credential Reset Capabilities: Azure AD has a built-in self-service password reset (SSPR) capability. Verifiable Credentials can be used to provide another way to recover, particularly in cases where users do not have access to or lost control of the SSPR method, for example they’ve lost both computer and mobile device. In this scenario, the user can reobtain a VC from an identity proof issuer and present it to recover their account. Similarly, you can use a VC to generate a temporary access pass that will allow users to reset their MFA authentication methods without a password. Authorization: Create an authorization mechanism such as a security group that the RP checks before proceeding with the credential recovery. For example, only users in specific groups might be eligible to recover an account with a VC. Interaction with Azure AD: The service-to-service communication between the web front end and Azure AD must be secured as a highly privileged system because it can reset employees’ credentials. Grant the web front end the least privileged roles possible. Some examples include: Grant the RP website the ability to use a service principal granted the MS Graph scope UserAuthenticationMethod.ReadWrite.Allto reset authentication methods. Don’t grant User.ReadWrite.All, which enables the ability to create and delete users. If your RP is running in Azure, use Managed Identities to call Microsoft Graph. This removes the risks around managing service principal credentials in code or configuration files. For more information, see Managed identities for Azure resources. Plan for identity management Below are some IAM considerations when incorporating VCs to relying parties. Relying parties are typically applications. Authentication The subject of a VC must be a human. Presentation of VCs must be interactively performed by a human VC holder, who holds the VC in their wallet. Non-interactive flows such as on-behalf-of are not supported. Authorization A successful presentation of the VC can be considered a coarse-grained authorization gate by itself. The VC attributes can also be consumed for fine-grained authorization decisions. Determine if an expired VC has meaning in your application; if so check the value of the expclaim (the expiration time) of the VC as part of the authorization checks. One example where expiration is not relevant is requiring a government-issued document such as a driver’s license to validate if the subject is older than 18. The date of birth claim is valid, even if the VC is expired. Determine if a revoked VC has meaning to your authorization decision. If it is not relevant, then skip the call to check status API (which is on by default). If it is relevant, add the proper handling of exceptions in your application. User Profiles You can use information in presented VCs to build a user profile. If you want to consume attributes to build a profile, consider the following. When the VC is issued, it contains a snapshot of attributes as of issuance. VCs might have long validity periods, and you must determine the age of attributes that you will accept as sufficiently fresh to use as a part of the profile. If a VC needs to be presented every time the subject starts a session with the RP, consider using the output of the VC presentation to build a non-persistent user profile with the attributes. This helps to reduce privacy risks associated with storing user properties at rest. If the subject’s attributes need to be persisted locally by the application, only store the minimal set of claims required by your application (as opposed to store the entire content of the VC). If the application requires a persistent user profile store: Consider using the subclaim as an immutable identifier of the user. This is an opaque unique attribute that will be constant for a given subject/RP pair. Define a mechanism to deprovision the user profile from the application. Due to the decentralized nature of the Microsoft Entra Verified ID system, there is no application user provisioning lifecycle. Do not store Personal data claims returned in the VC token. Only store claims needed for the logic of the relying party. Plan for performance As with any solution, you must plan for performance. Focus areas include latency, throughput, and scalability. During initial phases of a release cycle, performance shouldn't be a concern. However, when adoption of your solution results in many verifiable credentials being verified, performance planning might become a critical part of your solution. The following provides areas to consider when planning for performance: The Microsoft Entra Verified ID issuance service is deployed in West Europe, North Europe, West US 2, and West Central US Azure regions. To limit latency, deploy your verification front end (website) and key vault in the region listed above that is closest to where requests are expected to originate from. Model based on throughput: VC verification capacity is subject to Azure Key Vault service limits. Each verification of a VC requires one Key Vault signature operation. Maximum signing performance of a Key Vault is 2000 signings/~10 seconds. This means your solution can support up to 12,000 VC validation requests per minute. You can't control throttling; however, we recommend you read Azure Key Vault throttling guidance so that you understand how throttling might impact performance. Plan for reliability To best plan for high availability and disaster recovery, we suggest the following: Microsoft Entra Verified ID service is deployed in the West Europe, North Europe, West US 2, and West Central US Azure regions. Consider deploying your supporting web servers and supporting applications in one of those regions, specifically in the ones from which you expect most of your validation traffic to originate. Review and incorporate best practices from Azure Key Vault availability and redundancy as you design for your availability and redundancy goals. Plan for security As you are designing for security, consider the following: All relying parties (RPs) in a single tenant have the same trust boundary since they share the same DID. Define a dedicated service principal for a website accessing the Key Vault. Only the Microsoft Entra Verified ID service and the website service principals should have permissions to use Key Vault to sign messages with the private key. Don't assign any human identity administrative permissions to the Key Vault. For more information on Key Vault best practices, see Azure Security Baseline for Key Vault. Review Securing Azure environments with Azure Active Directory for best practices for managing the supporting services for your solution. Mitigate spoofing risks by: Implementing DNS verification to help customers identify issuer branding. Use domains that are meaningful to end users. Mitigate distributed denial of service (DDOS) and Key Vault resource throttling risks. Every VC presentation request generates Key Vault signing operations that accrue towards service limits. We recommend protecting traffic by incorporating alternative authentication or captcha before generating issuance requests. Plan for operations As you plan for operations, we recommend plan that you capture each attempt of credential validation as part of your auditing. Use that information for auditing and troubleshooting. Additionally, consider generating unique transaction identifiers (IDs) that customers and support engineers can refer to if needed. As part of your operational planning, consider monitoring the following: For scalability: Monitor failed VC validation as a part of end-to-end security metrics of applications. Monitor end-to-end latency of credential verification. For reliability and dependencies: Monitor underlying dependencies used by the verification solution. Follow Azure Key Vault monitoring and alerting. For security: Enable logging for Key Vault to track signing operations, and to monitor and alert on configuration changes. Refer to How to enable Key Vault logging for more information. Archive logs in a security information and event management (SIEM) systems, such as Microsoft Sentinel for long-term retention. Next steps Learn more about architecting VC solutions Implement Verifiable Credentials",https://learn.microsoft.com/en-us/azure/active-directory/verifiable-credentials/plan-verification-solution,,Docs,,Product,,,,,,,,2022-10-19,,,,,,,,,,,,,
|
||
Microsoft,techmindfactory,,,Daniel Krzyczkowski,,,,,,Azure Active Directory VCs - preview introduction,"Once I discovered that documentation is available, I decided to create a small proof of concept. I have configured Verifiable Credentials accordingly to details in the documentation I have an existing Azure AD B2C tenant so it was much easier because users have to sign in first before they can be issued a verifiable credential.","Azure Active Directory Verifiable Credentials - preview introduction Introduction Azure Active Directory Verifiable Credentials are now in a public preview mode (at the moment of writing this article). You can visit the official page to read more. On this website, you will find all the details about how to start using Verifiable Credentials with Azure Active Directory. There is also great documentation with all details required to set up Verifiable Credentials in your own Azure Active Directory tenant. Small proof of concept Once I discovered that documentation is available, I decided to create a small proof of concept. I have configured Verifiable Credentials accordingly to details in the documentation. I have an existing Azure AD B2C tenant so it was much easier because users have to sign in first before they can be issued a verifiable credential. In this short article, I decided to share the result (I do not want to write another documentation, because the one provided by the Azure AD Team is great) and confirm that this concept works as expected! Modified website with QR codes to issue Verifiable Credentials Below I present modified node.js application which is used to display QR codes: Verifiable Credentials in the Microsoft Authenticator App Below I the user experience in the Microsoft Authenticator App: Confirmed DID in the Identity Overlay Network (ION) Once I created my Verifiable Credential, I verified that it can be found and verified in the ION network. You can read more about it here. Summary In this article, I briefly presented proof of concept related to Verifiable Credentials using Azure Active Directory. In the future, I plan to prepare the blog post series and describe concepts and implementation in detail. If you want to read more about Azure Active Directory Verifiable Credentials, please check this documentation.",https://techmindfactory.com/azure-ad-verifiable-credentials-intro/,,Post,,Product,,,,,,Entra,,2021-04-07,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Ankur Patel,,,,,,Expanding the public preview of verifiable credentials,"more than 1,000 enterprises with premium subscriptions have issued and verified tens of thousands of verifiable credentials […] from remote onboarding at work, collaboration across business boundaries as well as enabling education beyond the campus [...] we are extending the public preview […] for all Azure Active Directory (Azure AD) Free and Premium users.",,https://techcommunity.microsoft.com/t5/azure-active-directory-identity/expanding-the-public-preview-of-verifiable-credentials/ba-p/3295508,,Post,,Product,,,,,,,Verifiable Credentials,2022-05-04,,,,,,,,,,,,,
|
||
Microsoft,Personal,,,Damien Bowden,,,,,,Implement Compound Proof BBS+ verifiable credentials using ASP.NET Core and Mattr,This article shows how Zero Knowledge Proofs BBS+ verifiable credentials can be used to verify credential subject data from two separate verifiable credentials implemented in ASP.NET Core and Mattr. The ZKP BBS+ verifiable credentials are issued and stored on a digital wallet using a Self-Issued Identity Provider (SIOP) and OpenID Connect. A compound proof presentation template is created to verify the user data in a single verify,,https://damienbod.com/2021/12/13/implement-compound-proof-bbs-verifiable-credentials-using-asp-net-core-and-Mattr/),,Post,,Product,,,,,,,,2021-12-13,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,,,,,,,Microsoft Entra Verified ID now generally available,"Customers rely on Azure AD to secure access to corporate resources. However, enabling use of credentials for utility beyond the company (e.g. prove employment for bank loan) is complex and comes with compliance risk. In contrast, identity documents from our everyday lives, like a driver’s license or passport, are well suited for utility beyond travel (e.g. age or residency). We believe an open standards-based Decentralized Identity system can unlock a new set of experiences that give users and organizations greater control over their data—and deliver a higher degree of trust and security for apps, devices, and service providers. ",,https://techcommunity.microsoft.com/t5/microsoft-entra-azure-ad-blog/microsoft-entra-verified-id-now-generally-available/ba-p/3295506,,Post,,Product,,,,,,Entra,,2022-08-08,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,,,,,,,Towards scalable decentralized identifier systems,"Today, we’re announcing an early preview of a Sidetree-based DID network, called ION (Identity Overlay Network) which runs atop the Bitcoin blockchain based on an emerging set of open standards that we’ve developed working with many of our partners in the Decentralized Identity Foundation. This approach greatly improves the throughput of DID systems to achieve tens-of-thousands of operations per second",,https://techcommunity.microsoft.com/t5/azure-active-directory-identity/toward-scalable-decentralized-identifier-systems/ba-p/560168,,Post,,Product,,,,,,,"DID,Verifiable Credentials",2019-05-13,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Pamela Dingle,,,,,,Why does standards certification matter?,"It’s a good month for identity certification at Microsoft! We are excited to have achieved two important goals: OpenID Certification for Azure Active Directory and also FIDO Certification for Windows 10. You may or may not know what these particular protocols do, but even if you don’t, it’s worth talking about what these certification programs accomplish.","It’s a good month for identity certification at Microsoft! We are excited to have achieved two important goals: OpenID Certification for Azure Active Directory and also FIDO Certification for Windows 10. You may or may not know what these particular protocols do, but even if you don’t, it’s worth talking about what these certification programs accomplish. The goal of certification in the standards world is to ensure conformance to protocols. In FIDO Certification, the tests are both physical and digital; for example, authenticators must prove that they are storing keys and secrets in a secure environment, such as a trusted platform module (TPM), and that the secure environment can only be used when a user gesture is performed. Resistance to physical attacks, such as side-channel attacks, must be demonstrated, as well as protocol conformance. A third party performs this certification, with the goal that anyone who uses a certified product can have reasonable confidence that the solution hasn’t cut any corners. The OpenID Certification is a different beast from FIDO Certification. Because OpenID Connect is a web protocol, there are fewer hidden parts; it’s easier for anyone to inspect and validate the protocol messages exchanged. The OpenID Certification process is therefore lighter weight and uses self-certification. With self-certification, those seeking certification run their own tests. The results of those tests are then published for scrutiny by all. In this case, the certifying organization is putting their reputation on the line. It isn’t a third party that claims adherence, it’s the owner of the implementation themselves. While those organizations could lie, most prioritize their reputation over any short-term gain that could come from misrepresentation. A lot of developers have been successfully using the OpenID Connect with the Microsoft Identity Platform for years, so what’s the big deal? There are a couple of reasons why it matters. First, certification enables third-party vendors who are completely platform-agnostic to develop with confidence. This gets us closer to a world that requires as little custom connectivity as possible. Second, these tests sometimes catch things! The simple assurance of knowing that the development team has worked through all the edge-cases is valuable, even for established platforms. If you go back a decade to when security assertions markup language (SAML) implementations were being certified, certification was highly formalized, took a long time, and cost a lot of money. We have iterated on that pattern with OpenID Connect, creating a lightweight and more inclusive practice. I don’t think this is the final frontier for certification, however. I believe that we will see the kinds of standards that lend themselves to automation evolving towards inline “test-driven” certification, where simple checks are performed by underlying layers as part of everyday software design lifecycle. Indeed, some projects are already using the OpenID Certification test suite in that way. Whether the tests are automated, manual, or process-driven, at the end of the day, the goal is to ensure that what is promised on the outside matches what is implemented on the inside. It takes a lot of time and attention to faithfully implement protocols and certify those implementations, but the effort is worth it. Congratulations to our engineering teams on both of our certification achievements!",https://techcommunity.microsoft.com/t5/identity-standards-blog/why-does-standards-certification-matter/ba-p/638937,,Post,,Product,,,,,,,"OpenID,FIDO,OpenID Connect",2019-05-23,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,,,,,,,Azure Active Directory Verifiable Credentials,Verifiable credentials help you build solutions that empower customers to manage their own data.,,https://learn.microsoft.com/en-us/azure/active-directory/verifiable-credentials/,,Product,,Product,,,,,,AzureAD,,2022-07-08,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,,,,,,,Decentralized identity,Discover the open standards-based solution for verified digital identity that gives people more control and convenience.,,https://www.microsoft.com/en-us/security/technology/own-your-identity,,Product,,Product,,,,,,,,2019-06-18,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,Azure,,,,,,,Azure Documentation,Find the languages and tools you need to develop on Azure.,,https://learn.microsoft.com/en-us/azure/?product=popular,,Documentation,,Product,,,,,,,,2023-02-25,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,,,,,,,,Microsoft Entra Verified ID. Verify once. Use everywhere,"Strengthen security and reduce costs with Microsoft Entra<br>Hear Joy Chik, Microsoft Corporate Vice President for Identity, share the latest identity and access announcements in governance, workload identities, strong authentication, and new tools for upgrading from Active Directory Federation Services (AD FS) to Azure AD.","Strengthen security and reduce costs with Microsoft Entra Hear Joy Chik, Microsoft Corporate Vice President for Identity, share the latest identity and access announcements in governance, workload identities, strong authentication, and new tools for upgrading from Active Directory Federation Services (AD FS) to Azure AD. Start your decentralized identity journey Enable more secure interactions with Verified ID, the industry-leading global platform from Microsoft. Quickly onboard employees, partners, and customers Digitally validate information with ID verification providers to ensure trustworthy self-service enrollment and faster onboarding. Access high-value apps and resources Quickly verify an individual’s credentials and status to grant least-privilege access with confidence. Provide self-service account recovery Replace support calls and security questions with a streamlined self-service process to verify identities. Work with a Microsoft Partner Ensure a smooth and secure verifiable credential experience, made possible by Microsoft partnerships with leading identity verification providers. Verified ID capabilities Help people control their digital identity Based on open standards, Verified ID automates verification of identity credentials and enables privacy-protected interactions between organizations and users. How customers are using Verified ID Verified ID is currently available for free.* - Verify and issue workplace credentials, education status, certifications, or any unique identity attributes. - Empower your users to own and control their digital identity for improved privacy. - Reduce organizational risk and simplify the audit process. - Provide developers with a seamless way to create user-centric, serverless apps. Start your decentralized identity journey by enabling Verified ID for free in the Microsoft Entra admin center. Quickly onboard and begin issuing and verifying credentials for customers by implementing Verified ID with one of our partners. *Microsoft Entra Verified ID is included with any Azure Active Directory subscription, including Azure AD Free. Related Microsoft Entra products Microsoft Entra Permissions Management Monitor permissions risks across your multicloud infrastructure. Azure Active Directory Help safeguard your organization with the Microsoft Entra identity and access management solution that connects people to their apps, devices, and data. Documentation and training Learn more about decentralized identity Reduce risk and empower people to own and control their identity data. Key technical concepts Understand digital direct presentation, verifiable credentials, and decentralized identifiers. Developer guide Create serverless apps that store data with users through the Microsoft verifiable credentials platform. Implementation partners Accelerate your decentralized identity transformation with help from our world-class partners.",https://www.microsoft.com/en-us/security/business/identity-access/microsoft-entra-verified-id,,Product,,Product,,,,,,,Verifiable Credentials,2023-01-01,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,Azure,,,,,,,DID Project - Azure Websites,"Verify once, use everywhere. Join our list and we'll let you know when our Public Preview is ready.","Verify once, use everywhere Join our list and we'll let you know when our Public Preview is ready.Get Notified Faster onboarding, even remote Digitally validate any piece of information with ID verification services for trustworthy self-service enrollment and faster onboarding. Secure access to apps Verify credentials from a wide variety of trusted partners based on open standards. Self-service account recovery Replace support phone calls and security questions with a simpler, more secure process to verify their identity. Start issuing and accepting verifiable credentials in minutes With Azure AD verifiable credentials you can verify anything while respecting privacy. Digitally validate any piece of information about anyone and any business. Customer Stories Keio University Keio University is a leading research university in the process of implementing digital Student IDs to certify enrollment for eligibility and recruiting.Learn more > National Health Service The National Health Service (NHS) in the UK is using verified credentials to support staff movement between NHS organizations, allowing staff to hold their own verified record of their employment, clearance, and other attributes on their smartphone.Learn more > Government of Flanders Citizens will be able to request a verifiable credential with citizenship status across civic and private sector, including the citizen portal.Learn more >",https://didproject.azurewebsites.net,,Project,,Product,,,,,,,"DID,Verifiable Credentials",2021-01-01,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Pamela Dingle,,,,,,"All about FIDO2, CTAP2 and WebAuthN","To understand how FIDO2 authenticators work, you need knowledge of two specifications in two different standards bodies. The WebAuthentication (aka WebAuthN) spec lives at W3C (where the browser makers meet) while the Client-to-Authenticator (aka CTAP2) spec lives at the FIDO Alliance (where hardware and platform folks have joined to solve the problem of Fast IDentity Online).","This is a great week to be working in Identity Standards, as we at Microsoft celebrate the release of our first ever WebAuthN Relying Party. This one relying party enables standards-based passwordless authentication at Xbox, Skype, Outlook.com and more. But what are the actual pieces of the puzzle and how do they fit? Read on for the big picture of how the W3C WebAuthN and FIDO2 CTAP2 specifications interact. We will start with the industry standards perspective, and then at the end we will summarize how Microsoft implements the various roles. To understand how FIDO2 authenticators work, you need knowledge of two specifications in two different standards bodies. The WebAuthentication (aka WebAuthN) spec lives at W3C (where the browser makers meet) while the Client-to-Authenticator (aka CTAP2) spec lives at the FIDO Alliance (where hardware and platform folks have joined to solve the problem of Fast IDentity Online). The Big Picture CTAP2 and WebAuthN define an abstraction layer that creates an ecosystem for strongly authenticated credentials. Any interoperable client (such as a native app or browser) running on a given “client device” can use a standardized method to interact with any interoperable authenticator – which could mean a platform authenticator that is built into the client device or a roaming authenticator that is connected to the client device through USB, BLE, or NFC. Authenticators securely create and locally store strong cryptographic keys at the request of clients, under the condition that the user must consent to the operation via the performance of a ‘user gesture’. Once these client-specific keys are created, attestations can be requested and sent to the clients for the purposes of registration and authentication; the type of signature performed by the private key reflects the user gesture performed. When CTAP and WebAuthN are drawn, it looks something like the picture below. The light blue dotted arrows are light blue and dotted because the exact way in which platform APIs are exposed to clients is an implementation choice. The cast of characters in a combined WebAuthN/CTAP2 dance are: Relying Parties & Clients Relying parties are web or native applications that wish to consume strong credentials. In the native case, the relying party running on the client device can also act as a WebAuthN client to make direct WebAuthN calls. In the web case, the entity that wants to consume the credential cannot directly interact with the WebAuthN API, and so must broker the deal through the browser. Do not confuse FIDO relying parties with federated relying parties, there is no single sign-on in the above picture. Client Device The client device is the hardware hosting a given strong authentication. Laptops and phones are examples of client devices. Platform Authenticator A platform authenticator is usually resident on a client device and cannot be accessed via cross-platform transport protocols such as USB, NFC or BLE. Examples of platform authenticators include built-in laptop fingerprint readers or facial recognition using smartphone cameras. Roaming Authenticator A roaming authenticator can connect to multiple client devices, and interaction must be negotiated over a supported transport protocol. Examples of roaming authenticators might include USB security keys, BLE-enabled smartphone applications, or NFC-enabled proximity cards. Authenticators may support CTAP1, CTAP2, or both protocols. CTAP2 Platform/Host The platform (also called the host in the CTAP2 spec) is the part of the client device that negotiates with authenticators. The platform is responsible for securely reporting the origin of the request and for calling the CTAP2 Cbor APIs. In cases where the platform is not CTAP2-aware, the clients themselves must take on more of the burden and the internals of this diagram might best be drawn a little differently. Many relying parties and clients can interact with many authenticators on a single client device. A user might install multiple browsers that support WebAuthN, and might simultaneously have access to a built-in fingerprint reader, a plugged-in security key, and a BLE-enabled mobile app. The WebAuthN API enables clients to make requests to authenticators - to create a key, get an assertion about a key, report capabilities, manage a PIN, and so on. About that Interoperability Thing Before there was WebAuthN and CTAP2, there was U2F and CTAP1. U2F is the FIDO Alliance’s universal second factor specification and there are a lot of authenticators that speak CTAP1 and manage U2F credentials. WebAuthN was designed to be interoperable with CTAP1 Authenticators, and U2F credentials can still be used, as long as no FIDO2-only functionality is required by the relying party. Some of the options that FIDO2 authenticators have already implemented and that WebAuthN relying parties might require include: - Keys for multiple accounts can be stored per relying party - Client PIN - Location (the authenticator returns a location) - HMAC-secret (this enables offline scenarios) Other options are cool and might be useful in the future, but haven't been seen in the wild yet: - Transactional approval - User Verification index (this allows servers to understand if locally stored biometrics have changed over time) - User Verification Method (the authenticator returns the exact method) - Biometric Performance Bounds (the relying party can specify acceptable false acceptance and false rejection rates) Future blog posts will explore the benefits and the inner working of these interoperability points (some of which are documented in the specification but have not been implemented anywhere). Microsoft Implementation The Microsoft FIDO2 implementation has been years in the making: software and services are implemented independently as standards-compliant entities. As of the Windows 10 October 2018 release, all Microsoft components are updated to use the latest WebAuthN Candidate Release, which is a stable release not expected to normatively change before the specification is finally ratified. Because Microsoft is among the first in the world to deploy FIDO2, some combinations of popular non-Microsoft components won’t be interoperable yet – but give it time. Please remember that alignment on specifications like this does not happen overnight. As an industry, we will get to a place where all the components speak all the specs with all the right extensions supported, and then things will be fun. Here is an approximate layout of where the Microsoft bits go: Current Microsoft WebAuthN/CTAP2 Functionality - Microsoft Account plays the part of a WebAuthN Relying Party. If you aren’t familiar with Microsoft Account, it is the login service for services like Skype, Xbox, Outlook, and many other sites. The login experience uses client-side javascript to trigger Edge to talk to the WebAuthN APIs. Microsoft Account requires that authenticators have the following capabilities: - Keys must be stored locally on the authenticator and not on a server somewhere - Offline scenarios must work (see: hmac-secret) - Users must be able to put keys for multiple user accounts on the same authenticator - Authenticators must be capable of unlocking a TPM with a clientPIN if necessary Note that these are the requirements as of today; for the authoritative and maintained list of the extension support needed to be considered “Microsoft-compatible”, please see the docs. Because Microsoft Account requires features and extensions unique to FIDO2 CTAP2 authenticators, this site will not accept CTAP1 (U2F) credentials. - Microsoft Edge plays the part of a WebAuthN Client. Edge can handle the UI for the above listed WebAuthN/CTAP2 features, and also supports the AppID extension. Edge is capable of interacting with both CTAP1 and CTAP2 authenticators, which means that it can facilitate the creation and use of both U2F and FIDO2 credentials, however Edge does not speak the U2F protocol, so relying parties must use the WebAuthN specification only. Edge on Android sadly does not support WebAuthN. - Authoritative information on Edge support for WebAuthN and CTAP can be found in the Dev Guide. - Windows 10 plays the part of the platform, hosting Win32 Platform WebAuthN APIs that enable clients to interact with Windows Hello in order for users to be prompted and interactions with authenticators to take place. The inner workings of Windows Hello are definitely worthy of a blog entry of their own. - Roaming Authenticators You might notice that there is no “Microsoft” roaming authenticator. That is because there is already a strong ecosystem of products that specialize in strong authentication, and every one of our customers (whether corporations or individuals) have different requirements for security, ease of use, distribution, and account recovery. If you want to see the ever-growing list of FIDO2 certified authenticators, you can find that list here: https://fidoalliance.org/certification/fido-certified-products/. The list contains built-in authenticators, roaming authenticators, and even chip manufacturers with certified designs, and this is just the start! While USB security keys are the most common roaming authenticator today, they may not be tomorrow; stay tuned for lots of innovation in the areas of NFC and BLE, and the integration of FIDO2 into smartphone apps, smart cards, fitness trackers, and who knows what else. While the diagram above is academically interesting, it is real-world interoperability and the ability for end users to leverage their authenticators at many services that will make Microsoft’s investment truly worthwhile. The industry is gunning for ubiquitous standards-based passwordless authentication, and by gum we’re on our way. What’s Next This overview covers the entities at play in a WebAuthN/CTAP2 interaction – but these roles are just the tip of the iceberg. In future blog posts we will dig into details of the interaction itself, including - How keys are stored and why resident keys are important - How exactly U2F/CTAP1 backwards compatibility works - What specific extensions do and what is the business value - Differences in how WebAuthN is used for first vs. second factor auth - How Developers can become WebAuthN and CTAP relying parties - How the security mitigations keep it all safe Stay tuned for more fun and excitement in the Identity Standards world!",https://techcommunity.microsoft.com/t5/identity-standards-blog/all-about-fido2-ctap2-and-WebAuthN/ba-p/288910,,Post,,Standards,,,,,,,"FIDO2,CTAP2,WebAuthN",2018-11-20,,,,,,,,,,,,,
|
||
Microsoft,Personal,,,Damien Bowden,Trinsic; Mattr; Evernym,,,,,Challenges to Self Sovereign Identity,"Authentication using SSI credentials would have to same level of security as the authenticator apps which you have for existing systems. This is not as safe as using FIDO2 in your authentication process as FIDO2 is the only solution which protects against phishing. The SSI Authentication is also only as good as the fallback process, so if the fallback process, recovery process allows a username or password login, then the level would be passwords.<br>","The article goes through some of the challenges we face when using or implementing identity, authentication and authorization solutions using self sovereign identity. I based my findings after implementing and testing solutions and wallets with the following SSI solution providers: Blogs in this series: - Getting started with Self Sovereign Identity SSI - Creating Verifiable credentials in ASP.NET Core for decentralized identities using Trinsic - Verifying Verifiable Credentials in ASP.NET Core for Decentralized Identities using Trinsic - Create an OIDC credential Issuer with Mattr and ASP.NET Core - Present and Verify Verifiable Credentials in ASP.NET Core using Decentralized Identities and Mattr - Verify vaccination data using Zero Knowledge Proofs with ASP.NET Core and Mattr - Challenges to Self Sovereign Identity - Create and issue verifiable credentials in ASP.NET Core using Azure AD - Implement Compound Proof BBS+ verifiable credentials using ASP.NET Core and Mattr History 2021-12-11 Added video explaining SSI phishing 2021-10-31 Updated phishing section after feedback. SSI (Self Sovereign Identity) is very new and a lot of its challenges will hopefully get solved and help to improve identity management solutions. Some definitions: - Digital Identity: This is the ID which represents a user, for example an E-ID issued by the state, this could be a certificate, hardware key, verifiable credential etc. - Identity: This is the user + application trying to access something which usually needs to be authenticated when using a protected user interactive resource. - Authentication: verifying the “Identity” i.e. application + user for user interactive flows. - Authorization: verify that the request presents the required credentials, specifying access rights/privileges to resources. This could mean no verification of who or what sent the request, although this can be built in with every request if required. Solutions exist for this in existing systems. The following diagram from the Verifiable Credentials Data Model 1.0 specification shows a good overview of verifiable credentials with issuers, holders and verifiers. The holder is usually represented through a wallet application which can be any application type, not just mobile applications. Level of security for user interaction authentication with SSI Authentication using SSI credentials would have to same level of security as the authenticator apps which you have for existing systems. This is not as safe as using FIDO2 in your authentication process as FIDO2 is the only solution which protects against phishing. The SSI Authentication is also only as good as the fallback process, so if the fallback process, recovery process allows a username or password login, then the level would be passwords. See this post for more details: The authentication pyramid Authentication Issuer The authentication process is not any better than previous systems, every issuer needs to do this properly. Trust quality of the issuer depends on this. If a verifier wants to use verifiable credentials from a certain issuer, then a trust must exist between the verifier and the issuer. If the issuer of the credentials makes mistakes or does this in a bad way, then the verifier has this problem as well. It is really important that the credential issuer authenticates correctly and only issues credentials to correctly authenticated identities. SIOP (Self-Issued OpenID Provider) provides one solution for this. With this solution, every issuer requires its own specific IDP (identity provider) clients and OIDC (OpenID Connect) profiles for the credentials which are issued. When a credential issuer has a data leak or a possible security bug, maybe all credentials issued from this issuer would need to be revoked. This might mean that all data in any verifier applications created from this issuer also needs to be revoked or deleted. This is worse than before where we have many isolated systems. With SSI, we have a chain of trust. A super disaster would be if a government which issues verifiable credentials had to revoke credentials due to a security leak or security bug, then all data, processes created from this would have to evaluate and process the data again, worse case delete and require new proofs. Authentication/Authorization Verifier Every verifier application/process has a responsibility for it’s own authorization and the quality of its verification. Depending on what the verifier needs to do, a decision on the required verifiable credentials needs to be taken. A verifier must decide, if it needs only to authorize the verifiable credential it receives from the wallet application, or if it needs to authenticate the digital identity used in the subject of the verifiable credential. If you only authorize a verifiable credential when you should be authenticating the digital identity used in the verifiable credential, this will probably result in false identities in the verifier application or processes run for the wrong user. Once the verifiable credential is accepted as trustworthy, an identity could be created from the verifiable credential and the digital identity if contained in the subject. Some solutions move the authentication of the digital identity to the wallet and the verifiers accept verifiable credentials from the wallet without authentication of the digital identity. This would mean that the wallet requires specific authentication steps and restrictions. Another solution would be to use a certificate which is attached to the digital identity for authentication on the verifier and on the wallet. SIOP and OpenID Connect would also provide a solution to this. Access to Wallets, Authentication & Authorization of the holder of the Wallet One of the biggest problems is to manage and define what verifiable credentials can be loaded into a wallet. Wallets also need to possibility to import and export backups and also share credentials with other wallets using DIDComm or something like this. Enterprise wallets will be used by many different people. This means that solutions which fix the identity to a specific wallet will not work if shared wallets or backups are to be supported. Wallets would also need to support or load verifiable credentials for other people who are not able to do this themselves or do not have access to a wallet. With these requirements it is very hard to prove that a verifiable credential belongs to the person using the wallet. Extra authentication is required on the wallet and also the issuers and the verifier applications cannot know that the credential issued to wallet x, really belongs to person x, which is the subject of the verifiable credential. Extra authentication would be required. Solutions with certificates, or wallet authentication can help solve this but no one solution will fit all solutions. If a relationship between the person using the wallet, the credentials in the wallet and how the verifiers use and trust the credentials is managed badly, many security problems will exist. People are people and will lose wallets, share wallets when they should not and so on. This needs to be planned for and the issuer of verifiable credentials and the verifier of these credentials need to solve this correctly. This would probably mean when using verifiable credentials from a wallet, the issuer application and the verifier application would need to authenticate the user as well as the credentials being used. Depending on the use case, the verifier would not always need to do this. Interoperability between providers & SSI solutions At present it is not possible to use any wallet with any SSI solution. Each solution is locked to its own wallet. Some solutions provide a way of exporting verifiable credentials and importing this again in a different vendor wallet but not using wallet vendor 1 together solution vendor 2. Solutions are also not supporting the same specifications, standards. Each implementation I tried supports different standards and have vendor specific solutions which are not compatible to other vendors. I have a separate wallet installed now for each solution I tried. It cannot be expected that users install many different wallets to use SSI. Also, if a government issues a verifiable credential with a certain wallet, it would be really bad if all verifiers and further credential issuers must use the same wallet. With JSON-LD and BBS+, the APIs between the wallets and the SSI services should work together but vendor specific details seem to be used in the payloads. If SSI is to work, I believe any wallet which conforms to a standard x must work with any SSI service. Or the services, vendor implementations needs a common standard for issuing and verifying credentials with wallets and the agents used in the wallets. Some type of certification process would probably help here. Phishing Users authenticating on HTTPS websites using verifiable credentials stored on your wallet are still vulnerable to phishing attacks. This can be improved by using FIDO2 as a second factor to the SSI authentication. The DIDComm communication between agents has strong protection against phishing. Self sovereign identity phishing scenario using HTTPS websites: - User opens a phishing website in the browser using HTTPS and clicks the fake sign-in - Phishing service starts a correct SSI sign-in on the real website using HTTPS and gets presented with a QR Code and the start link for a digital wallet to complete. - Phishing service presents this back on the phishing website to the victim on the phishing website. - Victim scans the QR Code using the digital wallet and completes the authentication using the agent in the digital wallet, DIDComm etc. - When the victim completes the sign-in using the out of band digital wallet and agent, the HTTPS website being used by the attacker gets updated with the session of the victim. This can only be prevented by using the browser client-side origin as part of the flow, signing this and returning this back to the server to be validated, thus preventing this type of phishing attack. This cannot be prevented unless the origin from the client browser is used and validated in the authentication process. The browser client-side origin is not used in the SSI login. Here’s the same problem description: Risk Mitigation for Cross Device Flows PII databases will still be created Credential Issuers require PII data to create a verifiable credentials. This data is persisted and can be leaked like we see every day on the internet. Due to costs/laws/charges, verifiers will also copy credentials and create central PII databases as well. The applications doing verifications also will save data to a local database. For example, if an insurance company uses verifiable credentials to create a new account, it will save the data from the verifiable credential to a local database as this is required for its requirements. This will have the PII data as well. So even with SSI solutions we will still share and have many databases of PII data. If BBS+ ZKP is used which does not share the data with the verifier, just a verification, this is a big improvement compared to existing solutions. At the time of testing, this is not supported yet with any of the solutions I tried, but the specifications exist. Most solutions only support selective or compound proofs. If the verifier does not save this data, then SSI has added a big plus compared to existing solutions. A use case for this would be requesting a government document which only requires proof of the digital identity and this can be issued then without storing extra PII data or requiring a shared IDP with the digital identity issuer. Complexity of the Self Sovereign Identity I find the complexity when implementing SSI solutions still very high. Implementation of a basic issue credential, verify credential process requires many steps. Dev environments also require some type of callback, webhooks. ngrok solves this good but this is different after each start and needs to be re-configured regularly, which then requires new presentation templates which adds extra costs and so on. If integrating this in DevOps processes with integration testing or system testing, this would become more complex. Due to the overall complexity to set this up and implement this, developers will make mistakes and this results in higher costs. IDP or PKI solutions are easier to implement in a secure way. Trust When using the provided solutions, some of these systems are closed source and you have no idea how the solutions manage their data, or handle the secrets in their infrastructure, or how the code is implemented (closed source). For some solutions, I had to share my OpenID Connect client secret for a flow to complete the SSI SIOP credential issuer flow. This was persisted somewhere on the platform which does not feel good. You also don’t know who they share their secrets with and the company provider has different law requirements depending on your country of origin. Implementing a “full” SSI solution does not seem like a good option for most use cases. Open source software improves this trust at least but is not the only factor. Notes: I would love feedback on this and will correct, change anything which is incorrect or you believe I should add something extra. These challenges are based on my experiences implementing the solutions and reading the specifications. I really look forward to the new SSI world and how this develops. I think SSI is cool, but people need to know when and how to use it and it is not the silver bullet to identity but does add some super cool new possibilities. Reviewers: - Matteo Locher @matteolocher Links: https://docs.Microsoft.com/en-us/azure/active-directory/verifiable-credentials/ The authentication pyramid https://w3c.GitHub.io/did-core/ https://w3c.GitHub.io/VC-data-model/ https://www.w3.org/TR/VC-data-model/ https://identity.foundation/ https://GitHub.com/swiss-ssi-group",https://damienbod.com/2021/10/11/challenges-to-self-sovereign-identity/,,Post,,Standards,,,Critique,,,,,2021-10-11,,,,,,,,,,,,,
|
||
Microsoft,XT Seminars,,,,,,,,,Introduction to the future of identity - DIDs & VCs,"In this blog, I want to start by thinking about identity in general and then explaining Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs). I will show you how you can issue your own DIDs and VCs using the new Microsoft service in future blogs. This series' final blog will look at how DIDs can be anchoPred in decentralized transaction ledgers using ION and the Bitcoin blockchain.","Update 02 March 2022: Please read the introduction to see what's changed. Part 2 and beyond are new content. Part 1 in the series With the Microsoft Azure AD Verifiable Credentials (VCs) issuer service available in your tenant, it's time to understand what VCs are and how they work with Decentralized Identifiers (DIDs). VCs and DIDs provide a new paradigm for identity, a true step into the future. In this blog, I want to think about identity in general and then explain Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs). There are four blogs in the series, and by following the four blogs, you will learn how to issue your own DIDs and VCs using the new Microsoft service. When I originally started writing this series of blogs, Microsoft provided an SDK and libraries that could be incorporated into an application to request the issuance and presentation of VCs. The libraries coded the cryptographic functions required to support VCs. Microsoft replaced the need for the libraries by implementing APIs to generate the necessary issuance and presentation requests. The APIs are also used for VC validation. The APIs are the recommended way of working with VCs, and the SDK became depreciated at the end of 2021. I have now published three new blogs which show how the Microsoft Request Service REST API is used. This first blog has not changed very much (apart from the intro), but blogs two, three and four are new. The blogs in the series are: 1. Introduction to the future of identity - DIDs & VCs (this one) 2. Issuing and verifying Verifiable Credentials with the Microsoft Azure AD services 3. How to run the Microsoft Azure AD Verifiable Credentials sample app 4. Creating your own Azure AD Verifiable Credentials Blog 4 will be published on 04/03/2022. By the end of blog 4, you will be issuing your own VCs with claims taken from different sources For now, read on for an introduction to DIDs and VCs. Let's start by considering the question What is identity and access control? Identity is about knowing who someone is; the user is identified when they are authenticated. Access control manages access to resources based on the individual's credentials. I've used the term credential to refer to ""known"" information about the user. Traditionally an individual's credentials have been set by the systems that authenticate the user and/or control access to a resource. Identity and access control could be governed by a single entity, such as a website authenticating users via its own accounts database and controlling access to the site's resources. Alternatively, a central identity provider (IdP) can authenticate the user, and individual resources implement access control based on the user's credentials. If our systems use industry-standard authentication protocols such as SAML or OpenID Connect / OAuth 2.0, one organization can manage the authentication and another organization control access to the resources. This is a federated solution, and there are many industry players providing enterprise Identity and Access Management (IAM) services including, Microsoft, Okta, OneLogin, Ping Identity. These enterprise IAM services are full-featured, allowing an organization to manage their own users and application access. In addition to the enterprise IAM services, there is a range of consumer identity services such as Google(Gmail), Microsoft(MSA), Amazon, and Facebook. These consumer services allowing self-service sign up for the core service, for example, Gmail. However, they also provide federated sign-in for any resources that are configured to trust the IdP to authenticate users. Identity providers need to be able to uniquely identify a user and provide a method of authenticating that user. Traditionally this has been done with a username and password. When a user is registered with an IdP, the user will be identified by a username and password. When the user initially signs in, what do we know about them? Nothing. All the system knows is that the same user has returned. If it is required to know more information about the user, some form of identity proofing will need to be implemented. This could be as simple as verifying an email address by sending the user an email and requesting that they respond, through to asking the user to attend an interview with appropriate documents, and performing in-person verification. Your digital identity To live in the digital world, you will probably have a corporate user account and then accounts with multiple IdPs (Google, Facebook etc). We will have access to resources based on all these different accounts. Are we in charge of our identity? The answer to that is a BIG NO. When we leave an organization, we lose our corporate identity. A consumer IdP may go bust or block our account. Once again, we have lost our identity. In both cases, we have lost the ability to prove who we are to the different resources that trust the IdP. Of course, in a corporate world, it's a good thing that when we leave, we can no longer access the organization's systems, but what about other systems that we signed up to using our work identity? If the potential to lose your identity wasn't bad enough, the other major pain point is that you are constantly being asked to prove things about yourself. You open an online bank account. You have to prove who you are. Open a savings account with a different bank. Once again, you are sending off all the same documents as part of the required identity proofing. And it goes on and on. Identity in the real-world Do we continuously lose our identity as we move between jobs, countries, relationships, and hobbies in the real world? It may feel like it sometimes, but in reality, the answer is No. When we are born, we have biometric characteristics, our first set of credentials. Throughout our life, we gain other credentials—birth certificate, education achievements, driving licence, passport, marriage certificate and so on. When we are stopped for speeding, how do we prove who we are to the police officer? We present our driving licence. Provided the photo matches and the driving licence has the correct hologram, the police officer may accept this as proof of who we are. For extra surety, she may well check that the licence has not been revoked. Once she knows the document is genuine, she can trust all the credentials it holds, age, address, and so forth. Let's step back a moment and think about what's happened in this scenario. I had the license in my wallet and presented it to the police officer. She validated the licence by checking: It was about me by matching the photo ID That it came from an issuer she trusted by examining the document hologram Getting extra surety by checking for revocation What my identity? My image, who can take away? No one! Digital identity mimics real-world identity How would it be if we could digitally mimic the driving licence scenario? Well, now we can with DIDs and VCs. My image is replaced with a globally unique digital ID which I generate and own. I will go into the details of this digital ID later in this blog; for now, let's keep it simple. Of course, being a mere human, I cannot generate globally unique IDs, so I leave that to my user-agent (also called a wallet). Microsoft has implemented a user-agent to do this in the Microsoft Authenticator App that you can install on your phone. This is the same app that does all those great things, such as multi-factor authentication and passwordless sign in to Microsoft 365/Azure AD. Thank you, Microsoft, for not cluttering up our phones with yet another app. My globally unique ID consists of a cryptographic private/public key pair. The private key is securely held by the user-agent and never leaves the agent. Using the private key, my agent can digitally sign a message which includes the public key and send it to a recipient service. Using the public key (contained in the message), the recipient can validate the signature. The recipient now knows that the message's sender owns the private key. The private key is unique to the user and never leaves their possession. Consequently, the message has been signed by the user's globally unique digital identity (private key). This can be confirmed by anyone using the user's globally unique associated public key. This is a great start. The recipient service could store my public key and maybe some characteristics about my interaction with the service. When I return to the service, I supply another signed message, and the service can uniquely identify me as a returning user. We have just thrown away the need for usernames and passwords together with their associated vulnerabilities. We have also eliminated the requirements for an identity provider. The FIDO 2 standards are implemented in a similar way using public/private keys. Traditionally, the next step would be for the service to do some form of identity proofing and create a user profile that contains the appropriate credentials. Back to the driving license and the police officer. Once I have passed my driving test, I can apply to the Driving Licence Authority (DLA) to issue a licence. The DLA will ask me for specific documents, including my test certificate, my photo, DOB, proof of address etc. Today they create a licence containing my photo to prove it belongs to me, appropriate credentials, and a hologram that confirms it was issued by the DLA. Let's go digital! I apply for a licence, submitting my digital ID (public key) rather than supplying my photo. I furnish the other documents necessary (which could be done digitally). The DLA creates a digital record that includes my credentials, my digital ID (the subject), the DLA's digital ID (the issuer) and is signed by the DLA. I store this in my user-agent. The police officer questions my speeding (I am sure my Fiat 500 was going that fast!) she asks me to show her my driving licence. I unlock my phone (with a biometric) and use the agent to submit my driving licence to be verified. My agent digitally signs the submission using my private key, which proves I am the licence owner (only I have the private key, it has never left my wallet.) We now have the trio of issuer, holder and verifier, which is fundamental to verifiable credentials. A little like the three musketeers all supporting each other. “All for one and one for all, united we stand divided we fall.” The issuer, holder and verifier all have their own unique digital identity. I left out a bit of detail above. When the police officer asks me to submit my licence, her systems will make a presentation request which is digitally signed by the verification system. I then have the opportunity to decide if it is safe to submit the credentials to that system. I have just introduced you to the world of verifiable claims (VCs). A key feature of VCs is that as the holder, I can submit a VC to multiple verifiers and provided they trust the issuer, they can all accept my credentials. This avoids me having to prove my credentials to numerous services. Prove the credentials once to an issuer and use them everywhere. Don't you just love that? I know I do. If you are wondering how you trust the issuer, you will need to read the section below on DIDs A world of endless possibilities A verifier, via a presentation request, could ask for multiple VCs from different issuers or just a single credential. Using Zero-Knowledge Proof (ZKP), it will also be possible to prove to a verifier that you are over 21 without submitting your DOB. Here's a thought, does our IdP need a user accounts database? When a user signs up to use our service, we make a presentation request to the user. The user submits the appropriate VCs from issuers we trust, and then we issuer a VC that proves the user is approved for access to our services. When the user subsequently signs in to our service, the VC that we issued is presented and we verify it. Do we need to hold details of the user in an accounts database? I'll leave you to ponder that question. This is an inspiring time for identity and open to ideas from many sources. You will be glad to know that there are emerging standards for VCs. If you want to know more, search the WC3 for verifiable claims https://www.w3.org/TR/?title=verifiable and the Decentralized Identity Foundation https://identity.foundation/ Microsoft has recently made its verifiable claims issuance service available in public preview. In my next blog, I will take you through issuing your own VCs and show you what's going on under the hood. If you want to get started now, a good starting point is here. Before I leave this blog, I want to look at digital identity in more detail and, in particular Decentralized Identifiers (DIDs). Decentralized Identifiers I left you waiting for an answer as to how you could trust an Issuer. So here's it is. The issuer, holder, and verifier are all signing messages with their private key. To validate the signatures, we need to know the member's public key, but we need more information than that. For example, we need information about the cryptographic algorithm used to sign the message and how we can validate the public key belongs to a particular entity. All the relative information is contained in a DID document. Here is an extract from a DID document: You will see all the details of the public key and a service endpoint where the issuer has published a JSON document. https://learn.xtseminars.co.uk/.well-known/did-configuration.json. This document binds the entity's DID document to the entity's domain. We now know that the DID belongs to learn.xtseminars.co.uk When an entity creates a signed message, the DID document can be contained in the signed message or referenced. If it's referenced, then the DID document is stored on a Decentralized public ledger (blockchain). The reference is in the form: did:method:method-specific-identifier The method identifies the underlying ledger technology and the method-specific-identifier uniquely identifies the DID document on the ledger. Microsoft is currently using the ION method to store their VC DIDs (more details in a later blog), and the DID is in this form: did:ion:EiBtLiugj5KjXko8o8Tczdg5KXN93Y3dn8TP5j6neJjpkw Issuers and verifiers will almost certainly use linked domains to prove who they are. When it comes to a user in most situations, there is no need to publish their actual identifier. All the user needs to prove is that a VC was issued to them. The user's agent could create a new DID for every verifier that they interact with. A user's complete identity is represented by all the VCs that contain one of the user's DIDs as the subject. I can hear you thinking…. The VC was issued to a particular digital ID. How can a user present the VC when their digital ID has changed? Read my next blog, where I will dive in under the hood. Thank you for reading this blog, and stay tuned for the next one. Please let your friends and colleagues know about the blog via LinkedIn and Twitter, don't forget to include me, so I see it! Twitter: @john_craddock and/or www.linkedin.com/in/johnxts My next identity masterclasses for CET and EST are in March 2022. Why don't you join me for an action-packed week? Monday 7th - Friday 11th March 2022 9:00 - 17:00 CET Monday 14th - Friday 18th March 2022 8:00 - 16:00 EST Full details at https://learn.xtseminars.co.uk",https://www.xtseminars.co.uk/post/introduction-to-the-future-of-identity-dids-vcs,,Post,,Standards,,,,,,,,2022-03-02,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Pamela Dingle,,,,,,"To Understand WebAuthN, Read CredMan",take a cruise through the W3C Credential Management (aka CredMan) specification first. CredMan sets up the object model for the Credential object model that WebAuthN's PublicKeyCredential extends.,"The holidays are well and truly over, time to get serious - now is the perfect time to read specifications! If you are planning to read the WebAuthN specification, you can ease into the terminology in a simple way - take a cruise through the W3C Credential Management (aka CredMan) specification first. CredMan sets up the object model for the Credential object model that WebAuthN's PublicKeyCredential extends. This post will be an overview of the CredMan spec, geared for folks who want to call the API as clients, not for those few and proud who are tasked with implementation of the API within a user agent. #IdentityStandards CredMan Base Definitions CredMan unsurprisingly centers on the concept of a Credential. Actions on Credentials are requested by a relying party using JavaScript and fulfilled by a user agent (generally a browser). Credentials can be created stored, retrieved for validation by a relying party and so on. In addition to actions, CredMan defines standardized dictionaries that communicate context. Note that the CredMan API itself does not use the term ‘relying party’ but instead refers to a developer that would write code using the navigator.credentials JavaScript control. Since we are identity architects, we will assume that developed code is deployed and running as a service at a specific origin and that the developed code will call the CredMan API as part of user registration and authentication activities. The CredMan API defines three actions (and an object constructor): Initialize - Credential objects are instantiated with an id and type. Additional parameters can be specified by populating the CredentialData data dictionary. create() - The relying party instructs the user agent to make a new credential, based on parameters from the CredentialCreationOptions data dictionary. In the base definition the CredentialCreationOptions data dictionary is empty. get() - The relying party instructs the user agent to retrieve an existing credential, based on parameters in the CredentialRequestOptions data dictionary. In the base definition, CredentialRequestOptions contains the mediation parameter: it allows a relying party to instruct the user agent whether it must, may, or must not interact with the end user to gain explicit consent for the action. The mediation default value is ‘optional’, meaning that the relying party leaves the decision up to the user agent. store() - The relying party instructs the user agent to directly save an already instantiated credential object – for example a credential that might have been returned via a get() call and subsequently altered. In addition to the task-specific actions and data dictionaries noted above, one data dictionary is defined that can be optionally added to any of the other dictionaries: CredentialUserData - Describes human-friendly information that the user agent could pass on to help a user properly identify a credential. The base values include a name and an icon URL. User Mediation CredMan defines an action as user mediated if the action “takes place after gaining a user’s explicit consent”. Choosing an account from a credential chooser during a get() or confirming storage of a credential both count as user-mediated actions. Origin-bound credentials require user mediation by default, meaning that user agents must interact with the user in some decision-oriented way before taking actions like storing or retrieving credentials – however in the interests of creating a user experience that is contextually intelligent, options exist to change the circumstances where user mediation takes place: - A user agent might offer to persist consent for ongoing use of a credential. In this case, access is considered to be “silent” and the action is considered un-mediated. - Relying parties can choose to require mediation, require silent operation, or let the user agent choose. The mediation property in the CredentialRequestOptions structure is used for this purpose. - By default, CredMan does not forbid a user agent to offer to persist consent for credential creation, however there are obvious risks, and the specification lays out recommendations for informing end users in the case where a new credential is silently created. Types of Credential CredMan extends the credential base class for two uses: Passwords and Federation discovery information. The W3C WebAuthN spec adds a third extension for cryptographic public keys. PasswordCredential When a relying party invokes the PasswordCredential interface, the user agent engages to create, retrieve, or change a shared secret stored in a credential store such as a password manager. The credential store might be built into the user agent or might just be facilitated by the user agent. Note that the relying party does not need to know anything about how the credential store works, or even which user agent has been invoked. That is the beauty of standards! There are a ton of great examples in the spec, take a look through the link above. Here is a quick summary of how PasswordCredential modifies the Credential base class: - The PasswordCredential object now contains a string for password storage, and defines two different constructor instances – one that will take an HTML form element as input and one that takes a PasswordCredentialData object as input - CredentialRequestOptions (used to retrieve a password) gets an extra Boolean called password that is set to false by default. The password boolean must be explicitly set to true by the relying party for the user agent to know the relying party is ok with a PasswordCredential coming back from the call. - CredentialCreationOptions gets a new object called password of type PasswordCredentialInit, which can be either a PasswordCredentialData object or an HTMLFormElement object. PasswordCredentials are origin bound - the user agent stores a different shared secret per web origin, and takes on the responsibility to ensure that the secret-origin relationship is enforced. FederatedCredential Interacting with federated credentials also involves talking to a credential store, but instead of managing a shared secret, the user agent stores and retrieves the metadata necessary to initiate a federated authentication request to a remote identity provider. Whenever the user agent can supply a FederatedCredential to a requesting relying party, the relying party can bypass Identity Provider discovery prompts such as NASCAR interfaces and immediately initiate federation. This can represent a significant improvement in end user experience. FederatedCredential details of note: - The FederatedCredential object stores two new properties: provider and protocol. This creates the same internal construction as the FederatedCredentialInit data dictionary. - A property called “federated” is added to the Request Options, of type FederatedCredentialRequestOptions: It contains two lists: providers and protocols. The federated property must be present in some form for a FederatedCredential to be returned, and the relying party can additionally populate 0 or more specific protocols or providers to be more specific about credentials they will accept. FederatedCredential is also origin-bound. Stay Tuned for PublicKeyCredential Now that you see how PasswordCredental and FederatedCredential work, imagine the same paradigm applying to a cryptographic key, where a client can request a user agent to create or retrieve a public key from a user-controlled cryptographic platform or device. W3C WebAuthN takes on this challenge. Stay tuned for the next blog entry, where we will describe how WebAuthN extends the Credman API, and how a very small number of options and methods can grant relying parties access to a vast and growing set of highly secure user authentication options. #DeploymentLandscape Edge browser doesn’t support either PasswordCredential or FederatedCredential but both Firefox and Google have support, you could try this demo app. Google has useful sample code here and Mozilla’s API documentation is here.",https://techcommunity.microsoft.com/t5/identity-standards-blog/to-understand-WebAuthN-read-credman/ba-p/339652,,Post,,Standards,,,,,,,"WebAuthN,CredMan",2019-02-15,,,,,,,,,,,,,
|
||
Microsoft,Microsoft,,ID Standards Blog,Pamela Dingle,,,,,,Why WebAuthN will change the world,"With WebAuthN, any web entity can call a simple Javascript API and ask for a cryptographically secure credential. What happens next is pretty cool – the world’s browsers have worked with the world’s operating system makers and the world’s hardware manufacturers, so that when a website asks for a credential, the browsers work with the underlying platform to securely locate compliant local hardware and talk to it!","A little over a month ago, W3C WebAuthN became a real internet specification. Most of you don’t know what WebAuthN is yet, but many of you will feel the impact in short order. In fact, I will go so far as to say that WebAuthN may change how we all authenticate to the resources we use every day. We live in a world where the best parts of our individual local hardware and software collection are rarely leveraged to make cloud security decisions. This is because there has never been a vendor-agnostic and privacy-preserving way for cloud resources to interact with individual hardware configurations in any generic way. Until now! With WebAuthN, any web entity can call a simple Javascript API and ask for a cryptographically secure credential. What happens next is pretty cool – the world’s browsers have worked with the world’s operating system makers and the world’s hardware manufacturers, so that when a website asks for a credential, the browsers work with the underlying platform to securely locate compliant local hardware and talk to it! All of a sudden, there is a way for all the devices close to us to speak for us. Whether it is my fitness device, a built-in fingerprint reader, a soft token or a roaming security key, we now have credible alternatives for passwords, because the very proximity of my device makes it hard for an attacker to subvert, my devices need to be either built-in, plugged in, or wirelessly connected. The ratification of WebAuthN is only a first step. While we have agreement on how we can leverage what is locally connected, deployment is still ongoing and it will take time for all the pieces to be available in a way that can be used anywhere, by anyone. One day, your individual collection of devices will form a flexible, recoverable set of ‘authenticators’ that make it very easy for you to get to your cloud resources. We won’t overwhelm you with technology, but rather use what you already keep with you every day. The most amazing thing about WebAuthN (and companion specs also ratified at the FIDO Alliance) is how many different companies have had to form consensus before this specification could exist. It has been seven years of debate, proposals, interops, working group meetings, editorial tweaks, liaison work with other specifications, evangelism and working code to get us where we are. Whatever happens, keep an eye out for W3C WebAuthN and FIDO2. And raise a glass to your neighborhood standards engineer, they deserve it. Cheers, and congratulations on your ratification, W3C WebAuthN and FIDO2 CTAP2!",https://techcommunity.microsoft.com/t5/identity-standards-blog/why-WebAuthN-will-change-the-world/ba-p/482286,,Post,,Standards,,,,,,Javascript,WebAuthN,2019-04-19,,,,,,,,,,,,,
|
||
MyDex,,MyDEX,,William Heath,,"United Kingdom, England, London",Europe,,,MyDex,"When Mydex came into being in 2007, its founders made a number of important decisions — these decisions are what make Mydex remain unique even in today's blossoming Personal data ecosystem. It had to be free to individuals for life, the data had to be under their control, and Mydex as a company had to be self-sustaining and protect its core values. These prerequisites guided the evolution of Mydex: a Community Interest Company, a Social Enterprise and its range of trust platform services for citizen controlled storage and exchange of Personal data, identity and engagement, in a safe, secure and easy manner. We are working to improve outcomes for individuals and organisations alike.","Mydex CIC helps individuals and service providers improve their handling of Personal data. Our Personal data stores equip individuals with tools to collect, store, use and share their data to manage their lives better. They also help bona fide service providers reduce data processing costs, improve service and innovate. As a Community Interest Company we are legally committed to pursuing our mission of empowering individuals with their data. We are currently helping individuals and service providers use Personal data to better manage chronic health conditions, access debt advice, improve independent assisted living and assure identities. We are working with governments, local councils and communities to improve access to and increase the value delivered from public services. We plan to do much, much more.",https://mydex.org/,,Company,,Company,Enterprise,,,Personal Data,,,,2008,,https://twitter.com/MydexCIC,https://www.youtube.com/user/Mydexcic,https://medium.com/mydex,https://medium.com/mydex,,https://www.crunchbase.com/organization/mydex,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,A critical fork in the data road?,"In its discussion of data portability the EU rightly recognises the economic importance of this issue, stressing that “market imbalances arising from the concentration of data restricts competition, increases market entry barriers and diminishes wider data access and use.”","A critical fork in the data road? We have been talking with the EU about some research they are doing into the role of ‘smart contracts’ in data portability. We won’t go into the details of that particular discussion here, but it raised some bigger questions that we think are worth sharing. This is an edited version of a document we sent them. Is the EU discussion about data portability missing a key point? In its discussion of data portability the EU rightly recognises the economic importance of this issue, stressing that “market imbalances arising from the concentration of data restricts competition, increases market entry barriers and diminishes wider data access and use.” However, the way it has framed the issue of ‘control’ of Personal data renders the biggest breakthrough opportunities for economic transformation — in productivity, service quality and innovation — largely invisible. Hidden in the details of data portability lies the potential for organisations to deposit ‘verified attributes’ or ‘verified credentials’ in individuals’ Personal data stores (sometimes called ‘wallets’). These verified attributes confirm data that has been carefully checked about individuals. Because they are cryptographically secure, they cannot be tampered with. When verified attributes are deposited in individuals’ Personal data stores, it becomes possible for individuals to bring this pre-verified data with them to their dealings with other service providers. These other service providers can rely on this data without having to regenerate it or check it. This process greatly speeds up completion of data-driven tasks and eliminates friction, effort, risk and cost from every step of Personalised service provision across every service dealing with individuals, including public administration, financial services, health, education, retail, transport, media and leisure. The closest economic parallel to this is the productivity revolution ushered in by Henry Ford’s moving assembly lines for the production of automobiles. Verified attributes are the standardised parts of service provision and Personal data stores are the assembly lines. Ford reduced the costs of making a car by over 90%. Similar productivity breakthroughs in service provision are being made possible by the portability of verified attributes. In addition, the ability to make individuals the point at which information about themselves is gathered is creating powerful new-to-the-world person-centric data assets. Currently, individuals’ data is dispersed across the hundreds of different organisations that collect data about them. Enabling this data to be unified in the individual’s Personal data store, under the control of that individual, is creating a data source whose richness surpasses any data asset ever created — while fully protecting individuals’ privacy. The economic potential of these new person-centric data assets is immense. As long as these two opportunities — of verified attributes and of individuals as the point of integration of their own data — remain overlooked, EU discussions about data portability and ‘control’ over data risk missing the economic opportunities that could be opened up. Two meanings of ‘control’ In this context, it is crucial that the EU recognises there are two distinct and different meanings to the word ‘control’ as it relates to individuals ‘controlling’ their data. The first, very limited, meaning relates to individuals exercising more control over the data that organisations collect about them. The second, broader, more expansive meaning of control is individuals being able to collect, store, use and share their own data for their own purposes — to make better decisions and to manage their lives better — independently of any organisation that may be collecting data about them. This subsumes the first, limited meaning of ‘control’ as one small part of a much bigger process. Only if the second, broader, more expansive meaning of the word ‘control’ is embraced will the full economic potential of data portability be unleashed. In this context, focusing only on the first, narrow meaning of ‘control’ is very limiting. The real defining characteristic of PIMS (Personal Information Management Services) is not the issue of control (which is just a means to an end). It is individuals’ ability to use their own data for their own purposes, e.g. to manage their lives better. Two forms of portability The two meanings of the word ‘control’ imply two different forms of data portability, requiring differing data sharing infrastructure. A narrow, limited, interpretation of data portability assumes that data about an individual will be transferred from one data holding organisation to another. This is a strategic non-starter for two reasons. First, it ignores the mathematics of networks which mean that, as such a system scales, it quickly generates a cost and complexity catastrophe. Figure 1 below illustrates this point. In a one-to-one data sharing network, as each new node joins the network the number of connections needed to connect them multiplies by the number of nodes in the network (whereas, it the sharing is done via a hub like a Personal data store, adding a node adds just one new connection). With a network of just three participants, direct sharing is fine. But with just eight participants, the number of connections that are needed has already jumped to 28. And by the time it gets to 50 participants, the number of connections needed has jumped to 1225 (whereas, with the individual’s Personal data store at the centre, the number of connections that are needed has now risen to 50). This complexity catastrophe unfolds on multiple fronts, including: - Security, as each organisation exposes its systems to other organisations - Interoperability, as each organisation has to learn how to deal with other organisations’ different software formats, standards, and so on. - Governance, as each organisation has to check whether another connecting organisation is bona fide, and really has permission to access the data - Data protection, as individuals lose sight of who holds their data for what purposes In short, a strategy for data sharing which simply extrapolates forward from today’s reliance on organisation-centric databases is a certain recipe for a catastrophically costly, toxic snarl-up. At the same time, if this approach is adopted, individuals never have the opportunity to take control of their data as in the second, broader, more expansive meaning of the term. With organisation-centric data portability individuals are continually excluded from the workings of the data economy, which remain firmly in the hands of data holding organisations. The second interpretation of data portability is where copies of the data held by organisations, (including verified attributes) are deposited in the individual’s Personal data store, so that they can use their data independently of the organisation that originally generated this data. This requires new infrastructure — empowering individuals with their own Personal data stores. This second approach to data portability has been explicitly recognised by the EU — by the Article 29 Working Party Guidelines on the right to data portability (5 April 2017) which stated that: “Data subjects should be enabled to make use of a Personal data store, Personal information management system31 or other kinds of trusted third-parties, to hold and store the Personal data and grant permission to data controllers to access and process the Personal data as required.” The take-out from this is simple. If the EU overlooks the pivotal importance of new enabling infrastructure for data sharing it risks a) sleepwalking into a data sharing complexity catastrophe and b) missing the true economic potential of data portability. Distinctions between Personal data stores and PIMS The two interpretations of ‘control’ and the two forms of data portability as outlined above raise further questions about the EU’s understanding of the PIMS concept. As stated above, in our view, the unique contribution of PIMS is to help individuals to use data to manage their lives better: the focus is on ‘use’. Examples of ‘using’ data may be making and implementing decisions about Personal finances; managing a health condition better, or laying out a Personal skills, training and career plan. All such services require data to be aggregated from many different service providers. Personal data stores do not, per se, provide such services. They are Personal data logistics enablers, providing the infrastructure that enables the necessary data collection, storage and data sharing for the provision of such services. A PDS enables data to be held by individuals independently of particular service providers and enables the maximum reuse of this data by many different services. To that degree, we see Personal data stores and PIMS as being part of the same movement towards citizen data empowerment but with different roles and functions — one enabling data access, the other helping better use of the data once it has been accessed. PIMS need Personal data stores to function more efficiently (and if PDS infrastructure is not used, it will involve PIMS in huge amounts of extra, duplicated effort, the costs and complexity of which could stifle the growth of the sector). Conclusion The gist of our input was that the EU needs to build the full economic opportunity into its discussions about data portability. To do so, it needs to grasp the differences between two meanings of control and the two forms of data portability along with the need for new citizen empowering data infrastructure. Let’s see what they come up with.",https://medium.com/mydex/a-critical-fork-in-the-data-road-1eb29c5a42a8,,Post,,Ecosystem,Public,,,,,Portability,,2020-09-22,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,National Data Strategy,,Achieving Change At Scale,"This is the third in a series of blogs providing edited extracts from Mydex CIC’s response to the UK Government [consultation around a new National Data Strategy](https://www.gov.uk/government/consultations/uk-national-data-strategy-nds-consultation). The first focused on [how to unleash the full potential of Personal data](https://Medium.com/MyDex/how-to-unleash-the-full-potential-of-data-3676db8d7c03), the second on why [every citizen should be provided with their own Personal data store](https://Medium.com/MyDex/how-to-unleash-the-full-potential-of-data-3676db8d7c03). This blog explains why this strategy can be quick and easy to implement.","Achieving Change At Scale This is the third in a series of blogs providing edited extracts from Mydex CIC’s response to the UK Government consultation around a new National Data Strategy. The first focused on how to unleash the full potential of Personal data, the second on why every citizen should be provided with their own Personal data store, common misconceptions that derail progress, and the scale of the social and economic opportunity. This blog explains why this strategy can be quick and easy to implement. To catch up on progress on our Macmillan My Data Store Pilot click here. In some peoples’ minds, the idea that every citizen should be provided with their own Personal data store generates visions of massively costly, risky and time-consuming IT projects that invariably overrun in terms of both time and money while failing to deliver their promises (the sad but common experience of many centralised Government IT projects). Providing individuals with their own Personal data store is not one of these projects. In fact, because it implements a completely different model, it avoids these problems. How to make it happen Providing every citizen with a Personal data store does not require a massively high cost, high risk IT procurement process. To the contrary, the strategy can be pursued in a way that minimises costs, risk and disruptions, and builds momentum incrementally. For example, the strategy identified by the Scottish Government: - Builds on what already exists. Organisations already hold huge amounts of data about citizens, including verified attributes. All they have to do is electronically share some of this data when requested. Providing individuals with Personal data stores does not stop organisations from collecting and using data to provide valuable services. Rather, it builds on their expertise and infrastructure to add a new, additional layer of capabilities and infrastructure. Verified Attributes are already used widely and frequently in the provision of public services. Citizens are already required to present proofs about themselves using documents provided by other parties (e.g. passport, driving licence, bank statement, official letter etc). Providing individuals with a Personal data store so that they can separately store and share Verified Attributes about themselves simply enables the same things to happen digitally, safely and securely in a more efficient manner. - Minimises risk and disruption. Providing individuals with a Personal data store does not require any significant changes to existing back office systems, or to organisations’ processes, culture, business models or operations. Personal data stores add a new connecting element that ‘joins the dots’ between previously separate data silos, where the Personal data stores acts as a node for citizen-controlled information sharing. This does not require the dots themselves to change what they do or how they operate. - Builds momentum automatically and incrementally Roll-out can proceed incrementally, taking one particular service at a time, allowing for a test-and-learn approach that builds momentum and impact over time. This avoids big leaps into the unknown. Incremental adoption can be built-in to small, additional process changes. For example, by minting digital copies of birth certificates during the certificate creation process and placing them into the child’s attribute/Personal data store, the Government could create a system that provides every individual with a Personal data store from birth, and which builds momentum over time. - Generates compelling win-wins to gain active buy-in. Using Verified Attributes to improve the quality and cut the costs of public service provision generates powerful win-wins between service providers and citizens (who will find applying for and accessing these services much quicker and easier than before). As the richness of the data held in each individuals’ Personal data store grows, incentives for other organisations to connect to the system also grow. - Sets the right course for the future, pump-priming further developments. Even as this approach enables improvement of prioritised services now, along the way, it also builds the infrastructure and capabilities to further improve these and other services as momentum builds. Central Government’s role In saying that every citizen should be provided with their own Personal data store, we are not suggesting they should be provided directly by the Government via some new centralised, nationalised Personal data store authority. Multiple different, competing providers can and should offer Personal data store services, so long as they conform to an essential set of design principles (see below). The particular role that only the Government can fulfil is to break a collective action logjam that currently blocks progress. This collective action problem is simple to state: while every service provider using data wants to be able to access more reliable data, more quickly and cheaply, none has any immediate incentive in making the data they hold available to others. Why should Organisation A invest time and effort helping Organisation B cut their costs and improve their outcomes? But if all such organisations made key data points available via the citizen’s Personal data store, all of them would benefit from the ability to access this data. Government can, and should, break this logjam by mandating public services to share Verified Attributes for free via Personal data stores in public services. This approach can also solve some internal data logistics and distribution issues and reduce internal operating costs and help eradicate use of paper and contribute to net zero targets. Design principles For such a mass-scale Personal data store infrastructure to work at scale however, it needs to operate by design principles that ensure its integrity. Learning from experience since our founding in 2007 we have identified what these key design principles are: - Fiduciary duty Personal data stores should be required to work in the best interests of their clients — the citizens. - Neutral and enabling Personal data store infrastructure should not be designed or used so as to favour one party’s vested interests over another (e.g. one data controller or relying party). - Distributed, not centralised Each Personal data store should be uniquely and separately encrypted and held by individuals for individuals. They should not result in the creation of a new centralised database holding all citizens’ data. - Zero-knowledge operations Personal data stores should not look into or seek to influence how individuals use their data. Their job is to promote and enable citizen agency, not to seek to control or influence citizens in any particular direction. - Aligned incentives Personal data stores’ business models should be structured so that they have no financial incentives to monetise, manipulate or seek control over individuals’ data or their uses of this data. They should earn their money from infrastructure provision and related services. - Separation of storage from use Personal data stores should focus on the provision of infrastructure enabling citizens to collect, store and share their data and should not seek to use this data to provide data-driven services (e.g. to provide financial advice, treat a medical condition, or assess an individual’s eligibility for a service). By enabling service providers to access individuals’ data (with permission) the role of the infrastructure is to enable more service providers to better access and use more, richer data in order to provide better, cheaper services. - Committed to interoperability Citizens should not be ‘locked in’ to any particular provider of Personal data stores. It should be part of the duty and responsibility of Personal data stores to enable interoperability to enable sharing of data between individuals and organisations and to enable individuals to move from one PDS provider to another. - Universality Every citizen should have access to their own Personal data store (their own ‘running data’ as it were), just as every citizen should have access to ‘running water’ and ‘running electricity’. - Independence For privacy protection and independence reasons, Personal data stores should not be owned or controlled by the state or under the control of existing service-providing data controllers. - Built to last Personal data store providers should have built-in safeguards against changes to role, functions and priorities — for instance, in the case of a change of leadership, stock market listing, merger, sale, acquisition or bankruptcy. The EU has already embraced many of these principles in its new Data Governance Act. The UK should do the same. Unleashing a positive feedback loop Every now and again, as societies and economies evolve, a need arises for citizens to be equipped with new capabilities. In the late nineteenth and early 20th centuries a need emerged for public sanitation to secure public health and for all children to learn how to read and write. Similar advances soon followed: universal access to running water, universal access to electricity, and universal access to basic health services. Each of these advances improved Personal wellbeing, making individuals’ lives easier and better. At the same time they also generated powerful positive social/economic externalities: a healthy, educated workforce is much more productive than an unhealthy, uneducated workforce. Generating such positive feedback loops between Personal wellbeing and social/economic efficiency and productivity is the hallmark of successful popular Government initiatives. Government action was needed in these cases because only the Government was able to intervene at the level needed to break collective action logjams and to realise the positive externalities including the universality of provision upon which the positive externalities are based. The modern digital age has created the need for the Government to play a similar role again: to equip citizens with the data they need to manage their lives better and to undertake interactions and transactions with service providers. A new positive feedback loop between citizen wellbeing and economic efficiency and productivity is now waiting to be unleashed.",https://medium.com/mydex/transforming-the-system-a-roadmap-for-practical-implementation-411e8821ed19,,Post,,Ecosystem,Public,,,,,,,2022-01-04,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,AI: The Emperor’s New Clothes?,"One reason the UK Government wants to abolish citizens’ rights to data protection is to create conditions for Artificial Intelligence (AI) to blossom. This, it says, will “bring incredible benefits to our lives”.","AI: The Emperor’s New Clothes? One reason the UK Government wants to abolish citizens’ rights to data protection is to create conditions for Artificial Intelligence (AI) to blossom. This, it says, will “bring incredible benefits to our lives”. This third blog in our series prompted by the Government’s proposed reforms (‘Data: A New Direction’) examines the flaws in the Government’s arguments. - You need lots of data to ‘do’ AI. But it doesn’t have to be Personal data. It can, and should, be anonymised data. So there is no need to reform data protection law to promote AI’s potential. - Most of the hopes (and fears) pinned on AI, along with many of its mis-uses and abuses, stem from a basic misconception about what AI is. AI does not replicate or substitute for human intelligence. Calculation is not the same as perception. Computation is not the same as intelligence. The real power of AI comes from its ability to do things that humans can’t do, rather than replace what they can do. Unfortunately, the Government is compounding the misconceptions. We think AI has great potential … to do what it is good at. But in perpetuating misconceptions about AI, the Government risks compounding the damage done by mis-applications of AI’s potential while distorting policy decisions (including those relating to data protection law and investment priorities). This blog examines these misconceptions. AI and Personal data The Government is proposing to abolish citizens’ rights to data protection in all but name. Its main justification for doing so is the “incredible” benefits it believes AI will bring. But it does not need to ‘reform’ data protection law to realise these benefits. Many AI-based services such as driverless cars or automated online translation services don’t need any Personal data at all. Others, like facial and voice recognition, may generate new Personal data about the person whose face or voice it is but they don’t actually need Personal data to work. Yes, they need huge amounts of data to identify the patterns needed for recognition. But they don’t need to know whose faces or voices they are: they need anonymised data, not Personal data. Many other AI-based services such as medical diagnoses or predictive modelling also need huge amounts of data to generate their insights and predictions and often this data starts out as Personal data (e.g. about identifiable people). But again, AI does not need to know who these people are to work. The data can and should be anonymised. Of course, there are many technical questions relating to how data should be anonymised. Can it be easily de-anonymised, for example? But this doesn’t alter the underlying fact: these applications don’t need Personal data to work, so rules about the collection and use of Personal data aren’t relevant to it and shouldn’t be allowed to be made relevant by using Personal data when they shouldn’t. The only time when Personal data does become important is if an organisation acts on the outcomes of AI’s number crunching when dealing with an identifiable individual. But this has got nothing to do with AI itself. This is about service provision, when normal rules and safeguards relating to the use of Personal data in service provision should apply. In short, the Government’s proposals for data protection reforms as they relate to AI are without foundation. Misunderstanding AI Unfortunately, there’s more: the Government’s proposals are based on a definition of AI that is confused, at least partially wrong and potentially highly damaging. Here is how its ‘Data: A New Direction’ paper defines artificial intelligence. “The use of digital technology to create systems capable of performing tasks commonly thought to require intelligence. AI is constantly evolving, but generally it involves machines using statistics to find patterns in large amounts of data, and the ability to perform repetitive tasks with data without the need for constant human guidance.” [Paragraph 63 (a)] This definition has three parts. The middle part — “using statistics to find patterns in large amounts of data” — is spot on. The third part — “performing repetitive tasks with data without the need for constant human guidance “— is simply wrong. That’s what all computer software does. Automation and AI are not the same thing, and to suggest they are is a fundamental misunderstanding. But the real dangers lie in the first part: the implication that AI is replicating human intelligence, which it is not. It’s true: AI can be trained (with great difficulty) to do some things (like recognise faces) that we commonly associate with human intelligence. But AI doesn’t undertake these tasks in the same ways as humans (calculation is not the same as perception), and it can only do a very small proportion of the things that human intelligence can do. In fact, far from being different versions of the same thing, AI and human intelligence are near opposites. AI is really good at doing some things that human beings are really bad at (such as using statistics to find patterns in large amounts of data, which is what makes it powerful and useful). But it is terrible at performing most of the “tasks commonly thought to require [human] intelligence”. Promoting AI in the hope that it can become a substitute for human intelligence is a recipe for disaster. Many of its errors and abuses stem from this mistake. Unfortunately, the misconceptions on which this mistake is based are now extremely common — which is why we take time out in this blog to explore them. AI finds it really difficult to ‘see’ things which we humans see instantly and automatically This is called the ‘bundling problem’. When we see a cup of tea on a saucer with a spoon on the side, that’s what we ‘see’: a cup of tea on a saucer with a spoon on the side. AI doesn’t see anything of the sort. It ‘sees’ a jumble of thousands of different pixels represented by different data points. A complete blur. (To see what AI ‘sees’ when looking at a picture of a panda, take a look at this article.) To see anything in this blur, AI has to do an enormous amount of computation, comparing one data point to another. It takes an immense amount of effort and training for AI even to ‘see’ an ‘edge’ (because that implies it can see some ‘thing’ which has an edge that separates it out from other ‘things’). It’s almost impossibly difficult for it to understand that the spoon and the saucer are different things to ‘the cup’. This is because, to it, they are all part of the same bundle of data points and it has no reason to unbundle them with no logical basis for doing so. Also, it does not have any ‘history’ or experience outside of what it has been fed by AI engineers or has already computed. It does not have the cultural or experiential history to understand the concepts ‘cup’, ‘saucer’ or ‘spoon’. We humans can do these things so quickly, automatically and effortlessly that we don’t even realise we are doing them. This is testimony to billions of years of evolutionary history — which AI does not have. AI does not have ‘general’ intelligence Much has been made recently about AI’s ability to ‘learn’ using things like neural networks. These feats are indeed impressive. But they are tightly constrained, because AI cannot apply what it has learned in one task to another. When Google’s Deep Mind project announced that its AI could beat a human expert at the game of Go it provoked a flurry of excited speculation. What next? But Google’s AlphaGo only works on a board of 19 x 19 squares. If it is asked to work on a board of 18 x 18 it is stumped, having to start all over again. It cannot ‘apply’ what it has learned from one board to another, never mind from one complex situation to another. This is no real problem for humans whose evolutionary survival rests on the flexibility and adaptability that the ability to apply lessons ‘learned’ from one experience to another brings. An ability that AI doesn’t have. AI cannot use analogy or metaphor Humans are instinctively able to see likenesses in things that are very different, and use this capability to aid their understanding. If I say “March came in like a lion and went out like a lamb”, you instinctively understand the implied references to ‘roaring’ and ‘fierceness’ as opposed to ‘gentleness’. For AI, this is a massive barrier. Because AI doesn’t work in the currency of concepts, only in correlations between data points, it cannot use analogy and metaphor to reason. AI cannot imagine or create Humans don’t only think about what is. They also think about what is not but could be. They use these imaginings to better understand their current situation and to see ways to change it. Because AI can only work with the data it has been fed with (it cannot crunch data it hasn’t got) it cannot imagine what could be. It is indeed possible for AI to give the impression that it is being ‘creative’ by acting on an instruction to generate new, random combinations of data. This can be useful. But it is mimicking the outputs of imagination, not recreating the processes behind it. AI cannot distinguish between correlation and cause and effect Humans instinctively reason by linking causes to effects (to the point when often they see causes or effects that aren’t actually there). But AI, which has no experience of living in a world where ‘causes’ have ‘effects’, can only identify correlations. (And, by the way, the bigger the data set, the more ‘correlations’ it will identify, most of which will be meaningless). AI cannot ‘level jump’ AI cannot stop to think. It cannot ‘step outside’ of what it has been programmed to do to ask: “hang on, am I asking the right question?”, or “hold on, does this really make sense?”. This is what makes ‘adversarial examples’ so dangerous in AI training. Adversarial examples can happen when someone changes just one pixel in a series of images to interfere with the correlations that are being computed. If one pixel is changed in a picture, humans see the same picture. But AI might see something entirely different: a computer instead of a cat; a gibbon instead of a panda. Crucially, AI has no way of knowing that it is making this mistake and carries on regardless — which can be incredibly dangerous. AI cannot switch attention to something more important AI cannot stop to think “hold on, should I really be focusing on this right now? Shouldn’t I be doing X instead?” Because AI doesn’t understand concepts (and has to be trained through millions of trials and errors to even ‘see’ a cat), and because it can only see correlations but not their relative significance, it cannot understand that one task may be more important than another — or even be aware that another, different task may exist. (It may be trained by human beings to allocate scores to things within a task, but it cannot make judgements as to the value or importance of different tasks.) AI cannot sense or respond to anything outside of the data sets it is provided with Humans have multiple different senses which are ‘always on’ — such as sight, sound, smell, touch, taste and so on. Our brains are monitoring our environments all the time to see if anything ‘out there’ needs paying attention to. If there isn’t, our brains relegate this vast inflow of information (billions of bits a second) to the unconscious so we can focus our attention on the task at hand. Psychologists are still at a loss as to how we process multiple incommensurate senses (such as sight plus sound) to create single experiences. This ability to sense a background environment, to make constant judgements as to its significance, and to switch attention to another task is way beyond AI. Instead, for AI to work its wonders, it has to be fed with huge amounts of data by human beings, separately, task by task — by human beings who choose what data to feed it with, and who pre-empt the calculations that AI can make in doing so. AI does not have an experiental ‘history’ to draw upon like humans do. Take that cup of tea, for example. We humans know that if we turn a cup of tea upside down, the tea will pour onto the floor. AI doesn’t know this because it has never experienced Gravity — a piece of knowledge that we humans simply take for granted. But, you might say, AI Personal assistants such as Siri and Alexa are ‘always on’, listening for the words that will ‘wake them up’. But that confirms the key point: they have been carefully trained to do this one, small, particular task. It’s testimony to the brilliance of AI engineers that they have managed to create driverless cars. But no piece of AI has the ability to decide that the car should change its destination, on the basis of new, external information (say that a friend has just been taken ill). It needs a human being to tell it to do this. AI cannot feel emotions and or display human empathy It goes without saying that, as a machine, AI does not have emotions. It cannot feel pleased, excited, or hurt. It cannot use such information to modulate its actions. Ai cannot use norms or values to assess situations or decisions As a piece of software, AI is not like a human being who is embedded within a society and culture and acutely aware of the norms or values by which this culture operates. For this reason, AI cannot make human decisions which are always, to some degree or other, value judgements. It can only make logical decisions, based on criteria programmed into it by human beings. But ‘Ah!’ you might say, feeling emotions or making value judgements have got nothing to do with how a piece of computer software goes about its business. But that is precisely the point. Real life human intelligence is embedded within and awash with emotional impacts and ethical and moral judgements. They are an essential part of how humans make decisions about what to do and what not to do. Indeed, when humans put their intelligence to work, much of the information they are dealing with is about other people’s actions, intentions, attitudes, beliefs, motivations, concerns and so on. Human intelligence is recursive (‘I know that you know’), not linear; social, not abstractly logical. As soon as the task at hand requires any assessment of any of such human factors, AI systems are at a loss. AI can only solve a tiny proportion of the problems human intelligence has to deal with AI works by making calculations, which means it can only find solutions to questions that can be computed. But a surprisingly small proportion of real world problems are actually computable. AI can only work with problems that are ‘algorithmically compressible’. That is, with problems that can be finally and completely solved by a finite series of logical ‘if-then’ questions. But the majority of the most important issues that humans have to apply their intelligence to are not ‘algorithmically compressible’: they do not have a definite answer that can be calculated. They involve judgement. In fact, for AI to be able to come up with its clever answers, the questions it is asked and the ways of addressing them must display key criteria that makes them calculable: e.g. with a clearly defined goal, a clear end point for the calculation, and a clearly defined way of getting there. That’s why AI promoters use games like chess and Go to sell their wares. Yet many of the issues taxing human intelligence have neither clear goals, nor clear end points, nor clear rules for reaching them. They are ‘wicked’ ‘infinite’ (i.e. forever unfolding) problems where, for example, it’s not actually clear whether we are asking the right question in the first place, whether we can rely on the information we’ve got, where the people involved have different perceptions, motives and goals, and where the ‘rules of the game’ may suddenly change half way through. AI cannot answer the question “what is the best way to implement AI?” because this is a ‘wicked’ problem, not a computable one. Artificial stupidity? For hundreds of years, in their quest for immortality Chinese alchemists sought to perfect the art of embalming — because, they believed, the closer they got to mimicking the appearance of life they closer they were getting to life itself. Today, AI is being used in many clever ways to mimic the appearance of human intelligence. But that doesn’t mean it is intelligent. It is as close to intelligence as embalming is to life. If we put all the limitations discussed above together, AI is actually quite stupid. When it comes to things like understanding common sense (meaning, significance, cause and effect), social relationships, or the physics of Gravity it is much less intelligent than a toddler (or what some AI researchers call infant metaphysics), which is why it makes so many awful mistakes. The fact that it is now common practice to equate AI with human intelligence, as the Government does in its proposals, does not mean these claims are right. It simply means many people are repeating a mistaken conventional wisdom, just as centuries ago millions of people sincerely believed they were at risk of harm perpetrated by witches’ use of black magic. Fact is, the term ‘artificial intelligence’ is not a description. It’s a marketing slogan designed to sell a product. After all, would people talk about AI in hushed tones of awe if it was simply called ‘large scale number crunching’ or ‘Artificial Stupidity’ (which it often is)? Don’t get us wrong. We like AI for what it can do. As a means of deploying clever statistical methods to find patterns in large quantities of data, AI can be extremely useful in tackling a wide range of tasks. We at Mydex are embracing AI to help individuals make better decisions and manage their lives better. But we like it for what it can do, not for what it can’t. The AI hype we see today is more of a sociological phenomenon than a technological one — similar to the railway mania that gripped Britain in the 1840s. At that time, this awe-inspiring new technology seemed to promise spectacular social and economic opportunities — and financial returns. Soon, every hairbrained venture with the word ‘railway’ in it was attracting huge amounts of money, only for thousands to lose their fortunes when the bust came. The same is happening now, except that the trigger word is ‘AI’ rather than ‘railway’. AI mania is dangerous. Precious time and money are being squandered on inflated dreams and expectations. Vested interests are stoking the bandwagon for their own purposes. People are using AI to make bad decisions. And policy is being distorted to fit the hype (e.g. dreams of unprecedented ‘innovation and growth’), not the reality. That is what the UK Government is doing now, with its proposal to abolish the existing citizen right to object to automated processing of their data “if this processing significantly affects them”. Which raises the issue of consent: the subject of our next blog. Other blogs in this series are: - Why is Data Valuable? Exposes misconceptions about the value of data, and therefore where its biggest economic potential lies. - Five Myths about Data, Innovation and Growth Explains why the Government’s claims that its ‘reforms’ will promote innovation and growth are without foundation. - We Need To Talk About Consent Explains why ‘consent’ to data processing got to be a problem and how to address it — without having to change the law. - Data: A New Direction — But Which Direction? Analyses the core proposals made by the Government’s initiative to ‘reform’ data protection regulations.",https://medium.com/mydex/ai-the-emperors-new-clothes-91de9eed3650,,Post,,Ecosystem,Public,,,,,,,2020-11-12,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium, Alan Mitchell,,,,Data: A New Direction,,Data: A New Direction — But Which Direction?,This is the fifth and final blog in our series about the UK Government’s proposals for data protection reform — “Data: A New Direction”. Previous blogs focused on the thinking behind the proposals. This blog summarises what the main proposals are.,"Data: A New Direction — But Which Direction? This is the fifth and final blog in our series about the UK Government’s proposals for data protection reform — “Data: A New Direction”. Previous blogs focused on the thinking behind the proposals. This blog summarises what the main proposals are. Stated plainly, the UK Government is planning to end data protection rights for UK citizens. Reforms proposed in its paper Data: A New Direction would shift the core operating principle of data protection regulations from citizen protection (that Personal data shall only be collected by organisations “for specified, explicit and legitimate purposes”) to a new principle that organisations should have the right to build and maintain databases about citizens without their consent. This Briefing Paper shows how the Government is planning to achieve this radical ‘new direction’ for data. (Paragraphs 57 and 58 of the Consultation, around which this ‘New Direction’ pivots are reproduced in the Addendum.) Background The Government is taking the opportunity of Brexit to ‘reform’ data protection law. “Now that we have left the EU, we have the freedom to create a bold new data regime,” says the Minister in his introduction. The stated intention of this “bold new data regime” is to “usher in a new golden age of growth and innovation right across the UK”. This, to be achieved by creating “a better balance between protecting individuals and not impeding responsible data use” [Paragraph 59] — a ‘better balance’ that ends citizen data protection rights in all but name, replacing them with corporate rights instead. The Minister’s introduction states that “The protection of people’s Personal data must be at the heart of our new regime. Without public trust, we risk missing out on the benefits a society powered by responsible data use has to offer.” But the content of the actual proposals do the opposite. What the law currently says The core principle of existing GDPR data protection regulations is that Personal data shall only be “collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes”. A key supporting principle is that of data minimisation: that Personal data shall be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed”. There are six conditions for the lawful processing of Personal data but the two central ones are that: - the data subject has given consent to the processing of his or her Personal data for one or more specific purposes; - processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract; Conditions for lawful processing envisage situations where “processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party”. But these legitimate interests come with a ‘balancing test’ to test whether they should be “overridden by the interests or fundamental rights and freedoms of the data subject which require protection of Personal data”. What the UK Government is proposing On the grounds of addressing ‘consent fatigue’, the Government is proposing to: “create a limited, exhaustive list of legitimate interests for which organisations can use Personal data without applying the balancing test in order to give them more confidence to process Personal data without unnecessary recourse to consent.” [Paragraph 57] Paragraph 57 adds that “The processing would still have to be necessary for the stated purposes and proportionate.” But the ‘exhaustive’ list of exceptions to this rule provided in Paragraph 58 is so broad that organisations would have the right to process Personal data without individuals’ consent in the vast majority of use cases. In other words, Paragraph 57 renders the safeguard meaningless, leaving it for window dressing only. The pivotal clause is Paragraph 58 (i), which makes “Managing or maintaining a database to ensure that records of individuals are accurate and up to date, and to avoid unnecessary duplication” a ‘legitimate interest’ where organisations need not seek individuals’ consent for the processing of their data. This deft wrecking amendment turns the current rule — that Personal data should only be processed for specified, explicit purposes — on its head, promoting organisations’ right to collect data about individuals without their consent instead. No limit or restriction to organisations’ right to collect data for their databases is mentioned. The rest of the consultation extends the exceptions to cover most uses to which organisations put data, including for research, business innovation, web analytics, AI training, use of cookies, and data sharing. For example, Paragraph 58 (i) is supplemented by Paragraph 58(h) which includes “Using Personal data for internal research and development purposes, or business innovation purposes aimed at improving services for customers”. No definition for ‘internal research’ or ‘business innovation’ is provided, making them vague enough for Cambridge Analytica to claim that its activities were entirely lawful. Paragraph 58(c) has another exception. “Monitoring, detecting or correcting bias in relation to developing AI systems” would now also be a ‘legitimate interest’ where individuals’ consent for data processing is no longer needed. This might seem innocent. Desirable even. But practically speaking, the best way to use data to eliminate the risk of bias is to have as comprehensive and complete a database as possible — which means this clause could (and will) be used by some corporations as good reason to collect all possible data about every individual: it opens the door to total data surveillance. (It is also based on a false premise: that the main cause of bias in AI deployments is lack of access to data. This is not true. The main cause of bias in AI is poor management of the processes for testing and deployment: a people thing, not a data thing.) That this is the direction of travel intended by the Government is confirmed by other proposals which include unspecified measures that would “permit organisations to use Personal data more freely, subject to appropriate safeguards, for the purpose of training and testing AI responsibly” [Paragraph 81] and to ‘disapply’ individuals’ current right to object to automated processing [Paragraph 48] . The Government continues in the same direction with its proposals to reduce ‘consent fatigue’ as it relates to cookies. Paragraph 195 proposes to “permit organisations to use analytics cookies and similar technologies without the user’s consent.” (The term ‘analytics cookies’ is used in a variety of different ways without a single, clear definition.) Paragraph 197 would “permit organisations to store information on, or collect information from, a user’s device without their consent for other limited purposes” where ‘other limited purposes’ “could include processing that is necessary for the legitimate interests.” (In other words, the free-for-all created by Paragraph 58). Question 2.4.4 simply asks “To what extent do you agree that the requirement for prior consent should be removed for all types of cookies?” Again, a door opener to total data surveillance. Meanwhile Paragraph 43(b) seeks to expand the grounds for lawful processing of Personal data to include ‘research purposes’, with no stipulations on what may be included or not included in the definition of ‘research’. Implications The purpose of these ‘reforms’ seems to be to create a completely ‘free market’ for the trading of UK citizens’ data, without their consent. The Ministerial introduction talks of “secur[ing] the UK’s status as a global hub for the free and responsible flow of Personal data” (Note the word ‘responsible’ again. See below). To this end, Paragraph 51(g) (part of an extended discussion on lifting restrictions on the ‘further processing’ of Personal data) notes that “Innovative data uses may involve sharing Personal data sets between different controllers”. This opens the door to corporations trading peoples’ data without their knowledge and behind their backs. To this end, the Government intends to clarify “When Personal data may be re-used by a different controller than the original controller who collected the data, and whether this constitutes further processing.” If this ‘clarification’ is based on the new definition of ‘legitimate interest’, it could make UK citizens’ data a globally traded commodity over which they have no say. In sum, the net effect of the new regulations would be to turn data protection regulation on its head, effectively removing all main citizen rights and giving organisations carte blanche to collect and use Personal data as they wish, without individuals’ consent, thereby opening the door to unrestricted data surveillance and value extraction. All in the name of ‘innovation and growth’. The regulatory environment As part of this ‘New Direction’ for data, the Government is also seeking to compromise the independence of the regulator — the Information Commissioner’s Office. Key elements of the extended discussion on this subject are the proposals [Paragraph 326] to “place a new duty on it [the ICO] to have regard for economic growth and innovation when discharging its functions”. The absurdity of this concept becomes apparent if it is applied to other areas of regulatory enforcement. Should the Health and Safety Executive or Trading Standards Officers ‘have regard for economic growth and innovation when discharging their functions’, ‘balancing’ the requirements of health and safety and honesty in trading against the ‘needs’ for economic growth? What such rules and regulations do is create boundaries that channel innovation and economic growth in a certain direction: a direction that protects health and safety rather than undermines it. Likewise with data protection. When it comes to data, the Government wants to compromise rules and regulations that channel innovation and economic growth in a direction that protects citizens’ data to take the nation in “a new direction” (the title of its Paper) — one that exploits citizens’ data instead. Should the ICO have held back on fines on Cambridge Analytica on the grounds that it was promoting innovation and growth? This overt political interference in the enforcement of law is confirmed by [Paragraph 319] which introduces “a power for the Secretary of State for DCMS to prepare a statement of strategic priorities to inform how the ICO sets its own regulatory priorities”. The nature of the consultation process At 146 pages and over 50,000 words of dense, technical (and often obfuscatory) commentary, the consultation seems designed not to be read, the purpose being to hide its true intent, made possible by carefully chosen wrecking amendments that are hidden in a welter of often irrelevant detail. How many people have had the time or energy to read and inwardly digest the full document to grasp its implications? One of the stated justifications for the proposals are that current regulations are “unnecessarily complex or vague” and continuing to “cause persistent uncertainty”. Yet the consultation itself is both unnecessarily complex and vague. Its use of the word ‘responsible’ is a good example. The consultation highlights the difficulty of defining terms like ‘the public interest’ and has an extended discussion of the meaning of ‘fair processing’, concluding that “Comprehensively defining ‘outcome fairness’ in the context of AI within the data protection regime may [therefore] not be a feasible or effective approach”. But with the word ‘responsible’ it introduces a term new to data protection regulations, using it 52 times … without ever defining it. Rationale for reform The Government’s main justification for these ‘reforms’ (other than to rectify ‘confusion and vagueness’) is to address ‘consent fatigue’ and “unleash data’s power across the economy and society for the benefit of British citizens and British businesses” thereby “ushering in a new golden age of growth and innovation right across the UK”. Neither of these justifications stand up to scrutiny. ‘Consent fatigue’ is mainly caused by the widespread gaming of consent systems, compounded by lax regulator oversight. The problem can be better addressed without any changes to the law, as we show here. The ‘innovation and growth’ envisioned by the Government in this Consultation represents a deep misunderstanding of what makes data valuable; misconceptions about where the biggest opportunities for innovation lie and how to enable them; and a fundamental misunderstanding of the nature and potential of artificial intelligence. In short, because it is intellectually incoherent and flawed, it will not achieve its stated goal: to “unleash data’s power across the economy and society for the benefit of British citizens and British businesses”. In fact, it is almost certain to do the exact opposite. This blog series Other blogs in this series are: - Why is Data Valuable? Exposes misconceptions about the value of data, and therefore where its biggest economic potential lies. - Five Myths about Data, Innovation and Growth Explains why the Government’s claims that its ‘reforms’ will promote innovation and growth are without foundation. - AI: The Emperor’s New Clothes? Examines common misconceptions about AI, which the Government repeats and uses as justification for its proposed ‘reforms’. - We Need To Talk About Consent Explains why ‘consent’ to data processing got to be a problem and how to address it — without having to change the law. Addendum: Paragraphs 57 and 58 in full 57. The government therefore proposes to create a limited, exhaustive list of legitimate interests for which organisations can use Personal data without applying the balancing test in order to give them more confidence to process Personal data without unnecessary recourse to consent. The processing would still have to be necessary for the stated purposes and proportionate. For those activities not on the list, the balancing test would still be applied. The balancing test could also be maintained for use of children’s data, irrespective of whether the data was being processed in connection with an activity on the list. The government is mindful that Article 6(1)(f) of the UK GDPR recognises that particular care should be taken when data controllers are relying on the legitimate interests lawful ground to process data relating to children. 58. Any list would also need to be sufficiently generic to withstand the test of time, although the government envisages it could be updated via a regulation-making power. In that respect, the list would be similar to the approach in Section 8 of the Data Protection Act 2018 for the public tasks processing condition. For example, it could cover processing activities which are necessary for: - Reporting of criminal acts or safeguarding concerns to appropriate authorities - Delivering statutory public communications and public health and safety messages by non-public bodies - Monitoring, detecting or correcting bias in relation to developing AI systems (see section 1.5 for further details) - Using audience measurement cookies or similar technologies to improve web pages that are frequently visited by service users - Improving or reviewing an organisation’s system or network security - Improving the safety of a product or service that the organisation provides or delivers - De-identifying Personal data through pseudonymisation or anonymisation to to improve data security - Using Personal data for internal research and development purposes, or business innovation purposes aimed at improving services for customers - Managing or maintaining a database to ensure that records of individuals are accurate and up to date, and to avoid unnecessary duplication",https://medium.com/mydex/data-a-new-direction-but-which-direction-da547b886ac0,,Post,,Ecosystem,Public,,,,,,,2021-06-08,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Hidden in Plain Sight — the Transformational Potential of Personal Data,"Personal data stores apply the same economic logic to transform the costs of producing data driven services. [Verified attributes](https://Medium.com/MyDex/unleashing-the-potential-of-verified-attributes-fe001e01b091) are the digital equivalents of Henry Ford’s standardised parts. By enabling one organisation to instantly re-use data verified by another organisation they eliminate the need for vast amounts of duplicated effort and rework (re-creating each data point from scratch or checking its details, provenance etc).","Hidden in Plain Sight — the Transformational Potential of Personal Data This is the sixth and final in a series of blogs which provide edited extracts of key points made by Mydex CIC in its response to the UK Government consultation around a new National Data Strategy. This blog focuses on the scale of the economic (and social) opportunity — and why it is often overlooked. Previous blogs focused on how to unleash the full potential of Personal data, why every citizen should be provided with their own Personal data store, how to achieve these changes at scale, common misconceptions that derail progress and a review of the key components needed for the overall ecosystem to work. To catch up on progress on our Macmillan My Data Store Pilot click here. It’s odd to the point of bizarre, but it’s true. Today, there is endless hype about the enormous economic potential of data. That is why the UK Government is developing a National Data Strategy. Yet most debate (and therefore decision-making) about it demonstrates a deep misunderstanding of where this value lies and how it can be unleashed. For a National Data Strategy to be successful, it has to get its underlying economic logic right. Cost plus versus cost out Currently, nearly every Government proposal and policy relating to data (including Personal data) treats data as a corporate asset that the organisation has invested in. The organisation therefore needs to earn a return on this investment (it is assumed). Like any other product, data needs to be sold (or rented) for a margin. The entire focus is on measuring and the potential size of ‘the market for data’, the supposed value of different bits of data in this market, and how to enable this market to work better. It’s all about monetisation of data. At first glance, this seems logical. After all, if organisations have invested a lot of time and money creating data assets, it’s only sensible that they should find a way to cover these costs. What this misses, however, is the opportunity to take cost out of the system as a whole — to move it to a different cost baseline — where new ways of sharing and using data pay for themselves many times over without anyone having the need to sell or ‘monetise’ anything. Henry Ford’s mass production moving assembly line is a good example of the immense opportunities opened up by such system-wide ‘cost out’ approaches. Before the moving assembly line, cars were extraordinarily expensive items, made painstakingly by craftspeople who made each component separately. Because each component was hand made and therefore slightly different, to assemble them into a working machine, they had to re-work every component to make it fit. This required exquisite skill. The ability to do it well was a key source of the craftsperson’s added value. But it was also incredibly expensive. By relying on standardised components, Ford’s production lines eliminated the need for this rework. And by bringing each component to the worker when they needed it, his moving assembly line eliminated unnecessary time spent searching for, travelling to, or waiting for parts to arrive — thus reducing another layer of effort and speeding up outputs. When Henry Ford first experimented with his ‘cost out’ moving assembly line, people were astonished by the efficiency gains it delivered. Before the experiment, it took 29 workers 15 minutes to make a magneto (which helps start ignition). After the experiment it took 14 workers 5 minutes: an 85% leap in productivity. Similar productivity gains followed as he applied the same methods to the rest of the car. Car output soared from 68,773 in 1912 to 735,020 in 1917. The price of his cars fell by 90%. Nobody had seen just how much waste was embedded into how the old system worked. The waste was previously invisible to them. Given the extensive pollution, congestion and occasional carnage caused by the motor car, many people might say it ended up being more of a curse than a blessing. But we are not talking about the merits of the car itself here. We are talking about economic principles that powered an entire industrial revolution. The moving assembly line eliminated huge amounts of waste that were embedded into every sinew of how the previous system worked: the waste caused by the need to rework every component and by poor logistics which meant that the right things could not get to the right people at the right time. Personal data stores apply the same economic logic to transform the costs of producing data driven services. Verified attributes are the digital equivalents of Henry Ford’s standardised parts. By enabling one organisation to instantly re-use data verified by another organisation they eliminate the need for vast amounts of duplicated effort and rework (re-creating each data point from scratch or checking its details, provenance etc). And by enabling individuals to hold these verified attributes in readiness, and to share them safely and efficiently with service providers when they need to be used, Personal data stores act as data logistics engines bringing the right data components to the right places at the right time (the equivalent of Henry Ford’s moving assembly line). Our early experiments are showing similar efficiency and productivity gains to those realised by Henry Ford. There is now an opportunity to create order of magnitude reductions in the costs of providing data-driven services (initially in the public sector). Just to be clear: by order of magnitude reductions in costs, we don’t mean five or even ten per cent here or there but a complete system that is five or ten times more efficient. When dogs don’t bark In one of his famous detective stories Sherlock Holmes notices something crucial that no one else had: he notices something that hadn’t happened. Humans are hard-wired to notice things that happen. But it’s very difficult to become aware of things that are not happening — things that could or should be happening but aren’t. This happened with the motor car. Before Henry Ford’s revolution, the idea that owning and using a motor car would become a mass market open to ordinary people was literally laughable — when Ford suggested it was possible, people laughed at him. They said there was no demand for it. They were right. But the reason why there was ‘no demand’ was because cars were prohibitively expensive. The dog wasn’t barking. People weren’t noticing something that wasn’t happening: making cars affordable. When Ford made his breakthrough, the dog started barking. Instead of being an exclusive, privileged plaything of the very rich, motorised mobility was democratised. Result? Society was transformed as people started driving to work, to shops, to leisure destinations and to new homes in newly created suburbs. The 20th century mass ‘consumer / car economy’ was built on this breakthrough. Online search is another example. Before Google, very few people conducted searches for information because doing so was so prohibitively expensive in terms of time and effort. Another non-barking dog. Then Google made search very easy, and now most people conduct searches dozens of times a day, spawning a vast new market apparently out of thin air. Waking the data dog Today, it’s commonly said that individuals don’t want to manage their data. “There is no demand for it,” we are told again and again. But that’s because, under the current system, it’s prohibitively expensive in terms of time and effort to do so. How many people want to invest precious time, effort (and sometimes money) finding, collecting and presenting the information they need to get stuff done: filling in forms; proving facts about themselves; trying to join the dots created by organisations that work in isolated silos? No wonder they do it as little as possible. Until the opportunity to do things differently is presented to them, helping people handle their data better remains another non-barking dog. By applying those same principles of standardised parts and improved logistics — by waking that dog — Personal data stores have the potential to democratise how society uses Personal data, spreading the benefits to every citizen. Just like ‘the great car economy’, the way services are created and delivered will be transformed while making new ones possible — services that enable citizens to use their data to organise and manage their lives much better, to make and implement better decisions and to undertake associated administration across every aspect of their lives (money, home, education and career, travel, leisure etc) … all at a fraction of the previous cost and effort currently involved. Everything we’ve written in our last five blogs has been focused on waking the 21st century dog of low cost, high quality, mass produced, privacy protecting, Personalised, data driven services. Yet, time and time again we are told ‘there is no demand’ for a new Personal data logistics infrastructure that empowers citizens with their data, just as Ford empowered citizens with mobility. Why is this? Because they haven’t noticed that the dog is not barking. Their attention is focused on the current system as it is, not what it could be. They are simply not seeing the huge amounts of waste embedded in its workings (or if they do, they undertake ‘red tape reduction’ initiatives that reproduce the very causes of that waste) because they are not looking at the connections between the parts, only at how to improve the efficiency of each part in splendid isolation. They have not learned the lessons of mass production. A positive feedback loop All the really big service and quality of life breakthroughs of the past 100 years — including the provision of universal running water, sewerage, electricity, education and health services — have two things in common. First, they were first and foremost infrastructure projects, making something essential universally available at low cost. Second, they directly improved individuals’ lives and, at the same time, they also improved society’s health and resilience and the efficiency and productivity of the economy as a whole. They combined Personal and public benefit. Providing every citizen with their own Personal data store — another case of universal service provision — follows this pattern. The way it achieves this combined benefit is order-of-magnitude reductions in the friction, effort, risks and costs individuals and service providers incur when attempting to collect, store, share and use Personal data. In a post-Covid world trying hard to ‘build back better’ the need for infrastructure that enables verified data about people to be shared while protecting their privacy has never been more apparent. Government today has a once in a generation opportunity to do something truly transformative and rebuild in a way that benefits everyone. This is what the National Strategy for Personal Data can and should be aiming for.",https://medium.com/mydex/hidden-in-plain-sight-the-transformational-potential-of-personal-data-da47f666713e,,Post,,Ecosystem,Public,,,,,"Personal Data Stores,VC",,2021-04-12,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Misconceptions that Kill Progress in Personal Data,"It is not possible to make good policy decisions about priorities for investments, grants, innovation and research projects or rules and regulations if the grounds for these decisions are faulty. Currently, effective policy making is hampered by widespread misunderstandings about where the biggest economic opportunities lie, the nature of issues such as control, and the role of citizens in the workings of the data economy.","Misconceptions that Kill Progress in Personal Data This is the fourth in a series of blogs which provide edited extracts of key points made by Mydex CIC in its response to the UK Government consultation around a new National Data Strategy. This blog focuses on common misconceptions that need to be addressed if we are to progress. Previous blogs focused on how to unleash the full potential of Personal data, on why every citizen should be provided with their own Personal data store, how to achieve these changes at scale, and the sheer size of the social and economic opportunity. To catch up on progress on our Macmillan My Data Store Pilot click here. It wasn’t so long ago that sober, intelligent, responsible adults believed the sun orbited the earth, that if you were ill the best cure was to have a leech suck your blood, and that it was the right thing to do to hunt and burn witches at the stake. They also gave patronage to alchemists and astrologists who claimed they could turn lead into gold and foretell futures. These conventional wisdoms were deeply and dangerously wrong, but the majority of the great and good went along with them without a second thought. If everybody believes it, it must be true. Right? Something similar is happening with debates about Personal data today. Mydex CIC’s submission on the UK Government’s proposed National Data Strategy highlighted three areas where this is the case: a naive belief in the magic cure-all powers of ‘Big Data’ and artificial intelligence; confusion about what citizens ‘controlling’ their data really means; and whether citizens are actually capable of doing so. This blog summarises what we said on these issues. Beware AI hype Many narratives about a national strategy for data follow a familiar refrain. Artificial intelligence can gain much better insights from data and make much better decisions than human beings can. The opportunities are endless. To enable the analytics that will drive these insights we need to gather together as much data as possible. There is so much wrong with this trope that we will need to devote future blogs to it. But basically, it contains three core errors. - Claims for what AI can actually do are being wildly inflated. AI is good at solving certain types of problems (which require lots of computation and where judgements based on context or values are not required). But most of the important problems our society faces do not fit this specification. - By the nature of the processes needed to deliver AI, policies designed to promote it automatically favour existing data monopolies, thereby exacerbating the extreme imbalances of power and reward that already exist. - AI operates by crunching huge amounts of data. But when it comes to actually applying any insights or decisions generated to specific individuals, we hit a transition point. At this point we are no longer dealing with ‘big’ statistical data or (misnamed) artificial ‘intelligence’, we are dealing with specific bits of data about specific people: we are dealing with Personal data. This last point needs some elaboration. Economically speaking, by far the most important and valuable uses of Personal data do not derive from ‘insights’ derived from ‘analytics’. Businesses like Google and Facebook may loom very large in peoples’ consciousness, but their business models are organised around advertising, which accounts for less than 2% of all economic activity. The really big uses of Personal data lie elsewhere: in public administration, health, financial services, education, transport, retailing and so on. The key point about uses of Personal data in these arenas is that they are operational. Service providers use data to plan, organise and implement activities and to undertake associated administration. Such activities include ensuring they are dealing with the right people, making decisions relating to eligibility and access to services, configuring service provision to the needs and circumstances of particular individuals, planning and organising the delivery of such services, keeping associated records, undertaking billing, dealing with queries, etc. A key characteristic of these activities is that completing each one successfully does not require the aggregation of large amounts of data. Instead, they each require the ability to access and use exactly the right data at the right time. The infrastructure and capabilities that make the construction of such data-driven services is not the aggregation of the biggest possible databases but the availability of safe, efficient data logistics: the ability to get exactly the right data to and from the right people and places in the right formats at the right times. Establishing such Personal data logistics infrastructure has almost nothing to do with ‘big data’ or ‘artificial intelligence’ and is entirely overlooked by AI enthusiasts. Personal data stores provide this infrastructure — infrastructure that is really needed to turbocharge the accelerated improvement of the big part of the data picture, operational uses of Personal data for the purposes of improved service provision. Do individuals really want to control their data? Most debate about the issue of citizen ‘control’ of their Personal data is fundamentally confused. There are two very different types of ‘control’: - Individuals trying to control data that is collected about them and used by organisations - Individuals trying to use their data to manage their lives better, which means they have to be able to exercise control over it Virtually all current debate about ‘control’ focuses on the first, narrow, meaning. Virtually all of the value of exercising control derives from the second meaning, which is nearly universally ignored. Personal data stores empower individuals with the second type of positive control: the ability to use their own data for their own purposes; to add value in their lives. A second set of misplaced assumptions follows. When individuals seek to control data that organisations collect about them, the process is usually adversarial. The individual is trying to stop the organisation doing things with data that the individual doesn’t like. But when individuals seek to use their own data for their own purposes, on most occasions they positively want to share data with bona fide service providers because these service providers can help them add value. If a National Data Strategy frames debate and policy making about citizens controlling their data in the first narrow, adversarial way it will never unleash the full potential of Personal data. Are individuals capable of controlling their data? A related, common misconception is that individuals do not want to or are unable to exercise control over their data, because doing so is too difficult and complex. The opposite is true. Processes currently used by organisations to collect and use data manufacture complexity, thereby imposing significant burdens on individuals. Current burdens lie in four main areas: - Individuals are required to fill in multiple forms when seeking to access and use services, often having to provide the same information many times over, on occasion even to the same organisation. - Individuals are required to prove claims that they make about themselves (e.g. about their age or address) often involving the physical presentation of physical documents - Overly complex and cumbersome consent processes, including privacy policies and terms and conditions that are evasively worded and difficult to read and understand. These are often deliberately made excessively complex by organisations seeking to induce individuals to consent to data sharing for reasons and purposes that go beyond accessing the data that’s needed to provide the service the individual is seeking. - The organisation-centric nature of today’s data ecosystem means that individuals have to manage each relationship with each organisation separately — so that the above burdens of information provision, proofs of claims and consent are multiplied many times over as individuals deal with multiple organisations. Most individuals have 100 or more such data relationships, so to effectively ‘control’ all their data, they need to undertake these processes 100 times over. The transaction costs are so high that hardly anyone bothers. Personal data stores greatly reduce and often eliminate these manufactured burdens. They do so by: - Standardising data sharing agreements around ‘safe by default’: only sharing data necessary for the purposes of service provision. - Automating data sharing processes so that they work for and on behalf of individuals ‘while they sleep’. For a parallel, think of standard orders and direct debits which provide individuals with full and complete control over their money, but work in standardised, automated ways that mean the individual no longer has to think about their operation. - Creating single, centralised tools for the citizen for managing relationships with suppliers (equivalent to setting up, changing or ending direct debits). For example, consent management dashboards enable individuals to see and simply manage all their data sharing agreements with suppliers in one place (rather than having to log in to dozens of different separate accounts) Summary It is not possible to make good policy decisions about priorities for investments, grants, innovation and research projects or rules and regulations if the grounds for these decisions are faulty. Currently, effective policy making is hampered by widespread misunderstandings about where the biggest economic opportunities lie, the nature of issues such as control, and the role of citizens in the workings of the data economy. This is resulting in misallocation of available resources and overlooked opportunities. For a National Data Strategy to succeed, a fresh assessment of the assumptions that lie behind current debates about Personal data is essential.",https://medium.com/mydex/misconceptions-that-kill-progress-in-personal-data-4736b1d883c6,,Post,,Ecosystem,Public,,,,,,,2021-04-12,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,DigitalScot,,,,,MyDex CCI on working with the Scottish Government,Over the past months Mydex CIC has been [working for the Scottish Government](https://blogs.gov.scot/digital/2020/10/01/digital-identity-scotland-a-beta-industry-event/) on a strategy for implementing and scaling a system of ‘smart entitlements’ for the citizens of Scotland.,"A Way Forward for Personal Data Over the past months Mydex CIC has been working for the Scottish Government on a strategy for implementing and scaling a system of ‘smart entitlements’ for the citizens of Scotland. The Smart Entitlements concept is very simple. Its goal is to create a common, easy approach for citizens to access public services that is consistent across multiple service providers. To achieve this, it provides citizens with the ability to store their Personal information in an Attribute (or Personal data) Store which they own and control. Using this Attribute Store, citizens are able collect, receive and keep verified data about themselves and subsequently allow its reuse (if relevant) when they seek to access or use other public services. Practically speaking, under the proposed system service providers would a) generate a secure electronic tokens that verify facts about a citizen (such as proofs of their address, age, disability or educational qualification), b) provide these tokens to citizens to be held safely in the citizen’s own attribute/Personal data store, so that c) citizens can share these tokens with other service providers, under their control, as and when they are needed. The benefits could be immense. - Citizens would have to spend much less time and effort trying to find, present and prove information — filling in endless forms — and would also find the process less stressful and yes, sometimes less humiliating. - Organisations could use the process to reduce unnecessary data processing, verification of data entered into forms, all the duplicated effort, and errors that come from forms, enabling them to do more with less. - And the Government itself would benefit from the creation of a safe, efficient, privacy-protecting data sharing infrastructure that would support its inclusive growth and wellbeing agenda, while providing it with the digital capabilities to respond to new situations as they arise — such as the COVID-19 pandemic. In a separate project we proved that, technically speaking, it is indeed feasible for service providers to generate verified attributes and for citizens to store them in an Attribute Store. (A ‘verified attribute’ is any piece of information about a person or performance that has been generated or checked by a responsible trustworthy body and made available to another party in such a manner as to be trustable as a specific attribute.) This approach is already being put into practice in other projects working with groups of organisations serving the same citizen. But is such an approach operationally feasible? Could it work at scale? And how to achieve that scale? A roadmap for success The good news is that we have identified a way to achieve all the above. It involves some technical things like creating metadata standards to make sure data is machine readable and to ensure clarity about levels of assurance about the data ( including descriptions of how the data was collected, created, protected and verified). But essentially, it boils down to identifying a good place to start (for example, focusing on attributes that are widely used by many services and are easy to mint and share) — and then getting on with it. Crucially, it is do-able, now. There are no critical technical, operational or legal obstacles stopping Scottish Government from being able to implement this approach with immediate effect which is why they are able to move ahead with their Beta Phase during 2021. The details of this approach are laid out in this report. But its core features are simple. First, Scottish Government agencies, departments and initiatives such as the National Entitlement Card, local authorities, housing associations and the Scottish Qualifications Authority have all the data that’s needed to deliver critical mass. Other bodies and agencies, such as Social Security Scotland, can always add additional data to the system as it grows and matures, as well as accessing verified attributes (with citizens’ permissions) for their own service provision purposes. By enabling these agencies and departments to share data via citizens’ Attribute (Personal data) Stores, the recommended approach minimises dependencies and maximises agency for the citizen. It can be done, now, by the Scottish Government without needing the permission or cooperation of other bodies. And it enables the Government to independently pursue its own agenda and policies. Second, it can be done without major disruption and risk. The recommended strategy does not require any significant changes to existing back office systems that provide attributes or consume them. Instead, it connects these existing systems with a new piece of infrastructure (Attribute Stores) that join the data dots, adding a new layer of capability, flexibility and opportunity as they do so. At the same time, it can proceed incrementally, taking one step at a time, so that the risks of big leaps into the unknown are avoided. Instead, it allows for a test-and-learn approach that builds momentum, impact and benefits over time. Third, taking this incremental step-by-step approach brings immediate benefits for the uses of data that are initially prioritised while setting the right course for the future. The more service providers, verified data points and citizens that are involved, the more it builds the infrastructure and capabilities needed to bring further benefits. Once this infrastructure and processes have been put in place, an increasing number and range of attributes and attribute providers can be added incrementally, thereby expanding the number of use cases that can be supported and benefits that can be realised. Fourth, in doing this, it generates powerful win-wins. All the key stakeholders involved — citizens, service providers and Government — benefit from doing it. The Scottish Government is now launching a Beta Phase to take this initiative to its next steps. The opportunities it presents are huge.",https://medium.com/mydex/a-way-forward-for-personal-data-6251d1503bdd,,Post,,Ecosystem,Public,,,,,,,2022-11-12,,,,,,,,,,,,,
|
||
MyDex,DigitalScot,,,,,,,,,MyDex is working with the Scotish Government,MyDex is a community interest corporation that has been working on building *real products in the real world*. They [wrote about the ongoing work](https://Medium.com/MyDex/proving-verified-attributes-work-3f9ca813d43f) enabling public sector organizations to give citizens verifiable attributes they keep in their own data stores and can prove to other parties without the issuing organization in the middle.,"Digital Digital Identity Scotland – Prototype draws to a close May 13, 2020 by Digital Identity Scotland No Comments | Category Digital Identity, Digital Public Services Mike Crockart, Delivery Lead for the Digital Identity Scotland Programme, provides an update as the work on a prototype draws to a close… “The pandemic and our country’s response has rightly dominated all of our lives in recent weeks. For many of us it has meant a change to where we are working; how we are working; what we are working on; and for some whether we are working at all. For Digital Identity, however, it has brought into sharp focus the potential benefits of an improved method for digitally proving identity or entitlements in a new world. Particularly where accessing services online may not only be the easiest but also the safest way to do so. As a result, we have continued our work – albeit remotely – and it has been an exciting time for the programme, building and testing an attribute-led approach to support simple, safe and secure access to public services. In partnership with Mydex CIC and DHI (Digital Health and Social Care Institute), we have successfully developed a fully working prototype, including linking a separate credential provider (Okta UK Ltd) as an example authentication service. This has proven the technical feasibility and provided ample opportunity for testing the usability of the proposed service. With Young Scot and Independent Living Fund as example Relying Parties, we developed associated user journeys and iterated based on user feedback. The goal is to simplify access to services and reduce tedious processes for users, for example, repeatedly providing Personal information that has been verified as accurate elsewhere, such as age or disability, whilst maintaining high levels of privacy and security. A key aim of the prototype was to test concepts with users. We wanted to know how they interpreted a Scottish Government branded authentication credential (registration/login) and if they understood that it is reusable across the whole of the Scottish public sector. We also introduced the concept of creating a citizen-controlled attribute store into which they could add verified information, and could then choose to share selected trusted facts about themselves with other organisations to speed up application processes. Our findings were that users broadly understood that the credential was reusable across services; users were familiar with 2-factor authentication via SMS and authenticator apps, though many regarded this as an inconvenience despite awareness of the benefits to security; and there was generally support for creating and using an attribute store. However, it became clear early on that we need to do more to explain the difference between facts asserted by users about themselves and facts verified as being true by a trusted third party. We will also work with users to understand how best to outline the benefits this will give them and service providers in speeding up their access while reducing the risk of their privacy or security being breached. The prototype has added a great deal of knowledge to the wider Digital Identity Programme around technical feasibility, usability and user perceptions. Our plan now is to conduct more user research with the prototype (which we will have access to for 12 months), testing and iterating associated wireframes. This will help us understand how best to convey the attributes model to users so they have full trust and confidence in using the service, particularly as to how it gives them full control of their data. We will be looking to test with a broader demographic to understand whether this impacts findings. There is also more work to do to ensure the service is fully inclusive and promote how the service removes barriers to access for all. I touched on the COVID-19 outbreak above. As a result of it, user testing moved from being conducted face to face to remotely, and impacted the number of users we were able to test with, particularly ILF users, so we look forward to continuing this testing over the coming weeks and months. We have recorded demonstrations of each prototype iteration to seek insight from stakeholders and inform development work with relying parties. I look forward to now moving into the next phase of the programme where we take all we’ve learnt into a plan for developing a Beta/Live service. We will be further engaging with providers of public services across Scotland that might be both relying parties and attribute providers, as we look to build a simple, safe and secure digital identity service that removes friction, effort, risk and cost to both individuals and public organisations – a lasting mantra from David Alexander at Mydex CIC.” Hilary Kidd, Smart Services Director at Young Scot, adds: “At Young Scot, we are thrilled to have ensured that young people have played a key role in user-testing and informing the development of the prototype as part of a potential attribute-led approach. We are looking forward to working closely with Digital Identity Scotland and partners in the next phase.” Next steps We will publish final reports and other documentation about the prototype in the coming weeks and will be scheduling a virtual National Stakeholder Group meeting to seek views. We also plan to hold an Industry Engagement Day in early summer. This would be for potential suppliers of all components of a future Beta/Live service e.g credential providers, brokers, attribute providers and identity providers. We will publish more details and invitations here in due course. Get in touch! As always, please feel free to get in touch at digitalidentityscotland@gov.scot if you have any feedback or questions. Tags: attributes, digital identity, digital public services",https://blogs.gov.scot/digital/2020/05/13/digital-identity-scotland-prototype-draws-to-a-close/,,Post,,Ecosystem,Public,,,,,,,2020-05-13,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium, Alan Mitchell,,,,National Data Strategy,,Not Just Personal Data Stores,This is the fifth in a series of blogs which provide edited extracts of key points made by Mydex CIC in its response to the UK Government [consultation around a new National Data Strategy](https://www.gov.uk/government/consultations/uk-national-data-strategy-nds-consultation)This blog focuses on the main ingredients needed to unleash the full potential of Personal data — in addition to Personal data stores.,"Not Just Personal Data Stores This is the fifth in a series of blogs which provide edited extracts of key points made by Mydex CIC in its response to the UK Government consultation around a new National Data Strategy. This blog focuses on the main ingredients needed to unleash the full potential of Personal data — in addition to Personal data stores. Previous blogs focused on how to unleash the full potential of Personal data, why every citizen should be provided with their own Personal data store, how to achieve these changes at scale and common misconceptions that derail progress. To catch up on progress on our Macmillan My Data Store Pilot click here. If you’ve read the other blogs in this series you might think everything we’re saying boils down to a single recommendation “provide every citizen with their own Personal data store”. That’s definitely pivotal. But it’s not the whole story. Empowering citizens with their own Personal data stores only works in the context of a broader ecosystem, where other parties contribute other functions. Adding genuine Personalised value The most obvious of these is bona fide service providers, who use data to provide valued services. A Personal data store is a generic enabler; a piece of infrastructure. Every bona fide service provider is a domain expert in their particular field. That is why they exist. To provide their services they need to access and use the right data, and they need their own systems to do this. Personal data stores do not eliminate or replace existing organisation-centric systems. They add another complementary layer. We explore some of the implications of this here. A second obvious fact is that value isn’t generated by Personal data in isolation. Usually, it’s generated by making the right connections between and combinations of Personal and non-Personal data. A simple example: information about the weather is utterly imPersonal. But a weather service that provides a forecast for where I am right now, and where I intend to travel tomorrow may use Personal data to add value: my location and my future plans. Many of our problems today are created by providers of such services using the need to connect Personal with non-Personal data as an excuse to grab Personal data that they can monetise for other purposes. Manipulation of the content and presentation of privacy policies and terms and conditions for these ends has to stop. But the Personal data-driven connections and combinations are essential to ongoing value creation. This is generating the need for two important bits of enabling infrastructure: - ‘smart directories’ that create connections between the individual’s Personal data (my particular circumstances, interests, plans etc) and external data about the world out there, so that all possible extraneous noise is filtered out and only the right information is presented. This is the essence of data logistics: exactly the right information flowing to and from the right people at the right times. - ‘smart search’. Currently, search services are driven primarily by search term plus (perhaps) a tiny bit of filtering that a search engine conducts using information it has gathered about you. Usually, this filtering isn’t done to help you find what you want, but to herd you in the direction that is most profitable for the search engine. Smart search is a genuine service to the individual, using information about the individual that the individual is happy to volunteer (and only that information, almost certainly behind a shield of anonymity) to help filter search results so that the individual finds exactly what they are looking for as quickly as possible. Smart search is a big ask that would turn the existing industry on its head. But it is desperately needed, and only made possible by the prior existence of a Personal data store infrastructure. Additional enabling infrastructure For such services to work efficiently and effectively, further enabling infrastructure needs to be developed. Data directories In a ‘MUMU’ world, where there are many users of data using data from many different sources, service providers need to know what data exists and where to find it. For this, we need data directories. Personal data directories create their own special design requirements: they need to protect rather than abuse individuals’ data and invade their privacy. They should not hold the data itself, but provide the routing to that data along with information relating to the nature and type of data that is available. In the context of public services, in the long term this directory should list all of the Personal data held by the Government and where this data is held. It should also specify what level of assurance relates to that data (for example, was it generated for the purposes of an email newsletter or to verify a person’s identity?) The directory should be seen as a service to both service providers and citizens, being used to to signpost citizens to where they can obtain the attributes they need to complete applications and other processes where they need to provide or point to data. Many technicalities need to be addressed. For example, a data directory might not point the searcher to the original source of the data, but to another place where the data is easier to access. The directory will need to make clear how, and on what basis, the attribute in question remains up to date and valid (e.g. is updating automatic, periodic or occasional?). A Personal Data Store itself could publish its own directory entries, the same approach could apply for any data source. Metadata standards For data’s full potential to be unleashed, it must be reliable and trustworthy. This creates the need for large amounts of ‘metadata’ — data about the data that is being presented and used. For example, a data user may want or need to know how the data was collected, created, protected and verified. At all times, users need to have confidence that the data they are using is correct and verified and not modified since generated, in storage or transit. They also need to know how to link it with other data and if necessary synchronise the times lines of data. To avoid the huge costs and complexities created by the need to check these things — or to regenerate data where it is not possible to check these things — we need metadata standards: mechanisms by which ‘data about this data’ flows with the data to establish its trustworthiness and reliability. As with data directories, many technicalities need to be addressed. Data points need to be defined unambiguously and precisely targeted to the context of the transactions they are used in. The metadata should be machine readable and avoid concatenation of data rather dealing with specific attributes and composite attribute structures. (For example, machine readable date of birth as opposed to a composite of date of birth, postcode and full address.) There needs to be clear, standardised processes and procedures for accessing attributes. And so on. If you are interested, the strategy report Mydex CIC wrote for Scottish Government on its proposed Smart Entitlement’s programme examines these technicalities in detail. There is even work being undertaken on this as part of the Identity and Attributes Trust Framework Alpha in the policy paper understanding attributes which must surely be influenced by and influence the national data strategy. Real as opposed to ‘faux’ strategies Successful strategies do many things. They have a clear idea of what ‘good’ looks like and of how to get there, including who needs to do what and how these different parties’ contributions are going to be realised and coordinated. They need a sense of order — identifying those things that need to be put in place first, thereby enabling other things to happen later. They need to be crystal clear about what is not in scope or allowable as well as what is. Yet all too often, we are presented with faux ‘strategies’ that are nothing more than pious wish lists of things that would be nice, along with some vague descriptions of resulting things that would need to happen. Often they are worded in deliberately vague ways so as to please as many parties and create as much wriggle-room as possible. If you want a good example of faux strategy, take a look at the UK Artificial Intelligence Council’s ‘AI Roadmap’. The UK’s citizens need and deserve a genuine Personal data strategy, formulated and implemented to achieve genuine, lasting benefits. This blog has outlined some of the ingredients of such a strategy.",https://medium.com/mydex/not-just-personal-data-stores-f2070eada6be,,Post,,Ecosystem,Public,,,,,,,2021-06-25,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Public Spending: One Way to Solve Many Problems,"“We will identify where there may be shared interest, duplication or overlap in intended policy outcomes over multiple portfolios. Where there is, we will look to develop a more effective and efficient cross-government solution.”","Public Spending: One Way to Solve Many Problems Like most governments, the Scottish Government is currently going through the difficult process of deciding how to spend a shrinking amount of money on a growing list of problems. Given its “tight budgetary envelope”, says the Government’s Investing in Scotland’s Future paper “it is imperative that public spending is deployed as efficiently and effectively as possible” via policy interventions that help “maintain the affordability of public services over the Medium to long-term”. Within this context it is seeking to prioritise three things: - To support progress towards meeting poverty targets - To address climate change - To secure a stronger, fairer, greener economy Now, we would say it wouldn’t we? But it’s true (and it applies to almost any Government you can think of, not just Scotland). On every issue identified — whether it’s efficiency savings in public spending, maintaining the affordability of public services, tackling child poverty, addressing climate change, or securing a stronger, fairer, greener economy — one, single action could help address them all: building a Personal data logistics infrastructure that empowers citizens and enables the safe, efficient sharing of the data citizens and service providers need to access and provide public services. Guiding principles The Scottish Government says its approach to these challenges will be “heavily informed” by the principles of the Christie commission’s report on the future of public services, which says the Government should: - Empower individuals and communities - Integrate service provision - Prevent negative outcomes from arising - Become more efficient by reducing duplication and sharing services Providing each individual with their own Personal data store, which they can use to safely collect and store data about themselves, and share it under their control, would do all these. By enabling the right data to get to and from the right people and organisations at the right times, this citizen-empowering data infrastructure could radically reduce transaction costs for both citizens and service providers. It would both cut costs and enable service improvement. Let’s give a quick run through those key headline priorities. Improving service efficiencies In work previously done by Mydex CIC for the Scottish Government we showed how a system by which public service providers deposit ‘verified attributes’ (cryptographically secure tokens confirming facts about people) in their Personal data stores and that enables citizens to share this data as and when needed, could strip out huge amounts of friction, effort, risk and cost for both citizens and public services. In particular, it would help eliminate endemic duplication of effort (e.g. recreating or checking data that has already been generated and confirmed, or simply rekeying data that already exists) and errors caused by inaccurate or out of date data (including the cost of rectifying those errors). This system works like this. - Services holding different types of data about an individual deposit verified copies of this data in the individual’s Personal data store. - This data remains under the individual’s control in their PDS, and kept up to date and accurate via secure API link. 3. When the citizen needs to provide some of these data points to a different service provider they simply say ‘Yes’, and the data can flow accordingly — enabling them to bring their data with them to new service relationships, without having to fill in forms. Scottish Government has already formally recognised the power of this approach by deciding to establish its new Scottish Attribute Provider Service, which is based on these principles. Yet from what we can see, the potential of this new service is being overlooked, remaining unmentioned in the Government’s spending review. This needs to change. This is something that is entirely within the gift of the Scottish Government, and has already been agreed. Now all that needs to be done is for it to be implemented. Tackling child poverty Current approaches to help families in poverty are grossly inefficient. Much attention has been paid to the ‘poverty premium’ where those in poverty have to pay more for basics such as energy because they cannot fit into organisations’ preferred administrative systems — e.g. paying by meter rather than by direct debit. But hardly any attention is paid to the ‘poverty punishment’ — the fact that people in poverty are burdened with significant amounts of extra friction, effort, cost and stress in their lives because, to access the support they need, they need to jump through multiple hoops finding out about and seeking to access sources of support, finding and proving the information needed to prove eligibility, and repeating the same basic processes (including providing basically the same information again and again) each time they wish to apply for a new service. As a result, many people do not apply — or do not complete applications — for services they are entitled to. Which means they are punished if they do (taking on extra time, effort and stress) or if they don’t (missing out on services that they are entitled to). This is compounded by the fact that, as Scottish Government reports have emphasised, poverty is complex and multifaceted. People in poverty face multiple challenges, not just one challenge. They are quite likely to have issues relating to employment, mobility (e.g. the difficulty of getting to work without a car), childcare, housing, physical and mental health etc — and every single one of the services addressing these issues operates as a separate silo, which means the situation faced by the family is never addressed in a joined up way. Using the infrastructure we’ve just talked about, the Scottish Government could provide those wrestling with poverty with the means to easily and quickly find out about available sources of support, access and provide the information they need to prove eligibility for benefits and services while cutting service providers’ administration and service delivery costs. It enables poverty to be addressed as a whole, in a joined up way, efficiently and effectively. This is something that it is within the power of the Scottish Government to do, now. And it is something that would quickly and noticeably improve peoples’ lives whilst making services more efficient. Why not do it, now? Addressing climate change We’ve talked about the crucial role of data in helping to tackle climate change here. The key point is that, as well as helping to reduce paper use, providing much better data logistics — getting the right information to and from the right people and organisations — is the organising tip of a huge iceberg of physical logistics, because it is data that is used to organise the delivery of physical services. The more efficiently data is used, the bigger the multiplier effect on physical, carbon-emitting activities. Amory Lovins, the veteran energy campaigner, notes that while attention naturally focuses on renewable energy replacements for fossil fuels (which of course are essential), the biggest immediate carbon emission reductions could come from re-designing the ways in which existing systems and services work. For example, he explains, far less energy is needed to pump heat or cold through fat, straight pipes than skinny, long and crooked ones, because there is less friction. “In our house we save 97% of the pumping energy by properly laying out some pipes. Well, if everyone in the world did that to their pipes and ducts, you would save about a fifth of the world’s electricity, or half the coal-fired electricity.” Using better data logistics to cut waste from decision making and implementation is a key part of the system redesign we now need. It is just waiting to be done. Securing a stronger, fairer economy The same Personal data store-based data logistics infrastructure is also key to the creation of a stronger fairer economy. - Stronger This infrastructure enables individuals to amass rich new sets of data about themselves in their Personal data stores; data assets that simply weren’t possible to create when individuals’ data was dispersed across multiple separate organisations’ data silos. The infrastructure enables individuals to bring these rich new data assets with them to their relationships with service providers — data that will inform research and innovation in ways that were simply not possible before. - Fairer The same infrastructure pre-distributes power of data, by enabling individuals to build their own Personal data assets independently of the organisations that have traditionally collected data about them. Instead of concentrating data power primarily into the hands of large private corporations, Personal data stores build citizen-inclusion and empowerment into how a data-driven economy works. Conclusion In its Review the Government says; “We will identify where there may be shared interest, duplication or overlap in intended policy outcomes over multiple portfolios. Where there is, we will look to develop a more effective and efficient cross-government solution.” The Personal data infrastructure we have described helps exactly that. It also says: “As well as challenging portfolios, we will also examine discrete opportunities for longer-term, large-scale public service reform and transformation that leads to both beneficial outcomes for our citizens and the realisation of more fiscally sustainable delivery mechanisms.” In her recent speech to the National Economic Forum, Scottish Finance and Economy Secretary Kate Forbes said that: - Scotland needs to be “a country that boosts productivity”, - “We need everybody to be empowered to participate, and everybody to share in the successes that we have”, - “We need to ensure we redesign services from the perspective of their users and whilst big talk about delivery might not capture the headlines, it is the delivery that is absolutely critical”, What more can we say? The citizen empowering data logistics infrastructure we are talking about fits all these goals and criteria exactly. We would say it, and we are saying it. Whichever way we turn on the Scottish Government’s public spending review or national economic strategy, there is one, single, simple, low cost thing it can do to make progress on all fronts at the same time: invest in the infrastructure we’ve been talking about. The Scottish Government has already recognised this infrastructure’s potential, with its decision to go ahead with SAPS, as discussed above. All it needs to do now is recognise how strategically important this move is — and to build it into its spending and investment priorities for the years ahead.",https://medium.com/mydex/public-spending-one-way-to-solve-many-problems-3ac394e46a9e,,Post,,Ecosystem,Public,,,,,,,2022-04-07,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Deploying Personal Data Stores at Scale,"The big question now is how to enable this to happen at scale, safely, securely and efficiently. One key element of this is useful, easy-to-use interfaces, the taps and switches that mean people can use the infrastructure without having to think much about it.","DEPLOYING Personal DATA STORES AT SCALE An important change is beginning to sweep the world of Personal data. For many years, people have debated the question ‘what if individuals — e.g. citizens, customers — were able to assert genuine control over their own data?’ Now the debate is moving on to how to make this happen, at scale. Look at some recent developments. In recent weeks: - The UK Government’s proposed legislation to reform UK data protection laws, includes (amongst many less positive things) new provisions for data intermediaries including for “Personal information management systems, which seek to give data subjects more control over their Personal data.” - The UK Department of Culture Media and Sport’s latest paper on its new National Identity and Attributes Trust framework specifically mentions citizens being able to hold verified attributes in their own Personal data stores. - The Scottish Government’s proposed Scottish Attribute Provider Service includes a provision “where people can choose to save their Personal information securely to an individual ‘locker’ (a digital attribute store), in order to reuse when they wish to apply to other services”. - The UK Government via its Government Digital Service is to provide a One Log-in for Government which includes the concept of a Personal data store to enable citizen control over how their data is being shared across Government. Tom Read GDS Chief Executive Officer said that “One Log-in for government is currently the organisation’s most important piece of work.” - Scottish Government has just signed a contract with Mydex CIC to improve recruitment of citizens to participate in projects to co-design public services that ensures privacy via the use of its Inclued platform and Personal data stores.The UK NHS and BBC are now experimenting with Personal data stores for health and media consumption records In other words, multiple different parties and people are converging on the same solution — of providing citizens with their own Personal data stores — to solve multiple different problems. The big question now is how to enable this to happen at scale, safely, securely and efficiently. One key element of this is useful, easy-to-use interfaces, the taps and switches that mean people can use the infrastructure without having to think much about it. We’ve written about this here. But operational deployment as scale presents its own challenge. It’s one thing to build something in a lab to illustrate an idea’s potential. It’s quite another to make the transition to 24/7/365 operations, working at scale in the real world. Answering the question ‘how’ requires robust answers to many hard questions relating to infrastructure resilience, security, system architecture, governance, trustworthiness, business model and legal compliance. Here’s a checklist of these questions in a little more detail. Are its underlying design principles fit for purpose, robust and built to last? We talk about this issue in detail here. Is the individual’s data really secure? It’s very easy to make promises about data security, but very difficult to keep these promises permanently, especially when a system is operating at scale. Here are some of the safeguards that need to be built. - All data should be encrypted in motion and at rest - Distributed architecture: every PDS is separately and individually encrypted (which means the system is not creating a massive centralised database that becomes a honeypot for hackers) - No knowledge operations. Every individual should hold their own private key to their own data: the PDS operator does not have access to this private key, and cannot look into the Personal data stores it provides or make any changes to who can send or collect data from it only the individual can. - Everything the company does relating to the security of information management should be independently assessed and certified. To be legitimate, the PDS provider should be certified under ISO 27001 for information security Management. The Mydex platform displays all these criteria. How does the PDS operator cover its costs and make its money? - Is the PDS’s business model open and transparent? Does it, for example, publish a public price tariff, where what organisations are paying, for what, is open for all to see? - How does this business model affect the PDS provider’s incentives? For example, some PDS providers have generated business models where they take a ‘cut’ every time data is shared. This generates an incentive for the PDS provider to maximise the amount of data that is shared, thereby creating a potential conflict of interest between it and the citizens it is supposed to be serving. To make their offerings attractive to organisations that want to retain control, other PDS providers have created halfway-house ‘Personal data stores’ which remain inside the organisation’s operational boundaries, where the individual signs in using the organisation’s systems, and where the organisation places restrictions on what data the individual can share with who. Such faux Personal data stores may generate revenue streams for the technology provider, but they generate a conflict of interest with the citizen that defeats the object of having a Personal data store in the first place. - Does the PDS provider’s business model create revenue streams that are stable, e.g. designed to last in perpetuity? Mydex’s business model is designed to be open, to avoid conflicts of interest and to be stable. The model is very simple. Organisations pay a fee to connect to the platform, to enable safe efficient data sharing with individuals. There is no limit on what data can be delivered or accessed by who and for what purpose that is under the control of the individual at all times. Does the PDS provider have governance structures designed to ensure its trustworthiness in perpetuity? In a new ‘market’ like this, many would-be PDS providers are start-ups that are hungry for funding. Many of them seek funding from venture capitalists who, by definition, are seeking ‘an exit’ in the form of an IPO or trade sale. This brings two dangers. First, it incentivises the PDS provider to create a business model that focuses on financial extraction — making money out of citizens and/or organisations — rather than genuine service provision. Second, it means that any promises it makes in terms of commitments to privacy, data protection, business model or anything else may only last until the venture is sold. For a PDS provider to be legitimate, its business model and governance must include legally enforceable guarantees that mean it cannot simply go back on its promises in event of ownership of the organisation changing hands. That is why Mydex has chosen to be a Community Interest Company — because CIC status builds in legal requirements on the company to stay true to its mission of empowering citizens with their own data. Is the IT infrastructure robust and capable of operating at scale? Many people operating in IT today have a ‘hacker mindset’. They love writing bits of code that do cool things, and every time they come across a new technical challenge they automatically start writing another, separate, bit of code. Hackers are often brilliant at creating cool point solutions. But as these point solutions add up, they generate complexity and chaos. People with the hacker mindset are not good at building robust, integrated, efficient solutions that operate at scale. For that, we need an engineering, infrastructure-building mindset that is always asking ‘how does this piece fit with everything else that has already been built? Will it work stably and reliably, at volume?’ This requires an engineering mindset, not a hacker mindset. Can the system scale without generating mounting risks, costs or complexity? - Providing a million Personal data stores that are being used to store and share data, day in and day out, is very different to building a demo in a lab. Having robust software development, testing and deployment systems is essential if the flaws of the hacker mindset are to be avoided. - If the system can only work on a particular device such as a smartphone, everyone has to have access to such a device, these devices need to be designed so that they can ‘talk’ to each other, and problems arise if the device is lost, stolen or malfunctions. The only way millions of people can access their data from multiple different devices is if their data is stored (safely) in the cloud. - Some ways forward, such as Open Banking, envisage individuals giving permission for their data to be ported from one service provider to another without it being deposited in the individual’s Personal data store. This, proponents claim, cuts out the unnecessary extra step of having a data ‘middleman’. The approach works fine for just one or two transactions. But it creates complexity and cost catastrophes as volumes rise. It’s why (for example) telephone exchanges were invented rather than every telephone line trying to create its own unique connection with every other line. Independent scrutiny and certification It’s very easy for start-ups to make grand claims about what their technology can do or what their beliefs are. Selling ‘brochureware’ and ‘vaporware’ is a time honoured practice in software: Step 1) sell what you intend to make. Step 2) Use the money made from these sales to actually make what you promised. But an operation that works day in, day out, at scale cannot be fed by ‘vision’ and the apparent confidence of the salesman. What’s needed is independent scrutiny and certification. That’s why Mydex is independently certified for data management security under ISO27001 and with Fair Data and why it has met the requirements to be listed on UK Government procurement frameworks like G-Cloud and to gain an Open Banking licence. Built-in regulatory compliance For any system to scale efficiently it has to make it easier, not harder, for service providers to comply with data protection regulations. This requires dedicated tools and infrastructure that a) ‘designs in’ compliance with key principles such as data minimisation under GDPR and b) enables both citizens and service providers to manage related processes simply, quickly and where possible, automatically. Leaving compliance to data protection regulations aside to a ‘different department’ — creates a gap and disconnect between ‘legal’ and operations and is not something that can work efficiently and effectively at scale. Summary We know, because we’ve been there. Bringing new ideas to life in a lab environment is a positive, necessary thing to do. But making sure they can be implemented at scale, robustly, reliably and resiliently involves another — very different — set of considerations. This blog sums up our experience of what these considerations are.",https://medium.com/mydex/deploying-personal-data-stores-at-scale-ad35fb205e73,,Post,,Explainer,,,,,,,,2021-12-30,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Design Principles for the Personal Data Economy,"A key part of this is continuity and longevity: a Personal data store is for life, so the institutions providing Personal data stores should be designed for decades (centuries, even). Whatever particular corporate form they take, legal safeguards relating to continuity and longevity of purpose need to be built into how they operate.","Design Principles for the Personal Data Economy Last month we were asked to give a talk in Korea on the design principles for a new Personal data economy. We’ve turned this talk into a White Paper, published here. It’s important because, in our view, most current debate about Personal data is based on a philosophical category error that inevitably leads to a dead end. If you want to solve a problem you need to know what’s causing it. If you’ve got a heart problem, no matter how many pills you take for gastric wind you won’t solve it. When it comes to Personal data, most discussion assumes that our problems are caused by how organisations handle the data they collect and use. They are not. They are caused by the design of the system these organisations are operating within. ‘System design’? That sounds rather abstract and airy-fairy. Not for practical people. But it’s not abstract or airy-fairy at all. Birds are living systems designed to fly in the air. Fish are designed to swim in water. If you ask a fish to fly you won’t get very far because that’s not what its system is designed to do. Today’s Personal data system is designed to help organisations collect and use data to further their particular purposes. It is not designed to help citizens get the benefits of their data, to use data to address social ills, or even to help economy-wide innovation and growth (which may involve the creation of new and different organisations). In fact, with the way our current system works citizens have little or no control over their own data and are not benefiting as they should from it. Service providers often lack access to the data they need and face high costs in accessing and using it. And the system as a whole experiences high costs along with low levels of trust and restricted innovation. But these problems are not created by the behaviours of individual organisations in isolation. They are created by the way the system itself works. Its design: the fact that it is organised solely around organisations collecting and using data for their own particular purposes. We simply won’t solve the problems we now face by asking organisations to behave differently; by asking fish to fly. We can only solve them by introducing a different system design. Fitness for purpose We need a system for Personal data that unleashes its full Personal, social and economic potential; that goes beyond only helping organisations achieve their purposes. To achieve this we need to design a new system that’s fit for a broader range of purposes; that’s designed to fly rather than swim. These design principles are not random, plucked from the air because we think they might be nice. It’s no accident that all fish are sleek and that birds have wings. These designs fit what they need to do. So what fit-for-purpose design principles do we need to adopt? For a start, we need to build on the fact that unlike other commodities, when data gets used it doesn’t get used up. It can be used again and again, for many different purposes. We expand on this theme here. This being the case, it makes no sense to restrict access to data to just one organisation with one set of purposes. We need to break out of the confines of organisation-centricity, to enable data sharing. However, if you try to create a system where all organisations try to share data with all other organisations, you quickly create a complexity catastrophe. To solve this problem you need fit-for-purpose design. If data about each individual is deposited in the individual’s Personal data store, then these Personal data stores can act as hubs for the safe, efficient sharing of Personal data. This is effectively a new layer of data infrastructure that transforms the design of the system. It turns multiple separate, independent, exclusive and excluding silos of data collection and use (today’s organisations) into a networked data sharing ecosystem with three core functional layers: - Service providers collecting and generating Personal data (acting as ‘data factories’ if you like) - A new exchange layer made up of Personal data stores where data is stored and distributed under the individual’s control (data warehouses holding what the factories have produced) - The same and other service providers accessing and using this data to produce specific services. This means that our system’s current operating principle — that ‘the organisations that use data always store the data in question in their own systems’ — increasingly makes way for a different operational principle: the increasing separation of data storage and use, where Personal data stores store individuals’ data and specialist users access this data when they need it. (Like to going to a shop when you need bread rather than baking it yourself.) A key element of this new system design is that it is now largely citizen- or person-centric. Personal data stores a) give each individual control over the collection, sharing and use of their own data, and b) act as the natural point where all information about them is aggregated — information that was previously dispersed across hundreds of different, separate organisations collecting data about them. This new structure not only empowers citizens with their own data (thereby shifting balances of power and reward within the data economy), it also creates new person-centric data assets that are impossible to create under the old structure and which are natural sources of future innovation and growth. The economics of all this is another question which we will return to shortly. Right now, our focus is the design principles themselves which fall into four main areas of system architecture, infrastructure, governance and economic/business model. The following tables summarise their key points. The White Paper itself explains their logic in further detail. Architecture Today’s data economy is organised exclusively around organisations, disperses citizens’ data across many different organisations, excludes citizens from its workings (with organisations collecting and using data about them), operates via separate data silos that act like moated castles of data control, and integrate the collection and use of data behind these castle walls. The new data economy will enable citizens to collect and use their own data (as well as organisations), make individuals the point at which data about themselves is integrated, and place this new data asset directly under the citizen’s control. It will connect today’s data silos via a new ‘exchange layer’ of Personal data stores that enable citizen-controlled data sharing and increasingly separates the storage of data from use. Infrastructure Today’s organisation-centric approach to Personal data use creates multiple separate data silos where data is concentrated in a small number of centralised databases. Citizens’ control over their data is restricted to (often only formal) rights to consent to data collection and use. By the very way it operates it is privacy invading: organisations get to know a lot about the individuals they hold data on. The new infrastructure is distributed, with multiple nodes (Personal data stores) where individuals can exert real, direct control over their data. Because each Personal data store is individually encrypted, with only the individual holding the key, it operates on a zero knowledge (rather than privacy invading) basis. It has a technology agnostic, cloud- based approach where PDS providers ensure key functions such as interoperability. Governance The critical governance challenge is that trustworthiness cannot be reduced to an organisation’s policy or promise: a policy that may change at any time or a promise that may be broken at any time. Commitments to ensure data security, privacy etc need to be permanent and legally enforceable, embedded in the constitutions of the institutions providing the data infrastructure. (This relates to the ‘built to last’ principle — see below). In today’s data ecosystem, many organisations seek to gather and use Personal data for the purposes of competitive advantage. To achieve this, they restrict who has access to the data.. For infrastructure designed to empower and include citizens and to enable data sharing of wider, improved data use, while competition between different service providers can be as intense as ever, the data infrastructure needs to be neutral: it must be designed to enable all legitimate users, without favour. Economic logic and business models To sustain themselves in a way that maintains and fulfils their function, Personal data store providers need to be able to cover their costs. This ensures their independence from external parties who may wish to use control over the purse strings to exercise control over purposes. Because the purpose of this infrastructure is to enable others’ actions rather than to make and sell a specific ‘product’, its prime economic logic and benefit is ‘cost out’ rather than ‘added margin’. All its financial and economic incentives need to be designed to ensure this sustainability, neutrality and mission alignment. A key part of this is continuity and longevity: a Personal data store is for life, so the institutions providing Personal data stores should be designed for decades (centuries, even). Whatever particular corporate form they take, legal safeguards relating to continuity and longevity of purpose need to be built into how they operate.",https://medium.com/mydex/design-principles-for-the-personal-data-economy-f63ffa93e382,,Post,,Explainer,,,Design,,,,,2021-10-25,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Designed for Integrity,"Below are some of the design principles that underpin our infrastructure and services — principles designed to ensure that what we do truly does serve citizens, today and into the future.","DESIGNED FOR INTEGRITY In our last blog we talked about diverse, practical ways in which we help people and organisations use Personal data to get stuff done — to enrich peoples’ lives and improve service quality in a way that cuts friction, effort, risk and cost for both sides. The tools and capabilities we discuss in that blog are great. But to be truly valuable, they also need to be robust and ethical. They need to be based on sound design principles. Below are some of the design principles that underpin our infrastructure and services — principles designed to ensure that what we do truly does serve citizens, today and into the future. - Safe, efficient data logistics Our Personal data stores (PDSs) use APIs to enable safe efficient data sharing between individuals and organisations. Organisations can deposit data in an individual’s PDS, and individuals can share data with these organisations. Our PDSs don’t only provide safe storage facilities. They also provide safe, efficient data sharing capabilities which keep the individual in control. We call this ‘Personal data logistics’. - Individual the point of integration Our Personal data stores enable individuals to aggregate data about themselves that is currently dispersed across many different service providers. For example, most UK citizens have over a dozen relationships with different financial service providers: one or more banks and building societies, loan providers, mortgage providers, savings providers, investment services, pensions, insurances and so on. It’s only by aggregating data from all these service providers (and by adding additional information that only the individual knows) that it’s possible to gain a true, fully-rounded picture of an individual’s financial circumstances, and therefore to be able to give truly Personalised, relevant advice. That is why we applied for and got an Open Banking licence. Our infrastructure is designed to enable the provision of such genuinely Personalised advice across every walk of life: money, health, career, etc. - An asset for life By enabling individuals to aggregate data (including verified attributes) about themselves in their own Personal data store, we provide them with an asset for life: an asset that grows in value over time as more data is added to it. - Seek win-wins Many data relationships are adversarial, where one side seeks to extract value from another. We seek to enable mutually beneficial, productive relationships between citizens and bona fide service providers. For example, as the data in an individual’s Personal data store grows in richness and value, citizens can bring this data with them to relationships with service providers, helping both sides access and use the data they need for better services at lower cost. - Neutral In line with the above, our platform is not designed to help one organisation gain competitive advantage over another. It is designed to enable all sides to improve the way they operate, by helping everyone involved reduce friction, effort, risk and cost for example. So our charging structure doesn’t favour one organisation over another and it doesn’t incentivise us to try and make money out of individuals’ data either. (For example, if we charged a fee per data transaction, our revenues would grow with the volume of data sharing and that could incentivise us to pressurise individuals into sharing more data than they want to. So we don’t.) - Truly independent Our Personal data stores are truly independent and under the control of the individual. They do not sit inside any data-holding service provider’s organisational boundaries and do not depend on any service provider’s systems and technologies. Individuals don’t have to use any organisation’s identity processes to access their data. Organisations which deposit copies of an individual’s data in that individual’s Personal data store cannot tell the individual what they should or shouldn’t do with it, or who they can or can’t share it with. This is important because the more fashionable Personal data stores become, the more people there are out there pretending to offer Personal data stores but where, if you look a little more closely, it actually continues the organisation’s control of individuals’ data (by doing one or more of the above). - Safety and security Each individual’s Personal data store is separately and individually encrypted. This means our infrastructure is distributed, not centralised — so that it is designed to be the opposite of a honeypot for hackers. (If a hacker wants to access the records of millions of people in a centralised corporate database, they only need to succeed in one hack. For a hacker to access the records of a million people on our infrastructure, they would have to succeed at a million separate, different hacks, each time accessing the data of only one individual. That makes the hacker’s incentives a million times a million less attractive (a million times harder, for a millionth of the reward). - No knowledge Each individual holds their own encryption key to their PDS, which we don’t have. We have designed our systems so that they do not look at an individuals’ Personal data or who is delivering it or requesting it. We have provided the tools needed for the individual to be in control and able to see who is using their data for what purpose and approve such uses. We cannot see and don’t want to see their data. As a Community Interest Company we want to help people, not exploit them or intrude upon them. - Interoperable There is no point in doing all the above good things if the end result is to trap people inside a system that they can’t get out of. People have to have choices; genuine alternatives. Competition over who can provide the best genuine data empowerment can only be a good thing. For this reason, we expect many Personal data store providers to emerge, and we are fully committed to enabling users to transfer their data from our systems to other systems and vice versa. We believe in interoperability. - Commercial integrity Our business model is designed to support all the above principles. We are a Community Interest Company. Yes, we are commercial — to fulfil our mission we have to cover our costs. But we are not in this to maximise the profits we can make, but to maximise the benefits citizens can get from their data. So we don’t charge individuals for having a PDS. Instead, organisations pay us for enabling safe, efficient data sharing relationships with customers. Summary As the concept of Personal data stores grows more fashionable, we’ve got no doubt that clever people will invent many exciting new tools that do wonderful things. That’s great. It’s how it should be. But for such creativity to really deliver value, it must be built on solid foundations. We believe design principles like the ones listed above provide the foundations our society needs to put Personal data on a new, trustworthy footing.",https://medium.com/mydex/designed-for-integrity-13a69bcda0b2,,Post,,Explainer,,,Design,,,,,2022-06-22,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Flicking the Switch of Personal Data,"Over the last 14 years we have built the infrastructure needed to make citizen data empowerment possible — infrastructure capable of providing every individual with their own Personal data store, where they can safely and securely collect their own data, use it and share it under their own control. This infrastructure is now live and operational, officially recognised as a supplier to public services on procurement platforms in both England and Scotland and independently accredited for data management security under ISO 27001.","FLICKING THE SWITCH OF Personal DATA We believe individuals should be able to access and use their own data to be able to manage their lives better. Currently, this isn’t possible because every individual’s data is dispersed across dozens (probably over a hundred) different organisations that hold it and don’t make it available. This is absurd and unfair. Over the last 14 years we have built the infrastructure needed to make citizen data empowerment possible — infrastructure capable of providing every individual with their own Personal data store, where they can safely and securely collect their own data, use it and share it under their own control. This infrastructure is now live and operational, officially recognised as a supplier to public services on procurement platforms in both England and Scotland and independently accredited for data management security under ISO 27001. Unleashing the potential But what we’ve also learned over these 14 years is that core infrastructure is not enough. A parallel: it is extraordinary and wonderful that we have water infrastructure that brings fresh, safe water to our homes and offices, and a national grid that does the same with electricity. But if we didn’t have taps and switches to turn the water and electricity on and off as and when we need them, they wouldn’t be half as valuable as they are. So, over the past few years, we’ve also been building the taps and switches that are needed to make our citizen empowering Personal data logistics infrastructure really useful. This blog outlines some of them. Smart directories One of the really big, time consuming, frustrating and expensive things every individual and every organisation has to contend with is what we call ‘matching and connecting’. Individuals want to find the services they need to help them with a particular task, but often they don’t know who they are or where to find them. They might not even know that such services exist. Likewise, organisations offering a particular service often struggle to find and reach the particular people who really want or need this service. A huge amount of time, money and effort is wasted by both individuals and organisations trying to solve these puzzles, often in expensive and unsatisfactory ways. With smart directories, individuals can allow selected organisations to see an anonymised profile of themselves (shorn of any data that would identify the particular individual), and the selected organisations can see whether individuals fit the criteria their service is designed for. If there is a fit, the organisation can use the platform to send a message to the individual. If the individual decides to accept the message and respond, a connection is established. Smart Directories lie at the heart of the work we are currently doing with the Office of the Chief Designer in Scotland and Connecting Scotland to radically reduce the costs of finding and working with citizens to help co-design public services. Automated form filling There is little more dispiriting and irritating than having to fill in forms, especially when you have to do it time and time again, providing the same information to different people. Using our platform and its ability to enable the sharing of verified attributes in particular, if an individual says ‘Yes, I would like this service’ it is possible for the necessary information (and only the necessary information) to be automatically sucked out of their PDS and sent to the service provider so that service provider doesn’t have to waste time, money and effort checking to see if it is correct. This eliminates friction, effort, risk and cost for both sides while radically speeding up the process. Using our infrastructure, what used to take weeks or months (the individual painstakingly and manually filling out a form; the organisation painstakingly checking every piece of information within the form — often having to pay external service providers for the privilege), can now take minutes. This approach is central to the Scottish Government’s planned Scottish Attribute Provider Service, which could revolutionise both citizens’ experience of accessing and using public services in Scotland and the costs these services incur. Circles In many service situations, different groups of people need to talk to each other to make arrangements, coordinate diaries, deal with changes, share results of meetings, organise follow ups, and so on. Often this is done by phone or email where the information gets lost or becomes difficult to find and where separate processes are then needed to share associated data. To address this need, we have created what we call ‘Circle’ capabilities by which an individual can create a specific circle of contacts for a particular need (a bit like a WhatsApp group) but where a) all data generated by the conversation is automatically recorded into the individual’s PDS and b) where if related information needs to be shared it can be, automatically, again via the individual’s PDS. Individuals having to manage their cancer journeys provide a good example. Each cancer journey requires a lot of organisation and coordination, with different groups of people for different purposes. For example, the patient will need to arrange and attend appointments and share clinical data with medics; to coordinate arrangements with carers (both paid and unpaid); to manage related domestic arrangements with friends and family (can someone walk the dog while I’m recovering from chemo?); and to connect with specialist service providers. In our work with Macmillan My Data Store, we have created specific Circles (e.g. for friends and family or small service providers) where all of these tasks can be undertaken safely and efficiently, thereby lifting an energy and emotional burden off cancer patients while helping service providers work more productively. The same core technologies are now also being used in the Revolutionising Healthy Ageing project, where multiple different service providers need to come together to create an integrated, joined-up service for citizens that deals safely and respectfully with multiple aspects of their lives. Web apps For services like these to work, people and the services they engage with need an interface. Traditionally, this interface has been provided by a single, isolated service provider via their website or app. But this requires individuals to sign in to each particular service and doesn’t enable dots to be joined between different providers. And if it’s an app, it creates a new dependency on Silicon Valley monopolists like Google and Apple with their app stores, while requiring that the individual has a smart device. What’s more, the interface acts as a data suction device — sucking the user’s data into the systems of the app provider. In other words, the current approach is not only not inclusive, inefficient and restrictive, it is also privacy invading. Our answer: our web apps. Web apps can be used on any device connected to the Internet, independently of the Silicon Valley monopolists’ app stores. Our architecture for web apps goes further, placing data that is generated in the individuals’ own Personal data store. And there is something else. With our Web Apps front line service providers (who know each step of a process inside out) can map each of these steps out, identifying exactly what information needs to be shared when, to create a seamless journey. We have developed the technology by which the resulting interface — the App itself — is generated without the front line service providers having to know anything about software or code. This means the people who really know what each service needs can quickly and easily generate interfaces with service users, as and when they need them. With each Web App directly linked to the individual’s PDS, any information that needs to be shared (for example, a form needing to be completed) can be automatically sucked out of the PDS. This makes access to service modules like Smart Directories and Automated Form Filling instant and easy. Citizen Co-design For any service to be really useful, the most important input is that of the user — because users are the only people who really know what it feels like to use a service. Every service module we develop — including all of the above — has been developed working with citizens who actually use the service. We have developed skills and processes to facilitate this and we are continuing to research how to best engage with citizens to really make their contributions easy, fulfilling, informative and actionable. This is part of the work we are doing on co-design of public services discussed above. Our Inclued platform Many service providers don’t need just one of the above. They need them all — and more. They need channels to communicate and engage with citizens, to send and receive messages and to integrate that ability to act on these messages. Working with Glasgow City Council we have built a platform, called Inclued, which does all these things. Using Inclued, the Council can use the above capabilities to present citizens with incentives and offers, create communities, and seek feedback or direct engagement to gain insights and improve the targeting of their services. Inclued also eradicates form filling through secure, informed data sharing. Inclued is now being used as part of our work with the Scottish Government and Connecting Scotland and our work with Blackwood Homes and Care as part of the three year programme with three local communities working on Healthy Ageing. Consent Management Dashboards For all of the above to work — for citizens to remain in control of their data and to have continued trust and confidence in what’s happening — they need quick, easy, simple ways to see which service provider has had access to what information, for what purposes. They need tools which enable them to make changes — such as withdrawing consent — when and if they want to; and to exercise their rights under data protection legislation if needed. Currently this is practically impossible. Most of us have data sharing relationships with over 100 organisations (across health, financial services, public administration, education skills and employment, travel, retail, leisure, media and so on). Currently, to do anything relating to data with them, we have to jump through hoops, signing in to each different organisation’s website or app, navigating our way through to MyAccount, scrolling through Settings and so on. The costs and hassle of doing this in a consistent way across 100 or more different relationships are so prohibitively high that hardly anyone bothers. What we have today is an entire data ecosystem and economy built on learned citizen helplessness — the realisation that exercising genuine control over your own data is such a vast, time consuming, hasslesome task that it’s not worth even trying. We are changing that by building consent management dashboards that enable citizens to see and manage all their data relationships simply, quickly and easily from one place within their PDS. This includes the ability to create general permission and consent settings. What if, for example, instead of being confronted with a separate, different pop-up each time you visit a web site to set permissions for cookies, your PDS could instantly inform the web site that (for example) ‘I am happy with cookies for performance analytics but not for third party advertising). Our consent management dashboard is still work in progress. There are still many bits of functionality that need to be built. But the core is there, working to support all of the above service modules. Summary Giving individuals the means to exercise control over their data — to be able to access and use their own data for their own purposes — is easy to say but hard to do. Doing it in a way that also helps service providers benefit from citizen empowerment adds another level of challenge. We’re not finished with this task by any means. But we have made a good start. Many core capabilities and functions are built and already adding value for both citizens and bona fide service providers. We’ve got many more in the pipeline. The opportunity to unleash the full Personal, social and economic potential of Personal data is immense. And we are now on our way. However, for all these services to be truly valuable they have to demonstrate built-in integrity. That’s the subject of our next blog.",https://medium.com/mydex/flicking-the-switch-of-personal-data-4c5d0d368a31,,Post,,Explainer,,,,,,,ISO 27001,2021-10-22,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium, Alan Michel,,,,,,Getting Data Security Right,"data security is about system-wide design, where many different elements need to fit together to create a working whole.","Getting Data Security Right Over the last few weeks we’ve been asked about data security from places as far afield as Lithuania and Korea. As one journalist from Asia Business Daily asked us: “Security issues are paramount in MyData. How does Mydex manage customer privacy? There is no single ‘magic bullet’ that can guarantee data security or customer privacy. Indeed, the belief that there is such a magic bullet (usually some techno-fix such as blockchain) is one of the biggest dangers. That’s because data security is about system-wide design, where many different elements need to fit together to create a working whole. Mydex’s approach to data security is therefore multi-faceted and multi-levelled. It includes: - Encryption: all data handled by Mydex is encrypted in motion and at rest. - Architecture: big, centralised databases holding records about millions of citizens attract hackers. With our infrastructure, each individual’s Personal data store is separately encrypted. This means that to get a million records, hackers would have to conduct a million separate, successful hacks. - Operating procedures: Each individual holds their own private key to their Personal data store. Mydex itself does not know or hold this key, so Mydex employees cannot see the data held by citizens in their Personal data stores. - Business processes: We only work with known, reputable organisations that themselves work to the highest standards (e.g. government departments). To connect to our platform they have to agree to Terms and Conditions and Information Sharing Agreements with citizens designed to protect citizens’ privacy and data. - Citizen control: Citizens can easily see what data they are sharing with which organisations for what purposes, via their own Consent Dashboard. They can use this Dashboard to view ‘consent receipts’ that confirm their agreements with each organisation, and can change or revoke these permissions if they wish to. - External audit and accreditation: All our systems and processes are independently audited to international standards. We have held ISO 27001 accreditation for Information Security and Management for the last nine years. We don’t believe it’s possible to ensure data security without thinking through how the system as a whole works, looking at it from every angle: structure, incentives, governance, processes and, yes, technology. With the way our current data economy works however, many necessary elements are either missing or badly designed — generating incentives that undermine rather than enhance data security for example. That’s why our current system is so insecure, and why there is so little trust in it. There isn’t a magic bullet for ensuring data security and customer privacy. But there is a way of tackling the challenge so that robust, reliable ways forward are found.",https://medium.com/mydex/getting-data-security-right-36d291cac156,,Post,,Explainer,,,Security,,,,,2021-10-22,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Getting Identity Right. At Last.,"By recognising the pivotal importance of verified attributes and the potential role of Personal data stores in enabling the sharing of these attributes, it is opening the door to actually solving the problem of identity. At last.","Getting Identity Right. At Last. A seismic shift is under way in the huge, ongoing international project called ‘digital identity’. It hasn’t triggered any spectacular earthquakes yet. It’s more tectonic than that. But it’s seismic nevertheless. Ten years ago, we (Mydex CIC) were one of five companies (along with the likes of the Post Office and the credit reference agency Experian) chosen by the UK Government to pioneer its Verify digital identity programme. At the time the Government had a vision for identity which went something like this. An ‘identity’ — that confirms that a person is who they claim to be — would be a sort of digital product produced by specialist producers called ‘identity service providers’ (ISPs). They would compete in a market for digital identities made up of competing ISPs. Organisations would buy these identities from the ISPs to help them reduce the costs and risks they incur in checking to see if individuals are who they say they are. None of this vision is likely to survive as it gets replaced by a different, more efficient and more person-centric perspective. The first shift towards this new perspective happened a few years ago when the Government decided to launch its Identity and Attributes Trust Framework. Adding the word ‘attributes’ isn’t just a small semantic change. It signifies something very important (see below). The second shift, confirmed by the publication of the Government’s Beta version of this Trust Framework, is the explicit recognition that ‘identities’, attributes, or both, may be shared by a range of different parties including citizens using Personal data stores. The Trust Framework paper gives the example of Carmen, a doctor moving to work at a new hospital. Before starting work at the hospital, she must prove who she is and that she has the relevant qualifications. She gets a digital version of her registration certificate that confirms her licence to be a doctor in the UK. It is added to her Personal data store. The information from this registration certificate can be checked against an authoritative source. She can share it when needed e.g. when applying for a post at her new hospital. “Attributes can be created, collected and checked by an attribute service provider. An attribute service provider could be an organisation or a piece of software, like a Personal data store or digital wallet.” It’s all obvious, sensible stuff. But what it points to is a new vision of identity that has got nothing to do with the one outlined by Verify above; that combines trust-building with citizen agency with the reduction of friction, effort, risk and cost. Here are some of the main operational differences between the two visions. A new vision of identity First, an ‘identity’ is not a fixed ‘thing’. It is a byproduct of a process for the sharing of verified attributes (that is, details about an individual that have been generated or checked by a recognised, responsible body such as a bank or government department). The particular bits of information that may go towards confirming that an individual is who they say they are may vary greatly from situation to situation. It doesn’t really matter what they are, as long as the process for making them available is reliable, safe and efficient. Second, the use cases of ‘identity’ vary widely. Most people think of identity as relating to one specific scenario such as opening a bank account when, as an individual, you have to prove to the bank that you who you say you are and not some sort of fraudster. That’s important. But it’s actually just one step in an entire sequence of operations where it is necessary to know the identity and attributes of a person in order to provide them with a service. Other use cases include being recognised when you return to a service, checking eligibility for the provision of a service (such as for a loan or a benefit), configuring the details of that service so that they fit the circumstances of the individual concerned, planning the delivery of this service, and implementing its delivery. At each stage, different bits of information may be needed. If a person is applying for a job where they will be driving a lot, they will need to present evidence of having a valid driving licence. If they are applying for a loan or benefit, the driving licence may be irrelevant but details about their financial circumstances become central. At each point, it’s the ability to access and use verified attributes that matters — and it is the bundles of these attributes that make up the individual’s identity in that context. Third, this means there can never be a fixed, separate product called ‘an identity’, because the detailed bits of information in play at any one time will be changing. What really adds value is the ability to configure multiple different data points to fit the task at hand. Fourth, this ability to access, share and configure verified attributes requires the existence of enabling data logistics infrastructure — infrastructure that enables these processes in a safe, efficient, privacy-protecting way. This mental and operational shift from ‘product’ to enabling infrastructure is vital. It is what our recent White Paper on Design Principles for the Personal Data Economy is about. Fifth, this means there is no need for a special class of producers called ‘Identity Service Providers’, because their role is being fulfilled by this infrastructure. Sixth, it also means that identity provision will never, ever become a ‘market’ because there isn’t a ‘product’ to sell. At one time, many companies hoped to make a fortune selling identities for fabulous profits. The opposite is true. Like all infrastructure such as roads, railways, electricity and the internet, the greatest benefits accrue all round when the costs of using attribute sharing infrastructure are brought to as close to zero as possible. The economic logic driver here is ‘cost out’, not ‘margin plus’. Seventh, and most important of all, ‘identity’ is not just a service to organisations. It is first and foremost inclusive: a service for citizens as the example of Carmen shows. It is about empowering citizens with agency; with the information they need to make their way efficiently and effectively within a complex world of service provision. Finally, the citizen-empowering data logistics infrastructure that’s needed for this new realistic vision of identity is already built — by Mydex CIC. As just noted, we have just published a White Paper examining the design principles of the ecosystem it needs to work within. Conclusion For decades now, identity practitioners (Governments, big businesses, tech companies) have been chasing a vision of identity that was as real as a pot of gold at the end of a rainbow. That is why they have spent decades — huge amounts of wasted time, money and effort — getting nowhere. But now there is growing Governmental recognition of the need for a different approach that empowers citizens as agents able to share verified attribute about themselves. This idea lies at the heart of Digital Identity Scotland’s Scottish Attribute Provider Service, the Korean Government’s increasingly ambitious MyData initiative, and the EU’s Data Governance Act (especially its provisions for ‘data intermediaries’). The Government’s proposed Identity and Attributes Trust Framework doesn’t get us all the way to the coherent, alternative vision that is needed. But it has made some decisive steps in that direction. By recognising the pivotal importance of verified attributes and the potential role of Personal data stores in enabling the sharing of these attributes, it is opening the door to actually solving the problem of identity. At last.",https://medium.com/mydex/getting-identity-right-at-last-8512abadcfbc,,Post,,Explainer,,,,,,"Personal Data Stores,VC",,2022-05-23,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium, Alan Mitchell,,,,,,The Perils of Pre-Copernican Data Strategy,"Today, ‘everyone’ including powerful actors and decision-makers like the UK Government ‘just know’ that organisations are the centre of the Personal data universe, and that everyone else including citizens revolves around these organisations.","The Perils of Pre-Copernican Data Strategy It’s an oft-told story but it has a new relevance. Back in the middle ages, people believed that the earth was the centre of the universe so that everything else, including the sun, circled round it. When they tried to track the movement of the planets through the sky they were presented with a puzzle. The planets’ motions didn’t represent a simple orbit. At certain times, they seemed to go into reverse, creating a complex jungle of ‘epicycles’ that astronomers struggled to explain. Their mappings of the movement of the planets, as seen from the earth, are shown on the left hand side of Figure 1. An extremely complex picture that is very hard to fathom. Then came Nicolaus Copernicus. He said the sun was the centre of our universe, and that the earth orbited the sun. A simple switch of perspective — from earth-centric to sun-centric — created the enormously simplified picture of the planet’s movements shown in the right hand side of the figure. Today, something similar is happening with data sharing. Back in the days before Copernicus, ‘everyone’ including the great, the good, the clever and the powerful, ‘just knew’ that the earth was the centre of the universe and that everything else, including the sun, revolved around it. Today, ‘everyone’ including powerful actors and decision-makers like the UK Government ‘just know’ that organisations are the centre of the Personal data universe, and that everyone else including citizens revolves around these organisations. That’s why the UK Government is pressing ahead with organisation-centric plans for the future of data sharing. Plans that, if implemented, will create a picture far more complicated than that shown on the left hand side of the illustration; a complexity catastrophe on multiple fronts: costs, data security, interoperability, governance and trust. The complexity catastrophe Let’s take a look at how this data sharing complexity catastrophe will unfold. There are basically two ways to share Personal data. The first is an ‘organisation-centric’ one: direct from organisation to organisation, with the data never being handled by the citizen the data relates to. The second is a person centric one, where organisations deposit copies of details they hold about people in their Personal data store, so that the citizen can forward share this data as and when needed. What is the difference between the two? If you look at Figure 2, which shows how the two models work, you may not think there is much difference. On the left hand side, if there are three organisations involved in data sharing, three connections between them are required: one each to the other two. Simple! The person-centric approach to data sharing shown on the right hand of the diagram also requires three connections. But why bother adding a completely new entity — the citizen’s Personal data store — into the equation? Doesn’t that just add cost and complexity? Now take a look at Figure 3, which shows what happens if eight organisations are now sharing an individual’s data. On the organisation-centric left hand side, the number of connections has grown to 28, whereas if the data is shared via the individual’s Personal data store, the number of connections has grown to just eight. Instead of each organisation having to connect with every other organisation involved in data sharing (who they have never done business with before), the person-centric approach only requires one connection per organisation — with the citizen’s Personal data store. This connection is with somebody they already have a relationship with: the citizen. Figure 3 three shows that as the organisation-to-organisation approach to data sharing scales, its complexity grows exponentially. And data sharing between eight organisations is just beginning. How many Government services currently collect and use Personal data? Well, you can get to a list of eight very quickly. How about DVLA (driving licence data), the Passport Authority, the Money and Pensions Service, the Disclosure and Barring Service, the Department of Work and Pensions, HMRC (tax data), the Ministry of Justice (for services such as Lasting Powers of Attorney), the Home Office (residency and citizen status). That’s eight. Without drawing a breath. And without even thinking about all the different parts of the National Health Service, care services, local authorities, and education authorities that also collect and use citizens’ data. And completely ignoring the third and private sectors. If we take the number of organisations involved in sharing data about individuals to fifty, the number of connections now needed by the organisation-centric approach rises to 1225. (See Figure 4) Whereas, if the data is shared via the individual’s Personal data store, it rises to 50 — still tracking the number of organisations involved. Which sort of system do we want to create? The organisation-centric approach to data sharing which creates a picture of pre-Copernican complexity or the person-centric approach which creates a picture of post-Copernican simplicity? Perhaps pre-Copernican complexity would be justifiable if it brought tremendous benefits while the person-centric approach brought tremendous risks and harms. But the opposite is true. Where the catastrophes lie The pre-Copernican organisation-centric approach to data sharing doesn’t create just one complexity catastrophe. It creates many, involving costs, security, interoperability, governance and trust. Costs Let’s assume for the sake of argument that the costs of an organisation sharing data with another organisation and with an individual’s Personal data store are the same. (As we show below, this is a bad assumption to make, because the costs of sharing data between organisations is actually much higher.) But keeping to that assumption for a moment, if 50 organisations are sharing data under the pre-Copernican approach, with the exponential rise in connections involved total costs across the system are around 25 times higher than with the person-centric approach. That’s because each organisation has to manage dealings with 49 other organisations instead of having to deal with just one other organisation — the Personal data store provider. Security The databases of organisations like the Department of Work and Pensions and Her Majesty’s Revenue and Customs Service (taxes) were designed to operate like moated and walled data castles, working on the principle of perimeter-based protection. They were designed to keep outsiders out. Rightly so. They were designed to protect the data of the people inside: citizens. But with data sharing, organisations have to open their systems up — and the greater the number of connections they create, the greater the security risks. Once they are sharing data with 50 or more other organisations, their carefully built castle walls begin to look like a piece of Swiss gouda cheese: full of holes. But when sharing data with a Personal data store, they only have to create one carefully managed and scrutinised API connection with the PDS platform. Interoperability Because of the way our data systems have evolved, every organisation has its own, different software systems, formats, languages, standards and so on. For one organisation to become adept at sharing data with 49 other organisations it needs to become an expert not only at its own data systems but with 49 other organisations’ systems too. It’s not going to happen. And if it does, it’s going to be extremely costly and time-consuming. That’s because this interoperability problem doesn’t have to be solved just once. Each of the 50 organisations involved in data sharing has to solve it again, for themselves, separately and independently — reinventing the same wheel 50 times over, once again multiplying total system costs many times over. Whereas, when sharing data with a Personal data store, the organisation hardly has to think about software systems, formats, languages and standards at all. That’s because it simply shares the data using the systems it already has. It is then up to the Personal data store provider to manage these interoperability challenges. The big benefit here is that the PDS provider only has to solve these problems once for the solution to work with all 50 organisations. Governance Who decides what data should be shared with who, for what purposes? Under the Government’s current proposals this is left down to ‘senior leaders’ (e.g. civil servants) within Government departments. Will these ‘senior leaders’ ask citizens if they want their data to be shared? If not, what right do they have to make the decision? With PDS-based data sharing, most requests for data to be shared come directly from individuals seeking to access or use a service. The governance, consents and permissions challenges of data sharing are addressed almost automatically as a by-product of the process. Trust For all of these reasons, the pre-Copernican approach to the sharing of Personal data is highly likely to generate its own knock-on trust catastrophe. But that’s just the beginning. Under the current system, when individuals share data with an organisation, they have at least some idea of what data they are sharing with who, for what purposes (unless something underhanded, devious and illegal is going on). With the pre-Copernican organisation-to-organisation approach to data sharing, it becomes practically impossible for citizens to keep track of who has access to their data, for what purposes. The ideal of citizens being able to exercise control over their data becomes practically impossible and goes out of the window. Whereas, with a Personal data store, citizens are provided with their own consents and permissions dashboard which enables them to see everyone they have shared what data with for what purposes, and can exercise control (changing these consents and permissions) easily, from within this dashboard. Less than half the picture The above analysis of pre- and post-Copernican data sharing only addresses operational issues. They are less than half the picture. The organisation-centric approach focuses only on what organisations need to do with Personal data they hold. It ignores the challenges facing citizens when applying for, accessing and using services — challenges which Personal data stores address by making citizens’ data available to citizens themselves. It also ignores the immense potential benefits of making citizens the point of integration of data about themselves. What would happen if 50 or more different organisations each deposited data they hold about the citizen in that citizen’s Personal data store? A totally new-to-the-world Personal data asset would be created: one which starts to generate a complete picture of that individual. These new Personal data assets could become the engines of most if not all data-driven innovation of the future — a driver of economic growth. But if Government persists in its organisation-centric approach to data sharing, this opportunity will never be created because their data will remain dispersed across multiple service providers. Conclusion In machine learning, deep assumptions such as ‘the sun orbits the earth’ or ‘organisations are the only entities that collect and use Personal data’ are commonly called ‘priors’. A prior is an initial set of beliefs that people bring with them to the experiences and challenges that they face. They are the lenses through which we see the world. Pre-Copernican astronomers studied the workings of the heavens as closely as those that came after them. But because it never entered their heads that the earth might orbit the sun, every conclusion they drew was wrong. Today’s data strategies and policies are being made by a Government with the wrong, pre-Copernican prior belief that organisations are the centre of the Personal data universe. As a result its policies are way off the mark. Organisations are not the centre of the Personal data universe. Citizens are. If Government put citizens at the centre of its data policies, the simplicity and power of the post-Copernican perspective would shine through everything it did.",https://medium.com/mydex/the-perils-of-pre-copernican-data-strategy-974827845585,,Post,,Explainer,,,,,,,,2022-05-31,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,Data: A New Direction,,Why is data valuable?,"prompted by the UK Government’s proposed reforms of data protection law contained in its consultation paper Data: A New Direction. […] Under the banner of tackling ‘consent fatigue’, abolish citizens’ right to consent to the collection and use of their data: achieved by expanding the definition of organisations’ ‘legitimate interests’ (where they don’t have to seek consent) to cover almost every activity","Why is data valuable? This series of blogs is prompted by the UK Government’s proposed reforms of data protection law contained in its consultation paper Data: A New Direction. The stated intention of these proposals is to unleash a new ‘golden age’ of innovation and growth. But the real effect of these reforms would be to: - Under the banner of tackling ‘consent fatigue’, abolish citizens’ right to consent to the collection and use of their data: achieved by expanding the definition of organisations’ ‘legitimate interests’ (where they don’t have to seek consent) to cover almost every activity they would wish to undertake, thereby rendering consent irrelevant. - Under the banner of helping organisations access more data for the purposes of research and innovation, abolish the core principle upon which data protection regulations have been built: that Personal data should only be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes. The Government’s proposals would do this by making ‘further processing’ of citizens’ data legal in so many additional circumstances (‘legitimate interests’, use in ‘research’, for AI, to improve ‘data sharing’, etc) that the core principle is rendered irrelevant. These proposals represent the biggest attack on citizen rights seen in this country for generations. They would make new Cambridge Analyticas legal, and would put UK citizens’ data up for sale in global markets — the stated goal being to “secure the UK’s status as a global hub for the free and responsible flow of Personal data” (where ‘responsible’ is never defined in over 50,000 words of argumentation, but where the door is opened to ‘irresponsible’ uses of data many times over). This blog series explores the many mistaken assumptions that lie behind this disastrous initiative. The innovation and growth that Government promises its reforms will bring are a fantasy. They will not unleash a new ‘golden age’ they are based on deep misunderstandings of what makes data valuable in the first place and how to unleash this value. This, first, blog in the series looks at why data is valuable. Where data’s value comes from Given all the fuss that’s made about data, you’d think everyone would have an immediate answer to this question. But when you ask it strange things happen. Some people immediately go off on a tangent, talking about how much money is being made out of data. But this begs the question: why are people prepared to pay so much for it? Others repeat catchphrases such as ‘data is the new oil’. Very soon people are embroiled in arguments about whether data is, or is not, ‘like oil’ — while the original question gets forgotten. But the answer to this question matters, because it will inform all the policies and strategies that follow. If you get it wrong (as the UK Government is currently doing with their proposals to ‘reform’ data protection regulations ) you risk wasting huge amounts of time, money and effort chasing rainbows, while missing the opportunities that really matter. Reliability and Surprise So why is data valuable? Because of two things. Reliability and Surprise. Reliability If we know a fact to be true, we can act on it without incurring the risks (and costs) of making a mistake. If the data you are working with is unreliable — if you don’t know if it is true or not — any decisions or actions you take on the basis of it risks being wrong. One of three things then happens. - You don’t dare making the decision or taking the action, in which case lots of things that could happen don’t. - You make a wrong decision or take a wrong action and then have to incur the costs of cleaning up the mess. - You invest significant amounts of time, money and effort trying to improve the reliability of your data (often paying external agencies exorbitant fees to confirm basic facts), so that you avoid the costs and lost opportunities of the first two. Across the economy today, individuals’ and service providers’ inability to access and use the right, reliable data when and where they need it is a major cause of unnecessary cost and waste, fraud and of missed opportunities. This lack of access to the right reliable data has deep structural causes. Usually somebody, somewhere has generated or checked that a piece of information is correct — that is indeed reliable. But because our data economy is currently organisation-centric (that is, organised around hundreds of different, isolated, separate data silos run by different, separate organisations), this data remains inaccessible. What’s needed is a new Personal data logistics infrastructure which enables individuals to get hold of certified data about themselves (verified attributes) and to share this data, under their control, as and when they need to. If such infrastructure were put in place, everyone would have access to reliable data when they need it, and everyone would be able to eliminate the friction, effort, risk and cost that pile up when it’s not available. We estimate that over 95% of all data processing activities undertaken by British services providers in the private, public and third sectors (e.g. banking, insurance, retail, utilities, media and communication, local and central Government public administration, education, health, transportation, leisure, charities, etc) relate to the collection and use of data for the purposes of service provision: primarily administrative activities that depend on the availability of reliable data. Yet nowhere in the 50,000-plus words of of the UK Government’s Data: A New Direction consultation does it even come close to recognising this fact. Which means its entire ‘data strategy’ is already missing the most fundamental point. Surprise The second reason why data can be valuable is the opposite of the first: it can surprise you by telling you something you didn’t know. It’s data as ‘surprise’ that helps people and organisations keep abreast of a fast changing world. Data as ‘surprise’ is what fuels research and those innovations that are driven by insight into peoples’ behaviours. It’s a great thing. But we have to keep it in perspective. By definition, innovations are exceptions to the rule — the rule being the administration of existing services. And once an innovative product or service has been created to realise its value, it needs to be delivered to people, which requires the reliable data that drives these processes. In this situation, data’s two aspects of reliability and surprise work together, with ‘reliability’ still ensuring the actual delivery of value. For the ‘surprise’ side of data to generate value, the ‘reliability’ aspect is essential. That’s why we say that at least 95% of data’s value comes from reliability. But unfortunately, as we’ve noted above, the UK Government’s proposed data protection reforms focus entirely on research and innovation: the 5%. A classic case of the tail wagging the dog. Mixtures of reliability and surprise Even though they are like chalk and cheese, often the biggest value is generated by mixing reliability and surprise. For surprises to be valuable they need to be reliable: fake news can be a very dangerous thing, Some things can be surprisingly reliable as we’ll see when we discuss artificial intelligence in a later blog. Other things are ‘reliably surprising’: by which we are confronted by new information that changes what we do, but where the sorts of new information and the sorts of changes that result are part of a routine process. Take your daily commute to work. You know exactly where you are going, what the route is, how long it should take, and so on. And yet, every journey you make is full of little surprises — little bits of new information that you need to respond to. A traffic light turns red. A child looks like they might run out into the road. A car suddenly turns right in front of you. In each case, you know how to deal with the situation. It’s happened many times before. But on each occasion, you still have to deal with the ‘surprise’. In modern economies, a huge range of important value-adding activities fall into this category of managing ‘reliable surprises’. Take the health service which is largely organised around the management of ‘reliable surprises’: common ailments with well known symptoms and treatments but where, on each occasion, a diagnosis is still needed and a resulting set of actions planned and implemented. Likewise financial services when deciding whether or not to give someone a loan or assessing an insurance risk, or public administrators deciding whether an individual is entitled to a benefit or service. And so on. A huge amount of the data that’s collected and used in our society today is devoted to the efficient management of such ‘reliable surprises’. Implications Talking about reliability and surprise as we’ve done here may seem simple and obvious. In a sense it is. But its implications are profound. Practically speaking, for the full potential of data to be realised, individuals and service providers need to be able to access and use the right reliable data at the right time, on a mass scale. For this, we need new infrastructure. Just as the industrial age needed a national electricity grid to unleash electricity’s potential, so we need a new data ‘grid’ — a data logistics infrastructure that gets the right reliable data to (and from) the right people and places at the right times. That’s what we at Mydex are working on — Personal data logistics infrastructure that puts citizens at the heart of the process, including them in the workings of society and economy. The Scottish Government has begun to realise this with its decision to create its Scottish Attribute Provider Service. SAPS enables organisations to share verified (reliable data about their citizens) with the citizen so citizens, in turn, can share this data when and where they need to gain access to other services and eradicate things like form filling, delays and improve access, inclusion and outcomes. Some of the work the UK Government is doing around Personal data — its Identity and Attribute Trust Framework for example — has truly positive potential. But this Consultation and these proposals? Oh dear! They misunderstand where the real value of data lies and rely instead on a mythological narrative promoted mainly by Silicon Valley lobbyists. The next post in this series turns to this narrative. Other blogs in this series are: - Five Myths about Data, Innovation and Growth Explains why the Government’s claims that its ‘reforms’ will promote innovation and growth are without foundation. - AI: The Emperor’s New Clothes? Shows how the Government’s claims that these ‘reforms’ are needed to promote Artificial Intelligence are without foundation, and based on deep misunderstandings of AI itself.",https://medium.com/mydex/why-is-data-valuable-59bd63e1a09f,,Post,,Explainer,Public,,,,,,,2021-12-18,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,National Data Strategy,,"Why we need new, nationwide Personal data infrastructure","The central plank of Mydex CIC’s consultation response is that the UK needs to build a new layer of data logistics infrastructure that:- Includes citizens in the workings of the data economy, empowering them with the ability to collect, store, use and share data about themselves independently of any data controllers they may have data relationships with.<br>- To achieve this, the Government needs to ensure that every citizen is provided with their own Personal data store, which enables citizens to collect, store, share and use their own data, under their own control, for their own purposes, independently of any organisation that may have collected data about them.<br>- These Personal data stores should be designed to act as neutral, enabling nodes in a vibrant data sharing network, whereby citizens can obtain copies of their data held by organisations and can forward relevant elements of this data (such as Verified Attributes) to other data users under their control, as and when beneficial and necessary.<br>","Why we need new, nationwide Personal data infrastructure This is the second in a series of blogs providing edited extracts of key points made by Mydex CIC in its response to the UK Government consultation around a new National Data Strategy. The first one addressed the question “how to unleash the full potential of data?” Others look at Achieving Change at Scale, Common Misconceptions that Derail Progress, and the sheer size of the social and economic opportunity. To catch up on progress on our Macmillan My Data Store Pilot click here. The central plank of Mydex CIC’s consultation response is that the UK needs to build a new layer of data logistics infrastructure that: - Includes citizens in the workings of the data economy, empowering them with the ability to collect, store, use and share data about themselves independently of any data controllers they may have data relationships with. - To achieve this, the Government needs to ensure that every citizen is provided with their own Personal data store, which enables citizens to collect, store, share and use their own data, under their own control, for their own purposes, independently of any organisation that may have collected data about them. - These Personal data stores should be designed to act as neutral, enabling nodes in a vibrant data sharing network, whereby citizens can obtain copies of their data held by organisations and can forward relevant elements of this data (such as Verified Attributes) to other data users under their control, as and when beneficial and necessary. Such a citizen empowering data logistics infrastructure is key to enabling a MUMU data economy to grow: one where the right Personal data can flow to and from the right parties at the right times, in ways that protects citizens’ privacy and make them active participants in service provision. Benefits of the new Personal data infrastructure Ensuring every citizen is provided with their own Personal data store would kill many birds with one stone. Specifically, it would: - Enable order of magnitude reductions in friction, effort, risk and cost for both bona fide service providers and citizens - Ensure built-in privacy and data protection for all processes involving the collection and use of Personal data - Ensure built-in fair pre-distribution of power and rewards relating to the collection and use of Personal data, resulting in inclusive economic growth, enabling social inclusion and helping to tackle the digital divide and ‘poverty premium’ - Act as a platform for innovation, enhancing capabilities, capacity and flexibility Cutting costs The most compelling immediate reason to introduce Personal data stores is their cost-cutting potential. By enabling citizens to easily, safely and securely obtain electronic copies of data held about them by organisations (as per the data portability provisions of GDPR/Data Protection Act) a Personal data store-based data logistics infrastructure would enable data ecosystems to move from ‘make afresh every time’ to ‘make once use many times’ modes of operation. Under the current organisation-centric system, if an organisation needs to know something about an individual it has to obtain the necessary information for itself, even though another organisation might have already obtained the same information. Very often, organisations don’t need the actual information itself: they just need confirmation that a particular piece of information is valid, up-to-date and correct (e.g. that an individual is over 18, lives at a particular address, has a valid driving licence, has passed certain exams, is entitlement to certain benefits, etc). Responsible organisations that have already checked information can easily and cheaply generate secure electronic tokens verifying these bits of information. These are called ‘Verified Attributes’. Enabling this information to be ported to a Personal data store and forwarded to other service providers, would eliminate vast amounts of duplicated effort for both service providers and service users, significantly reducing the time, money and effort they invest in collating and confirming the data they need. The Scottish Government has already successfully conducted tests to confirm the technical viability of such a process in its Verified Attribute prototype. The prototype “enables public sector organisations to provide individuals with verified attributes, for individuals to store these attributes in their own Attribute Store, and for these individuals to be able to share these attributes with other public sector service providers as and when needed.” Because of the way these verified attributes work (using secure API links) other parties can rely on the accuracy and provenance of this data without having to undertake their own checking processes or asking citizens to fill out yet another form. (Some people say that such a process cannot work, asking who takes liability if a piece of shared data happens to be wrong. In fact, relying on data provided by other people happens in multiple processes all the time without any liability model. Whether to accept the information is down to the user’s own assessments of related risk.) By enabling the sharing of Verified Attributes and other data, Personal data stores render many previously essential data processing tasks unnecessary (thereby freeing up time and resources) while enabling service providers to access more, better quality data, more quickly. The potential power of this approach is demonstrated by the Scottish Government’s work on ‘smart entitlements’ which examines just how big the opportunity is and how to realise it. The sharing of verified attributes also cracks the problem of online identity: an ‘identity’ is simply an accepted collection of verified attributes. Built-in privacy and data protection It is now universally accepted that citizens should be able to assert more control over how their data is collected and used. Compliance with new data protection regulations remains a major challenge. By creating mechanisms that ensure that Personal data is shared and used by the citizens whose data it is, a new Personal data store based data logistics infrastructure would ensure privacy and data protection by default. A failure to build such citizen empowering Personal data infrastructure can only result in ‘more of the same’ problems. Rebalancing the workings of the data economy It is now widely recognised that status-quo approaches to the collection and use of Personal data have resulted in huge competition- and growth-restricting, unfair and socially divisive imbalances of power and reward. There is now ongoing debate as to how to ‘fix big tech’ with the UK’s Competition and Market Authority recommending the creation of a new Digital Markets Unit to curb the power of big tech data monopolies. While regulatory intervention is part of the answer, regulation alone can not do the job. We also need to embed new, fairer ways of working into day-to-day data management practices. Providing every citizen with a Personal data store achieves this, ensuring fair pre-distribution of power over Personal data and resulting rewards. The Financial Times recently observed much effort is now going into ‘fixing big tech’. But: “The real game-changer would be if Europe could pioneer the creation of a radically different and more decentralised data economy.” A Personal data store-based data logistics infrastructure would achieve this decentralisation. The Future of Citizen Data Systems report by the Government Office for Science, notes research that using such an approach may be closest to economically optimal, citing an economic modelling study that: “investigated how different models of data control could affect how far data is shared when consumers prioritise both their privacy and gains from use of their data. A scenario where consumers control their use of data was closest to optimal, with maximised benefits from data sharing and privacy. Consumers kept some data private, but shared other data with many more organisations and companies, compared to a scenario where firms controlled data.” One important aspect of this rebalancing is tackling the poverty premium. Currently, data exclusion is a major cause of social exclusion. Those with the greatest needs for access to services are also those who have to spend the greatest amounts of time, effort and money filling in application forms, trying to find data and prove credentials. By enabling individuals to build up banks of pre-verified facts about themselves — facts that they can reuse when applying for services — the new data infrastructure would go a long way to enabling social inclusion. A platform for innovation Organisations use data for many different purposes e.g. for the purposes of measurement, analysis, decision-making, operations planning, coordination and implementation, and administration to manage their operations better. Enabling individuals to do the same with their own data would open up a tidal wave of innovation. Individuals should be able to collect, store and use their own data for the same purposes (of measurement, analysis, decision-making, operations planning, coordination and implementation, and administration) in order to manage their lives better. This applies across all ‘life departments’ whether it is managing a home and dealing with suppliers, making big financial decisions such as planning for retirement, coping with long-term illnesses such as diabetes or cancer, or advancing their education and careers. Personal data stores enable service innovation in such areas by enabling citizens to safely and efficiently combine data from many sources, and to add extra information about themselves (e.g. plans, preferences, goals) to build rich new-to-the-world person-centric data assets that weren’t possible before (because previously the data was dispersed across many different organisations). The Government has already recognised some of this innovation potential with its Open Banking initiative. However, with Open Banking, data is ported from one organisation to another and never back to the individual whose data it is. This is a very restricted form of data portability which requires costly and heavy regulatory oversight and risks creating a complexity catastrophe as it tries to scale. It also actually reduces individuals’ control over their data. As individuals’ data gets spread across an increasing number of different organisations, it gets more and more difficult for individuals to keep tabs on their data. In contrast, Personal data stores allow a person’s data to be aggregated around that person, making it possible for service providers to gain permissioned access enhanced, enriched data sets to provide enhanced, enriched, Personalised services. Given that this information is critical to service provision across every major sector of the economy including financial services, health, public administration, transport, education/skills development, housing and leisure, this has the potential to become a significant driver of innovation and economic growth. Once every citizen has been provided with their own Personal data store, the Government will have created Personal data infrastructure with near infinite flexibility: the ability to handle any data for any purpose. With the infrastructure in place for example, when the Government introduces a new policy requiring access to data (as happened with the Covid pandemic) it can use the infrastructure to enable the right information to get to and from the right people at the right times. Given that companies like Mydex CIC have already built the infrastructure that is needed to achieve all the above, there is now no good reason why the UK Government cannot implement provision of this new infrastructure at speed.",https://medium.com/mydex/why-we-need-new-nationwide-personal-data-infrastructure-56513fb6daf4,,Post,,Explainer,Public,,,,,,,2021-04-12,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Revolutionising healthy ageing,"Mydex’s role will be to provide the data sharing infrastructure to enable individuals and service providers to safely and efficiently share the right data at the right times, in ways that protects individuals’ privacy and puts them in control of their data at all times and enable two way engagement and feedback throughout the project.","Revolutionising healthy ageing Mydex CIC is pleased to announce its involvement in a new £12.5m project designed to ‘revolutionise’ healthy ageing. The Peoplehood project — originally called Blackwood Neighbourhoods for Independent Living — will help people to stay well and physically active as they age and explore new products and services to support them. Supported by £6m UK Research and Innovation funding as part of its Healthy Ageing Challenge’ and led by Blackwood Group, the project will work with residents and partners in three neighbourhoods to enable people to live independently, including new homes, a design guide to improve upgrading accessibility and adaptations of existing homes as well as future home design. It will include accessible outdoor spaces so that people can sustain physical activity, supported by digital connectivity and infrastructure that helps security and ethical data sharing. Sustainable energy and transport will aim to reduce community carbon footprint and reduce transport costs. Individual coaching and support will help people maintain their health and wellbeing. The long term goal is to improve peoples’ lives as they age and reducing costs of care provision. Key role of Personal data Mydex’s role will be to provide the data sharing infrastructure to enable individuals and service providers to safely and efficiently share the right data at the right times, in ways that protects individuals’ privacy and puts them in control of their data at all times and enable two way engagement and feedback throughout the project. Through every aspect of the project, all Personal data relating to each individual will be delivered to and accessed from the individual’s Personal data store. All parties collecting or using any Personal data will send it to the individual’s Personal data store via a secure API, and will have a data sharing agreement designed to achieve the highest standards of data protection, transparency and control for the citizen. Connecting to Blackwood’s CleverCogs digital system, participating residents will be able to organise their services, care and medical appointments, stay in touch with family and friends via video calls, and listen to music and entertainment. For customers living in Blackwood Home, the system can also be used to control everything from lighting and heat to opening doors and blinds. The three neighbourhoods chosen to take part are located in Dundee, Glasgow, and Moray. Other partner organisations, besides the lead Blackwood, are: - Canon Medical Research Europe - Carebuilder UK - CENSIS - Cisco International Ltd - Enterprise Rent-a-Car UK - Lewis & Hickey Architects - The DataLab - The University of Edinburgh",https://medium.com/mydex/revolutionising-healthy-ageing-200a7edd1016,,Post,,Meta,,,,,,,,2021-11-18,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Can I trust you?,This is the second of two blogs on our new White Paper: [Achieving Transform At Scale](https://MyDex.org/resources/papers/AchievingTransformationatScaleMydexCIC-2021-04-14.pdf). The [first blog](https://Medium.com/MyDex/our-new-white-paper-achieving-transformation-at-scale-f97320f8447e) focused on the infrastructure challenge. This blog focuses on the parallel need for institutional innovation.,"Can I trust you? This is the second of two blogs on our new White Paper: Achieving Transform At Scale. The first blog focused on the infrastructure challenge. This blog focuses on the parallel need for institutional innovation. Sometimes, when a society confronts a new challenge, the institutions it previously relied on to provide solutions cannot cope. New, different institutions are needed. We think this is the case with Personal data. Traditionally, our society has looked to two main types of institution to unleash social and economic opportunities: private sector firms focused on maximising their profits and state-owned enterprises. But as this blog explains, these types of institution cannot rise to the particular challenges posed by Personal data. A different type of institution is needed, and thankfully we have one to hand: the Community Interest Company (CIC). Many people are still not familiar with CICs, which often come across as a rather strange hybrid beast. CICs are: - asset locked. This means any assets a CIC develops cannot be sold to another entity that is not also asset locked and equally committed to pursuing its community purpose. Mydex is not seeking a trade sale or ‘exit’: it is committed to continuing the operation and extension of its platform as permanent infrastructure to benefit its community (citizens). - dividend capped. Only 35% of distributable profits can be returned to shareholders. The remaining 65% must be reinvested in furthering the community benefits for which the CIC was established. Why has Mydex chosen this unfamiliar CIC status? Mission logic One simple explanation is that when Mydex was established back in 2007, its founders didn’t just want to sell a product in order to make money. They wanted to produce a service that brings benefits to people and communities and recognised they needed to make money in order to fund this service provision. Making money isn’t the purpose of the business. Benefiting the community by achieving its mission is the purpose, and making money is a means to achieving that goal. A second reason is that we recognised that Personal data throws up huge issues and challenges relating to trust. We reasoned as follows: If there is a lack of trust surrounding the collection and use of Personal data, its full Personal, social and economic potential will be hampered by mutual suspicion, power struggles and conflict and therefore never realised. A new type of institution that builds trustworthiness into what it does, and how, is needed for this challenge. How CIC status helps us rise to this challenge is not immediately obvious but the logic is powerful. It’s worth exploring. Economic logic The unique and particular challenges (and opportunities) of Personal data lie in the fact that unlike physical products and most services, data is a resource that can be used by many different people and organisations for many different purposes without ever getting ‘used up’. Because of this, it breaks the boundaries imposed by current notions of ‘private property’. Institutions organised around the notion of ownership of private property and profit maximisation are struggling to come to terms with a new world where value is created by data sharing. This takes us to the first unique challenge for Mydex: the question “What makes an enterprise economically viable and successful?” It’s commonly assumed that the acid test of an enterprise’s economic success is how much money it makes. But that relates to its financial success, not its economic success. If you stop to think about it, organisations’ economic results always lie outside their traditional legal and accounting boundaries — in the value their products or services generate in being used by other parties (while how much money they make is an internal measure). So, for example, the economic (and social) value of electricity isn’t measured by the money electricity suppliers happen to make. It lies in all the many uses our society uses electricity for. This is true of all enterprises. The job of a car or phone maker is to provide people with cars or phones — that provide them with mobility and help them communicate: things they want to do in their lives. The job of a hospital is to treat the sick; the job of an orchestra to delight audiences with its music; the job of a local shop to make accessing the necessaries of life easy. For most enterprises, this external economic impact is implicit. It’s not something they worry about too much, because they are focused on the internal measures that matter to them. But in the case of Mydex it needs to be made explicit, because the whole purpose of the organisation is the external value it helps to create: making data available to people (especially citizens) outside its traditional organisational boundaries, so that they can use this data for their own purposes. Adopting CIC status makes this external purpose explicit and focuses attention (of people both inside and outside the organisation) on this external purpose. Financial logic If it’s true that financial success is not the same as economic success, then how much money an organisation happens to make has got nothing to do with its external economic impact. If it’s a charity or public service, it could deliver huge economic benefits but not ‘make’ any money at all. If it’s a mafia syndicate, it could make huge amounts of money while its external economic impacts are 100% negative. But to survive, an organisation needs to cover its costs, and how it does so matters. If it’s a charity, it needs income from donations. If it’s a public service, it needs to be paid for by taxes. If it sells products or services, it needs customers willing to pay for them. If it needs external investment, it needs investors willing to invest. Each of these approaches to funding has its advantages and disadvantages. At Mydex, we chose not to be a charity for two reasons. First, because with a focus like ours on Personal data we feared that we would end up spending so much time and effort seeking benefactors and donations that this quest could end up diverting our attention away from actually delivering our mission. Second, this constant need to attract benefactors might place us under pressure to bend our mission to these benefactors’ whims. Likewise, we don’t think an organisation with a mission like ours, relating to Personal data, should rely directly on taxpayer funding for two reasons. First, there are immense risk and trust issues involved in the state having comprehensive citizen databases under its control. Second, taxpayer funded services often find themselves at the mercy of shifting political winds. That is why we chose to be a company that can cover its costs from what it sells: so that it isn’t dependent on external funders and remains free to make its own decisions. For us, this strategic independence is extremely important. But does this mean we simply have to bend to the will of a different set of external parties, namely customers (e.g. the organisations who pay Mydex fees to connect to its platform) and investors? It could, in theory. But we have designed our business model carefully to avoid this risk. We have designed our revenue streams so that we are financially incentivised to be a neutral enabler, not a ‘data monetiser’: we only sell connections to our platform to enable safe, easy, low cost data sharing; we don’t sell or rent any data itself. And the dividend cap for shareholders means the community always benefits from the lions’ share of any profits we happen to make. Getting this balance right is crucial because of the extraordinary potential of the Mydex business model: the more it grows the more profitable it gets. Exponentially. As a platform business, Mydex’s core operating costs do not rise rapidly with volume use. But the more organisations that connect to this platform, the more revenues the platform generates. In other words, as Mydex grows its revenue streams grow much faster than its costs — which means that Mydex has the potential to become extremely profitable. By creating a legally-enforceable dividend cap that requires it to reinvest two thirds of any profits it makes in its community mission, CIC status ensures that Mydex’s external, economically beneficial community purpose always remains paramount. (This has an important knock-on internal cultural impact. It means that in everything we do and how we do, we focus all our efforts on doing those things that will continually improve the external value we generate — our community contribution — on ‘making’ rather than ‘taking’. It creates a discipline and a yardstick for decision-making.) Strategic logic But this external value focus creates a potential problem. Why should investors bother investing in a company that only returns a third of the profits it makes to them when they could in theory get all the profits? The simple answer to this question is that a third of a very large sum is much bigger than all of a tiny sum. There is a paradox to this answer which goes to the heart of Mydex’s CIC status. It relates to the two separate elements: the ‘very large sum’ and the ‘tiny sum’. Let’s start with the very large sum. With its data sharing infrastructure Mydex is creating the equivalent of a mass production assembly line for the entire data economy. Henry Ford’s assembly lines reduced the costs of making motor cars by over 90%. They made a transformational product affordable to ordinary people, unleashed a tidal wave of product innovation and transformed the way societies and economies worked. Mydex is doing the same with Personal data: slashing the costs of accessing and using it, making its benefits available to ordinary people, unleashing innovation and transforming the way our society and economy uses our data in the process. The potential scale of this business is enormous and global. It could generate very large sums of money. What about the tiny sum? A year or two ago, the UK Treasury published a paper on the Economic Value of Data. It contained a crucial insight. It noted that “innovative uses of data can generate powerful positive externalities (i.e. benefits) that may not always accrue to the data creator or controller. This can result in data being under-exploited or under-shared.” In other words, the Treasury was accepting that private companies focused only on profit maximisation have no reason to pursue the transformational external economic benefits we have been talking about: they have no reason to build the data sharing infrastructure that Mydex is building. This means that without a new type of institution that prioritises the external benefits identified by Treasury they won’t ever happen. If we stick with existing institutions with their existing priorities, the very large sum becomes a tiny sum. Forever. Investors who insist on having access to all the profits will get close to none instead. The opportunity will be stillborn. This goes deep into Mydex’s role. The UK Treasury was highlighting a strategic collective action problem. It would greatly benefit all parties if investment was made into the infrastructure that makes the safe, efficient, privacy protecting sharing of Personal data possible, so that all its uses and potential can be realised. But it’s not in the immediate interests (or role) of any current data controllers (e.g. organisations collecting and using Personal data) to take on the cost, risk or responsibility of making this investment. Somebody else needs to take on this role. But this somebody has to be different. They cannot be focused on grabbing all the benefits for themselves. They have to be focused on creating external community value. And these priorities need to be baked into how they work. That’s what CIC status does: bake these purposes into the organisation’s legal structure. As a CIC Mydex is legally required to share financial rewards equitably. And, as a CIC, we are legally required to keep to the above promises in perpetuity (unlike venture capitalist funded outfits that are incentivised to make all the promises in the world and to break these promises once they have achieved a profitable ‘exit’). To put it simply, to break the Treasury’s collective action problem and to fully unleash Personal data’s ‘positive powerful externalities’ we need a new type of institution that every stakeholder can trust. Only a new, neutral, independent, non-threatening, non-competing, trustworthy body can break the collective action logjam and Mydex’s CIC status formally confirms and signals to everyone concerned that this is its positive, permanent, enabling role. The final piece of jigsaw There is one more piece of this jigsaw that finally locks all the others into place. There are two main types of Community Interest Company: companies limited by guarantee and companies limited by shares. Companies limited by guarantee are not-for-profit enterprises, while companies limited by shares can make a profit and distribute (some) dividends to shareholders. Mydex has chosen to be a company limited by shares. Why? To some purists, CICs limited by shares are not ‘real’ community interest companies. The purists’ assumption is that if any shareholder stands to make any money out of investing in the company, then by definition they are extracting value from it and therefore exploiting the community that the company is supposed to be serving. We don’t see it that way. Mydex is building nationwide infrastructure that will last for decades and building such infrastructure takes time (decades) and money. Which means it needs investment. Most financial investors today adopt the role of cherry pickers. They search around for ripe fruit ready for the picking and invest in the cherry picking operation: big, quick, low-risk returns. Mydex is doing the opposite. We are planting and tending a cherry orchard from seedling to maturity — without which there will be no cherries to pick (the ‘positive externalities’ the Treasury was talking about). But this requires a different type of investor: one who ‘gets’ the mission and is prepared to be patient (because reaching the point where those increasing returns kick in will take many years). Paying dividends (limited to one third of any profits made) to such investors is not exploitation of the community. It is paying for a service that makes a community benefit possible, just as paying staff to develop the software is not staff exploiting the community but making the community benefit possible. Once again, CIC status helps us to turn potential conflict into alignment to achieve a positive way forward. Locust or bee? Mydex’s mission — to empower every individual with their own data — is potentially global, long term and paradigm-changing: if successful it would put the modern data economy on a new and different footing. In nature, we find many strategies for survival. There are predators and prey, parasites and hosts, foragers, scavengers, hunters. Locusts flourish by devouring the value other organisms have produced. Bees flourish by helping plants flourish — to flower and fruit. Today’s Personal data economy is dominated by locusts, intent on extracting as much profit as they can from individuals’ data. In choosing to be a Community Interest Company, Mydex brings together economic, financial and strategic logic to make it possible for the company to flourish as a bee, and therefore for the orchards to flourish and fruit too. We flourish by helping others to flourish. That, we believe, is why in the long term we will be successful. And that is why we are a Community Interest Company.",https://medium.com/mydex/can-i-trust-you-6771a6ca0e35,,Whitepaper,,Meta,,,,,,,,2022-01-04,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Our New White Paper: Achieving Transformation At Scale,"This [White paper](https://MyDex.org/resources/papers/AchievingTransformationatScaleMydexCIC-2021-04-14.pdf) explains the scale of the potential benefits, how they can be achieved, and how they can be achieved at scale.○ Mydex CIC is starting where the need is greatest and resistance is lowest - in public and third sector services needing to cooperate with each other to deliver efficient, effective services to individuals.<br>○ It is using Personal data store-enabled solutions to specific problems to demonstrate the superiority of the Personal data store approach - starting with reductions in friction, effort, risk and cost for both bona fide service providers and citizens.<br>","Our New White Paper: Achieving Transformation At Scale This is the first of two blogs on our new White Paper: Achieving Transformation At Scale. The first blog focuses on the infrastructure challenge; the next on the need for supporting, enabling institutions. It’s now widely understood that the way our society collects and uses Personal data generates wide scale invasions of privacy, endemic risks of data breaches and fraudulent access to data, plus eye-watering imbalances of power and reward which, together, have led to a pervasive erosion of trust. What’s less widely understood is that all these problems and issues are an inevitable byproduct of how the system itself is organised: its organisation-centric structure — the fact that the only entities really capable of collecting and using Personal data at scale are large organisations. No matter how well-meaning new rules, policies and regulations might be, as long as this structure remains, the problems will remain. New rules, policies and regulations may help to clear up the mess created by the systemic leaking bucket, but in themselves they can never fix the leak itself. For that we need structural reform. An infrastructure challenge Our society has faced similar challenges many times before. When the industrial revolution hit us, it dawned on people that if this new system was to work effectively, the entire population would need to be able to read and write. We needed an education system. The rapid growth of cities created immense sanitation and public health problems: everyone needed running water and sewers. When electricity came along, letting every Tom, Dick and Harry create their own generating stations using different frequencies and distribution methods clogged the system with nightmarishly high costs and complexity. We needed a national grid. In each case, it was recognised that for the good of the society, its economy and its citizens we need infrastructure that made things universally available. The same goes for Personal data. The toxicities created by our current systems can never be addressed, and the full potential of data can never be unleashed, unless every citizen is able to collect and use their own data for their own purposes, just as every citizen was previously empowered to read and write and access water and electricity for their own purposes. Doing so makes the whole system work better. The scale of the challenge To achieve such a change however, we need to get over some mountainous obstacles. At Mydex CIC, we believe we have found such a route through. Our new White Paper Achieving Transformation At Scale explains how. To make any real headway we first have to overcome some formidable mental roadblocks. The biggest of these is market myopia — the assumption that the only way forward is to ‘create markets for data’. This is nonsense on two counts. First, none of the economic benefits of what Mydex CIC does around Personal data come from trading data for money. They come from helping people and service providers strip out the huge amounts of unnecessary waste and cost they both experience when trying to handle data. Taking cost out of how the system works is not the same as creating a ‘market’ where data is traded or sold. Closely related to market myopia is accounting myopia: the belief that the best or only way to measure the economic contribution of an enterprise is to see how big its accounting profits are. This, too, is nonsense. The social and economic value of electricity is measured by all the uses we put electricity to, not the profits electricity suppliers happen to make. (Try imagining your life without electricity.) Only when we look past market and accounting myopia, to look at all the different ways Personal data can be used to improve peoples’ lives (and the economic operations that make this possible), can we see just how big the prize of new ways of collecting and using Personal data could be. Practically and operationally speaking, the biggest challenge is to create a system where new ways of collecting and using Personal data can operate at scale. Here, the Mydex roadmap is simple and straightforward. First, focus efforts on where the resistance is lowest and immediate benefits are highest: what we call ‘clusters’. These are situations where a number of different public and third sector organisations are all helping the same individuals in their own specialist ways, but where to create the best possible ‘joined-up’ outcomes, lots of data needs to be shared. Mydex’s Personal data store infrastructure makes it easy, cheap and safe to do this. Second, in a world where despite immense amounts of bravado and rhetoric anything truly innovative is instinctively seen as ‘very risky and therefore to be avoided’, seeing is believing. So we are using these clusters to demonstrate proof points. To show, without doubt, that what we are talking about actually works. Third, in serving these clusters, we have developed solutions that can be quickly and easily applied time and time again across multiple different situations — so that they can spread. Examples include being able to use previously checked and certified information to be shared safely and efficiently, or to use improved data sharing to help multiple different service providers coordinate and integrate activities. These needs arise in countless different situations across all industries and many aspects of individuals’ lives and can now be met quickly, easily and at very low cost. So what we have created is something that proves in practice that it’s possible to strip out enormous amounts of friction, effort, risk and cost while opening up new opportunities; that has near universal applicability; and can be easily adopted by multiple different users. Conclusion It’s not enough to identify what’s wrong with how things work today. What’s needed is practical ways forward that deliver real benefits now; that once created can be reapplied and spread so that they operate at scale — thereby changing how the system itself works. It took us over a decade to find a path through the mountain range of obstacles. Our new White Paper maps this pathway in more detail. But the right sort of infrastructure alone is not enough. For this infrastructure to work and spread as it should, it needs the right institutions to support it: institutions that focus on broader social and community benefit and not just corporate benefit, and that are capable of building trust between multiple different stakeholders. That’s why Mydex has chosen to be a Community Interest Company. That’s the subject of our next blog.",https://medium.com/mydex/our-new-white-paper-achieving-transformation-at-scale-f97320f8447e,,Whitepaper,,Meta,,,,,,,,2021-05-14,,,,,,,,,,,,,
|
||
MyDex,MyDex,,Medium,,,,,,,Helping Data Trusts Manage Personal Data,"Mydex CIC has just published a blog for Cambridge University’s Data Trust Initiative on ‘Helping Data Trusts Manage Personal Data’. In it, we address the challenges that arise as the Data Trust movement begins to scale.","Helping Data Trusts Manage Personal Data Mydex CIC has just published a blog for Cambridge University’s Data Trust Initiative on ‘Helping Data Trusts Manage Personal Data’. In it, we address the challenges that arise as the Data Trust movement begins to scale. In a world where many different Data Trusts want to access individuals’ data for a range of different purposes and services, two questions arise: - How can many different Data Trusts collect/access the data they need from the same individuals without creating far-reaching duplication of cost and effort? - How can individuals keep track of, and assert control over, the data they are sharing with many different Data Trusts? One answer, we suggest, is to use individuals’ Personal data stores as infrastructure for Data Trusts. Individuals can use their PDSs to feed their data to the Trusts they want to support and to exercise appropriate controls over this data. The blog goes into more detail as to how this can work.",https://medium.com/mydex/helping-data-trusts-manage-personal-data-4215faaee5f2,,Post,,Product,,,,,,,,2022-05-03,,,,,,,,,,,,,
|
||
SecureKey,Avast,SecureKey,,Greg Wolfond,"DHS, DIF","Canada, Ontario, Toronto",Canada,,,SecureKey Technologies,"SecureKey is a leading identity and authentication provider that simplifies consumer access to online services and applications. SecureKey’s next generation privacy-enhancing services enable consumers to conveniently and privately assert identity information using trusted providers, such as banks, telcos and governments, helping them connect to critical online services with a digital credential they already have and trust, while ensuring that information is only ever shared with explicit user consent. SecureKey is a champion of the ecosystem approach to identity, revolutionizing the way consumers and organizations approach identity and attribute sharing in the digital age.",,https://securekey.com/,,Company,,Company,Enterprise,ID,"SSI, Supply Chain",,,,DID,2008,https://twitter.com/SecureKey,https://www.youtube.com/user/SecureKeyTech,,https://www.crunchbase.com/organization/securekey-technologies,https://www.linkedin.com/company/securekey/,,,,,,,,
|
||
Spherity,,Spherity,,Carsten Stoecker,Sovrin Steward,"European Union, Germany, Berlin, Berlin",Europe,,,Spherity,"Spherity is building decentralized identity management solutions to power the 4th industrial revolution, bringing secure identities (“Digital Twins”) to machines, algorithms, and other non-human entities.<br><br>Spherity’s Digital Twins enable innovative customer journeys across mobility, supply chain transparency, risk assessment, audit trails for data analytics, and many more use cases.<br><br>Our developers and systems designers combine years of deep research in the emerging decentralized identity space with a wide range of cross-industry experience. They have built and refined complex, bespoke information systems for supply chain management, data-rich manufacturing, and next-generation data control.<br><br>We participate in key standards processes and community conferences to ensure full compliance and interoperability in the complex technological landscapes of decentralization and self-sovereign identity","Credentialing the world for a new internet age with digital trust Enable digital trust in your ecosystems by implementing decentral identities and verifiable credentials. Leverage the trust to streamline your business processes. Start now and use our solutions to easily integrate with your existing IT landscape. OUR ECOSYSTEM AND PARTNERS Products The Spherity Product Suite Two products. Same mission. CARO Credentialing Service for US DSCSA compliance. Spherity’s compact app to authenticate direct and indirect pharmaceutical Authorized Trading Partners in real-time.Learn more Digital Product Passport Boost your compliance with regulatory requirements introduced by the New EU Battery Regulation with Spherity’s Digital Product Passport.Learn more Services Supporting you in Strengthening Trust through Digital Identity. Set-up your trust-ecosystem in your specific industry.Learn more Stay sphered, join our newsletter! Receive product updates and the latest tech trends across industries. We care about the protection of your data. Read our Privacy Policy. Resources Read and watch in-depth articles on case studies, solutions, technical implementations, and more! How issuers can manage credential revocation? Spherity has developed an Ethereum-based credential revocation mechanism for use in the US pharmaceutical supply chain. In brief, a credential issuer examines real-world evidence, such as a trading license,... COP27: Digital Trust Technology Supports International Climate Action The Government of British Columbia (B.C.) and Spherity, both members of the Global Battery Alliance (GBA), are cooperating to facilitate the secure exchange of sustainability information using digital trust technology. Product Passport Pioneers - #6 with Mario Malzacher, Circular Fashion In this episode, we speak to Mario Malzacher, CO-Founder of CircularFashion. Mario is driving the circular economy in the textile industry. He heads and participates in research projects of the BMWK...",https://spherity.com,,Company,,Company,Enterprise,,"ID,AI,IOT",,,,"ISO 27001,DID,Verifiable Credentials",2017,,https://twitter.com/spherityproject,https://www.youtube.com/channel/UCJd30vQ46EYCq0KFysJtRMg,https://medium.com/@spherityy,https://medium.com/@spherityy,,https://www.crunchbase.com/organization/spherity,https://de.linkedin.com/company/spherity; ,,,,,
|
||
Spherity,Spherity,,Medium,,EBSI; EIDAS; W3C,,,European Data Infrastructure,,"Spherity connects the dots between SSI, AI, and European Data Infrastructure","Juan Caballero attended the stakeholder meeting for the European Blockchain Services Infrastructure project in Brussels, where architects and legal counsel presented their requirements and reports for the next round of development in partnership with industry leaders and contractors. [...] The most interesting development [...] the report from Nacho Alamilla, a key legal advisor for EBSI, on the functional limits of the current eIDAS (cross-border electronic signature mutual recognition) system in Europe and possible revisions or refinements of it being discussed in the EU.[...]<br>[Carsten Stöcker](https://Medium.com/u/2d7ca4c61292) and [Ricky Thiermann](https://Medium.com/u/16518b469d1e) were in Bonn attended the High-Tech Partnering Conference [#HTPC20](https://www.htpc-htgf.de/en-gb/home) organized by our lead investor [High-Tech Gründerfonds](https://high-tech-gruenderfonds.de/en/the-decentralized-identity-and-digital-twin-pioneer-spherity-receives-seed-financing-from-htgf/) (HTGF). Carsten had a keynote about “How to unlock the untapped business potential of IOT devices with digital identity”. Further we were able to exchange with the other start-ups of High-Tech Gründerfonds’ portfolio and to establish relations to HTGF’s industry and corporate partners.<br>[...]<br>At the end of January, [Juan Caballero](https://Medium.com/u/7da78f634e80) and [Carsten Stöcker](https://Medium.com/u/2d7ca4c61292) were in Amsterdam, attending the specification-writing face-to-face meeting of the Worldwide Web Consortium’s Decentralized Identifier Working Group (W3C DID-WG). [...] The main event at this meeting was the renegotiation of the limits and interoperability of [DID Documents](https://Medium.com/spherity/ssi101-what-exactly-gets-written-to-a-blockchain-69ef1a88fa3c), which has become a sticking point in recent months due to the complexity of ongoing development based on different encodings (traditional JSON, JSON-LinkedData, CBOR, and even ASN.1 and PDF).<br>[...]<br>On 31st January [Marius Goebel](https://Medium.com/u/3a23dedbeb33) attended the steering committee of the “Standardization Roadmap Artificial Intelligence” for the German Federal Ministry of Economics and Energy ([BMWi](https://www.bmwi.de/Navigation/EN/Home/home.html)) hosted by [DIN](https://www.din.de/en) [German Institute for Standardization] and [DKE](https://www.dke.de/en) [German Commission for Electrical, Electronic & Information Technologies].<br>[...]<br>[Sphertiy](http://www.spherity.com/) is contributing to the working groups around the fields of “IT security in artificial intelligence (AI) systems” and “Certification and quality of AI systems” delivering its expertise in the fields of digital identities, in particular auditability, authenticity, traceability and identifiability of data and artificial intelligences (AIs).","Spherity connects the dots between SSI, AI, and European Data Infrastructure Recap of the first month of the new year Spherity started the year off with a busy travel itinerary, participating in standards work and startup communities. We met with the stakeholders of the European Blockchain Services Infrastructure, shared the business potential of the Internet of Things, made headway on the industry-wide groundwork for more robustly interoperable Decentralized Identifiers, and pushed forward the Identity capabilities of Germany’s Artificial Intelligence standards body. European Blockchain Services Infrastructure, Brussels Juan Caballero attended the stakeholder meeting for the European Blockchain Services Infrastructure project in Brussels, where architects and legal counsel presented their requirements and reports for the next round of development in partnership with industry leaders and contractors. We have built relationships with the key architects of the new system, and will be following closely the tenders and calls for industry input of this epochal project for European integration. The most interesting development, perhaps, was not the EBSI project itself (or its self-sovereign research and development framework, eSSIF), but the report from Nacho Alamilla, a key legal advisor for EBSI, on the functional limits of the current eIDAS (cross-border electronic signature mutual recognition) system in Europe and possible revisions or refinements of it being discussed in the EU. High-Tech Partnering Conference, Bonn Carsten Stöcker and Ricky Thiermann were in Bonn attended the High-Tech Partnering Conference #HTPC20 organized by our lead investor High-Tech Gründerfonds (HTGF). Carsten had a keynote about “How to unlock the untapped business potential of IOT devices with digital identity”. Further we were able to exchange with the other start-ups of High-Tech Gründerfonds’ portfolio and to establish relations to HTGF’s industry and corporate partners. Worldwide Web Consortium’s (W3C), Amsterdam At the end of January, Juan Caballero and Carsten Stöcker were in Amsterdam, attending the specification-writing face-to-face meeting of the Worldwide Web Consortium’s Decentralized Identifier Working Group (W3C DID-WG). As observers consulting on specific use-cases for the supplemental use-case document, Spherity met with stakeholders and designers hammering out a rescoping and process change for the specifications underlying interoperable “DIDs”. We will be representing our clients’ IoT requirements and use cases in the ongoing industry inputs to the standards process. The main event at this meeting was the renegotiation of the limits and interoperability of DID Documents, which has become a sticking point in recent months due to the complexity of ongoing development based on different encodings (traditional JSON, JSON-LinkedData, CBOR, and even ASN.1 and PDF). To make the security and the translation of DID-Documents more manageable across these divergent encodings, the working group decided to define a finite list of valid DID Document properties and contents, establishing a threshold for method interoperability and a standard registry for maintenance of the standard. More complex extensibility mechanisms, which might be difficult for all other standards-compliant methods to support fully, has been relegated to a separate layer linked via the @Context to allow simpler systems to remain fully-compliant. Other extension mechanisms around “metadata”, matrix parameters (which work like query strings or URIs), and the incorporation of a broader base of use-cases were also discussed. For a more detailed guide to the online documentation, see Juan Caballero’s report here. Standardization Roadmap Artificial Intelligence, Berlin On 31st January Marius Goebel attended the steering committee of the “Standardization Roadmap Artificial Intelligence” for the German Federal Ministry of Economics and Energy (BMWi) hosted by DIN [German Institute for Standardization] and DKE [German Commission for Electrical, Electronic & Information Technologies]. Experts from business and society are working jointly to develop a roadmap for norms and standards in the field of artificial intelligence. The aim is to develop a framework for standardization at an early stage. The standardization roadmap will include an overview of existing norms and standards on AI aspects and, in particular, make recommendations with regard to future activities that are still necessary. It will be developed by the respective stakeholders from industry, science, public authorities and society. The roadmap will make a significant contribution to introducing and enforcing the national position at European and international level. Sphertiy is contributing to the working groups around the fields of “IT security in artificial intelligence (AI) systems” and “Certification and quality of AI systems” delivering its expertise in the fields of digital identities, in particular auditability, authenticity, traceability and identifiability of data and artificial intelligences (AIs). “For us as a society as well as for us as Spherity, it is immensely important, in my view, that we necessarily deal with the topic of standardization of artificial intelligence — not least because this technology has the potential to completely transform our concept of what it means to be human.”, Marius Goebel says. With its activity within the steering committee of the “Standardization Roadmap Artificial Intelligence” Spherity is contributing to the AI strategy of the German Federal Government. Aim of the steering committee is to overhand a first draft by November 2020.",https://medium.com/spherity/spherity-connects-the-dots-between-ssi-ai-and-european-data-infrastructure-1f626e77ba7,,Post,,Ecosystem,Public,,,,,,,2020-02-06,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,Legisym,,,,,Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S. Life Sciences Marke,"“Legisym is thrilled to be working alongside Spherity to bring the first production-level ATP Credentialing solution to the industry,” said Legisym President & Co-Owner David Kessler. “With the successful completion of the ATP Credentialing Pilot in 2020 and the joint founding of the Open Credentialing Initiative in early 2021, the Spherity-Legisym partnership is already proving successful in collaboration and forward thinking.”","Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S. Life Sciences Market. Legisym, LLC is a trusted expert in the U.S. Life Sciences Market, providing services to pharmaceutical companies around the world since 2009 Legisym and Spherity have worked closely together to bring to maturity a joint offering that meets the security requirements of the U.S. Life Sciences Market. As part of the joint development, both companies have collaborated with SAP and Novartis, which have already subjected the product to extensive quality testing and functional validation. Spherity and Legisym are pleased to officially announce their partnership as of today. In November 2013, the U.S. congress enacted the Drug Supply Chain Security Act (DSCSA) in order to protect patients’ health. To ensure that only legitimate actors are part of the supply chain, the regulation requires U.S. pharmaceutical trading partners to ensure that they only interact with other trading partners that are authorized. Authorized is every trading partner holding a valid state-issued license or a current registration with the Food and Drug Administration (FDA). Today in 2022, U.S. pharmaceutical supply chain actors have no interoperable, electronic mechanism to validate each other´s authorized status. With more than 60,000 interacting trading partners involved in the U.S. Life Sciences Industry and a FDA recommendation to respond to data requests in under one minute, a solution that provides compliance with the regulations by 2023 is in high demand. Legisym and Spherity have decided to cooperate and offer an interoperable highly secure service to enable pharmaceutical supply chain actors to become an Authorized Trading Partner (ATP) according to U.S. DSCSA. Legisym, as a trusted identity and license verification service provider, perfectly complements Spherity’s digital wallet technology for managing verifiable credentials. The verifiable credential technology is used to represent the authorized status of interacting trading partners in an highly efficient, secure and DSCSA-compliant way. To use credentialing for Authorized Trading Partner (ATP) requirements under DSCSA, trading partners need to go through a one-time due diligence onboarding process with Legisym. Once the verifiable credentials are issued, they are stored in a secure digital wallet which comes embedded with the Credentialing Service provided by Spherity. Using this technology enables U.S. pharmaceutical supply chain actors to interact with digital trust, as they now can digitally verify their ATP status in every interaction. Georg Jürgens, Manager Industry Solutions at Spherity says, “together with our partner Legisym we focused on making the adoption of credentialing for trading partners as simple as possible. Manufacturers, wholesalers and dispensers can all acquire a digital wallet and ATP credentials within minutes without integration effort and use this innovative solution for DSCSA-regulated interactions.” “Legisym is thrilled to be working alongside Spherity to bring the first production-level ATP Credentialing solution to the industry,” said Legisym President & Co-Owner David Kessler. “With the successful completion of the ATP Credentialing Pilot in 2020 and the joint founding of the Open Credentialing Initiative in early 2021, the Spherity-Legisym partnership is already proving successful in collaboration and forward thinking.” Legisym and Spherity founded along with other adopters, the Open Credentialing Initiative (OCI). This newly formed organization incubates and standardizes the architecture using Digital Wallets and Verifiable Credentials for DSCSA compliance for Authorized Trading Partner requirements. Besides U.S pharmaceutical manufacturers, wholesalers, and dispensers, the OCI is open for solution providers integrating the ATP solution. For press relations, contact communication@spherity.com. Stay sphered by joining Spherity’s Newsletter list and following us on LinkedIn. About Legisym, LLC For over a decade, Legisym, LLC has successfully provided the pharmaceutical industry with affordable and effective regulatory compliance technologies. In early 2020, driven by the 2023 authorized trading partner (ATP) requirements, Legisym began leveraging their existing Controlled Substance Ordering System (CSOS) and license verification technologies and experience, to engage as a credential issuer. By performing thorough credential issuer due diligence processes, first to establish a root of trust, Legisym promotes confidence in the trading partner’s digital identity prior to the issuance of all ATP credentials. About Spherity Spherity is a German software provider bringing secure and decentralized identity management solutions to enterprises, machines, products, data and even algorithms. Spherity provides the enabling technology to digitalize and automate compliance processes in highly regulated technical sectors. Spherity’s products empower cyber security, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001.",https://medium.com/spherity/spherity-is-partnering-with-legisym-offering-joint-compliance-product-for-the-u-s-cbf9fd5a217,,Post,,Ecosystem,Public,,,,,,,2022-01-20,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,,,,,,#SSI101: An Introductory Course on Self-Sovereign Identity,"Outside of a few philosophers, social scientists, and a tiny minority of specialized technologists, however, most people feel uncomfortable making any definitive or authoritative statements about identity.","#SSI101: An Introductory Course on Self-Sovereign Identity The Spherity Way Most of the time when someone first hears about “self-sovereign identity,” “decentralized identity,” or “blockchain identity,” they naturally assume the terms refer to some esoteric topic far enough away from their domain of experience and expertise that they can safely leave it to the experts. “Identity,” after all, is an important, hotly debated, and nearly undefinable core concept of human life. Outside of a few philosophers, social scientists, and a tiny minority of specialized technologists, however, most people feel uncomfortable making any definitive or authoritative statements about identity. Who would volunteer to express opinions about something that can so easily offend, and which we rarely think about when it is working well for us? As for the adjectives “self-sovereign,” “decentralized,” and “blockchain,” these are no less controversial, no less stable, and no less likely to intimidate, to offend, or to confuse. I do not believe, however, that most people can safely leave it to the experts, even though I am one of those experts. On the contrary, I believe “SSI” is worth learning about, worth getting familiar with, and worth getting excited about. For this reason, I have tried to outline a quick tour through the basic “building blocks” needed to understand what SSI is, how SSI is different from other “regimes” or systems of organizing identity, and what Spherity does with SSI. Half as a fun way to structure these essays, and half out of habit, I will refer to this series of writing as a “curriculum,” and I will use North-American-style course numbers of the kind that were standard in my former life as a college professor. Here, then, is an overview of the topics that will be covered in the coming weeks in our “SSI 101” series: - Identities & Identifiers - An Overview of Non-Human Identities - Self-Sovereignty and Autonomy - Attest, Identify, Authenticate, and Verify - What Exactly Gets Written on a Blockchain? - Verifiable Credentials & Data Portability - Encryption & Correlation - How Open Standards Get Made To facilitate your and our sharing of these links and cross-linking back to them from other writings, I will structure each “glossary entry” listed above as a distinct Medium post with a permanent URL. Full disclosure, they might get more detailed (or illustrated) at some point in the future. They can be read in any order, although they are easiest understood by the true beginner in the linear sequence conveyed in the “previous”/“next” links at the top of each entry. For the reader who is already comfortable with the 101 topics, members from across the Spherity team will be collaboratively writing articles for the rest of 2019 that walk you through the specific data needs of various industries we have studied closely with partners and clients, and even in our past lives pre-Spherity. These longer articles comprise the “special topics” in the 200-level sequence of our SSI curriculum. Having read two or three of those, hopefully at least one of the 300-level will be of interest and accessible: there, we will cover speculative economics, machine futures, the data needs of an increasingly circular economy, data marketplaces, and other “advanced topics” in SSI. At this level, things get a little academic and we stand on the shoulders of many giants, mostly deep thinkers within the SSI community or the software sector generally. So let’s start at the beginning, then, with identities and identifiers, the smallest indivisible unit of SSI.",https://medium.com/spherity/ssi101-an-introductory-course-on-self-sovereign-identity-the-spherity-way-19e0d1de3603,,Post,,Explainer,,,,,,,,2020-09-07,,,,,,,,,,,,,
|
||
Spherity,KuppingerCole,,,,,,,,,Dr. Carsten Stöcker - Decentralizing Provenance in an Industry 4.0 World,"In this episode, Raj Hegde sits down with Dr. Carsten Stöcker, Founder & CEO of Spherity to understand how #decentralized identity is transforming the end-to-end supply chain lifecycle.","Decentralizing Provenance in an Industry 4.0 World | Frontier Talk #3 - Dr. Carsten Stöcker In this episode, Raj Hegde sits down with Dr. Carsten Stöcker, Founder & CEO of Spherity to understand how #decentralized identity is transforming the end-to-end supply chain lifecycle. Tune in to this episode to explore the increasingly important role of provenance in helping build a better world and learn about the intersection of exciting concepts such as non-fungible tokens (NFTs) and decentralized identifiers (DIDs). He pursued a PhD in physics from RVDA Hawkin, um, to understand how the world works and is leveraging the power of technology for the greater good of society. He's a highly respected figure in the blockchain space and acts as an advisor to the world economic forum, as part of its global future counsel, you are to share a stake on our provenance as a fundamental technology connect as a force multiplier to bring about positive change in society. Dr. Carsten Stoker, the founder and CEO of austerity. Yeah. Hi, thanks for having me on your Frontier Talk and I'm glad to be here today. Welcome to the podcast. And I'm speaking to a physicist, uh, to be honest as being always up there on my bucket list. So I'm glad that I could finally scratch it off my list. Um, so let's get started. Um, you've had an interesting career today. It's spanning across research consulting, a stint at the WEF and now entrepreneurship. So I'm curious to know, how were you introduced to blockchain technology? We, how I got introduced to blockchain technology basically was a big coincidence. I worked for utility RWE at this time and later it was kind of spun off into energy. I worked at the innovation hub and then there was a Dutch board member and the Dutch partner number basically wants to invent Uber for energy. So at this time, everyone wants to invent Uber for some things they have and be for something to come up as a new digital digital proposition. And basically it was a Dutch board members. The Dutch board member asked one of his friends or freelance and his network. And the freelancer said, yes, no, I can start working on this and invent Uber for energy. And what's the freelance and residence that he basically wrote a LinkedIn message to his network and asked his network, Hey, anyone in my network can help me to invent Uber for energy. And then all of the very early Ethereum developer ecosystem, because at this time there was the goal theory team and Amsterdam, and then I think C plus plus theory on team in Berlin and because it was Netherlands and the people from the Netherlands ecosystem said, Hey, yes, we have this fancy new technology, Ethereum, there's all the smart contracts. Why shouldn't we kind of try to not invent Uber for energy, but Uber for energy was out Uber in between, even if it's a disintermediation of Uber. And that's how I got in touch with the theater production and then be developed as early as 2015, already a peer to peer energy trading prototype, um, for, uh, on, based on smart contracts on the theory. So for households that offer energy because of renewable energy that could do a direct peer to peer energy transaction, other households that would like to consume energy. So with our utility, um, in the, uh, in between, without moving in between, and that is how I got in touch with the blockchain technology. Okay. Right. Brilliant. That's so cool. Um, there are so many interesting applications of blockchain today, be it, um, decentralized finance or defy, or for computing, I'm curious to know what, uh, got hooked onto decentralized identity and more specifically, why did you choose to specifically tackle the challenge of automating identity verification in end-to-end value chains? It's basically, um, the S mentioned, we did some the Saudis in 2015, some deceptive digitization business model designs and all of them. That was the problem identity. And I think when you do digitization, I think every digitalization should start with kind of proper identity solution. And, um, that's, that's also another thing from Nobel prize winner, you basically said, okay, if we'd like to solve the identity problem, then we need to solve it end to end. So, which means, so then, um, let's say as a company, right, and I got a company identifier, there must be someone who says, okay, the companies, the companies that I can start testing the company, and this could be, for example, in Germany would be And that's basically what we call an ends to enter the entire supply chain. That's still not solved in the internet and on the other. So we truly believe today's internet is, uh, internet with data and Um, so what are the industries that you primarily target and why is there a core need for your technology specifically? Yeah. Now I would like to, uh, to, to mention two things as a continuum, from my perspective, when you think about identity decent last sentence, the internet, on the one hand side, we have the cypherpunk manifesto. So there's, everything's encrypted and fully autonomous and anonymous transaction systems for humans. It's on one side of the continuum, a lot of privacy to protect my data and to make sure it's kind of self-sovereign data and no data will be leaking about myself. And on the other side of the continuum, it's basically, um, yeah, I would say surveillance of an object where I would like to have the full back to burst testability back to birth lifecycle history of an object. So are completely two different, um, the poets of the continuum. So then we had 30 would like, of course, to address both of the poles, but I think the human poll is much more difficult. GDPR does sends inertia to kind of, to convince people, to use specific technologies, specific wandered, very, very tough questions. And we think that the other side of the continuum where I would live to have this full traceability of an object for compliance reasons, for what reasons, because I would like to protect patient heads. I would like to provide, let's say an auditor for circular economy and for the school to kind of, to protect the safety of food, because I have the provenance, I know where it's coming from, and this is the support. So where we address our technology, because we think from a doctrine perspective, it's kind of more realistic to push it short term into production. And that's what we actually do in the pharma pharma supply chain area. And that's the reason why I put more focus on enterprise and specifically object identity. And when we do an API is legacy systems, manufacturing, execution systems, global AtWork databases, ERP systems, then API integration is kind of reasonable to be done, but then you can significantly scale technology because then the scale is number of objects as the number of files to good product that's being produced by, by the company, right? Um, some pertinent points that you raised. I think it's a great time to now, um, segue into the role of enterprise identity in supply chains today. Um, supply chains, as you might know, are often seen as this complex value chain of sorts, comprising of a wide range of parties. Uh, there's no clear understanding as in who actually is part of this entire value chain, you know, be it vendors, wholesalers, regulators, there are a whole bunch of parties. So to say, so could you perhaps start off this discussion by highlighting, what are the typical identities in a traditional end to end lifecycle in a supply chain? I think when maybe it's pretty difficult because we all need to go to the GS one to this global standardization of identifiers and to have identifies quality digital global trade identification number. And then they have another identifier called civilized. So a GTIN is basic right, and identify for specific product, but similar Streeton is basically breaking it down to batch level and even to see the number level that's one part of the identifiers. Another part for example, is so-called GL and global location number or party GL and PG lands that are presenting a legal entity or company. And these are the kind of identifiers that are today supply chain, but we cannot verify anything. So we have no instruments in the tools to find out is really coming from a company Right. So you're pretty much forced to believe, um, what's in front of you essentially. And is that the fundamental issue with supply chains today? Um, is there a missing trust layer? Yeah, I think today, um, the couple of tools in place one's called EPC, I S when you have specific messages as a tier two supplier tier one supplier, the customer, and then the messages are being created exchange among each other, but still there's no tools to verify the authenticity. At first, it starts with authenticity of the product. So how can I pull the authenticity of the product from our perspectives, that kind of two different kind of, let's say key technologies in place. So one is to put an identifier on the product and identify if it's a randomized you the number. So, which means if I'm a malicious actor, for example, the pharma industry and I'm producing fake pharmaceutical products. Yeah. Then it's almost impossible for me to guess a valid serial number because the manufacturer use this randomized serial numbers for, to get packages, which means I, as a verifier, let's say I have a patient, or I have the policy. If I scan the identifier, I can send the request to the manufacturer and send the manufacturers, basically looking at So all these players are also trying to go into supply chain for supply chain integrity for product authenticity use cases. And that's still a big problem, but it doesn't end with product authenticity. It's also about very simple things such as an ear leaflet and electronic leaflets. Yeah, because if I'm, as a hacker can put a fake leaflet, electronic or digital leaflet can attach it to foster good product, then the patient might get the wrong instructions, how to consume the pharmaceutical product is this could have significant impact on patient health. But also if we think about machines, if I can sneak in a fake, a leaflet for machine, and then people that are maintaining on starting the machines, they basically, this can be kind of a very bad impact on the health and safety, because if there's a high voltage power line and it doesn't explain how to disable the high, high voltage power line, and that touched the high voltage power line, then it can be very impactful. So it's about authenticity. It's about leaflets. It's about product recalls because how do I know as a manufacturer who is, uh, owning the product who's using it? Yeah, because I have a product recall, I must need to contact the end consumer. So that that's another as I use case product recall, but also back to birth lifecycle for circular goods. For example, what plastics ingredients are in the casing is this bio soft plastics, is this biodegradable plastics? How do I recycle it is very important information. And if I, if, if the antisystem is not secure, so then I'm kind of screwing up my, my circular economy, um, or plastic recycling systems and all of this, uh, very, very important use cases up to customs. So we also have a project with the U S the power performance, security, customs, and border protections. So we even thinks this is technology civilization due to training is also last line of defense for circular economy, because let's, let's assume we are here in Germany and getting products from Asia. And the only in the EU, we only would like to let let in products specifically circular, renewable, sustainable history. And, um, let's say basic usable energy was used to produce them. Bio source plastics was first used kind of to produce them. So if I would like to do this, then the customs transition must have the tools and the hand to check the serial number, to go to digital twin, to go back to the auditor and find ours is proper circular object, fixed circular object, and to fake the clock objects that we locked out from, for our perspective, it's the enabling technology for the circular economy and all these kinds of let's say features are very fascinating. And we're still only talking about objects, identity objects ends up plenty, plenty of excellent and comprehensive from use case. Right. Um, you raise an interesting point there about the circle economy. Um, I'm curious to know, um, what is the role of identity in a circular future, according to you? Um, so fast, it's all about provenance and provenance starts with the company will provide components. So who basically is a company that manufactured components that are then being assembled to produce a product, for example. So I need to know the identity of the companies that are producing the components, and this is where it all stops are can I trust it? The company is a fake company. Is this company who would use the components? Is it coming from, uh, from a country, was export compliance issues as coming from a pop-up proper country. Do they do the work in a Crohn's to environment, health and safety standards? Do they do the work and the Crohn's to labor rights, anti-bribery tried labor, and then I can check this. I can stop trusting the company, and then I can start testing the origin of the data of a product. And then I think as I think maybe for, for the audience, for the, particularly as that component of being sample to product, in terms of transformed and circular object, as an end customer, I would like to have the full, the full back to burst traceability, because if some of the players is kind of cheating, then I cannot trust the end product. And that's still a very, very big problem when I, for digital twin in the end product to how can I trust the data so that describing the end product and how kind of tasks or the supply chain actors that's part, part of the circular economy just sounds like a very big complex problem. And I think the, uh, critical success factor is to start very, very simple. It's very simple use cases that are doable and not to try to solve the entire circular economy problem from the start, but to start with very simple. Okay. Um, and to add to that, um, in our first episode, on this podcast with, um, Dr. Harry barons, he mentioned that in a B2B setting, um, the trust authority almost always goes to a route, particularly in regulated industries. So how do you ensure that the credentialing authority is who they actually claim to be? Yeah, I think this is a very important context. Concept is basically similar. What we know from public key infrastructure. We have past hierarchies in the five or nine words and in the internet is a whoop whoop certificate authorities. And if you would like, and what are basically all these decentralized identity solutions and verify their credentials all about, it's exactly the same context concept as X five or nine, it's more standardized, it's more extensible, it's more flexible. It allows more or different so-called testaments, but in a given customer, I need to know who the, the, the, um, the authority that are issuing certificates and today in the internet. So we have a couple of tools. One is a so-called well-known to it because I always need to able to verify the identity of the would authority and award authority could be the government authority could be global legal entity identifier foundation that is establishing a global governance and infrastructure for verifying enterprise identity. And then I need this well known tool, and well-known basically says there's an identifier for a company. And then there's public keys that I have presenting the company, or that are being used by the authority to sign now certificates, for example, enterprise certification certificates. And then I need to check the task. Is this identifier, does it really belong to the authority? For example, the German government does the identifier belong to the German government and the public keys, VDP log belong to the hood identifier. If I can check this, then I can establish a task Tibor key. And then when I get a very fabric credential or science data payload, I can basic cryptographically verify it. I can also check those. The keys belong to the entity was the entity being verified, for example, by the German government and the keys of the verifiable credentials, verification credential, really drunk government that's Tyreke. I have to do the anti-trust chain and that's by the way, by this. And plus it is still challenging, but that's kind of trust hierarchies and must be solved. And that's where a lot of that momentum is established right now to really get to some place. Brilliant. Um, the term decentralized identity is almost always associated with blockchain technology. Um, why is this, so, and, um, could you perhaps, uh, double click on this relationship between blockchain and decentralized identity? Yeah, basically in decent last entity, it all starts that I S identity subtract on identity controller. I would like identifier and adult go, and that's a big difference. I don't go to centralized platform that are creating the identifier and the public private key pair on my behalf, if I will do it. And what all the kinds of existing systems are about today is that an administrator can kind of try to cheat. So a lot of the tech vectors straight I can kind of manipulate the, the keys can see the private key can manipulate the identifiers. This is problem is all centralized platforms in these set. Last, the entity is different. So I, as an anti subtract with my identity controller, basically out of randomness or from Abbotsbury no noise, I create seed. And then I create a public private key pair. I fully control it. There's no one else who's controlling it. And when I create a public private key pair, also public key, I create an identifier and now what do I need to do now? And I want to make sure my contact counterpart is nosy decent. Last identifier. Plus saves the tools to look up my public, finding keys for this identifier, and then you need to broadcast them or to inform them. Yeah. And blockchain is a very handy tool because what I can basically do, I can establish a smart crowd fact. And in the smart contact I had just done my identifier. Plus my signing keys was identifier and a smart contract is making sure. So then I'm changing this, that only I, as an identity controller can change it. So in this is what cryptographers called tapping the entropy. This is basically, uh, the blockchain is providing the instrument so that other people can look up for my identifier. So what are the other corresponding keys that are being used for signing for signing data on my behalf? And, um, as a blockchain is immutable and publicly accessible. That's a perfect tool to basically, um, communicate the keys to my identifier. And then there's the second use case. If I have an identifier, I also would like to enable people that they can look up service endpoint or URL of a service. So it belongs to my identifier. And there's an analogy with the DNS. So in the DNS today, we base the it's a lookup, it's a mapping tool between it address I between an IP address and a domain name. That's exactly what we're doing here. So the other perspective on decentralized identity is a decentralization of the DNS because you have an identifier basically, which is kind of, let's say maybe domain name. And then I basically can, can, can look up, identify service endpoint, and then I can go to service end points and interact in a neutral way on the internet with me as an identity subject. And those are the two very important tools, uh, features of these times, the entities, assigning keys, communicating them and communicating service endpoints. And if I put as an immutable ledger that I only have control about the data, then I can be sure that everyone can read it and kind of find all the keys and interact with my service endpoint. And, um, I have the tool to communicate that to everyone else, the tools that fully controlled by my own, there's no set party involved that can try to lock me out to manipulate my identity. And that's, that's being avoided by the use of blockchain. Right. So now that we've discussed the current state of play in supply chains, I think, um, it's a good time to deep dive into the future of, um, industry 4.0, um, according to you, what are some of the biggest inefficiencies you see when it comes to supply chain, network design? And, um, more importantly, is there any room for ecosystem innovation So it's basically, it's kind of self controlled. I can request some credentials published class. I can go throw up a route of trust, a trust chain to find out if the credentials are correct, just establishing completely new means. And when I can basically verify where's the data coming from, from which company which Agros in which object, I can also put what we call this scoring in place. Yeah. And then I can give in the future data of this score, especially in the more crop, kind of, let's say in a typical cyber-physical value chain, just so many data that blended processed and merge to establish, for example, digital twin of a manufacturer product. But, um, I can, I can be is an Eve, uh, trust all the data and digital twin or establish some tools to test scoring on the data. And then I can have some special decisions and can make up my own mind whether I trust the data, um, or not. But this is very important, especially when, um, machine learning comes into play as well. So when I feed machine learning algorithms, I get machine learning labels out of the algorithms. Then it's even more important to find out, can I task the machine learning algorithm is a benchmark. What has the training data? Was it biased? And even if I trust the word best machine learning algorithm, what is the input data? Was it fake cars or was it real BMWs? Yeah, if I can check the input rate as an egg wasn't, um, but when I get the output machine labeled, so then I can establish a risk scores. I can trust them and I can use them for decisions for autonomous driving back and forth, uh, driver assistance systems for risk propositions, for traffic control systems for mapping systems. I need to trust the data as a provenance of data being processed. And that's, that's the can from, from our perspective, the, um, the big, um, as a big opportunity with specific technology, right? I think you raise a very important point there, uh, um, with regard to the verification of trust. And I think that is a concept that we can definitely touch upon later in this podcast, um, for now, and again, like a shift back, um, the focus on to industry 4.0, um, you've seen the industry evolve for some time now. And, um, in your opinion, what are the four or five, um, ingredients that organizations need to consider to successfully bring any new technology to domain? And from, from my perspective, first, it starts with simplicity to find this really, really simple use cases, few supply chain, actors, few data sets, um, and few systems, um, that I can basically connect to have to have a proper business value already. Yeah. So which means I would like to have precise integration of few systems and not to have this very complex Boise OSHA report in it. So it's simplicity, it's also education. So for us as a start ups, we prefer to engage with people, people, and businesses. So they're already, yeah, that's a very important ingredient because otherwise we have to do one or two years additional education, and then we bring nothing to the market. So it's implicit, it's simplicity, it's education, it's a business case. So if there's not a business case, and it's difficult, what we see the best business case guide now, because this is an ecosystem technology is the business case can be dilutive. It's more systemic business case where everyone kind of benefit, but how do I benefit? What's my business case. What's in, for me, it's unclear. But if there's regulatory requirements for provenance, for auditability, for back to birth traceability, if these requirements as they are, then these compliance requirements as a business case, because today is a lot of paperwork. It's many work. It's a lot of, let's say quality inspectors. If I can digitize all of this, or if I can avoid fines or penalties, because I'm not compliant. So this, from our perspective compliance process cost reduction. So that's the business case for short term implementation. So I mentioned simplicity. I mentioned, uh, as the education and the post compliance, I would like to mention two more. So one is that as an ecosystem, the full ecosystem, because this technology doesn't make sense if just one company introduce it. So it's very important to have, let's say, consistent ecosystem for everyone shares the same common goal, for example, to reduce compliance costs and last but not least very important, because we mentioned this past domains and some need to do some enterprise identity verification. That's where I'd start. This is, this is normally not in place. It's still a big question. So who says at Porsche Porsche, Siemens, Siemens So there's a solution to implementing that is bootstrapping the DEA trust domain. Um, this is fantastic for us because we don't need to verify anyone so we can do combination of DVDs and five or nine signing certificates. And then we can establish digital identities and we can establish so-called also has trading partner, um, credentials. And this is how we can bring it to production is outside need to solve the problem of enterprise identity verification because in principle that's solvable, but in practice in practice, it's not being solved because there's no infrastructure in place and no consistent credentials as this. This is fantastic. And that's the reason why we are, we live lives as also as trading partners use case in, uh, in us very much. That's the sixth ingredient I would like to mention this because when you have all the wallets city centers, the wallets, you need to go to every supply chain and gives the wallet. You have to sell it to them. You have to integrate it to all the pharma companies and wholesalers. Then you have to validate and test it. So it's a lot of work, even from a commercial perspective, this is almost an unbelievable amount of work, but in this use case in us, uh, so-called men in the middle and in this also has tiny partner use case. Um, it's regulated by FDA. It's a us stock supply chain security act, everyone, every supply chain participant wants to use compliance cost. They don't have, let's say a more principle discussion. I compose a wallet is the custody wallet or noncustodial wallet. So it doesn't matter what matters is to reduce the compliance cost and to establish a secure system and SMN in the middle of the, for us, it's very interesting. Um, they're called so-called verification, defaulting service providers, and to provide a lookup service, there are a few of them. If we integrate our solution of what business verification holding service providers with the maintenance emitter, it's fast, uh, fantastic how to market, because they are connected to all supply chain actors. And we all need to go to the very few VMs providers, for example, SAP, F X a TraceLink just to name three of them. You basically need to go to three of them and say, basically, uh, yeah, I have outsource the whole set as a pharma companies outside as opposed to them. And then by integrating with CVS providers, we can basically kind of switch the entire industry to use our technology. And it's fantastic because otherwise we have to go to every pharma company and kind of to, in a very many work to somehow integrate our wallet. And by integrating our API APIs to few men in the middle, that's fantastic from a go to market and how to doctrine. Right. Um, great insights there. Costin, I think that's a great playbook. You've put out there for anyone looking to move a supply chain 2.0 or upgrade the supply chain systems. Um, I think one thing that's clear from our conversation so far is that we are living in, in narrative driven society. I mean, if you look at all the institutions around us, the media, the politicians, and, and whoever, you know, almost always are told to believe in a certain kind of truth without being able to verify it. And then all of a sudden comes this incredible piece of technology called blockchain. That gives us the ability to not just verify the truth, but also be sure that what's in front of you is actually what it means, you know, being in the case of provenance of raw materials or in the case of NFTs where we can now track and verify the authenticity of non-financial assets that are scars and, and, and unique being a piece of physical art or digital art, you name it. I'm curious to know how do you see this move playing out in society, um, by this move? I mean, the intersection of decentralized identifiers, DIDs and NFTs. Mm. I think so the verification of tools, that's about the trustworthiness and that, I think it's all coming back again to the risk scoring. Yeah. Because tools is not something binary. It is a tool or not tool. It kind of convolution. Yeah. Because I have so many different data, for example, in the pandemic. So what's the tools it's a tool set with that says more infections or less infections. Uh, what is the tools when I make a specific policy? So how, what is the impact on the infections? Um, what is, what is the, what is the lesson for me Personally for my age, this just a lot of possibilities. Yeah. And in, in supply chains, in art, in, um, in e-commerce there's always a lot of data, data are blended together. Um, I get a lot of data. I think also think tools is not binary, I think in the end, and this is even unexplored. There must be tools kind of to, um, to assess the risks of using the data of testings of data. And that's something that's still unexplored because even in the debts and verifiable credential, um, domain, people think bit, a bit binary, uh, or the have a driver license, or I have not a driver license, or I have a COVID vaccination test certificate. I don't have it, but what can go wrong? So then I go to a test center. Okay. The test center can basically screw up my name. Yeah. And screw up, my sample to test center could be tested, was very poor quality management. Yeah. Which ruins lab equipment is not maintained correctly. So the outcomes public has also some, some probability and then they basically can, uh, then they have to, to give me a certificate about the test. Yeah. Uh, how does the insurers, as they give it to the right person and that person authenticated there? So what I would like to say, it's all about a lot of this risk matrix and to understand and have proper scoring. I think that that's tooth. Um, but you mentioned the NFCS and the debts. So a lot of people are putting art as an NFT, a non-functional talking. And first of all, uh, what do I know about the artists? What is the artists, what do you know about the ad it's set as one piece existing, obviously art also serious of 1 million pieces of art. There's only tiny differences. Yeah. So if it says one millions and it's less about, it was much less combat compared to scarcity. If the art is just one of the piece of sad. Yeah. So again, need to, to, to know that the provenance and I think even from a legal perspective, so if someone puts an NFT Ethereum, so how can I make sure the same, person's not setting NFT on Bitcoin polka dots, uh, on a cadaver or any, any other chain. Yeah. So how do we ensure it? And then it's kind of at the physical digital intersection and with, with some legal perspectives, how do I do this? But anyway, I think that's, that's kind of the broader perspectives here in terms of tools. And you mentioned the NFTs and debts from a tank perspective, there are a lot of similarities because an NFT is controlled by an owner, and then I can do ownership transactions. I can give it to him to new owner. I can also establish, uh, uh, fractional ownership, um, to, yeah, for example, there's an expensive bottle of wine. I can give it to one person that's controlling the NFTs, presenting the bottle of fine, and they have to show the NFT before I get it, or there can be multiple owners. See, I don't have a concept of faction ownership. And in addition, if ownership is changing in the NFT and might be able to see the chain of custody, and that's also for some supply chain use case, um, for luxury goods, it can be connected to authenticity. I can do similar things with the debts debts. I also control it. Um, I can base even change the ownership by kind of giving control about that to you. Um, then I have the service endpoint. I can find a duty between the two. It can be digital and to add, I can describe the prevalence of the art and see the heritage of the artist was very fiber credentials. So we feel a lot of intersection between this and NFTs, but, um, and especially for the question of provenance of NFTs, and this is pretty much unexplored yet. I think it's unchartered territory. And my prediction is that there will be a lot of, um, yeah. Work going on in combining NFTs did spare fiber credentials and a not so far future. Right. And finally, I now want to explore the cultural revolution of sorts that is unfolding in front of our eyes. Um, you recently wrote an article on the principle of duality in science and art, uh, that is changing the course of tech and marketing. I Personally think it's a fascinating read and for our I'll post the link to the article in the description box below, um, I Personally would like to add onto it and call it a trifecta of sorts because we're announcing athletes jump on the bandwagon and get associated with emerging technologies. You're seeing the newly crowned quarterback of tags, um, Trevor Lawrence signing with blocky fi you also have silica now partnering with nine top football stars to endorse their product. So my question to you is why is this a recipe for success? Um, when it comes to marketing in today's world, that perfectly thing from a technology perspective, there's a concept called crossing the chasm. Yeah. So, and you have to kind of react cantaloupe best technology. Um, but you need to have early adopters. Yeah. And, but it's even not enough. So you have crossed the champion when there's an early majority. Huh. And you're, and if you don't reach an early majority, then no one really uses the technology outside the lab or outside a few test. Yeah. But as a technology technologist and entrepreneur, you're only successful when you can reach the early majority. And I think when you think about the athletes and the artists, so they have quite some reach and that can help you bootstrap, um, uh, to the crossing, to the costings of Chisholm, big cross, if they're interested in technology can leverage this for, for their benefit or for the greater good of society. And that can help to establish an ecosystems that can help kind of to transport things, the message, the narrative to the early majority. I think this, this is a fast festinating duality of, um, of kind of combining technology because the reach of, of athletes and artists and that's, what's, what's, what's pretty fascinating. Brilliant. Uh, I think now it's time for the best part of the podcast. It's time for frontier fire, where pause a series of rapid fire questions to our guests on the pod. So Karsten, I ready for the challenge. Yes. Brilliant. Let's get started. Um, I'm curious to know what's the best application of physics in everyday life? Well, from my perspective, I liked statistical physics pretty much because if you apply statistics, statistical physics for machine learnings, for predictions for everyday life, for social, for economic, for technical questions. Um, so that's, that's, that's from my perspective, the best application to forecast the future, to predict the future, to make better decisions for now, for all of us. And what is the best business advice you've ever received? It's the best piece of advice. So from my perspective is really, um, uh, humility because you, sometimes people think they can control something. They own some things I have found, see the big insight, and now they can change the world. And usually it's not so easy. I think humility is very important to have a little bit It's quite a beautiful mindset and there's as a movie about, um, uh, and tooling. And it's so fascinating, what kind of Personal fights I have with them setups with inner being is kind of trying to open themselves to innovate, to bring new mathematics, to bring new encryption, and, but then to struggling in everyday lives. So this is very fascinating because this is Michael micro system. So people have some tools and the capabilities to do something great. As, as traveling, it's a tiny things. And if you, if you, if you can have, let's say, um, transforming to a macro level, it's a planet earth. It's still, it's still for us humans. I think we all, we all have all the knowledge and all the tools and science to understand that separate is twice a planet, but we are traveling and cannot really change our costs. And that's so that's, that's what I like to connect to science and started of people with, um, the greater climate change problems we're in today. And speaking about struggles, um, what is the one thing people don't know about entrepreneurship? I think from my perspective, so as a consultant, it was super easy as a consultant to set stuff to enterprise as, yeah, incredibly easy to sell as a consulting project system integration, project strategy project, super easy, and tend to sell even more clients and doing transformation implementation as an entrepreneur, especially which would do be to focus on B2B. It's, it's just the opposite. It's sore, it's sore spot, super difficult to sell emerging technology. There's a business case. It's not clear to sell it to companies because especially, let's say in Europe, in Germany, people would like to have a crystal clear business case center start investing. They don't invest in hypothesis that they need to build the capabilities to kind of, to work for technology. And I think that the second tech tech and one big thing it's that decisions and implementation Ultimax exchanging because a lot of technologists are ecosystem technologies. And this requires a different approach because you cannot just sell it to one company who's successful. You have to kind of to sell it to an ecosystem is dependencies because it's fusion of technologies that dependencies the main technologies and that's set sets. That's kind of tough challenges, but, um, yeah, if you kind of have that same ecosystem innovation approach in mind, um, Down's has probably prerequisite finally, what's your advice to anyone listening to this podcast? Cool. My, my advice is basically so, so never give, give up, um, and be very flexible because when all of you is kind of pushing forward, decent, less identity bent vegan, uh, you have to be very flexible. You cannot kind of focus on not just one domain and one copy, please fall off. That's what venture capitalist wants to see. So it's unclear where to start. And I think to be able to pivot my different domains propositions, I think this, this, this flexibility is very important and also the ability to execute, to learn fast then going into another business domain. That's very important Carsten. It was an absolute pleasure speaking with you today. Thank you so much for shedding light on the increasingly important role of provenance in the world tomorrow. I hope to have you again on this podcast and wish you and your team as Verity the very best of luck going forward. Thank you so much. Yeah. Thank you for having me. And it was a fantastic experience being on your frontier show. That was Dr. Carson Stoker cost. And we'll be speaking at the European identity and cloud conference EIC, and you can get your tickets to the event. Why the link in the description box below. I hope you enjoyed this conversation that dabbled around NFTs provenance and the cultural revolution. Um, if you think anyone would benefit from this information, please go on and share this with them. Um, until next time, this is me Raj Hegde day, and I hope to see you all again on this incredible journey to really find the eye in identity, stay safe, stay happy. Video Links Stay Connected Related Videos How can we help you",https://www.kuppingercole.com/watch/frontier-talk-podcast-3-decentralized-provenance,,Post,,Explainer,,Supply Chain,,,,,,2021-05-12,,,,,,,,,,,,,
|
||
Spherity,Spherity,,,,,,,,,Authorized Trading Partners,"DSCSA-Compliant Verification of Authorized Trading Partners - Spherity is working together with global pharmaceutical manufacturers, wholesalers, distributors, the Healthcare Distribution Alliance and other solution providers to develop a production-grade solution for the Authorized Trading Partner legislation by Autumn 2020. Spherity’s Cloud Identity Wallet enables the exchange and verification of electronic state licenses.",,https://web.archive.org/web/20220401000000*/https://spherity.com/pharma-authorized-trading-partners/,,Post,,Meta,,,,,Cloud Identity Wallet,,,2020-11-21,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,Ontology,,,,,Ontology Partners with Spherity to Advance Digital Identity Solutions,Partnership will involve integration of Ontology blockchain into Spherity’s Cloud Identity Wallet,"Spherity Partners with Ontology to Advance Digital Identity Solutions Partnership will involve integration of Ontology blockchain into Spherity’s Cloud Identity Wallet Spherity the German decentralized digital identity specialist, announces a partnership with Singapore based high-performance enterprise blockchain and distributed collaboration platform Ontology. This strategic partnership will see the integration of the Ontology blockchain into Spherity’s Cloud Identity Wallet, and will allow Ontology to harness Spherity’s blockchain-agnostic Decentralized Digital Identity solutions across public and permissioned blockchains. The remit of the partnership will also extend to the creation of Proof-of-Concept (PoC) pilots around supply chain, mobility, and pharmaceutical verticals, along with joint research and marketing initiatives. The primary objective of the partnership is to deliver enterprise solutions for cross-border supply chain resilience and transparency. Andy Ji, Co-founder of Ontology, said, “Partnering with Spherity represents a further expression of intent towards accelerating the development of digital identity protocols, and underlines our commitment to addressing prevalent issues associated with identity security and data integrity. This partnership provides scope for the exploration of enterprise and product identity, as well as cross-border supply chain identity use cases including provenance, transparency, and authenticity.” Spherity builds enterprise cloud wallets and other decentralized digital identity management solutions that offer more secure and versatile cyber-physical bindings and supply chain tracking solutions. Based in Dortmund, Germany, the Spherity team is developing decentralized ‘digital twin’ applications — self-sovereign, unique digital representations of enterprises, products, machines or algorithms services — which are immutably anchored on a decentralized system, overcoming today’s trust and interoperability issues. Dr. Carsten Stöcker, CEO of Spherity, said, “The ambition and vision of the Ontology team directly correlates with Spherity’s own roadmap, both in terms of expediting the deployment of blockchain solutions globally, while also firmly imbuing the principles of trust and security in digital identity solutions. Ontology will provide invaluable access into key Asian markets for our bespoke cloud technology, and we are delighted to boost Ontology’s European presence, given our rich history of operations in Germany, Switzerland, and Austria.” The partnership focuses on connecting Sino-European ecosystems while establishing trust among supply chain actors. Together, Spherity and Ontology are uniquely positioned to establish interoperability and data portability across European and Asian Blockchain infrastructures. The joint goal is to deliver secure ‘collaborative data sharing’ solutions for ‘enterprise master data’ and ‘product serialisation’ with back-to-birth auditability. Ontology delivers a fast, user-friendly platform with a unique infrastructure that supports robust cross-chain collaboration, providing businesses with the tools to design their own blockchain solutions securely. Powered by the Ontology Token (ONT), the distributed collaboration platform allows businesses to enjoy the benefits of smart contracts and tokenization while retaining control of their sensitive data. “The strength and depth of the Spherity network can help us achieve our strategic objectives, particularly those pertaining to extending our European reach and advancing our B2B business development. We look forward to supporting the deployment of Spherity’s cloud technology in Asia while identifying mutually beneficial opportunities for collaboration moving forward,” concluded Ji. About Ontology Ontology is a high-performance public blockchain and distributed collaboration platform. Ontology’s unique infrastructure supports robust cross-chain collaboration and Layer 2 scalability, offering businesses the flexibility to design a blockchain that suits their needs. With a suite of decentralized identity and data sharing protocols to enhance speed, security, and trust, Ontology’s features include ONT ID, a mobile digital ID application and DID used throughout the ecosystem, and DDXF, a decentralized data exchange, and collaboration framework. About Spherity Spherity is building decentralized digital identity management solutions to power the fourth industrial revolution, bringing secure identities to machines, algorithms, and other non-human entities. Spherity’s decentralized identity cloud-edge wallet empowers cyber security, efficiency and data interoperability among digital value chains. The customer focus is primarily on highly-regulated technical sectors like supply chain management, mobility, and pharmaceuticals.",https://medium.com/spherity/ontology-partners-with-spherity-to-advance-digital-identity-solutions-4d2c95b288,,Post,,Meta,,,,,Cloud Identity Wallet,,,2021-03-26,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,,,,,,Spherity Achieves ISO 27001 Information Security Standard Certification,"“To guarantee business continuity and protect data, we have built up an Information Security Management System (ISMS) in accordance with ISO/IEC 27001. For us as a company that deals directly with one of the most valuable assets that civilization has — identity — it was the logical pathway to give information security an appropriate degree of importance. Furthermore, we see it as our duty to our customers and employees to enter into this self-imposed obligation and to guarantee the highest possible level of information security — also as an investment in the deep mutual trust and ongoing cooperation with our clients.”","Spherity Achieves ISO 27001 Information Security Standard Certification Spherity, a company building digital identity management solutions, has achieved ISO/IEC 27001:2013 certification Issued by TÜV Rheinland, the certification confirms that the company’s data security systems, including the secure development process, meet the industry’s best practices at the highest level. ISO/IEC 27001 is the most widely used information security standard prepared and published by the International Organization for Standardization (ISO), the world’s largest developer of voluntary international standards. It includes requirements on how to implement, monitor, maintain, and continually improve an Information Security Management System (ISMS) within the context of the organization and its business needs. Conformity with this internationally recognized standard lies at the core of Spherity, since we consider information management essential to all of Spherity’s business operations. These best practices ensure we will continue to protect the interests of our customers, investors and employees, providing the highest level of security assurance. Information security is the practice of ensuring the Confidentiality, Integrity and Availability of information and data according to the “CIA principle,” and thereby defending information and data from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction. Additionally, this principle maintains business operations and protects business continuity while minimizing risks. “To guarantee business continuity and protect data, we have built up an Information Security Management System (ISMS) in accordance with ISO/IEC 27001. For us as a company that deals directly with one of the most valuable assets that civilization has — identity — it was the logical pathway to give information security an appropriate degree of importance. Furthermore, we see it as our duty to our customers and employees to enter into this self-imposed obligation and to guarantee the highest possible level of information security — also as an investment in the deep mutual trust and ongoing cooperation with our clients.” – says Dr. Michael Rüther, COO/CFO, Spherity. The certification was validated following an assessment done by the independent certification body, TÜV Rheinland. It covered the organization’s IT systems, cloud services, applications and all related assets, as well as all information and data stored and transacted. The scope also included the company’s office, located in Dortmund, Germany. “Digitization is one of the trends of our time, bringing with it both significant opportunities and major risks: for example, new types of attack vectors are being created or the complexity of systems is increasing, which increases the risk of failure. Certification means that customers can expect systematic and continuous management of information security risks and business continuity. It is noteworthy that Spherity, as a start-up company, has committed itself to follow the internationally recognized standard for information security management.” – says Klaus Schneider, Managing Director of IMS-SCHNEIDER and Lead Auditor for TÜV Rheinland Cert GmbH. The certification is publicly available in the TÜV Rheinland Certificate Directory and also on the Spherity website. The organization’s ISO/IEC 27001 certification represents an important step forward on our journey to prove that we are committed to the highest standards of security and service. About Spherity Spherity is building decentralized digital identity management solutions to power the fourth industrial revolution, bringing secure identities to machines, algorithms, and other non-human entities. Spherity’s decentralized cloud identity wallet empowers cyber security, efficiency and data interoperability among digital value chains. The customer focus is primarily on highly-regulated technical sectors like pharmaceuticals, mobility and logistics. Sign up for our newsletter or follow us on LinkedIn to stay up to date. Press Inquiries Please direct press inquiries to: Marius Goebel communication@spherity.com",https://medium.com/spherity/spherity-achieves-iso-27001-certification-f687ee42c40e,,Post,,Meta,,,,Business,,,ISO/IEC 27001,2021-03-26,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,Sovrin Steward,,,,,Spherity becomes a Sovrin Steward,"Spherity has assumed the role of Steward in the Sovrin Network, a not-for-profit worldwide alliance of companies that operate nodes supporting distributed ledger operations so that the network can provide identity for all.","Spherity becomes a Sovrin Steward Spherity to support Sovrin’s “public utility” network for identity Spherity has assumed the role of Steward in the Sovrin Network, a not-for-profit worldwide alliance of companies that operate nodes supporting distributed ledger operations so that the network can provide identity for all. The product of a nearly decade-old open-source community, this network is the largest and oldest global blockchain to support decentralized identity systems, including many governmental initiatives like Unlock in the Netherlands and the province-wide projects for corporate identities in Canada like British Columbia’s OrgBook. In Germany the Main Incubator in Frankfurt has recently launched its own Sovrin-based network, called LISSI, following Finland’s Findy and a handful of smaller efforts. Indy’s complex governance structures have reportedly been a substantial influence on the design of the European Commission’s eSSIF framework. Spherity co-founder Carsten Stöcker says: “Sovrin represents a unique approach to decentralized identity, which draws its strength from technology leadership, thoughtful governance and shared infrastructure. The Indy blockchain and the open-source Aries codebase are driven forward by a wide-ranging coalition of enterprises and communities. We are excited to become Stewards, which will enable us to support German and Europe-wide efforts to build trust frameworks and identity infrastructure from the ground up without starting from scratch.” Spherity has been exploring the Aries framework to build up an Indy wallet that can interoperate with the production-grade Indy-based products of project partner SwissCom (which has been a Steward since November 2018). Spherity plans to continue expanding its Indy libraries and wallets, particularly those supporting interoperability between Aries-compliant Indy wallets and Ethereum wallets. More information on this project can be found in this recent article. About the Sovrin Foundation The Sovrin Foundation is a nonprofit organization established to administer the Governance Framework governing the Sovrin Network, a decentralized global public network enabling self-sovereign identity on the internet. The Sovrin Foundation is an independent organization that is responsible for ensuring the Sovrin identity system is public and globally accessible. About Spherity Spherity is building decentralized identity management solutions to power Industry 4.0, bringing secure identities to enterprises, machines/IoT-devices, data and algorithms. Our client focus is primarily on technical industries like pharmaceuticals, supply chain and mobility. In particular, Spherity is supporting the introduction of identity solutions in the Industrial Internet of Things (IIoT) market, which is expected to have a value of over €400 billion by 2030 and 75 billion connected devices by 2025. Stay sphered by joining Spherity’s Newsletter list and following us on Linkedin and Twitter.",https://medium.com/spherity/spherity-becomes-a-sovrin-steward-b813cff2999b,,Post,,Meta,,,,,,,,2020-05-08,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,Secure Digital Identity Association,,,,,Spherity joins VSDI (Secure Digital Identity Association),"Membership in the association allows [Spherity](http://www.spherity.com/) to participate in consultative processes around future tenders and initiatives, keeping us up to date on the needs of government digitization and the trends moving through state and federal bodies on these topics. It also fosters communication with companies working in specialized fields like military information technology, public services computing, finance, and cross-border issues. The VSDI sees itself as a practice-oriented competence network for politics, administration and business. The association communicates the bundled expertise of its members and advocates through its initiatives to enable secure, user-friendly and data protection-compliant digital identities.","Spherity joins VSDI (Secure Digital Identity Association) 28. January 2020 Spherity GmbH has joined the Berlin-based trade association, Verband Sichere Digitale Identität (VSDI)( engl.: Secure Digital Identity Association), attending the annual meeting hosted by the state-owned certificate, mint, and security conglomerate Bundesdruckerei GmbH, a founding member and part-owner of many others. Having changed its name earlier in the year (from Verband Sichere Identität Berlin Brandenburg), the association focuses more than ever on cybersecurity topics, data standards, usability, and other aspects of digital identity. In practical terms, the association coordinates between Bundesdruckerei, government contractors, IT companies, and other stakeholders as the Bundesdruckerei and administrators throughout the German federal and state government. Membership in the association allows Spherity to participate in consultative processes around future tenders and initiatives, keeping us up to date on the needs of government digitization and the trends moving through state and federal bodies on these topics. It also fosters communication with companies working in specialized fields like military information technology, public services computing, finance, and cross-border issues. 2020 is poised to be a breakthrough year for digital identity, with the federal and state governments embracing (and, just as importantly, funding) initiatives to make digital services more available, more usable, and more secure. For example, the Bundesministerium für Wirtschaft und Energie is running an ongoing series of rapid “Showcase” (Schaufenster) projects to promote this activity and German’s contributions to the European sector. About VSDI Our thesis Without secure digital identities for people, organisations and things, there can be no reliable digitalisation. The Secure Digital Identity Association (VSDI) is the nationwide network for companies, universities and research institutions that promotes the transformation from analogue to digital identities. The VSDI sees itself as a practice-oriented competence network for politics, administration and business. The association communicates the bundled expertise of its members and advocates through its initiatives to enable secure, user-friendly and data protection-compliant digital identities. Our members from the business community offer software and hardware, consulting and services to secure the digital world technologically. Our members from the research and science community research and test how secure digital identities can be improved. The member companies and institutions employ around 9,000 people and have an annual turnover of around 750 million euros. About Spherity Spherity is building decentralized identity management solutions to power the 4th Industrial Revolution, bringing secure identities to machines, algorithms, and other non-human entities. The client focus is primarily on technical industries like mobility, supply chain, and pharmaceuticals. In particular, Spherity is supporting the introduction of identity solutions in the Industrial Internet of Things (IIoT) market, which is expected to have a value of over €400 billion by 2030 and 75 billion connected devices by 2025. Stay sphered by joining Spherity’s Newsletter list and following us on Linkedin. Media Contacts VSDI VSDI Press Department, info@vsdi.de Spherity Spherity Press Department, communication@spherity.com",https://medium.com/spherity/spherity-joins-vsdi-secure-digital-identity-association-101d160d267f,,Post,,Meta,,,,,,,,2020-05-06,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,IDUnion,,,,,Spherity partners IDunion Trusted Identity Ecosystem,Spherity announces that it has become a partner of the IDunion project. The project is funded within the innovation framework “Showcase secure digital identities” of the German government (Federal Ministry for Economic Affairs and Energy). Spherity is entrusted on the application of cloud identity technology in the healthcare industry.,"Spherity partners IDunion Trusted Identity Ecosystem Spherity becomes Partner in German Government’s Secure Digital Identity Project Spherity announces that it has become a partner of the IDunion project. The project is funded within the innovation framework “Showcase secure digital identities” of the German government (Federal Ministry for Economic Affairs and Energy). Spherity is entrusted on the application of cloud identity technology in the healthcare industry. IDunion aims to provide identity solutions for business, government, and citizens that are user-friendly, trustworthy, and economical. The consortium and its partners will establish a decentralized identity ecosystem for individuals, companies and machines. Within the IDunion project Spherity is entrusted with the application of cloud identity technology in the healthcare industry. Secure and portable identities are necessary prerequisites for seamless e-health applications. In this context, secure digital identities are equally necessary for patients, healthcare professionals and healthcare institutions. Applying a decentralized digital identity (SSI*) approach to the management of e.g. patient data, legal requirements for data protection can be fulfilled, patient rights can be strengthened and the efficiency of digital system solutions can be significantly improved. In the project, these decentralized technologies will be linked with central systems such as Gematik infrastructure and health insurance systems for issuing and storing certificates, so that a corresponding hybrid architecture is created. Demonstrators and field tests will be created in the areas of telemedicine, e-prescription or electronic certificates of incapacity to work or digital vaccination cards. Spherity will further operate a node of the decentralized, heterogeneously distributed IDunion test network, which will be European legal framework (GDPR and eIDAS) compliant. About IDunion IDunion develops a basic infrastructure for the verification of identity data. For this purpose, a distributed database will be jointly operated, which will be managed by a European cooperative. The network will be set up and managed by various actors consisting of private companies, associations, cooperatives, government institutions, educational institutions and other legal entities. About Spherity Spherity is building decentralized digital identity management solutions to power the fourth industrial revolution, bringing secure identities to enterprises, machines, products, data and even algorithms. We provide the enabling technology to digitize and automate compliance processes primarily on highly-regulated technical sectors like pharmaceuticals, automotive and logistics. Spherity’s decentralized cloud identity wallet empowers cyber security, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001. Stay sphered by signing up for our newsletter, follow us on LinkedIn or Twitter. Press Inquiries Please direct press inquiries to: Marius Goebel communication@spherity.com",https://medium.com/spherity/spherity-joins-idunion-trusted-identity-ecosystem-e89d093be35a,,Post,,Meta,,,,,,,,2021-03-12,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,Swisscom Blockchain,,,,,Swisscom Blockchain & Spherity are Co-developing Cloud Identity Wallets,"Swisscom Blockchain and Spherity have both built interoperable Cloud Identity Wallet solutions that handle this kind of world-class, highly trustworthy data, allowing it to transcend silos, proprietary systems, and blockchains. Binding data to digital identities at a low level is the key to bringing more value to all stakeholders in any industry, and making data trustworthy and verifiable anywhere. At its heart, this is what an identity wallet does, and we are proud to have developed two industry-leading, enterprise-grade identity wallets tailored to the documentation needs of the pharmaceutical industry.","Swisscom Blockchain & Spherity are Co-developing Cloud Identity Wallets Real interoperability is built together Swisscom Blockchain and Spherity have both built interoperable Cloud Identity Wallet solutions that handle this kind of world-class, highly trustworthy data, allowing it to transcend silos, proprietary systems, and blockchains. It seems like every day the market for data grows larger, but not all data is of equal value: on the contrary, data’s value increases exponentially the more trustworthy it is. There is a rising demand for trustworthy data, particularly for verifiable data that can travel far and wide without risk or privacy complications. Binding data to digital identities at a low level is the key to bringing more value to all stakeholders in any industry, and making data trustworthy and verifiable anywhere. At its heart, this is what an identity wallet does, and we are proud to have developed two industry-leading, enterprise-grade identity wallets tailored to the documentation needs of the pharmaceutical industry. “We both offer similar solutions and we decided to work together while designing our products because we both wanted to create solutions that are, since day 1, interoperable.” says Luigi Riva, Senior Technical Product Manager, SwissCom Blockchain AG Goals and Accomplishments Swisscom Blockchain and Spherity both have seasoned teams of developers and architects with decades of experience in decentralized and traditional Identity and Access Management, as well as cybersecurity and cryptography topics. For this reason, we see the value that will be unlocked by the emerging Worldwide Web Consortium standards for identity-linked data, but also the challenges that come with such innovation. These include the properly technological limitations of the current generation of platforms and products, such as the customizations we’ve had to make to the open-source Indy libraries to accommodate our clients’ high security requirements. Because of these high requirements, we also value cryptographic agility and work hard to future-proof our security model as well as our codebase. Another technological upgrade we made to the underlying codebase entailed carefully linking verified “attachments” to create a hybrid solution (both machine-readable and human-readable) beyond the size limitations of a traditional Indy-style verified credential. But the most important challenge is balancing these kinds of customizations against the promise of interoperability and freedom from “vendor lock-in”. To fulfil this promise, we chose not just to work together closely as “coop-etitors” but to go one step further, “co-developing” two parallel solutions cooperatively and making sure both of our customizations staying interoperable with the underlying platform and other wallet providers. Comparing designs and testing interoperability throughout the process. We based our two solutions not just on existing Hyperledger Indy standards, but worked together on more future-proof prototyping of a solution inspired by the ongoing Hyperledger Aries specification process (technical readers can find more detail here). Security features fit for a king The current drafts of the Aries cloud-agent specifications are light on details about security, yet our clients on this project were enterprises with high standards in that regard. Given our shared commitment to security-by-design principles, we were able to prototype mechanisms for exchanging verifiable credentials between “cloud fortresses” while still building top-of-the-line enterprise security features into both our wallets: • Data Loss Prevention Mechanisms: Redundancy mechanisms ensure that sensitive data is not lost to system failures, misused, or accessed by unauthorized users. • Multi-Tenant Design: Each customer shares the software application and also shares a single database. Each tenant’s data is isolated and remains invisible to other tenants. • Custodial Approach to Key Management: Wallets secure, store, and share sensitive documents, firewalled from access management and key storage systems • Auditable Wallet Metadata: In highly-regulated sectors, privacy often has to be balanced against auditability. For this reason, we added the capability to give an appropriately-permissioned user such as an auditor verification access to wallet-transaction metadata. First Joint Project Our first joint project was in the pharmaceutical industry, where we provided our respective cloud wallet solutions to different actors in a pharmaceutical supply chain. Our interoperable, co-developed data exchange system was successfully subjected to a stress test in a proof-of-concept trial. The use case being validated was the onboarding of suppliers within a pharmaceutical supply chain and the maintenance of those credentials. Since brands are responsible to ensure upstream compliance, they need a high level of certainty that this crucial paperwork is accurate and up-to-date at all times. Incorporating sophisticated cryptography not only increases the level of assurance, but also makes the process considerably more efficient and agile. Switching from manual processes to ones based on digital exchange of credentials between secure cloud wallets drops a supplier’s on-boarding time from 30 days to as little as 3. How we solved Third-Party Risk Management in complex Supply Chains can be read here in detail. A clear path to more resilient data systems Decentralized digital identities are a powerful organizing principle for data systems which provide high levels of privacy, security, and verifiability at the same time. Few industries have requirements for all three criteria as high as those of the pharmaceutical industry, but we have also proven analogous and adjacent use cases and business cases in other fields, such as manufacturing supply chains and mobility systems. These technologies will make cloud-based software more trustworthy and verifiable, while making enterprise business processes more efficient, agile, and resilient. Binding data to digital identities at a low level and gradually moving important data exchange from proprietary platforms and email to identity wallets is the key to bringing more value and more trust to all stakeholders in any industry. As the key management solutions currently reaching maturity and hardness become standard practice outside of the software industry, these technologies will be well-positioned to vouchsafe the security models of tomorrow. Outlook The W3C standards for decentralized identity enable powerful new ways to build digital-first interactions between corporations, based on trust infrastructure and cryptographical assurances. Much of the press coverage of these standards focuses, understandably, on the most tangible and obvious use cases: managing the identity credentials of individual citizens and users of the web, with all the legal complexity that entails. But what enterprises call Identity & Access Management (IAM) and corporate identity use cases are far closer to production, powering business cases that are marching towards production today. “Companies have to comply with identity regulations that are getting more demanding over time, such as GDPR and national laws about privacy and consent for data sharing. It is easier to adapt to these changes over time with a sophisticated identity layer in your IT infrastructure.” says Dr. Carsten Stöcker, CEO, Spherity GmbH The core concepts of business strategy, like reputation and risk, will be transformed by these new infrastructures, which reduce the role of intermediaries and informational asymmetries by safely drawing on shared records. The future digital economy includes cooperative business models, among independent organizations, machines or algorithms, demanding a more agile, resilient business culture that preserves the privacy and independence of actors big and small. In this future, identity wallets will be so central to everyday business at so many levels, that no one will remember how recently they were invented, or who pioneered their design. If you have any questions about how our cloud wallet could power your enterprise credentialing use case, feel free to reach out with any question or book a demo directly. You can also follow us here, or on LinkedIn, or sign up for our newsletter. Press Inquiries Please direct press inquiries to: Marius Goebel communication@spherity.com",https://medium.com/spherity/swisscom-blockchain-spherity-are-co-developing-cloud-identity-wallets-632babc50a6c,,Post,,Meta,,,,,Cloud Identity Wallet,,,2021-03-26,,,,,,,,,,,,,
|
||
Spherity,Spherity,,Medium,,,,,,,New Product to Support Pharmaceutical Supply Chain Compliance,The product establishes trust in digital interactions between trading partners in pharmaceutical supply chains and ensures compliance with the U.S. Drug Supply Chain Security Act (DSCSA).,"Spherity launches New Product to Support Pharmaceutical Supply Chain Compliance Already integrated by SAP and rfxcel, the Spherity Credentialing Service is now ready to be shipped to the market Spherity announces the launch of its new product: The Spherity Credentialing Service, which sets the benchmark for compliance solutions in the field of trading partner verification and is available from today. The product establishes trust in digital interactions between trading partners in pharmaceutical supply chains and ensures compliance with the U.S. Drug Supply Chain Security Act (DSCSA). We are proud that Novartis, as an innovation leader, is looking to adopt the Spherity Credentialing Service. David Mason, Regional Serialization Lead at Novartis says that “Using credentialing is the first proven digital solution for our industry that addresses the ATP compliance gap of knowing if the counterparty is an Authorized Trading Partner. This is a foundation to meet DSCSA requirements by 2023.” SAP and rfxcel have integrated the Spherity Credentialing Service within their verification routing service solutions to be able to share and verify the Authorized Trading Partner (ATP) status in product verifications. Herb Wong, Vice President of Marketing & Strategic Initiatives at rfxcel says “The Credentialing Service is the most comprehensive effort to address the upcoming Authorized Trading Partner requirement for DSCSA. rfxcel was impressed to see how seamlessly it integrated with our solution.” Dr. Oliver Nuernberg, Chief Product Owner at SAP says “For SAP, one of the key requirements was to ensure that the existing returns verification process is not impacted by adding credentialing. By making the credentialing optional, we further ensure that our customers can add this capability over time without disrupting existing processes.” The Spherity Credentialing Service enables supply chain actors to verify in real time that they are only exchanging information with Authorized Trading Partners (ATP), as per DSCSA requirements, even when they do not have a direct business relationship yet. The Spherity Credentialing Service integrated Legisym as credential issuer and is based on the ATP architecture that was tested by an industry wide pilot. Beyond DSCSA compliance, Spherity leverages process efficiencies of exchanging data with indirect business partners by avoiding manual and time consuming due diligence processes. Saving significant time and money for all participants in the ecosystem. To drive the utilization of decentralized digital identity technologies across the industry, Spherity participates in the newly founded Open Credentialing Initiative (OCI). As an industry consortium, this initiative incubates the ATP architecture and governs further standardization efforts. “Using ATP credentials for product verification interactions is just the tip of the iceberg. The established enterprise identities and associated verifiable credentials will leverage efficiency to exchange data in regulated environments”, says Georg Jürgens, Manager Industry Solutions at Spherity. About Spherity Spherity is building decentralized digital identity management solutions to power the fourth industrial revolution, bringing secure identities to enterprises, machines, products, data and even algorithms. We provide the enabling technology to digitize and automate compliance processes primarily in highly-regulated technical sectors like pharmaceuticals, automotive and logistics. Spherity’s decentralized cloud identity wallet empowers cyber security, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001. Stay sphered by signing up for our newsletter, follow us on LinkedIn or Twitter. Press Inquiries For press relations contact: Marius Goebel communication@spherity.com",https://medium.com/spherity/spherity-launches-new-product-to-support-pharmaceutical-supply-chain-compliance-28e5592b2dee,,Post,,Product,,,,,,,,2021-04-01,,,,,,,,,,,,,
|
||
Spherity,Spherity,,,,,,,,,One-Button Trusted Release,"Medical products with verifiable credentials - Imagine a digitalized Trusted Release process on a batch with instant access to all the relevant information in verifiable form, rather than dealing with multiple distributed paper documents. Digitized and pre-validated data allows the Qualified Person to dig deeper or double-check any input with a minimum friction of effort.",,https://web.archive.org/web/20220517020902/https://spherity.com/pharma-one-button-trusted-release/,,Post,,Product,,Healthcare,,,,,Verifiable Credentials,2020-11-21,,,,,,,,,,,,,
|
||
Spherity,Spherity,,,,,,,,,Pharma Third Party Risk Management,"How can we shorten the supplier onboarding effort in Third Party Risk Management from the thirty days that is typical today, to only three days?",,https://web.archive.org/web/20210119061430/https://spherity.com/pharma-one-button-trusted-release/,,Post,,Product,,,,Risk Management,,,,2020-11-21,,,,,,,,,,,,,
|
||
Spruce,,Spruce,,Gregory Rocco; Jacob Blish; Wayne Chang,,"USA, New York, New York",USA,,,Spruce Systems,"Spruce is building a future where users own their identity and data across all digital interactions. Our open-source credentialing infrastructure is standards-compliant, production-ready, and extensible into typical enterprise and government IT systems","We're building the open-source stack to leave control of identity and data where it should be: with users. This begins with SSX. Spruce is building a future where users control their identity and data across all digital interactions. We believe in endowing individuals with control over privacy through open-source software that makes user-controlled interactions possible. Today, identity providers, such as Google, Facebook, or Apple manage the entire login experience, but so much more can be unlocked by unbundling the login: user control, data sharing, and faster innovation. We use these libraries as the building blocks for our own products and, in the spirit of collaborative innovation, we make them accessible as open-source libraries for other builders to use. SpruceID is an ecosystem of open source libraries to enable user-controlled identity anywhere. Kepler is a decentralized storage system that uses smart contracts to define where your data live and who has access. Enable reusable identity verifications across social media, DNS and more. Enable users to control their digital identity with their Ethereum account using Sign-In with Ethereum. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Commodo risus euismod.Learn more Lorem ipsum dolor sit amet, consectetur adipiscing elit. Commodo risus euismod.Learn more Lorem ipsum dolor sit amet, consectetur adipiscing elit. Commodo risus euismod.Learn more Lorem ipsum dolor sit amet, consectetur adipiscing elit. Commodo risus euismod.Learn more Lorem ipsum dolor sit amet, consectetur adipiscing elit. Commodo risus euismod.Learn more Lorem ipsum dolor sit amet, consectetur adipiscing elit. Commodo risus euismod.Learn more Our libraries are the building blocks for our products, and in the spirit of collaborative innovation, they are modular and open-source for any builders in the community at large.Learn more",https://www.spruceid.com/,,Company,,Company,Enterprise,ID,,,,,,2020-05-13,,,,https://medium.com/@sprucesystems,https://medium.com/@sprucesystems,,https://www.crunchbase.com/organization/spruce-systems,https://www.linkedin.com/company/sprucesystemsinc/,,,,,
|
||
Spruce,TechCrunch,,,,Ethereum Foundation; Ethereum Name Service,,,,,Decentralized Identity Startup Spruce Wants to Help Users Control their Sign-In Data,The company [won an RFP](https://www.coindesk.com/policy/2021/10/08/sign-in-with-ethereum-is-coming/) from the Ethereum Foundation and Ethereum Name Service (ENS) to develop a standardized “sign-in with Ethereum” feature that could be interoperable with web2 identity systems [...] to let [users] control what information a platform receives about them when they sign in rather than automatically surrendering the data to the platform.,"Signing into websites using your Google or Facebook account has become so commonplace that lots of people don’t think twice before doing it. Keeping control over one’s own identity on the internet often requires a substantial sacrifice of convenience, so plenty of users have accepted the status quo of social media platforms being able to access and share their data freely, sometimes even in nefarious ways. Spruce, a decentralized identity startup, thinks the blockchain can fix this. The company won an RFP from the Ethereum Foundation and Ethereum Name Service (ENS) to develop a standardized “sign-in with Ethereum” feature that could be interoperable with web2 identity systems. The goal of allowing users to log in using a cryptographic identifier such as their Ethereum wallet address is to let them control what information a platform receives about them when they sign in rather than automatically surrendering the data to the platform. Wayne Chang, co-founder and CEO of Spruce, told TechCrunch that web2 platforms that offer sign-in capabilities have been able to access this data in the past because they offer trust and verification to users of the network. He and his co-founder, Gregory Rocco, both worked at blockchain infrastructure provider ConsenSys before starting Spruce. The company has been holding weekly calls to solicit input from the broader community on the “sign-in with Ethereum” project as it develops the feature, Chang told TechCrunch. Chang used the example of Uber to illustrate why centralized platforms have been viewed as valuable in the past and how a decentralized network could take its place. “If there’s an intermediary like Uber collecting 25%, they have to be doing something for the system. But what does it look like if those [intermediaries] became networks, and they were more like public utilities than a private company that’s trying to collect rent?” Chang said. That’s the question Spruce is trying to answer by building a public utility of sorts for internet users, but doing so requires individual users to build trust with one another by voluntarily sharing data through the network when they can’t rely on a centralized intermediary to make assurances. “If we imagine a smart contract-based ridesharing system, there’s a lot of concerns about that, because you don’t want to just send a transaction to a smart contract, and then step into the next car that pulls up. Instead, it’d be nice if the driver could present that they are a licensed driver, haven’t had too many accidents and that the network has [validated] their good reputation,” Chang said. In turn, the driver might want to know something about the rider’s reputation, akin to their star rating. Data on the internet could move in a similar way if it was decentralized and permissionless, allowing individuals to control what information they share with platforms, Chang continued. “A different way to phrase it is that there are transaction costs associated with booking a rideshare, and there’s a trust portion of those transaction costs. If you’re not able to mitigate [distrust] enough, then those transactions just won’t happen, so if we can move data in a decentralized and authentic way, then maybe a lot more is possible,” Chang said. Spruce offers two main products — SpruceID, a decentralized identity toolkit, and Kepler, a self-sovereign storage product. These products support use cases in service of Spruce’s broader goal, such as secure sharing of trusted data, DAO-based credentialing and reputation, and permissioned access to DeFi through a partnership the company has with the Verite protocol. Its customers are mostly Web3 projects looking to integrate the “sign-in with Ethereum” feature, Chang said. The feature has enjoyed a positive reception on social media from projects such as TIME Magazine’s TIMEPieces NFT project and decentralized autonomous organization (DAO) voting tool provider Tally. Spruce just announced it has closed a $34 million Series A round led by Andreessen Horowitz. Other participants in the round include Ethereal Ventures, Electric Capital, Y Combinator, Okta Ventures, SCB 10X, Robot Ventures and OrangeDAO, the company says. The startup announced it had raised a $7.5 million seed round in November last year. It plans to use the new funds to double its team of 15 employees by the end of the year, Chang said. He doesn’t want to scale the team past 30-35 employees in the near term, though, so the company can stay nimble and focused on solving specific problems such as key management for users who forget their passwords, he added. Chang sees “sign-in with Ethereum” and Kepler as the two main products Spruce plans to develop using the new funding, he said. “Sign-in with Ethereum,” in particular, is likely to be a catalyst for Spruce’s growth, he added. “We’re really hoping sign-in with Ethereum will become the standard choice whenever [users] want to sign in with something,” Chang said. “A really important note about that is it has to be an open, community-owned standard. We see [the feature] as the entry point for a lot more than just Spruce. I think that a lot of folks are looking at sign-in with Ethereum to be this way that users can use their existing wallet and interact with a new decentralized identity ecosystem.”",https://techcrunch.com/2022/04/20/decentralized-identity-startup-spruce-wants-to-help-users-control-their-sign-in-data/,,Post,,Meta,,,,,,Sign in with Ethereum,,2022-04-20,,,,,,,,,,,,,
|
||
Spruce,MarketScreener,,,,Okta; Wayne Chang,,,,,Founders in Focus: Wayne Chang of Spruce,Each month we highlight one of the founders of Okta Ventures' portfolio companies. You'll get to know more about them and learn how they work with Okta.,"Each month we highlight one of the founders of Okta Ventures' portfolio companies. You'll get to know more about them and learn how they work with Okta. This month we're speaking with Wayne Chang of Spruce. What is Spruce and what is your mission? Spruce is an open-source software company with the mission to let users control their data across the web, starting with Web3. What were you doing prior to Spruce that led you to this moment? Before Spruce, I was part of the leadership team for decentralized identity initiatives at ConsenSys, which incubated uPort, one of the first self-sovereign identity projects ever. It was at ConsenSys that we realized the power of the core technology. It gives individuals control, while phasing out rent-seeking intermediaries, and empowers end-users. Essentially, it diminishes the power of platforms that rely on keeping users locked in. The combination of self-sovereign identity and Web3 enables a model where being locked into a specific platform is erased, and control returns to the user-a victory for consumer choice. That's what we're trying to continue and bolster at Spruce. What is Spruce's solution? What challenges does it solve? We believe the world is moving away from today's centralized model, where users log in to platforms and mayor may not be granted access based on various factors, to a decentralized model, where platforms access a users' Personal data vault, and the user is empowered to adjust permissions for anyone, at any time. To get there, we must move towards open authentication systems based on public-key cryptography, such as Sign-In with Ethereum. Ethereum has tens of millions of monthly active users, and the ones we have spoken to are excited to take back their digital control. As these systems are developing, we're seeing a new class of compatible technologies, such as Personal data vaults like Kepler. This software allows individuals, companies, and decentralized autonomous organizations to host and protect their data wherever they want, whether it's with a company they trust or a server in their basement-all without interruption of service. There will also be a shift away from proprietary databases and shadow profiles, and toward open standards that allow for digital credentials, exportable social media graphs, and data-all fully controlled by the user. We combine many of these open standards into two open-source products under the Apache 2.0 license: the decentralized identity toolkit DIDKit, and the white label-ready credential wallet Credible. Why did Spruce want to work with Okta? We wanted to work with Okta because companies that choose Okta tend to take security and data ownership pretty seriously. It's the top vendor recommended when companies are standardizing their company single sign-on strategy in pursuit of better security, digital accountability, or security compliance standards like SOC 2, ISO 27001, or FedRamp. We're customers as well as Okta partners. These companies also tend to care about data sovereignty, zero-trust architectures, digital credentialing, and user-centric data workflows such as those found in Web3. Spruce solves many of these problem categories, and we're grateful to have the opportunity to collaborate with these companies in a way that works seamlessly with existing Okta installations. For example, our product allows any Okta or Auth0 customer to securely interact with blockchain accounts simply by installing a marketplace plugin. How is Spruce working with Okta? What support do you look for in a corporate partner? We are working with Okta in several ways. First, we are happy to announce the release of our Sign-In with Ethereum integration to the Auth0 marketplace, which allows any Auth0 customer to implement the Sign-In with Ethereum workflow with the click of a button to resolve data from the blockchain. In the near future, we hope to package our decentralized identity libraries to allow any Auth0 and Okta customer to enable data interoperability with W3C Verifiable Credentials and W3C Decentralized Identifiers. This means that Okta customers can share trusted data with each other, including professional certifications, cross-organizational approvals, budgets, financial statements, and much more, all while tightly controlling access criteria to the satisfaction of the CISOs. When working with a corporate partner, we look for scale and aligned incentives. It's apparent that Okta has the scale, with hundreds of millions of users on the service. What was especially impressive to us was how aligned the incentives were for Spruce, Okta, and even Okta's customers. Okta's leadership firmly believes in innovation, that the world is non-zero-sum, and there will be huge sectors opening up as we continue our transition into the digital age. Also, because Okta offers a straightforward service that doesn't monetize customer data, we find it to be well-aligned with our vision of data sovereignty. What trends do you expect to see in the Decentralized Identity industry? The following trends are combining into the perfect storm for the adoption of decentralized identity. - The proliferation of Web3. Web3 is proving to be the biggest movement of users taking back digital control that we've seen, it is also the most successful Public Key Infrastructure adoption event ever. For all decentralized identity projects, the widespread use of public-key cryptography is critical for successful rollouts. We think this thoroughly answers the question, ""why now?"" - Antitrust rulings, data privacy regulations, and growing user distrust of ""big tech"". It's no surprise that the FTC and the general public are upset about what's seen as large tech companies lacking accountability and hoarding data and power.. People are growing wiser as to what's happening to their information behind the scenes, and they don't like it. Given this climate, many data privacy officers may actually prefer that user data be stored directly with their customers, and accessed only when necessary. They understand that much of the data their organizations store today may become illegal to hold without additional consent processes in place. As organizations are mandated by government regulations to allow users to export all their data in a useful way, we think Personal data vaults will emerge as a popular way for users to take back control, while also mitigating privacy risks for corporations. - The transition to Zero Trust architecture. The White House has released a federal strategy toward Zero Trust, and this is a massive shift in the security industry. This change will favor systems built on public-key cryptography with next-generation authentication/authorization systems. The kinds of authentication and authorization we're working on are in exact alignment. - The emergence of the data supply chain. We think the world is growing smaller. In order to compete, companies will need to share more information with their collaborations than they ever have before. Data will be tracked and traced like assets along a physical supply chain, but instead of paper bills of lading, there will be digital certificates of origin, user consent packages, and certifications of data anonymization. This is all enabled using the tools from decentralized identity, in which not just people, but anything, can have an identifier-even an Excel file. Interested in joining Okta Ventures? Check out our FAQ here and feel free to reach out to our team or submit your business for review.",https://www.marketscreener.com/quote/stock/okta-inc-34515216/news/founders-in-focus-wayne-chang-of-spruce-40101309/,,Post,,Meta,,,,,,,,2022-04-20,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,San Francisico,,,Graph Hack 2022,Spruce At Graph Hack 2022,"Earlier this month, The Graph hosted Graph Hack at the Palace of Fine Arts in San Francisco. Graph Hack was a three-day hackathon, bringing together developers to kickstart the latest dapps and ideas featuring a mix of on and off-chain data via a variety of impactful use-cases.","Spruce At Graph Hack 2022 Earlier this month, The Graph hosted Graph Hack at the Palace of Fine Arts in San Francisco. Graph Hack was a three-day hackathon, bringing together developers to kickstart the latest dapps and ideas featuring a mix of on and off-chain data via a variety of impactful use-cases. Earlier this month, The Graph hosted Graph Hack at the Palace of Fine Arts in San Francisco. Graph Hack was a three-day hackathon, bringing together developers in Web3 to kickstart their latest ideas featuring a mix of on and off-chain data. We're happy to have been a sponsor for the event, working with developers on the ground and fleshing out the next generation of applications that effectively leverage on-chain data and Sign-In with Ethereum. As always, our main focus is to help teams get started building with Sign-In with Ethereum and ushering forward new ways in which users can own their own identifier and identity. As part of the event, we had prizes for the best use cases that used Sign-In with Ethereum for a meaningful workflow in an application, and additional prizes for use cases that incorporate the core authentication flow. We're happy to highlight some of the projects that used Sign-In with Ethereum during the hackathon: Project Highlights using Sign-In with Ethereum Borantia is an on-chain volunteer platform that enables DAOs to create bounties for users to claim for related tokens and commemorative badge NFTs. In addition to the bounty system, the platform also offers a way for users to view their earned badges and tokens on their own profile pages, and it also has an in-app leaderboard to encourage continued volunteering. BlockParty is a social video gallery that enables users to capture videos and store them as NFTs organized by time and place. This process is to remember moments from live events, gatherings, and more, and share them with friends. Dynamic Carbon Offset NFTs is a project that features a way to offset your carbon emissions for a particular period of time by purchasing carbon credits as NFTs. The NFT represents a plant, and slowly dies over time as it gets closer to its expiration period. Proceeds from the sale of the NFTs go toward carbon offset programs. We would like to thank the team at The Graph for putting on a great event. We look forward to continuing to work with developers on the ground at various hackathons around the globe. We hope to see you at the next one! Spruce lets users control their data across the web. If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord:",https://blog.spruceid.com/spruce-at-graph-hack/,,Post,,Meta,,,,,,,,2022-06-17,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Spruce Raises $34M to Unbundle the Login for a User-Controlled Web,"With the new funding, Spruce will spearhead research in cutting-edge privacy and usability technology for identity, grow its product teams, and continue to execute on partnerships across the ecosystem.","Spruce Raises $34M to Unbundle the Login for a User-Controlled Web We're excited to announce that we have raised $34 million in a Series A round led by Andreessen Horowitz. Spruce enables users to control their data across the web as the world becomes increasingly dependent on cryptography, networks, and digital economies. We're excited to announce that we have raised $34 million in a Series A round led by Andreessen Horowitz. Spruce enables users to control their data across the web as the world becomes increasingly dependent on cryptography, networks, and digital economies. Our product suite powers the necessary authentication, credentialing, and storage needed for portable reputation for users, providing decentralized access control to data, and interoperability between Web2 APIs and Web3. With the new funding, Spruce will spearhead research in cutting-edge privacy and usability technology for identity, grow its product teams, and continue to execute on partnerships across the ecosystem. Our Supporters We're thrilled to welcome Okta Ventures, SCB 10X, Robot Ventures, and OrangeDAO, and for continued participation from Ethereal Ventures, Electric Capital, Y Combinator, A.Capital Ventures, Third Kind Venture Capital, Protocol Labs, SV Angel, and Gemini Frontier Fund. Additionally, they are also joined by Alex Pruden, Anthony Sassano, Benjamin Jack, Dev Ojha, Ejaaz Ahamadeen, Jeromy Johnson, Juan Benet, Matias Woloski, Matt Condon, Matt Luongo, Ryan Li, Scott Belsky, Sunny Aggarwal, Teck Chia, Viktor Bunin, William Allen, Will Villanueva, and many more. We look forward to continuing our work with key supporters who share our vision of a user-controlled world, and to welcoming new ones. - If you’re interested in working on the future of user-controlled identity and data, we’re hiring! Check out our openings: To get started building with Spruce, check out: Spruce enables users control their data across the web. If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord:",https://blog.spruceid.com/spruce-raises-34m-to-unbundle-the-login-for-a-user-controlled-web/,,Post,,Meta,,,,,,Sign in with Ethereum,,2022-04-20,,,,,,,,,,,,,
|
||
Spruce,VentureBeat,,,,,,,,,"Spruce sets the bar for sovereign identity storage options, secures $7.5M","Spruce, a service that allows users to control their data across the web, has raised $7.5 million. The company builds open source, open standard developer tools helping users collect and control their data across the web. It helps prevent NFT frauds and defines access rules for decentralized","Spruce, a service that allows users to control their data across the web, has raised $7.5 million. The company builds open source, open standard developer tools helping users collect and control their data across the web. It helps prevent NFT frauds and defines access rules for decentralized finance pools and decentralized autonomous organizations (DAOs). The company has earned recognition for becoming the project lead for sign-in with Ethereum, a new form of authentication helping users control their digital identity with their Ethereum account and ENS profile rather than a traditional intermediary. Your Keys, Your Data Spruce’s tagline, “Your Keys, Your Data,” encapsulates the services it provides. Working seamlessly across multiple blockchains like Ethereum, Polygon, Tezos, Solana, Ceramic, and Celo, Spruce’s portfolio offers two signal products: Spruce ID and Kepler. While SpruceID implies an ecosystem of open source tools enabling user-controlled identity anywhere, Kepler is the decentralized storage that leverages smart contracts to determine the location of user data and its access. Event Intelligent Security Summit On-Demand Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today. SpruceID is a collection of four service categories: DIDKit, Rebase, Keylink, and Credible. DIDKit serves as a cross-platform decentralized identity framework, while the Rebase facilitates reusable identity verifications across social media, DNS, etc. The Keylink, as a feature, serves the purpose of linking existing system accounts to cryptographic tokens. Credible is a Whitelabel-ready credential wallet. Kepler, on the other hand, is Spruce’s self-sovereign storage. With Kepler, a user can share their digital credentials, private files, and sensitive media to blockchain accounts. All they need to carry out this sharing process is a Web3 wallet. Kepler also helps serve exclusive content to chosen NFT holders. To refine access, it uses NFT attributes and other blockchain data. With Kepler’s permission-centric storage facilities, users can allow DAO-curated registry members to access sensitive content. Self-sovereign identity The benefit of Spruce’s sovereign storage facility is that individual users own their storage, and no one other than them can govern their Personal data. An individual’s keys control the smart contracts that define their Kepler service contours. Users can manage and access their Kepler services through their Web3 wallet, without having to go for any additional downloads or installs. Additionally, the ‘programmable permissioning’ feature allows users to define their own rules. Users can set data access guidelines by determining the norms of who can do what. There is also the benefit of upgrading rules with ownership or identity verifiable modules. All these reasons are what motivated the investors towards Spruce. According to Joe Lubin, cofounder of Ethereal Ventures, “combining identity and storage elegantly,” Spruce is “building user-centric, Web3-style tools for the decentralized future.” Along similar lines, Ken Deeter, an investment partner at Electric Capital, believes that “Spruce is redefining how applications collect and share our data with others.” Although Ethereal Ventures and Electrical Capital led the round, Spruce won the support of a range of leading blockchain investors, including Bitkraft, Coinbase Ventures, Alameda Research, A. Capital Ventures, Protocol Labs, and the Gemini Frontier Fund. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.",https://venturebeat.com/2021/11/02/spruce-sets-the-bar-for-sovereign-identity-and-storage-options-secures-7-5m/,,Post,,Meta,,,,,,,,2021-11-02,,,,,,,,,,,,,
|
||
Spruce,Spruce,,Medium,,,,,,,Credible,"Spruce’s native credential wallet for the consumption, storage, and presentation of Verifiable Credentials on Android and iOS.",,https://medium.com/@sprucesystems/spruce-developer-update-2-484368f87ee9,,Post,,Product,,,,,,,,2020-10-07,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Spruce Developer Update #19,"- Sign-In with Ethereum offers a new self-custodial option for users who wish to assume more control<br>- Kepler is a decentralized storage network organized around data overlays called Orbits. Kepler allows users to Securely share their digital credentials, private files, and sensitive media to blockchain accounts, all using a Web3 wallet","Spruce Developer Update #19 At Spruce, we’re letting users control their identity and data across the web. Here’s the latest from our development efforts across Sign-In with Ethereum, Kepler, and SpruceID. At Spruce, we’re letting users control their identity and data across the web. Here’s the latest from our development efforts: Sign-In with Ethereum Sign-In with Ethereum is a new form of authentication that enables users to control their digital identity with their Ethereum account and ENS profile instead of relying on a traditional intermediary. We recently posted a Sign-In with Ethereum-specific April recap that can be found here: Kepler Kepler is a decentralized storage network organized around data overlays called Orbits. Kepler allows users to Securely share their digital credentials, private files, and sensitive media to blockchain accounts, all using a Web3 wallet. - Cryptoscript: added support for JSON templated REST queries as script input, including documentation, test methods, and verbose errors (cryptoscript#2) - The capability subsystem is implemented, providing a registry of delegations, invocations, and revocations (kepler#99). - Simplified Kepler SDK PR merged (kepler-sdk#35), and added an example dapp to show how to use it (kepler-sdk#37). - Simplified Kepler HTTP API to a single endpoint for invocation (of any action) and a single endpoint for delegation (of any action) (kepler-sdk#38, kepler#107). - Remote backend storage is now supported using S3 and DynamoDB (kepler#96, kepler#106). - Prometheus metrics were added (kepler#110), and work is underway to implement tracing and further metrics. - Work is underway to unify authentication in Kepler by supporting a single capability representation and adding translation functionality to the SDK. SpruceID SpruceID is a decentralized identity toolkit that provides everything you need for signing, sharing, and verifying trusted information. DIDKit - Added StatusList2021 implementation (ssi#278). - Auto-generate PRs to update bundled context files (ssi#421). - Add zCap context files; update other context files (ssi#419). - Allow RSA key lengths greater than 2048 with JsonWebSignature2020 (ssi#423). - Improve did:onion configuration (didkit#292). Spruce lets users control their data across the web. Through SpruceID and Kepler, Spruce provides an ecosystem of open source tools for developers that let users collect their data in one place that they control, and show their cards however they want. If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord:",https://blog.spruceid.com/spruce-developer-update-19/,,Post,,Product,,,,,,Ethereum,,2022-05-04,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Spruce Developer Update #21,"We're currently working on a new, ergonomic way to use Sign-In with Ethereum and session management for applications, and are currently in the process of setting up beta testing. If you're interested in trying this out, please get in touch.","Spruce Developer Update #21 At Spruce, we’re letting users control their identity and data across the web. Here’s some of the latest from our development efforts. At Spruce, we’re letting users control their identity and data across the web. Here’s some of the latest from our development efforts: Sign-In with Ethereum Sign-In with Ethereum is a new form of authentication that enables users to control their digital identity with their Ethereum account and ENS profile instead of relying on a traditional intermediary. - We're currently in the process of designing and implementing a method of delegating object capabilities to a session key using a SIWE message. For more on session keys, check out: - We are currently working various fixes across our libraries such as an update for our Discourse plugin, updating some of our examples, and are in the process of releasing v2.0.4of our SIWE core library. - As mentioned in our previous update, our community run identity server via the ENS DAO has had a witnessed deployment and is currently set up on Cloudflare along with relevant access for witnesses. Additionally, the Sign-In with Ethereum documentation has been updated to point to this new server. - We are currently finishing work with a major wallet on a direct Sign-In with Ethereum integration, and are currently working on how Sign-In with Ethereum can support non-standard verification methods. - We're currently working on a new, ergonomic way to use Sign-In with Ethereum and session management for applications, and are currently in the process of setting up beta testing. If you're interested in trying this out, please get in touch. Kepler Kepler is a decentralized storage network organized around data overlays called Orbits. Kepler allows users to Securely share their digital credentials, private files, and sensitive media to blockchain accounts, all using your Web3 wallet. - We've rewrote the core SDK functionality in Rust, refactored out some core definitions from keplerto kepler-lib, and added support for CACAO-ZCAPs (kepler #116) - We've implemented better bundling of the Wasm SDK dependency to improve developer experience, removing the need for specific configuration downstream. (kepler-sdk #40) SpruceID SpruceID is a decentralized identity toolkit that provides everything you need for signing, sharing, and verifying trusted information. DIDKit - We've added a basic UCAN implementation that takes advantage of ssi's JWT/JWS and DID tools. (ssi#447). - Various minor improvements and fixes. Rebase - We have deployed the first demo example of a Rebase frontend that allows users to go through various workflows that result in the user receiving a valid Verifiable Credential. - The demo features credential workflows for Twitter accounts, GitHub accounts, DNS ownership, and demonstrating ownership over two Ethereum wallets. - We've fully documented the architecture, and have added guides on implementing new signers, witness flows, and schemas. This information will also be added to the core SpruceID documentation. - Our next step here is to contribute this codebase to the Rebase community initiative, allowing any organization to issue Rebase credentials. Standards and Community - We congratulate the Decentralized Identity community on the DID-core specification moving forward to become a W3C recommendation. Spruce lets users control their data across the web. Through SpruceID and Kepler, Spruce provides an ecosystem of open source tools for developers that let users collect their data in one place that they control, and show their cards however they want. If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord:",https://blog.spruceid.com/spruce-developer-update-21/,,Post,,Product,,,,,,Ethereum,,2022-07-07,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Spruce Developer Update #23,"Updates on Sign in with Ethereum, Kepler, DIDKit, Rebase","Spruce Developer Update #23 At Spruce, we’re letting users control their identity and data across the web. Every month, we release a developer update detailing our progress on our open source libraries and beyond. At Spruce, we’re letting users control their identity and data across the web. Every month, we release a developer update detailing our progress on our open source libraries and beyond. Check out our previous update here: Here’s the latest from our development efforts: Sign-In with Ethereum Sign-In with Ethereum is a new form of authentication that enables users to control their digital identity with their Ethereum account and ENS profile instead of relying on a traditional intermediary. - As mentioned in a previous update, most of our efforts are currently focused on a product that will make working with Sign-In with Ethereum even easier for developers. Additionally, this initiative will also include enhancements to existing applications and additional information on user interactions. Interested in testing this out early? Get in touch! - CapGrok, an extension to EIP-4361 which provides concise wallet-signable messages with capability delegations, has been submitted to the EIP repository for consideration to become a draft EIP (4362). We will soon have an accompanying blog post breaking down CapGrok, and what it means for the future of Sign-In with Ethereum. Check it out here: - We're working on various improvements to our Sign-In with Ethereum TypeScript library, which will be reflected next month in a v2.1release. Kepler Kepler is a decentralized storage network organized around data overlays called Orbits. Kepler allows users to Securely share their digital credentials, private files, and sensitive media to blockchain accounts, all using your Web3 wallet. - Established performance baseline with load tests, and refined API errors (kepler#118) - Kepler SDK: Abstract over the kepler-sdk-wasminterface, to be able to swap in any module that satisfies that interface (kepler-sdk#48) SpruceID SpruceID is a decentralized identity toolkit that provides everything you need for signing, sharing, and verifying trusted information. Documentation for our core identity tooling can be found here: DIDKit / ssi - As mentioned in a previous update, we're currently restructuring our ssilibrary to make it even easier for developers to import and use. The ssicrate has now been restructured into a variety of more feature-specific crates, which are brought together in the top-level ssicrate (ssi#457). - Update Python examples and fix CI (didkit#308) Rebase We've introduced several new flows and functionality to Rebase including: - A flow linking an active Reddit account to a selected identifier (Rebase #29). - A flow linking a SoundCloud account to a selected identifier (Rebase #30). - Ongoing: the ability to use a Solana account as an identifier, obtain credentials, and two public keys together [Ethereum to Ethereum, Solana to Solana, or Ethereum to Solana] (Rebase #32). TreeLDR - We've released our first implementation of TreeLDR: an open-source developer tool with a DSL that makes managing data schemas as easy as defining data structures in your favorite (sane) statically-typed language. To read more about TreeLDR and how we're using it internally (or to try it out), check out the following: Spruce lets users control their data across the web. Spruce provides an ecosystem of open source tools for developers that let users collect their data in one place they control, and show their cards however they want. If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord:",https://blog.spruceid.com/spruce-developer-update-23/,,Post,,Product,,,,,,"Ethereum,Kepler,DIDKit,Rebase",,2022-09-06,,,,,,,,,,,,,
|
||
Spruce,Spruce,,Medium,,,,,,,Spruce Developer Update #8,- “We are currently working on a project that will enable creator authenticity for digital assets including NFTs.”<br>- “focused on advancing did-tezos as the first formally verified DID Method.”<br>- DIDKit Updates<br>- Credible Updates,"Spruce Developer Update #8 At Spruce, we’re building the most secure and convenient way for developers to share authentic data. Here’s the latest from our open source development efforts: Work-in-Progress: Creator Authenticity We are currently working on a project that will enable creator authenticity for digital assets including NFTs. The initial smart contracts are written, as well as a CLI/library to interact with web applications. We plan on alpha testing the application this week. Formally Verifying the Tezos DID Method The Tezos DID method is a DID method that optimizes for privacy, enables formal verification, and scales to billions of identifiers by using “off-chain updates,” which allow private networks to extend and update on-chain data. A lot of our current work is focused on advancing did-tezos as the first formally verified DID Method. We’ve continued work on improving the DID method’s core smart contract for on-chain updates. A first version of the formal proof has also been written, and a CI pipeline has been established. DIDKit Updates DIDKit is a cross-platform toolkit for working with W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs). - Added a Python package. - Added a Django example app. - Added a Flask example app. - Added a JavaServer Pages (JSP) example app. - Added a Svelte example CHAPI wallet. - We’ve enabled DID Methods to use HTTP(S) on WASM and Android. - Conducted a test with the VC HTTP API v0.0.2 test suite. Test report. - Worked on support for Relative DID URLs. - Improved DID URL dereferencing to support more DID documents. - Support publicKeyBase58 for Ed25519. - Implement did:onion. - (WIP) Implement did:pkh — a DID method for deterministic handling of public key hashes by curve. - Released ssi v0.2.0. - Published to crates.io: ssi, ssi-contexts, did-web, did-method-key, did-tz, did-sol, did-pkh, did-ethr, did-onion. - General bug fixes. Credible Updates Credible is a credential wallet for the verification, storage, and presentation of Verifiable Credentials using Decentralized Identifiers. In addition to our native mobile editions, we’ve since written a browser extension version of Credible along with an SDK to enhance any web application with decentralized identity. If you would like to discuss how we would deploy the architecture described above for a specific use case, please take 30 seconds to leave us a message, and we will respond within 24 hours.",https://sprucesystems.medium.com/spruce-developer-update-8-70f04e95a5d4,,Post,,Product,,,,,,,,2021-04-06,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Sign in with Ethereum,"Already used throughout Web3, this is an effort to standardize the method with best practices and to make it easier for web2 services to adopt it.",,https://login.xyz/,,Project,,Product,,,,,,"Ethereum,Sign in with Ethereum",,2021-10-01,,,,,,,,,,,,,
|
||
Spruce,Spruce,,Medium,,,,,,,DIDKit,a cross-platform toolkit for working with W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs).,"Introducing DIDKit In order to better work with decentralized identifiers and verifiable credentials, we’re working on DIDKit. DIDKit is a cross-platform toolkit for working with W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs). It allows you to resolve and manage DID documents, and also manage the entire lifecycle of Verifiable Credentials including their issuance, presentation, and verification. Notably, it reuses the same codebase across command-line tooling, RESTful HTTP servers, and platform-specific SDKs to facilitate code-level interoperability and a low total cost of ownership. When building ecosystems using decentralized identity to enable verifiable information, many actors must share the same underlying data formats and processing algorithms across different roles. We wrote DIDKit in Rust due to its memory safety, expressive type system, and suitability across a variety of systems and environments. For example, the Rust ecosystem has already explored WASM compilation targets in support of single-page apps running in browsers, and we wanted to be able to support those and also browser extensions with DID and VC operations. The same codebase can also run nimbly on embedded systems with moderately constrained resources (memory in the megabytes and CPU in the megahertz). Finally, Rust programs are able to directly interface with production-ready cryptographic libraries, as seen with Hyperledger Ursa’s use of openssl, libsodium, and libsecp256k1. Currently, we have a working suite of command-line tools for credential issuance, presentation, and verification. We are creating an HTTP server conforming to VC HTTP API, and we have native iOS and Android libraries that are used in our Credible wallet. If you would like to discuss how we would deploy the architecture described above for a specific use case, please take 30 seconds to leave us a message, and we will be more than happy to show our progress to date or show a demo.",https://sprucesystems.medium.com/introducing-didkit-an-identity-toolkit-e0dfa292f53d,,Code,,Resources,,,,,,,,2020-11-13,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Keylink,"Keylink is an in-development tool that links system accounts to keys. Accounts are authenticated using combinations of widely adopted protocols such as OpenID Connect and FIDO2. Keys can range from cryptographic keys to API credentials. Keylink can gradually bootstrap PKI within existing complex IT systems. It supports a centralized PKI operating mode that can evolve into decentralized PKI, and further coordinates with existing PKI and KMS installations.",,https://github.com/spruceid/keylink,,Code,,Resources,,,,,,DPKI,"OpenID Connect,FIDO2",2023-01-01,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Introducing TreeLDR: A Canopy Across Your Data Schema Dreams,TreeLDR is an open-source developer tool with a DSL that makes managing data schemas as easy as defining data structures in your favorite (sane) statically-typed language.,"Introducing TreeLDR: A Canopy Across Your Data Schema Dreams TreeLDR is an open-source developer tool with a DSL that makes managing data schemas as easy as defining data structures in your favorite (sane) statically-typed language. As we discover new ways to let users control their data across the web, we face plenty of hard problems to solve on the way there. We keep encountering the challenge of managing data schemas, especially when you add digital signing to them as in the case of W3C Verifiable Credentials. - How can you have a handle on your data when you don’t know how to describe them? - How do you go from machine-readable JSON to human-friendly understanding? - Which fields are required, and what do they mean? Is that bankIdreferring to a financial institution or a river bank? Fortunately, a crop of solutions have emerged for the problem of JSON data schema management over the years, including Semantic Web technologies (JSON-LD and SPARQL), JSON Schema, CouchDB views, and IPLD. The downside is that there are many categories of ways to manage data, primarily semantic meaning and validation, and combining them into a complete data schema management system is full of pitfalls and unpaved paths. For example, - JSON-LD will add semantic meaning to what a “LeaseAgreement” is in a specific context, but has no straightforward way to enforce that the “startDate” is an ISO 8601 datetime like “2022-08-16”. - JSON Schema can be used to require that “age” is greater than or equal to 21, but cannot explain who or what is being described by the age field in a way understandable by both humans and machines. - There is no agreed-upon way to perform wholesale migrations from one schema to the next one, or to rollback changes. There are some low-level protocols such as JSON patches that can serve as building blocks, but how would one automatically transform an OpenBadges V2 credential into an OpenBadges V3 one by configuring a managed migration instead of writing custom software that needs its own deployment pipeline? - How would you describe a JSON credential schema that must have been issued (digitally signed) by specific Ethereum or Solana accounts? What if this list of issuers needs to change, or networks need to be added based on different cryptography? We like emergence, and therefore oppose solutions that assume a single entity can efficiently propose, define, and evolve data schemas for all conceivable use cases across disparate verticals. We believe this to be technically infeasible, politically difficult, and also against the tenets of decentralization. Instead, we much prefer approaches where developers are empowered to self-serve, leveraging their specific domain knowledge to create data schemas that suit their use cases well–and when they need to, easily collaborate with other developers to reach a rough consensus on what would work for even more implementers. Introducing TreeLDR (Tree Linked Data Representation) That’s why we’re happy to introduce TreeLDR, which is an open-source developer tool with a DSL that makes managing data schemas as easy as defining data structures in your favorite (sane) statically-typed language. TreeLDR provides a single language to define common concepts (types) and shared data representations (layouts) that can then be compiled into a concert of data schema artifacts. It can be used to produce JSON Schemas, JSON-LD contexts, migration strategies, and eventually entire SDKs (with credential issuance and verification) in various target programming languages. In TreeLDR, not only can you import other TreeLDR definitions but also existing schemas such as JSON-LD contexts or XML XSDs. This way, developers can define data layouts in a familiar way and focus purely on the application they wish to build. Today, it just supports printing out JSON Schema and JSON-LD contexts, but it’s already usable, and more features are on the way. We felt it most important to release and quickly iterate against feedback from implementers as soon as possible, so here it is. We already use it to represent credential schemas using W3C Verifiable Credentials for the Rebase project. Here's an example of a TreeLDR file being compiled into both JSON-LD Context and JSON Schema: // Sets the base IRI of the document. base <https://example.com/>; // Defines an `xs` prefix for the XML schema datatypes. use <http://www.w3.org/2001/XMLSchema#> as xs; // A person. type Person { /// Full name. name: required xs:string, /// Parents. parent: multiple Person, /// Age. age: xs:nonNegativeInteger } After defining the TreeLDR file, we can run the following to define JSON-LD Context: tldrc -i example/xsd.tldr -i example/person.tldr json-ld context https://example.com/Person and the following is the output result: { ""name"": ""https://example.com/Person/name"", ""parent"": ""https://schema.org/Person/parent"", ""age"": ""https://schema.org/Person/age"" } We can also run the following to generate JSON Schema: tldrc -i example/xsd.tldr -i example/person.tldr json-schema https://example.com/Person and the following is the output result: { ""$schema"": ""https://json-schema.org/draft/2020-12/schema"", ""$id"": ""https://example.com/person.schema.json"", ""description"": ""Person"", ""type"": ""object"", ""properties"": { ""name"": { ""description"": ""Full name"", ""type"": ""string"" } ""parent"": { ""description"": ""Parents"", ""type"": ""array"", ""item"": { ""$ref"": ""https://example.com/person.schema.json"" } } ""age"": { ""description"": ""Age"", ""type"": ""integer"", ""minimum"": 0 } }, ""required"": [ ""name"" ] } What's Next - Enable merged representation of JSON-LD contexts and JSON Schema into a single file. - Define platform-agnostic and language-agnostic migration formats to upgrade/downgrade data schema versions automatically. - Begin integration with cryptoscript and DIDKit to produce developer SDKs that speak W3C Verifiable Credentials, including signing, verifying, and enforcing simple trust frameworks. - Investigate support for the IPLD ecosystem, including IPLD Data Schemas and Advanced Data Layouts (ADLs). TreeLDR is one of many open source tools released by Spruce to help developers tame the complexity of decentralized identity. We are sharing it publicly before an official release so we can get your feedback, so please send any thoughts on the roadmap, feature requests, and general reactions to hello@spruceid.com or in our Discord. Today, we will dogfood it across our projects before an official release, so if this is the kind of thing you'd like to work on, please check out our careers page. To give TreeLDR a spin yourself, check out the Quickstart in our documentation: Spruce lets users control their data across the web. Spruce provides an ecosystem of open source tools and products for developers that let users collect their data in one place they control, and show their cards however they want. If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord:",https://blog.spruceid.com/introducing-treeldr-a-canopy-across-your-data-schemas/,,Post,,Resources,,,,,TreeLDR,,,2022-08-26,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Spruce Systems introduces DIDKit,"DIDKit is a cross-platform toolkit for working with W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs). It allows you to resolve and manage DID documents, and also manage the entire lifecycle of Verifiable Credentials including their issuance, presentation, and verification.",,https://sprucesystems.medium.com/introducing-didkit-an-identity-toolkit-e0dfa292f53d,,Post,,Resources,,,,,,DIDKit,"DID,Verifiable Credentials",2020-11-13,,,,,,,,,,,,,
|
||
Spruce,Spruce,,,,,,,,,Upgradeable Decentralized Identity - DID Method Traits,DID method traits are testable properties about DID methods that can help implementers tame complexity and choose the right DID method(s) for their use case.,"Upgradeable Decentralized Identity - DID Method Traits DID method traits are testable properties about DID methods that can help implementers tame complexity and choose the right DID method(s) for their use case. Just yesterday, W3C Decentralized Identifiers were approved to be released as an official W3C Recommendation. As a W3C member organization, we are thrilled by this excellent outcome, and will celebrate it by sharing our favorite ideas about the next evolutions of DIDs that will make them more secure, composable, and friendly to implementers. A decentralized identifier (DID) is a URI that resolves to a JSON object called a DID document, which describes how to authenticate as the DID’s controller for different purposes. When a service knows that it’s talking to the controller, it can use this fact as the consistent anchor point to construct a decentralized identity, enriching the session with related data referring to the DID such as verifiable credentials (VCs) or any associated information found on a public blockchain. Where do DID documents come from? Each DID specifies a “DID method” that describes an exact resolution procedure (among other actions) to interpret the DID’s “method specific identifier” and ultimately produce a DID document. DID methods can retrieve data from a variety of sources: TLS-protected websites, public blockchains, or solely from the method-specific identifier itself. Over the past several years, dozens of different DID methods have emerged in practice, with proponents enthusiastic at how powerful and flexible DIDs can be to bridge disparate trust systems (e.g., Ethereum, GPG, and X.509), and detractors declaring an impending interoperability nightmare, with plenty of headaches for implementers. However, there is a way we can prevent this impending interop nightmare! Enter DID method traits: testable properties about DID methods that can help implementers tame complexity and choose the right DID method(s) for their use case. They can be used as requirements revealing which DID methods could satisfy the relevant constraints presented across different use cases. For example, requirements to support certain operations for the NIST Curve P-256, NIST-proposed Curve Ed25519, bitcoin Curve secp256k1 could all be expressed as different DID method traits complete with test suites. Also expressible as DID traits is the guarantee that a DID method is “purely generative,” requiring no storage lookups as in the case of did-key and did-pkh, as opposed to those actively querying a network such as did-web, did-ens, and did-ion. Finally, there may exist a DID method trait that ensures composability across different DID methods: that one DID may serve as the authentication method for another DID, such as did-pkh for did-ens or did-ion. This means that a user can start with an Ethereum account represented as did-pkh, then “upgrade” to a DID method that supports key rotation such as did-ens or did-ion. This helps create a great user experience when using DIDs, as with this approach, users do not need to set up a new decentralized public key infrastructure just to get started. Instead, they can start with whatever key-based accounts they have, leverage the corresponding DID methods, and graft their existing identifier to a more featureful DID method supporting this kind of composability whenever needed. Previous work has been done on the DID method rubric, which evaluates criteria as wide-ranging as underlying network decentralization, adoption metrics, and regulatory compliance. DID method traits may exist as a subset of possible criteria in the DID method rubric, or as a parallel spec used in conjunction. I will be writing a paper on DID method traits as my submission to the forthcoming Rebooting the Web of Trust (RWOT) conference in The Hague. If you’d like to collaborate on this, please reach out! Spruce lets users control their data across the web. If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord:",https://blog.spruceid.com/upgradeable-decentralized-identity/,,Post,,Standards,,,,,,,DID,2022-07-01,,,,,,,,,,,,,
|
||
Transmute,,Transmute,,Eric Olszewski; Karyl Fowler; Orie Steele,DIF; DHS; Microsoft; Consensys; Oracle,"USA, Texas, Austin",USA,,,Transmute,"Build a network of trust with Transmute<br><br>Transmute secures critical supplier, product, and shipment data to give customers a competitive edge in the increasingly dynamic global marketplace.",,https://www.transmute.industries/,,Company,,Company,Enterprise,Supply Chain,,,VCI,Ethereum,"DID,Verifiable Credentials,OAuth,OIDC",2017-05-01,https://github.com/transmute-industries,https://twitter.com/transmutenews,https://www.youtube.com/channel/UCUtBzCKziRpFleZcsnVpUkw,""https://medium.com/transmute-techtalk/","https://medium.com/@Transmute"",""https://medium.com/transmute-techtalk/","https://medium.com/@Transmute"",,https://www.crunchbase.com/organization/transmute-industries,https://www.linkedin.com/company/transmute-industries/,,,,,
|
||
Transmute,NextLevelSupplyChain,,,,,,,,,Visibility 2.0: Creating Digital Consistency in an International Supply Chain,"how can something as complicated as the international supply chain take fundamental trade practices and marry them with innovation so we can move at the speed of digitization? Join us for a mind-blowing discussion with Karyl Fowler, CEO at Transmute","Aug 10, 2022 Innovation tends to move more quickly than we can update our processes and infrastructure. So how can something as complicated as the international supply chain take fundamental trade practices and marry them with innovation so we can move at the speed of digitization? Join us for a mind-blowing discussion with Karyl Fowler, CEO at Transmute, and hear about the work being done to digitize trade documentation in a way that is cryptographically verifiable and traceable across the entire logistics ecosystem.",https://nextlevelsupplychainpodwithgs1us.libsyn.com/visibility-20-creating-digital-consistency-in-an-international-supply-chain,,Episode,,Explainer,,Supply Chain,,,,,,2022-08-10,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,Conexus Indiana,,,,,Blockchain-secured Documents for Global Trade,"On May 14th, 2020 Conexus Indiana and Transmute hosted an interactive webinar titled “Blockchain-secured Documents for Global Trade” as part of the Emerging Technology Showcase series.","Blockchain-secured Documents for Global Trade Recap of Conexus Indiana & Transmute Emergent Technology Showcase On May 14th, 2020 Conexus Indiana and Transmute hosted an interactive webinar titled “Blockchain-secured Documents for Global Trade” as part of the Emerging Technology Showcase series. The full recording is available above, or can be viewed on YouTube. Presenters (By order of appearance) - Mitch Landess: Vice President of Innovation and Digital Transformation at Conexus Indiana - Karyl Fowler: CEO & Co-founder at Transmute - Vincent Annunziato: Director, Business Transformation & Innovation Division, Trade Transformation Office, Office of Trade, Customs & Border Protection - Margo Johnson: Head of Product at Transmute - Anil John: Technical Director, Department of Homeland Security, Science & Technology Silicon Valley Innovation Program Showcase Overview Transmute is an Austin, TX based technology company that secures supplier, product, and shipment data for global supply chains. Together with our partners at the Department of Homeland Security, Science & Technology Silicon Valley Innovation Program and the US Customs and Border Protection Office of Trade, Transmute shared tangible examples of how technology, including blockchain, secure data storage, decentralized identifiers and verifiable credentials are being leveraged to digitize critical business documents for global trade. Transmute demoed what this technology looks like for manufacturers and importers with international supply chains. Participants were encouraged to join the session if they found value in: - Looking for ways to increase efficiencies and decrease costs associated with import and export of goods - Exploring counterfeit-reduction and chain of custody for critical goods ranging from steel to medical test kits - Curious about how targeted use of blockchain technology can bring tangible advantages to their business About Conexus Conexus Indiana’s Emerging Technology Showcase is a series that highlights specific technology solutions that offer a value proposition to Advanced Manufacturing and Logistics (AML) organizations. Featured technologies are qualified by Conexus to be commercially launched while still early-stage. The intent is to give Indiana-based companies exposure and direct access to credible technology solutions that can offer a competitive advantage prior to the technology’s widespread availability. If you represent an Indiana-based AML company or technology solution and would like to suggest a candidate technology to be featured in a future Emerging Technology Showcase please reach out to Conexus directly.",https://medium.com/transmute-techtalk/blockchain-secured-documents-for-global-trade-3369d4cfab1f,,Webinar,,Explainer,,Supply Chain,,,,,,2020-05-20,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,Identity Terms Provide Value along the Supply Chain: How We Know When to Buy the Farm,"Jessica Tacka Supply chain credentialing in the form of bills of lading, certificates of origin, or letters of credit is used to protect honest parties and their merchandise from being confused with dishonest parties or entities that are engaged in unethical practices, such as environmental destruction, or forced labor.",,https://medium.com/@transmute/identity-terms-provide-value-along-the-supply-chain-how-we-know-when-to-buy-the-farm-738701967e3d,,Post,,Explainer,,Supply Chain,,,,,,2022-06-09,,,,,,,,,,,,,
|
||
Transmute,Transmute,,,,,,,,,Takaways from the Suez Canal Crisis,"Karyl Fowler Appeal for Supply Chain Agility — Powered by Verifiable Credentials The Suez Canal debacle had a massive impact on global supply chains — estimated at >$9B in financial hits each day the Ever Given was stuck, totaling at nearly $54B in losses in stalled cargo shipments alone.","Takeaways from the Suez Canal Crisis An Appeal for Supply Chain Agility — Powered by Verifiable Credentials The Suez Canal debacle had a massive impact on global supply chains — estimated at >$9B in financial hits each day the Ever Given was stuck, totaling at nearly $54B in losses in stalled cargo shipments alone. And it’s no secret that the canal, which sees >12% of global trade move through it annually, dealt an especially brutal blow to the oil and gas industry while blocked (given it represents the primary shipping channel for nearly 10% of gas and 8% of natural gas). While the Ever Given itself was a container ship, likely loaded with finished goods versus raw materials or commodities, the situation has already — and will continue to — have a massive negative impact on totally unrelated industries…for months to come. Here’s an example of the resulting impact on steel and aluminum prices; this had related impact again to oil and gas (steel pipes flow oil) as well as infrastructure and…finished goods (like cars). And the costs continue to climb as the drama unfolds with port authorities and insurers battling over what’s owed to who. Transmute is a software company — a verifiable credentials as a service company to be exact — and we’ve been focused specifically on the credentials involved in moving steel assets around the globe alongside our customers at DHS SVIP and CBP for the last couple years now. Now, there’s no “silver bullet” for mitigating the fiscal impact of the Ever Given on global trade, and ships who arrived the day it got stuck or shortly after certainly faced a tough decision — sail around the Cape of Africa for up to ~$800K [fuel costs alone] + ~26 days to trip or wait it out at an up to $30K per day demurrage expense [without knowing it’d only be stuck for 6 days or ~$180,000]. So what if you’re a shipping manager and you can make this decision faster? Or, make the call before your ship arrives at the canal? [Some did make this decision, by the way]. What if your goods are stuck on the Ever Given — do you wait it out? Switching suppliers is costly, and you’ve likely got existing contracts in place for much of the cargo. Even if you could fulfill existing contracts and demand on time with a new supplier, what do you do with the delayed cargo expense? What if you’re unsure whether you can sell the duplicate and delayed goods when they reach their originally intended destination? Well, verifiable credentials — a special kind of digital document that’s cryptographically provable, timestamped and anchored to an immutable ledger at the very moment in time it’s created — can give companies the kind of data needed to make these sorts of decisions. With use over time for trade data, verifiable credentials build a natural reputation for all the things the trade documents are about: suppliers, products, contracts, ports, regulations, tariffs, time between supply chain handoff points, etc. This type of structured data is of such high integrity that supply chain operators can rely on it and feel empowered to make decisions based on it. What I’m hoping comes from this global trade disaster is a change in the way supply chain operators make critical decisions. Supply chains of the future will be powered by verifiable credentials, which seamlessly bridge all the data silos that exist today — whether software-created silos or even the paper-based manual, offline silos. Today, it’s possible to move from a static, critical chain style of management where we often find ourselves in a reactive position to supply chains that look more like an octopus. High integrity data about suppliers and products enables proactive, dynamic decision making in anticipation of and in real time response to shifts in the market — ultimately capturing more revenue opportunities and mitigating risk at the same time.",https://medium.com/transmute-techtalk/takeaways-from-the-suez-canal-crisis-971f7404b058,,Post,,Explainer,,Supply Chain,,,,,,2021-04-20,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,The Business Case for Interoperability,"For Transmute, the foundations required to technically interoperate are considered pre-competitive, and our ability to interoperate widely is a strategic feature. This feature powers a competitive advantage that ensures Transmute customers’ critical trade documents are verifiable at every step in the supply chain, regardless of where they’re stored and what blockchain they’re anchored to. Transmute customers realize maximum confidence about data integrity and access far richer insights about the health of their supply chains as a result.","The Business Case for Interoperability Transmute believes that cross-vendor interoperability is critical for commercialization of decentralized identifier (DID) and verifiable credential (VC) technology. To interoperate means that a computer system or software has the ability to exchange and make use of data from a different system or software “via a common set of exchange formats.” But technical interoperability is difficult to achieve, and it’s definitely not the status quo when it comes to the systems our customers are using today. In fact, most of them are [frustratingly] locked into a single vendor when it comes to transaction management software — with no easy way to share data with parties in their ecosystem. I’m often asked why we, as a start-up, would build so much out in the open. We aren’t afraid to put out the first reference implementation — although never without tests — because we’re keen to have a baseline to measure against and iterate on, especially when it comes to the technical foundations required for achieving interoperability. The more interesting question is why we work so hard to ensure interoperability with “competing” or ancillary DID-based products. The answer is that we view achieving interoperability as a requirement for market creation. We are solving a business problem; achieving interoperability is a barrier to adoption. As the world becomes increasingly hyperconnected, “connective tissue” products and services [like ride-shares, Slack, etc.] are in greater demand in order to bridge the last miles and information silos created between the multitude of disparate internet-enabled products and services we now rely on. For this product category, interoperability is required to create a market compelling enough to go after. For instance, if you are loyal to a single ride-share brand today, your user experience suffers. It takes longer to find rides if you’re loyal, and transit time is a key metric in mobility. Additionally, a frustrating user experience limits adoption, and throws a major kink in customer retention. If riders have multiple apps and are willing to ride whatever brand is most convenient to them, their user experience massively improves, adoption accelerates, and the market expands for everyone. This example demonstrates market expansion due to interchangeability versus technical interoperability, but it highlights the same impact to customer adoption and retention given customers’ current attitude toward avoiding vendor lock-in. Similarly, if my Texas issued digital driver license can’t be verified by my California-resident digital wallet, then I have not sufficiently solved the inefficiencies and traceability problems of physical licenses. In this scenario, a new, worse inconvenience is created since adoption of a solution that lacks interoperability means I’ll have to carry my physical license for interstate transit and my digital one. Furthermore, convincing state DMVs to offer a digital identity credential is a tremendous feat (it took Texas 3 legislative sessions to agree to a single pilot); imagine when you have to sell them on implementing 50 different versions. After all, my physical form Texas drivers license is already an acceptable identification credential across all 50 states. One criticism of supply chain-focused solutions using blockchain is that in order to realize value, you have to convince all of the ecosystem players to adopt the same product — or at least the same technology stack. Not only is that a tough sell in terms of architecture investment (nobody wants to “rip-and-replace,” and everybody hates vendor lock-in), but it does not sufficiently address the data sharing challenges that logistics professionals face today. Efficiency gains are necessary, but not sufficient; customers want to share critical data in a provable form regardless of the underlying system. If a brand uses software to create digital product credentials, but said credentials could not be effectively handed off to subsequent players in their supply chain, then the problem isn’t solved. For Transmute, the foundations required to technically interoperate are considered pre-competitive, and our ability to interoperate widely is a strategic feature. This feature powers a competitive advantage that ensures Transmute customers’ critical trade documents are verifiable at every step in the supply chain, regardless of where they’re stored and what blockchain they’re anchored to. Transmute customers realize maximum confidence about data integrity and access far richer insights about the health of their supply chains as a result.",https://medium.com/transmute-techtalk/the-business-case-for-interoperability-a1a2b884297d,,Post,,Explainer,,,,,,,,2020-05-21,,,,,,,,,,,,,
|
||
Transmute,ViennaDigitalID,,,,GS1,,,,Vienna Digital Identity #30,Identity in the Supply Chain,Vienna Digital Identity #30 GS1 is the global association for supply chain identifiers with members across all industry sectors and interacting (unbeknownst) with general consumer on a daily basis. Transmute a foundational member of the DID/VC community and a participant in the US DHS Silicon Valley Innovation Program’s cross-border shipping use case.,"In this edition of the Vienna Digital Identity Meetup we open our 4th year with a couple presentations and a discussion on how Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) are starting to make inroads and impact in the global supply chain. GS1 is the global association for supply chain identifiers with members across all industry sectors and interacting (unbeknownst) with general consumer on a daily basis. Transmute a foundational member of the DID/VC community and a participant in the US DHS Silicon Valley Innovation Program’s cross-border shipping use case. Slide Decks: - Transmute Deck: thedinglegroup.com/s/Transmute_Vienna-Digital-ID-Forum-Jan-2022.pdf - GS1 Deck: thedinglegroup.com/s/2022-01-24_ViennaDigitalIdentityPhilA.pdf Time marks: 00:00:00 - Introduction and Opening Remarks 00:05:46 - Karyl Fowler, CEO Transmute 00:28:33 - Health Tots video 00:36:36 - Phil Archer, Web Solution Director, GS1 Global 00:52:28 - Round Table, Karyl Fowler, Phil Archer, Michael Shea 01:23:26 - Upcoming Events",https://vimeo.com/669713750,,Video,,Explainer,,,,,,,,2022-01-25,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,Back to the Basics,"Several mentors have encouraged me to publicly share a more detailed account of Transmute’s choice to shift focus solely to commercializing Transmute ID, the decentralized identity component of our original product, the Transmute Platform — a Heroku-like rapid dApp builder that seamlessly bridged centralized and decentralized tech, specifically for existing enterprises.<br><br>[...]there was zero near-term enterprise demand in the token-powered functionality of the platform. We searched high and low, interviewing most of the major enterprise storage solutions out there and couldn’t even find one willing to admit that investing decentralized storage tech was on their 5 year innovation roadmap.[...]<br><br> we did uncover a demand for user-centric identity tech (e.g. increased security, privacy, portability, infinite federation/scalable, etc.) and an enormous demand for the resulting efficiency gains and untapped revenue potential of implementing a decentralized identity solution. Because these directly address problems enterprises are facing today, and they directly connect to the cost/profit levers that enterprise stakeholders care most about.","Back to the Basics Course Correcting from Tokens to Equity Several mentors have encouraged me to publicly share a more detailed account of Transmute’s choice to shift focus solely to commercializing Transmute ID, the decentralized identity component of our original product, the Transmute Platform — a Heroku-like rapid dApp builder that seamlessly bridged centralized and decentralized tech, specifically for existing enterprises. [Shoutout to our friends at Mainframe, Wireline and Golem who are keeping the dApp platform dream alive, successfully executing on it for the fully decentralized community!] So here it goes: Now that we’re an [emotionally] safe distance away from the crypto-crash of 2018 and crypto-goldrush that precluded it, I think we can all agree: some projects don’t technically require a token. This is the primary pushback any founder looking to drive adoption of a product with a crypto-token dependency faces in every pitch. And answering this in an accessible way that demonstrates technical chops and business acumen is no small feat. The second biggest concern for potential partners and investors is the legality of the token as an investment. There was [and still is] so much unknown and undecided about the validity and treatment of crypto-token assets here in the U.S. And relocating beyond the U.S. as an American start-up offers an onslaught of complex formation, tax and liability variables such that there is no “easy” option. The third area of question is where the above two intersect: the token economics. Are incentives aligned at network launch? What is the phased approach to reach equilibrium? How will we keep incentives aligned? How will we handle price volatility for enterprise customers? What will we do if, if, if. These are the concerns I witnessed fellow founders spend the most time, energy and resources addressing — myself included. And it makes sense; addressing these concerns is about de-risking the business opportunity. Since we were aware of these primary areas of concern heading into our initial raise, we came to the table with a de-risking plan as part of our first year’s roadmap. Upon closing our first million in pre-seed capital, we immediately began executing on our plan. We built our core team, and we rightfully spent a lot [in dollars and manpower] on finding the right solutions and answers to each of these questions for Transmute. We ultimately concluded that a crypto-token is technically required to ensure specific functionality of the decentralized side of the platform (e.g. decentralized store and compute). This resulted in fine-tuning our token economics through extensive modeling, and a decision to pursue a Reg A+ structure to run the token sale under. Since this meant we needed to functionally look like a publicly traded company [while realistically at the seed stage], we began the daunting task of reorganizing our operations to suit Reg A+ requirements (e.g. legal forms galore, GAAP audited books, etc.). At least, all of these signs pointed to a clear path forward to build the Transmute Platform. [This regulatory path was further validated by Blockstack’s recent SEC approval to proceed with their token offering under the same RegA+ structure; enormous congrats to this team!] Fortunately, we had one more chapter in our de-risking plan that we’d run in parallel to the aforementioned efforts: determine early product-market-fit. This initial effort was conducted over a 4–6 month period and involved hundreds of interviews with potential platform users (developers) and customers (enterprises). And when the data was in, the results were clear: there was zero near-term enterprise demand in the token-powered functionality of the platform. We searched high and low, interviewing most of the major enterprise storage solutions out there and couldn’t even find one willing to admit that investing decentralized storage tech was on their 5 year innovation roadmap. However, we did uncover a demand for user-centric identity tech (e.g. increased security, privacy, portability, infinite federation/scalable, etc.) and an enormous demand for the resulting efficiency gains and untapped revenue potential of implementing a decentralized identity solution. Because these directly address problems enterprises are facing today, and they directly connect to the cost/profit levers that enterprise stakeholders care most about. This was not the data we wanted; it was the data we needed. After taking time to reflect [and scrutinize our path to this point], the decision was clear: we needed to productize the decentralized identity component of our platform we found demand for as a standalone product and go to market sans dApp platform, and as CEO, I needed to re-align incentives across all our stakeholders. First, I immediately ceased legal work towards a token sale, redirecting efforts to assess a path forward as a venture-backed company with equity as the primary asset value can accrue to. I settled on offering investors who invested via SAFTs (“simple agreement for future tokens”) the opportunity to convert to a SAFE (“simple agreement for future equity”) instead. This kept most terms consistent, and it helped me avoid pricing ourselves at the pre-seed stage. Next, I reframed our mission with my team. A ton of work went into the early platform prototypes, and I wanted to ensure they understood how crucial their efforts were regardless of our new direction. Since the component that would become today’s Transmute ID was already the “core” of the platform, it was simple to refocus all of engineering on this single piece. Lastly, I brought the data and the updated company plan to our investors. We’d achieved everything we set out to accomplish in year one, but this story wasn’t one I anticipated telling. As such, I was most nervous about this part as we have some highly esteemed crypto investors in our pre-seed round, and I know they [like us] strongly believe in a more decentralized future. But I was armed with tangible evidence that the old approach was the wrong one. In fact, it was so wrong that it would have been overtly irresponsible for us to continue to pursue it. Minting and distributing a token to investors for an enterprise product with no demand for it would kill any company eventually. Furthermore, selling Transmute ID as a standalone product meant value would accrue away from our early token holders to the equity pool. Not only did our investors understand and appreciate the thoroughness of our de-risking process, but they fully supported our decision and accepted the conversion offers. Solving for identity was always a key part of commercializing decentralized technologies, but now we understand that it is THE key to adoption. When DID tech is pervasively integrated throughout enterprise infrastructures, we will finally have the tools we need to optimally own and control our individual data [and privacy], our intellectual property, our consent and our access. This is how we will reach the more equitable future this community is collectively aiming for. Today, we don’t view this decision as a true pivot for Transmute; we view it as a distillation of our strategy, more impetus to focus aggressively on the things we know will make the biggest impact…and build a profitable business…because they’re actually being used and adopted in the enterprise today. Our larger mission hasn’t changed: we are bridging the centralized world to the decentralized. And we’ve double downed on our pragmatic approach to integration of decentralized identity with legacy tech and clouds [which enterprises like Microsoft are now voicing is the right approach]. I can’t turn back time, and I am immensely grateful for the learnings and resulting shift in focus for Transmute. However, in my next rodeo, I will remember this lesson and go back to the basics, aggressively seeking product-market-fit first and foremost.",https://medium.com/transmute-techtalk/back-to-the-basics-9158f47f4eb6,,Post,,Meta,,Supply Chain,,,,,,2019-07-16,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,did:(customer),"Transmute builds solutions that solve real business problems. For this reason, we support a number of different decentralized identifier (DID) methods. While we are committed to providing optionality to our customers, it’s equally important to communicate the selection criteria behind these options so that customers can consider the tradeoffs of underlying DID-methods alongside the problem set they’re solving for. Essentially, we help them pick the right tool for the job.","did:(customer) Transmute’s evolving criteria for matching DID methods to business requirements. Transmute builds solutions that solve real business problems. For this reason, we support a number of different decentralized identifier (DID) methods. While we are committed to providing optionality to our customers, it’s equally important to communicate the selection criteria behind these options so that customers can consider the tradeoffs of underlying DID-methods alongside the problem set they’re solving for. Essentially, we help them pick the right tool for the job. In the spirit of sharing and improving as an industry, here are the work-in-progress criteria we use to help customers assess what DID method is best for their use case: Interoperability This DID method meets the interoperability requirements of my business, for example: - Other parties can verify my DID method. - I can switch out this DID method in the future if my business needs change. Security This DID method meets the security requirements of my business, such as: - Approved cryptography for jurisdiction/industry - Ledger/anchoring preferences - Key rotation/revocation Privacy This DID method meets privacy requirements relevant to my use case, for example: - Identifiers of individuals (data privacy and consent priorities) - Identifiers for companies (organization identity and legal protection priorities) - Identifiers for things (scaling, linking, and selective sharing priorities) Scalability This DID method meets the scalability needs of my business use case, for example: - Speed - Cost - Stability/maturity Root(s) of Trust This DID method appropriately leverages existing roots of trust that have value for my business or network (or it is truly decentralized). For example: - Trusted domain - Existing identifiers/ identity systems - Existing credentials We are currently using and improving these criteria as we co-design and implement solutions with customers. For example, our commercial importer customers care a lot about ensuring that their ecosystem can efficiently use the credentials they issue (interoperability) without disclosing sensitive trade information (privacy). Government entities emphasize interoperability and accepted cryptography. Use cases that include individual consumers focus more on data privacy regulation and control/consent. In some instances where other standardized identifiers already exist, DIDs may not make sense as primary identifiers at all. Examples of DID methods Transmute helps customers choose from today include: Sidetree Element (did:elem, Ethereum anchoring), Sidetree Ion (did:ion, Bitcoin anchoring), Sidetree Photon (did:photon, Amazon QLDB anchoring), did:web (ties to trusted domains), did:key (testing and hardware-backed keys), and more. How do you think about selecting the right DID method for the job? Let’s improve this framework together.",https://medium.com/transmute-techtalk/did-customer-4ca8b7957112,,Post,,Meta,,,,,,,DID,2020-10-30,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,GS1; SVIP,,,,,Transmute Closes $2M Seed Round,"Closing our seed round coincides with another exciting announcement: our recent launch of Phase II work with the U.S. Department of Homeland Security, Science & Technology’s Silicon Valley Innovation Program (SVIP) to enhance “transparency, automation and security in processing the importation of raw materials” like steel.","Transmute Closes $2M Seed Round We’re thrilled to announce the close of Transmute’s $2 million series seed round led by Moonshots Capital, and joined by TMV, Kerr Tech Investments and several strategic angels. Transmute has gained momentum on our mission to be the trusted data exchange platform for global trade. As a byproduct of the pandemic, the world is collectively facing persistent supply chain disruption and unpredictability. This coupled with increasing traceability regulations is driving an urgency for importers to fortify their supply chains. COVID-19 especially has highlighted the need for preventing counterfeit goods and having certainty about your suppliers (and their suppliers). Transmute’s software is upgrading trade documentation today to give importers a competitive edge in an increasingly dynamic, global marketplace. Leveraging decentralized identifier (DID) and verifiable credential (VC) tech with existing cloud-based systems, Transmute is able to offer digital product and supplier credentials that are traceable across an entire logistics ecosystem. From point of origin to end customer, we are unlocking unprecedented visibility into customers’ supplier networks. Disrupting a highly regulated and old-fashioned industry is complex, and an intentional first step in our go-to-market strategy has been balancing both the needs of regulators and commercial customers. This is why we’re incredibly proud to join forces with our lead investors at Moonshots Capital, a VC firm focused on investing in extraordinary leaders. We look forward to growing alongside Kelly Perdew (our newest Board of Directors member) and his founding partner Craig Cummings. They’re a team of military veterans and serial entrepreneurs with extensive success selling into government agencies and enterprises. We are equally proud to be joined by Marina Hadjipateras and the team at TMV, a New York-based firm focused on funding pioneering, early-stage founders. Between their commitment to diverse teams, building sustainable futures and their deep expertise in global shipping and logistics, we feel more than ready to take on global trade with this firm. The support of Kerr Tech Investments, led by Josh and Michael Kerr, further validates our company’s innovative approach to data exchange. Josh is a seasoned entrepreneur, an e-signature expert and has been advising us since Transmute’s inception. Closing our seed round coincides with another exciting announcement: our recent launch of Phase II work with the U.S. Department of Homeland Security, Science & Technology’s Silicon Valley Innovation Program (SVIP) to enhance “transparency, automation and security in processing the importation of raw materials” like steel. Our vision is more broad than just improving how trade gets done, and steel imports are just the beginning. We’re inserting revolutionary changes into the fabric of how enterprises manage product and supplier identity, effectively building a bridge — or a fulcrum, rather — towards new revenue streams and business models across industries. Last — but absolutely not least — I want to give a Personal shoutout to my core teammates; startups are a team sport, and our team is stacked! Tremendous congratulations as these backers will accelerate our progress in a huge way. And finally, thanks also to our stellar team of advisors who commit significant time coaching us through blind spots as we bring Transmute’s product to market. Also, we’re Hiring! Expanding our capacity to meet customer demand is our top nearterm priority. We’re adding a few engineering and product roles to our core team in Austin, TX, so please apply or spread the word!",https://medium.com/transmute-techtalk/transmute-closes-2m-seed-round-a0a2e6c90467,,Post,,Meta,,,,,,,,2020-10-21,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,Mobility X hackathon at Capital Factory,Transmute IoT,"The Transmute team joined forces with other Austin hackers to participate in [the first Mobility X hackathon at Capital Factory sponsored by car2go](https://www.eventbrite.com/e/mobilityx-hackathon-presented-by-car2go-tickets-33718213083#) a few weekends ago where hackers were challenged to address how to handle rider demand fluctuations or ensure consistent vehicle connectivity. <br><br> Maintaining network connectivity felt like the most urgent problem to solve given an effective solution would mean more accurate data on the car2go fleets, resulting in an expanded capacity to address rider demand changes. Mesh networks have been explored as a natural solution for maintaining network connectivity among distributed assets that are moving around geographically. …","Transmute IoT The Transmute team joined forces with other Austin hackers to participate in the first Mobility X hackathon at Capital Factory sponsored by car2go a few weekends ago where hackers were challenged to address how to handle rider demand fluctuations or ensure consistent vehicle connectivity. Maintaining network connectivity felt like the most urgent problem to solve given an effective solution would mean more accurate data on the car2go fleets, resulting in an expanded capacity to address rider demand changes. Mesh networks have been explored as a natural solution for maintaining network connectivity among distributed assets that are moving around geographically. The problem has historically been when an asset (or car) moves into a “dead zone” where a network node doesn’t exist, so connection drops. Our hardware hacker counterparts mapped out incentive schemes for mesh node operators in areas with poor connectivity and tracking integration for location specific smart contract behavior. Meanwhile, we chose to apply the Transmute framework to build a simple smart contract-based interface to mesh network devices. Significant progress was made on the Transmute framework, which our team used to power the final pitch prototype. We added dynamic event types to our EventStore contract, allowing us to easily interact with json databases and better leverage redux on the client. We consider this stack the “IoT Smart Contracts” or more generally, the interface between hardware devices and blockchain technology. Its applications are varied and include warehousing and logistics systems, inventory management, and firmware and sensor data interfaces. Although, we did not win this hackathon, we feel the problem space is worth considering as the number of automated assets require constant connectivity to function effectively rapidly increases (e.g. self-driving cars, etc.) On a final note, hackathons are a hobby for the Transmute team. We learn as much or more than we build sometimes. Typically, we arrive at the hackathon with a fully formed team and idea, and this has historically worked out well for us. This time, we regretfully neglected this strategy. Teams that remain loosely defined or don’t have prior experience with teammates’ skill sets will have significantly more trouble communicating effectively or organizing generally. This often results in a lack of cohesive messaging which is confusing to the judges and severely stifles technical progress. This may seem obvious, but it’s easy to get stuck in the code and forget how important the pitch is. The pitch is everything. We had a great time working on the Transmute framework and thinking about the blockchain connected hardware space. Lastly, shoutout to The 21 Marketplace, an awesome tool we found along the way.",https://medium.com/transmute-techtalk/transmute-iot-2d00fdcf53e9,,Post,,Meta,,,,,,,,2017-06-10,,,,,,,,,,,,,
|
||
Transmute,DHS,,,,,,,,,News Release: DHS Awards $198K for Raw Material Import Tracking Using Blockchain,"WASHINGTON – The Department of Homeland Security (DHS) Science and Technology Directorate (S&T) has awarded $198,642 to Transmute Industries, Inc. based in Austin, TX to develop a proof-of-concept application for Customs and Border Protection (CBP) to support increased transparency, automation and security in processing the importation of raw materials such as steel, timber and diamonds raw goods entering the United States.<br>","FOR IMMEDIATE RELEASE S&T Public Affairs, 202-254-2385 WASHINGTON – The Department of Homeland Security (DHS) Science and Technology Directorate (S&T) has awarded $198,642 to Transmute Industries, Inc. based in Austin, TX to develop a proof-of-concept application for Customs and Border Protection (CBP) to support increased transparency, automation and security in processing the importation of raw materials such as steel, timber and diamonds raw goods entering the United States. S&T is exploring the application of blockchain to issue credentials digitally to enhance security, ensure interoperability and prevent forgery and counterfeiting. Transmute builds identity management solutions that use blockchain technology to streamline and enforce identity authorization. Its Phase 1 award project “Verifiable Provenance, Traceability, and Regulatory Compliance for Raw Material Imports” will adapt Transmute ID, its core technology product that leverages centralized and decentralized identity infrastructures to secure individual agency identities and verifiable credentials to ensure that CBP has visibility into the provenance, traceability and regulatory compliance of raw material imports. “The ability to construct a secure, digital, chain-of-custody mechanism for raw material imports is a critical aspect of enabling legitimate trade.” said Anil John, SVIP Technical Director. “Transmute’s combined centralized and decentralized approaches address this challenge and support global interoperability by utilizing emerging World Wide Web Consortium global standards.” The Phase 1 award was made under S&T’s Silicon Valley Innovation Program (SVIP) Other Transaction Solicitation Preventing Forgery & Counterfeiting of Certificates and Licenses seeking blockchain and distributed ledger technology (DLT) solutions to fulfill a common need across DHS missions. SVIP is one of S&T’s programs and tools to fund innovation and work with private sector partners to advance homeland security solutions. Companies participating in SVIP are eligible for up to $800,000 of non-dilutive funding over four phases to develop and adapt commercial technologies for homeland security use cases. For more information on current and future SVIP solicitations, visit https://www.DHS.gov/science-and-technology/svip or contact DHS-silicon-valley@hq.DHS.gov. For more information about S&T’s innovation programs and tools, visit https://www.DHS.gov/science-and-technology/business-opportunities. #",https://www.dhs.gov/science-and-technology/news/2019/11/08/news-release-dhs-awards-198k-raw-material-import-tracking,,Press,,Meta,,,,,,,,2019-11-08,,,,,,,,,,,,,
|
||
Transmute,PRWeb,,,,Moonshots; TMV; KerrTech,,,,,"Transmute Closes $2M Seed Round From Moonshots Capital, TMV, Kerr Tech Investments",,"“When it comes to commercial importers, new trade regulations combined with antiquated processes are making compliance a huge burden. At the exact same moment, the market is demanding increasing evidence that products are what they say they are,” said Karyl Fowler, Co-Founder and CEO of Transmute. AUSTIN, Texas (PRWeb) October 21, 2020 Transmute, the trusted data exchange platform for global trade, today announced the close of a $2 million series seed round led by Moonshots Capital, with participation from TMV and Kerr Tech Investments. With a clear opportunity to grow rapidly in the enterprise market, Transmute plans to deploy the new capital to expand its Austin, Texas-based team to service increasing customer demand. The company secures critical supplier, product, and shipment data to give customers a competitive edge in an increasingly dynamic, global marketplace. “With backgrounds that span microelectronics manufacturing to cybersecurity, Transmute’s founding team is uniquely qualified to solve vulnerabilities in trade compliance,” said Kelly Perdew, General Partner at Moonshots Capital, a VC firm focused on investing in extraordinary leaders. “We’re enormously proud to back Karyl and Orie as they lead the way in modernizing and securing critical data for international trade.” Perdew will join the company’s Board of Directors as part of the funding. Transmute digitizes trade documentation in a way that is cryptographically verifiable and traceable across an entire logistics ecosystem. While eliminating the hassle of paper trails, Transmute provides unprecedented visibility into customers’ supplier networks. The company’s unique approach combines decentralized identifier (DID), verifiable credential (VC) and blockchain technology with existing cloud-based systems to effectively memorialize trade data at every step in a products’ journey. “When it comes to commercial importers, new trade regulations combined with antiquated processes are making compliance a huge burden. At the exact same moment, the market is demanding increasing evidence that products are what they say they are and are created how they said they were — whether ethically, sustainably or otherwise,” said Karyl Fowler, Co-Founder and CEO of Transmute. “Our seed round investors are purposefully rich in operational expertise spanning government and enterprise logistics. We are thrilled to be joining forces to accelerate growth.” The series seed round of funding closely follows Transmute’s recent launch of Phase II work with the U.S. Department of Homeland Security, Science & Technology’s Silicon Valley Innovation Program (SVIP) to enhance “transparency, automation and security in processing the importation of raw materials” like steel. “The ability to construct a secure, digital, chain-of-custody mechanism for raw material imports is a critical aspect of enabling legitimate trade.” said Anil John, SVIP Technical Director. “Transmute’s combined centralized and decentralized approaches address this challenge and support global interoperability by utilizing emerging World Wide Web Consortium global standards.” The company’s founders are established thought leaders within the emerging decentralized identity industry, each holding leadership positions in industry standards organizations from the W3C and the DIF. After taking the company through Techstars and incubating the underlying tech with early customers, the team found significant enterprise demand for digital identifiers that could persist and traverse across different contexts. ""Karyl is disrupting an archaic industry that requires some hand-holding in addition to a ground breaking service. Transmute has assembled a team of leading technologists creating software attuned to the needs of regulators and commercial customers alike,” says Marina Hadjipateras, co-founder and General Partner at TMV, a New York-based firm focused on funding pioneering early-stage founders. About Transmute Transmute secures critical supplier, product, and shipment data to give customers a competitive edge in an increasingly dynamic, global marketplace. The company was born of a hackathon where founders, Karyl Fowler and Orie Steele, took the prize with an innovative application that demonstrated the value of decentralized identifier infrastructure. About Moonshots Capital Moonshots Capital, with offices in Los Angeles and Austin, was founded by a team of military veterans in 2017. They have collectively founded and operated 15 companies, and have Personally invested in over 85 ventures. Beyond capital, they deploy their military and entrepreneurial experience and network to help world-changing companies grow.",https://www.prweb.com/releases/transmute_closes_2m_seed_round_from_moonshots_capital_tmv_kerr_tech_investments/prweb17487962.htm,,Press,,Meta,,,,,,,,2020-10-21,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,DigitalBazaar,,,,,Encrypted Data Vaults,"EDVs allow users and companies to store data with their favorite cloud storage providers without fear of vendor lock-in, while also ensuring that the storage provider has no access to their data whatsoever. With an EDV, the client does their own encryption and decryption using keys associated with decentralized identifiers they manage, and as such, acts as the true controller of their data.<br><br>Its Phase 1 award project “Verifiable Provenance, Traceability, and Regulatory Compliance for Raw Material Imports” will adapt Transmute ID, its core technology product that leverages centralized and decentralized identity infrastructures to secure individual agency identities and verifiable credentials to ensure that CBP has visibility into the provenance, traceability and regulatory compliance of raw material imports.","Encrypted Data Vaults for Trusted Data Access Introduction Data protection is an imminent challenge for modern society, as evidenced by the slew of data privacy regulations being introduced in most nations. However data privacy means much more than audits or reports to demonstrate regulatory compliance. Threats to data security are continuously evolving to meet economic and political aims, and as such, data privacy approaches must be even more rigorous to ensure success. Secure data storage is one critical component of data privacy. While significant work is underway to develop storage technologies that both preserve Personal privacy AND and are accessible for the general public to use, there is an equally crucial race among government and commercial entities to deploy storage solutions that better protect IP while enabling efficient and automated compliance. In this post we share an emergent storage solution called “Encrypted Data Vaults” that helps meaningfully preserve data privacy and ensure trusted data access. We are proponents of doing rather than telling, so we then walk you through how to generate keys and encrypt your own data using our demo implementation. Finally, we share the next steps for interoperability and expansion of this technology. What are Encrypted Data Vaults? Encrypted Data Vaults (EDVs) are secure storage mechanisms that allow entities to interoperate across disparate systems and processes without IP exposure or added liability for data that is not relevant to their business or the transaction at hand. EDVs allow users and companies to store data with their favorite cloud storage providers without fear of vendor lock-in, while also ensuring that the storage provider has no access to their data whatsoever. With an EDV, the client does their own encryption and decryption using keys associated with decentralized identifiers they manage, and as such, acts as the true controller of their data. According to the emergent specification, EDVs are “often useful when an individual or organization wants to protect data in a way that the storage provider cannot view, analyze, aggregate, or resell the data. This approach also ensures that application data is portable and protected from storage provider data breaches.” This idea was validated in 2018 by work Digital Bazaar pioneered when they deployed the first working implementation of an encrypted data vault [formerly referenced as a “Trade Evidence Server”] in a POC for Department of Homeland Security and Customs and Border Protection in 2018. Transmute’s EDV implementation is heavily inspired by these concepts, and we are grateful to the Digital Bazaar team for taking on the task of early market education which has paved the way for companies like ours. According to Manu Sporny, Founder and CEO of Digital Bazaar, “solving the problem of secure data sharing across blockchains and entities is one feat, but driving adoption of the technology requires further iteration and standardization. We are excited to see Transmute put forth a second functioning EDV implementation which will support interoperability and drive adoption.” Interoperability is a top priority for all of Transmute’s customer work, including our recent project with the Department of Homeland Security Silicon Valley Innovation Program and the US Customs and Border Protection Office of Trade. EDV Application: Supply Chain Data Access Let’s make EDVs more tangible through a quick example before showing the technology in action. Our team at Transmute is currently incorporating EDVs for the creation and sharing of verifiable trade credentials between manufacturers and federal authorities inspecting imported goods. A manufacturer can have confidence that the certificate they create for a shipment of raw materials can only be decrypted by themselves or explicitly delegated parties, such as United States Customs and Border Protection. This assurance helps the manufacturer feel more confident sharing proprietary or sensitive information. Even as their shipment is transferred through a global chain-of-custody, the associated data can move with it efficiently without threat of modification or capture by competitors. EDV in Action: Transmute Demo We’ve put together an EDV demo to move this conversation from concept to concrete example. Take 5 minutes to follow these instructions and see for yourself. You can also watch the video below to see our CTO, Orie Steele, walk you through the demo and provide more technical details. Step 1: Get Keys (Decentralized Identity) In order to encrypt and decrypt data you need keys that you control. That means you need a wallet file that you can import into the system. Our EDV demo currently supports 3 DID methods: Element (did:elem), GitHub (did:GitHub), and the test tool Did Key (did:key). We recommend creating an Element DID to start: - Go to https://element-did.com/ . - Click “Create Wallet.” - Create a password. - Download the wallet file. - Import the wallet file, and unlock it with your password. - Click on “My DID.” - Click “Create DID; ” this may take a few minutes. Step 2: Set up your Encrypted Data Vault Now that you have keys, you can import them into the Transmute Demo EDV. - Go to https://did-edv.web.app/ - Click “Create” in the upper right. - Click “Import,” and open your downloaded wallet file. - Unlock the wallet file (click the three dot drop down in the upper right of the component). Step 3: Create and Update Encrypted Documents Your EDV is now ready for creating and modifying documents signed by your keys! - Click “Explore” to open the “Documents” section. - Play with modifying and saving the demo document (“AuthenticateMe CREATE” — see minute 6:20 in the demo video for examples of how to modify the JSON). - When your document is saved, you will see it under the documents tab. - If you click on that document, update it (“AuthenticateMe UPDATE”) and save, you will see that the sequence has gone from 0 to 1, representing the update. - Click through the configuration tab for further security detail about this EDV. Browse our EDV Swagger API here. Conclusion: Next Steps for EDVs This demo illustrates the key technical components for using an EDV to encrypt and decrypt documents with keys managed by the user. This means a business or individual can have confidence that their data is under their control, and only they can see the decrypted form. So where do we take this work? Transmute is currently implementing the next step in this process — encrypted, asynchronous sharing between multiple parties. In this scenario both parties have access to associated key material, and the shared document needs an object capability to only allow access to specific keys. A helpful interoperability feature here is the ability to connect to other EDV providers using the same DID, enabling communication and even transfer. This combination of EDVs and decentralized identifiers ensures total user control over data access and modification. The result is a network-expanding approach to trusted data exchange across businesses — a critical advantage for the future of trade and data protection. If your company needs a secure way to share data with disparate players in your ecosystem (e.g. vendors, customers and employees), or if your business is looking for ways to reduce risk with better data protections, we want to help! Contact the Transmute Team here.",https://medium.com/transmute-techtalk/encrypted-data-vaults-c794055b170e,,Post,,Product,,,,,,EDV,,2020-01-15,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,Okta,,,,,Federated Enterprise Agents with Transmute and Okta,"The Okta Identity Cloud provides secure identity management with Single Sign-On, Multi-factor Authentication, Lifecycle Management (Provisioning), and more. Transmute provides a configurable Enterprise Agent enabling Decentralized Identity and Verifiable Credential capabilities for OAuth / OIDC users. Read on to learn about some of the ways Transmute and Okta enable enterprises to rapidly unlock the security benefits of decentralized identities.","Federated Enterprise Agents with Transmute and Okta Transmute’s products bridge the gap between established identity providers (IDPs) and decentralized identity technology. In this first of a series of posts we share details of how we work with IDPs like Okta. The Okta Identity Cloud provides secure identity management with Single Sign-On, Multi-factor Authentication, Lifecycle Management (Provisioning), and more. Transmute provides a configurable Enterprise Agent enabling Decentralized Identity and Verifiable Credential capabilities for OAuth / OIDC users. Read on to learn about some of the ways Transmute and Okta enable enterprises to rapidly unlock the security benefits of decentralized identities. Configuring SSO You can read more about the basics of configuring Single Sign On (SSO) with Okta here. Once the Transmute API has been configured to support SSO with Okta, users can leverage their existing directory accounts to sign in to Transmute. A Decentralized Identity along with a set of managed keys is automatically created for users. These identities and keys are what enable interoperability and audit-ability with the decentralized identity and verifiable credentials ecosystems. Creating a Verifiable Credential Transmute makes creating verifiable credentials and business to business workflows built on these credentials easy. After the user has completed the SSO process, they can use the Transmute Workflow engine (part of our paid product offerings) to create or participate in workflows. At each step of a workflow, the Okta provided id_token is leveraged to protect the use of signing keys linked for the DID. For example, when a user uploads a document the workflow activity is signed by their DID. This enables external systems which would like to verify the credential to do so without knowing any details of the Okta directory user, helping to protect against the mingling of Personal identifying information with credential and authorization material. Anchoring a VC to a Ledger Transmute enables workflows to be anchored to a ledger such that any Okta user can verify the workflow has not been tampered with since the anchor event. We do this by leveraging the same DID infrastructure we use for managing decentralized identities. The processing of anchoring a VC to ledger can be automatic or at the discretion of an authorized Okta user. Conclusion Identity providers like Okta enable SSO within enterprises and help secure products and applications widely in use today. Transmute integrates with IDPs like Okta to provide a seamless interface for existing enterprise users to unlock the security and traceability benefits of decentralized identifiers, verifiable credentials, and distributed ledgers.",https://medium.com/transmute-techtalk/federated-enterprise-agents-with-transmute-and-okta-2f1855dd3944,,Post,,Product,,,,,,,,2020-04-17,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,Release 0: Sol,"This release was focused on building support for decentralized identities into centralized directory technology that is already widely adopted by enterprises. This work involved adding UI to our React dashboard and updating our API to support registration and group management. We have also done some exploratory work regarding the DID spec and LDAP integration [...] <br><br> This release we focused on the centralized side of group membership. This use case relies on the integrity of the directory and discretionary access control. In other words, the directory admins can move users between groups, but users cannot control what groups they are assigned to.","Detailing our 1st release since finishing Techstars Everything Starts with Identity There are many companies tackling the blockchain identity problem, and for good reason: all applications begin with identity. And we want to help. The Transmute Platform will combine the best of centralized and decentralized services. In order for users, clusters, and services to communicate securely we need to define how identities are expressed. Historically, decentralized identity systems have been difficult to achieve. Systems like GPG rely on key servers and the web of trust to establish reputation for public keys. Services like Keybase attempt to bootstrap key reputation from social media profiles. Each of these approaches has advantages and challenges, but the common denominator is public key cryptography. Directories If you have worked on authentication and identity development in the past, you will be familiar with the concept of a directory. LDAP and Active Directory have become backbone technologies for enterprise IT and key servers for decentralized identity. This release was focused on building support for decentralized identities into centralized directory technology that is already widely adopted by enterprises. This work involved adding UI to our React dashboard and updating our API to support registration and group management. We have also done some exploratory work regarding the DID spec and LDAP integration — which we are still cleaning up and hope to share soon. Groups Directories like LDAP and Active Directory make use of groups, and it’s fairly common to use group membership for authorization. For example, all nurses can read patient profile data for their hospital group, but not others. This release we focused on the centralized side of group membership. This use case relies on the integrity of the directory and discretionary access control. In other words, the directory admins can move users between groups, but users cannot control what groups they are assigned to. This is valuable and familiar to enterprise system administrators, but not very compatible with decentralized identity. This structure is specifically helpful to Transmute because it means we can segregate our users on the centralized side, allowing selective access to new features based on billing information or reputation. Registration However, there are issues that need to be addressed when considering what it means to add a new member to a directory. These include proofing of communication channels (verify your email), proofing of public keys (verify you can sign), and linking of public keys in cases where you need different keys to support different protocols (for example: Ed25519, secp256k1 for use with SSH, PGP, etc.). For this release, we focused on the basics of registration. We came up with a flow that works, but it still has some limitations we will be addressing in upcoming releases. Our current registration flow involves submitting public key pairs to our centralized API. We then extract the email from the keys, verify that they have signed each other, and create a new directory entry in a deactivated state, which will become activated once the user verifies their email. Obviously, this flow requires the users to have email and protect access to it… which might not be a good idea. It also does not leverage the DID spec fully; PII (email) is linked to the keys, and the separation of PII and keys is one of the major features of the DID spec. We’ll be working to address this in the next release as well. Recovery Any identity system that does not discuss recovery from theft or failure up front, in clear language, should be avoided. We consider the concept of recovery and continuity of identity in both the centralized and decentralized senses. Centralized account recovery is something many readers may be familiar with. Forgot your password? You can reset it by showing you still control your email… But what happens when you don’t control your email? In most cases recovery is handled by establishing a protocol for recovery, and then protecting that protocol from attack. In the case of forgot password, you are protecting your email account. In the case of recovery keys, you are protecting a set of keys or a unique identifier that you will be required to provide in the case of email theft or compromise. The challenging part of recovery is that the attacker gets to play too. Whatever protocol you pick, the attacker can attempt to impersonate the user, and use the recovery mechanism to gain control of the account. We’ve seen this with SMS in particular, once an attacker has the ability to receive SMS codes, they can use this ability to lock you out of your account. Centralized recovery involves backup keys and proof of control over communication channels, whereas decentralized recovery involves backup keys and revocation certificates. We have explored and implemented some forms of simple recovery this release around compromised keys. It’s also worth stressing that when a private key is stolen or published, the identity can now be forged. This is why it’s important to use revocation certificates. They are required to tell users of public key crypto-systems that a key is no longer to be trusted. The support for revocation certificates in key servers, and the concept of continuity of identity in general, is tough. Continuity of identity is not always desirable. Sometimes, you don’t want to link the reputation of your old account to your new one. Sometimes you do. The method of support for this is built on digital signatures, and can rely on social systems as well. For example, more people will trust that you have changed keys if all your friends sign a message saying you have. There are other problems with recovery, such as sybil attacks, and privacy which require careful consideration. Summary We focused on the basics of identity in both centralized and decentralized systems for this release. We evaluated centralized directory technologies, including LDAP, and we implemented centralized authentication flows, which provide short lived tokens that represent identities in a centralized directory (JWT Bearer flows). We developed support for extending GPG to support Ethereum and centralized directories with a specific eye towards wider DID compatibility. We implemented simple registration flows for these systems and some forms of recovery for the case of private key compromise. Next Release For the next release, the Transmute team will tighten up our directory and identity work to ensure DID compliance. We’ll also extend our identity system to support better forms of decentralized groups, document management, and messaging.",https://medium.com/transmute-techtalk/release-0-sol-9d8fd06d2f4f,,Post,,Product,,,,,,,,2018-06-04,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,Release: The Transmute Framework Alpha,"The Transmute Framework helps developers build on decentralized technologies like Ethereum and IPFS via a familiar javascript interface. We support rapid prototyping by enabling developers with Redux experience to quickly build out decentralized applications. <br><br> This release focuses on support and documentation for the EventStore. Using the EventStore and the Transmute Framework, developers can save events to Ethereum and IPFS, track and query the event-sourced data models, and leverage the immutability properties of IPFS and Ethereum.","Release: The Transmute Framework Alpha The Transmute Framework has graduated to Alpha. Before you dive in, there’s a couple caveats to keep in mind. Hosted IPFS Support Coming Soon We currently provide test infrastructure to support the Transmute Framework; alternatively, you can use your own IPFS API. We’re working on tooling and support to make this easier in the future, but in the meantime, you’ll need to allow mixed content for test IPFS server. We will fix this shortly. Testnet Performance Varies The smart contracts used in the Transmute Framework Alpha are deployed to the Ropsten Testnet. This means you will need Ropsten Ether to use the demo; feel free to contact us for testnet ether if you need some. Announcing the Transmute Framework Alpha Check out the source code + a live demo here! The Transmute Framework helps developers build on decentralized technologies like Ethereum and IPFS via a familiar javascript interface. We support rapid prototyping by enabling developers with Redux experience to quickly build out decentralized applications. This release focuses on support and documentation for the EventStore. Using the EventStore and the Transmute Framework, developers can save events to Ethereum and IPFS, track and query the event-sourced data models, and leverage the immutability properties of IPFS and Ethereum. A Deeper Dive on the Transmute Framework Architecture One challenge developers face when building their first dapp is: how to manage state? Since Redux developers are accustomed to managing state with Redux, we chose to provide a Redux-like api for managing dapp state. Events can be used to model many applications and systems; the EventStore solidity smart contracts store events. Storing information on Ethereum is expensive, so we leverage the decentralized file system IPFS to store larger javascript objects, and then we store the IPFS identifier in the Ethereum smart contract. Together, Ethereum and IPFS are used to construct a Redux-like API for managing application state. IPFS’ content addressing strategy lets us store references to large slices of data easily. Another challenge dapp developers face is: syncing smart contracts state with external databases. This is important for querying, caching, analytics and external integrations. We use the term ReadModel to describe the state of an entity built up from events. Imagine a power plant with many switches and many possible states. Each time a switch is changed, an event is logged, describing the time and state of the switch. By reviewing this event log, we can see what the current state of the power plant is by looking at how it has changed over time. Event sourcing is a powerful tool that allows data management to evolve as it grows — always with the ability to rebuild state from events. ReadModels process the events from Ethereum and IPFS, and use a reducer to generate a state object that can be saved to caches or databases for querying.",https://medium.com/transmute-techtalk/release-the-transmute-framework-alpha-ad45acd42bdc,,Post,,Product,,,,,,,,2018-03-08,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,Orbit DB,,,,,Transmute ID Alpha,"One thing we learned from our Orbit DB PoC is that the DID spec offers a lot of valuable flexibility. Most DID systems achieve censor resistant decentralized storage and cryptographic decentralized identity protocols through a single identifier (hash of or full public key). We found it is possible to create a valid DID implementation that is anchored by 2 public keys, one for updating the filesystem, the other for managing the integrity of the documents, identities and claims.","Transmute ID Alpha Release 5: Wolf 359 Transmute’s sixth monthly release During November we focused our product and engineering efforts on shipping a public alpha of Transmute ID, our hybrid centralized and decentralized identity solution for enterprises. Transmute ID Features and Business Case Transmute ID supports the creation of discoverable identities that can securely send and receive requests and verified credentials, and have record of critical transactions written to a public or private ledger. We engineered our data model and architectural options to integrate easily with business workflows and pain points, and to reflect entity, individual, and asset identities. These business-related identities and their associated claims are built on decentralized identity standards, meaning they can be compatible with other consumer-focused self-sovereign solutions such as Uport and Sovrin. This means we can simultaneously support growth of the decentralized identity ecosystem while also addressing key enterprise pain points including identity proofing, portability, interoperability, consent, and regulatory compliance. DID Javascript Library While building Transmute ID, we’ve made significant progress towards a javascript library for developing DID systems and working with verifiable claims. Separating Decentralized Storage from Decentralized Identity We also added utility to our Transmute-did library, and developed a novel DID implementation built on top of OrbitDB with some really interesting features. One thing we learned from our Orbit DB PoC is that the DID spec offers a lot of valuable flexibility. Most DID systems achieve censor resistant decentralized storage and cryptographic decentralized identity protocols through a single identifier (hash of or full public key). We found it is possible to create a valid DID implementation that is anchored by 2 public keys, one for updating the filesystem, the other for managing the integrity of the documents, identities and claims. This dual public key system can reduce likelihood of an attacker compromising a full identity. It also creates more flexibility around storage and identity stewardship. Read more about how Transmute supports this scenario here. DID Selector Improvements We also proposed an updated approach to DID selectors to improve usability and longevity. This improvement suggests additional standards including JSON Pointer, JSON Path, JSON Path Expressions, URI Template, and Fragment Identifier. IPFS Updated Helm Charts While working with IPFS to support our Orbit DID PoC, we updated our helm charts for IPFS to the latest version. Check them out here. IPFS + Oracle Kubernetes Integration We added an easy mode setup for running IPFS on Oracle Kubernetes Engine with SSL, see here: https://GitHub.com/Transmute-industries/Transmute-charts/tree/master/tutorials/providers/oracle https://GitHub.com/Transmute-industries/Transmute-charts/tree/master/tutorials/easymode/ipfs We look forward to sharing our progress with Transmute ID in future releases. If you are interested in learning more about our tools applied to your specific use case, please contact us at product@Transmute.industries.",https://medium.com/transmute-techtalk/transmute-id-alpha-ba66cdc112fe,,Post,,Product,,,,,,,,2018-06-04,,,,,,,,,,,,,
|
||
Transmute,Transmute,,,,,,,,,Transmute U.S. CBP Steel Tech Demo,"The story focuses on critical trade verifiable credentials being issued, presented, and verified by trade, CBP, and PGAs.",,https://www.youtube.com/watch?v=03l_j7fvmhq,,Video,,Product,,,,,,,,2022-09-07,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,Element Block Explorer,"We’ve made some serious upgrades to the Element-lib which is the javascript library we use to implement the Element DID. As we mentioned in our last post here, Element is a Sidetree Protocol based DID Method that relies on Ethereum and IPFS. Our implementation is unique in that we provide a JavaScript library that runs in both the browser and node.js, in addition to providing a server-based REST API.<br><br>Our first implementation of Element enabled users to anchor their DID directly via a MetaMask-powered DApp thanks to Infura, and also use our “Full Node” to submit operations. Supporting both modes introduced a lot of complexity, and highlighted some scalability issues which we’ve recently fixed.","Element Block Explorer Transmute is pleased to introduce a block explorer for Element and Sidetree Full Nodes…with some extra bells and whistles. Check it out here: https://element-did.com/explorer. We’ve made some serious upgrades to the Element-lib which is the javascript library we use to implement the Element DID. As we mentioned in our last post here, Element is a Sidetree Protocol based DID Method that relies on Ethereum and IPFS. Our implementation is unique in that we provide a JavaScript library that runs in both the browser and node.js, in addition to providing a server-based REST API. Our first implementation of Element enabled users to anchor their DID directly via a MetaMask-powered DApp thanks to Infura, and also use our “Full Node” to submit operations. Supporting both modes introduced a lot of complexity, and highlighted some scalability issues which we’ve recently fixed. Using Element.Sidetree Interfaces might change, but we’ve added a class called “Sidetree” which abstracts a lot of the common functions and interfaces we used in the first version of Element. Here’s how its initialized: Blockchain and Storage interfaces have not changed (we still support Ethereum and IPFS only). We have added support for a message bus and database for caching data retrieved from the storage and blockchain interfaces. There 2 new services have opened the door towards some really exciting design patterns like CQRS, and syncing the database from full nodes. DB: Offline Mode, Caching and Syncing We’ve added an adapter pattern and a database to Sidetree, and we’re supporting PouchDB / CouchDB and Google Cloud Firestore out of the box. PouchDB, the JavaScript Database that Syncs! It enables applications to store data locally while offline, then synchronize it with CouchDB and compatible servers…pouchdb.com PouchDB is great because it provides a consistent API for both web and node.js and integrates seamlessly with CouchDB for enterprise-scale NoSQL. It’s easy to setup your own CouchDB instance, or host one on a major cloud provider. We also added support for Google Cloud’s Firestore, because we use Firebase to host most of Element today: Cloud Firestore | Firebase Use our flexible, scalable NoSQL cloud database to store and sync data for client- and server-side development.firebase.google.com Both of these databases have support for offline mode and syncing; right now, we’re not leveraging the sync features, but we do use the IndexDB interface provided by PouchDB to avoid making network requests for Ethereum and IPFS. In the future, we think this offline support will be very useful for anyone building a DID application in an Internet-denied environment, like rural areas or combat zones. ServiceBus: CQRS and Event Sourcing We’re huge fans of event sourcing and CQRS. If you are not familiar, check this out: Command and Query Responsibility Segregation (CQRS) pattern — Cloud Design Patterns Segregate operations that read data from operations that update data by using separate interfaces.docs.Microsoft.com Now that Element has a message bus built in, there are lots of potential integrations with other event-oriented systems, like OrbitDB, IPFS PubSub, Kafka, and more. Resolve & Sync The first version of Element only had a blocking sync method, which downloaded every transaction, anchorFile and batchFile, and then processed all the operations in the correct order. We noticed that starting to take a while even for our small test data. In the new version we added a fully event-sourced sync method, which uses the message bus / service bus to handle importing operations. In the future, this method can easily be parallelized and distributed, maybe even in a P2P form with something like OrbitDB. Peer-to-Peer Databases for the Decentralized Web. Contribute to orbitdb/orbit-db development by creating an account on…GitHub.com When we resolve a DID Document now, we use the db cache and save a lot of network requests. Because sometimes you don’t want your document with eventual consistency, you want it right away, we still support an optimized blocking resolve method, which tries its best to be fast and accurate. Mnemonic Key System We’ve added a key system which is particularly useful for testing operations when you want to generate a large number of test actors with various keys for different purposes: Using mks.getKeyForPurpose(‘primary’, 0) we can easily get a new keypair for a specific purpose, such as recovery. Use-testing complex multi-actor scenarios is much easier this way. Redux APIs! We have not pulled them out into a package / module yet, but we have a consistent Redux API for Element whether you are running a Full Node or doing everything in the browser with MetaMask. We use these Redux APIs to power the Block Explorer, and we are still maintaining support for both full and light nodes. JSON Schemas! Validation is a precursor to security. We’ve added some JSON Schemas for validating Transactions, Anchor Files, Batch Files and Operations. There’s a big opportunity to do more, especially when thinking about JSON Schema validation for operations. The Explorer We put all these new features to together to create 2 block explorers. Our Light Node Block Explorer runs entirely in the browser thanks the MetaMask & Infura for Ethereum and IPFS, and PouchDB. Our Full Node Block Explorer uses the same JavaScript library, also relies on Infura for IPFS and Ethereum, but uses Firebase Cloud Firestore and Firebase Cloud Functions for database and REST API support. This means you can see what’s going on in the Element network without having to setup MetaMask. Finally, we also implemented a large number of minor and major fixes — including some attack tests — for various scenarios. The most interesting attack test we have is for the Late Publish Attack, which is the main reason that Sidetree DIDs are not transferable. We’ve got a bunch of work planned including better integration with the TypeScript codebase currently powering ION. The Transmute team is excited to keep pushing Element forward!",https://medium.com/transmute-techtalk/element-block-explorer-bb6d2c712664,,Post,,Resources,,,,,,Element,,2019-07-10,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,Verifiable Actions for signing and verifying VCs with DIDs,This weekend I worked on making a GitHub action that can sign and verify verifiable credentials with decentralized identifiers.,,https://medium.com/@transmute/verifiable-actions-for-signing-and-verifying-vcs-with-dids-a4176fb5ba3f,,Post,,Resources,,,,,,,,2022-03-21,,,,,,,,,,,,,
|
||
Transmute,Transmute,,GitHub,,,,,,,eXtended Merkle Signature Scheme,"We've been working on generating test vectors for:<br>https://datatracker.ietf.org/doc/html/rfc8391<br><br>That we could use to register the `kty` and `alg` for XMSS such that it<br>could be used by JOSE and COSE.<br><br>[https://GitHub.com/Transmute-industries/xmss](https://GitHub.com/Transmute-industries/xmss)<br><br>I've reached the limits of my ability to move this ball forward, and am<br>here to ask for help.<br><br>I'm not very good with GoLang, and the original xmss source I am basing<br>this on is difficult for me to extend.",,https://github.com/transmute-industries/xmss,,Code,,Standards,,,,,,,, 2022-04-15,,,,,,,,,,,,,
|
||
Transmute,FederalBlockchainNews,,,,SVIP,,,,,"Anil John and Melissa Oh, of the Silicon Valley Innovation Program (SVIP)","There was a significant push by large platform players and others, to set up a platform model […] sit in the middle and extract value from that platform. As a government, we are rather familiar with being walked into a corner and told that there is only one product that you will buy because it will solve the problem.",,https://podcasts.apple.com/us/podcast/federal-blockchain-news/id1533524719,,Episode,,Standards,Public,Supply Chain,,,,,,2020-09-25,,,,<br>,,,,,,,,,
|
||
Transmute,Transmute,,,,,,,,,GitHub DID,"Decentralized Identifiers (DIDs) are a new type of identifier for verifiable, ""self-sovereign"" digital identity. DIDs are fully under the control of the DID subject, independent from any centralized registry, identity provider, or certificate authority. DIDs are URLs that relate a DID subject to means for trustable interactions with that subject. DIDs resolve to DID Documents — simple documents that describe how to use that specific DID. Each DID Document contains at least three things: cryptographic material, authentication suites, and service endpoints. Cryptographic material combined with authentication suites provide a set of mechanisms to authenticate as the DID subject (e.g., public keys, pseudonymous biometric protocols, etc.). Service endpoints enable trusted interactions with the DID subject.<br>",,https://github.com/decentralized-identity/github-did,,Page,,Standards,,,,,,,,2020-05-08,https://www.npmjs.com/package/@Transmute/GitHub-did,,,,,,,,,,,,
|
||
Transmute,Transmute,,,,,,,,,DID Key Workbench,"did:key is a DID Method which is offline friendly, cryptographically self certifying, requires no trust of certificate authoritites or blockchain and is ideal for ephemeral use.",,http://did.key.transmute.industries/,,Page,,Standards,,,,,,,,2020-11-27,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,Microsoft; Consensys; DIF,,,,,Introducing Element,"Transmute is excited to announce Element, an implementation of the Sidetree Protocol on top of Ethereum and IPFS. This work was done in collaboration with Microsoft and Consensys under the Decentralized Identity Foundation (DIF)","Introducing: Element The Sidetree Protocol Implemented on Ethereum Transmute is excited to announce Element, an implementation of the Sidetree Protocol on top of Ethereum and IPFS. This work was done in collaboration with Microsoft and Consensys under the Decentralized Identity Foundation (DIF). See also: ion, sidetree-core, sidetree-ethereum, sidetree-ipfs Why another DID Method? We love `did-ethr,’ but not every use case can support a single Ethereum transaction per document update, and there are lots of cases where we would like to use decentralized identities (DIDs) for ephemeral or high volume use cases, such as IoT and supply chain integrations which make this approach impractical. We’re excited to use the serviceEndpoints defined in Element DID documents to track external integrations, in a privacy preserving manner, including identity hubs, credential stores, and more. Furthermore, the scalability that Sidetree brings to DIDs is unprecedented. Our product, Transmute ID — enterprise-grade decentralized identity — must support scale that we could not achieve otherwise, so we are proud to offer Element as an alternative DID method for Transmute ID customer deployments. This is the business reason we invested so heavily in this open source implementation. There are already detailed posts about what Sidetree is, so we wanted to focus on Element, and what is special about it in the following: About the Code Transmute followed the general structure of sidetree-core and sidetree-bitcoin, but chosen to implement all the protocol logic in a single library, so we can demonstrate both server and browser based sidetree clients that run off the same codebase. We also chose to use Lerna, the mono repo tool for Javascript projects. This lets us test the newest versions of element-lib work with both element-app and element-api. Additionally, we implemented a very simple paper wallet system for testing working with DIDs, where a user can: - Create a wallet. 2. Add a password. 3. Export it as a QR code. Later, the user can import the wallet into the browser and use it to sign Sidetree operations for either the light node or the full node. This makes testing create and update super easy. Unlike Sidetree core, which is a bit more object-oriented and written in TypeScript (which we love!), Element extends functionality by using boring old Javascript. We hope this stokes more open source contribution given Javascripts’ wide acceptance. Light Node First! First and foremost, we think that users with sufficient funds should always be able to anchor their own DID updates with nothing more than a connection to IPFS and Ethereum. For some users, this will mean running a full ethereum node and IPFS locally, and others will use Infura. We provide a MetaMask powered light node demo where a user can pay to anchor their own DID (note: MetaMask uses Infura). Full Node as Cloud Functions! We also have a full node which is an express-based node JS web server with Swagger docs for its API. In this mode, we foot the bill for anchoring to the ledger; it’s currently free, but we plan to introduce Captcha [at a minimum] and more anti-spam defenses in the future.",https://medium.com/transmute-techtalk/introducing-element-328b4260e757,,Post,depreciated,Standards,,,,,,Ethereum,"DID,IPFS,Sidetree,Element",2019-05-10,https://GitHub.com/decentralized-identity/element,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,,,,,,NFC + DIDs,"Supply chains are complicated. While over 80% of logistics companies are investing in digitization to bring their supply chains into the 21st century, at the end of the day, not every step along the way can be web-enabled.<br><br>Transmute has been working on a solution: tying DIDs to Tangem NFC Cards, which carry a passport-grade secure chip, which implements public key cryptography. Near Field Communication (NFC) is the technology that enables things like contactless payments [...] increasingly used in supply chains for things like inventory and warehouse management","NFC + DIDs Transmute solves for offline traceability with Tangem. Supply chains are complicated. While over 80% of logistics companies are investing in digitization to bring their supply chains into the 21st century, at the end of the day, not every step along the way can be web-enabled. Provable identification is equally as important in these offline gaps, but far harder to achieve; part of the problem is ensuring seamless traceability across events like shipment handoffs [and associated documentation exchange] that happen offline versus the ones that occur online. Transmute has been working on a solution: tying DIDs to Tangem NFC Cards, which carry a passport-grade secure chip, which implements public key cryptography. Near Field Communication (NFC) is the technology that enables things like contactless payments (e.g. Apple or Android pay), but it’s also being more increasingly used in supply chains for things like inventory and warehouse management [via contactless counting of products on shelves]. Much like DIDs, they provide unique identification for the objects they represent (like your cell phone), but they’re notoriously limited when it comes to data storage and as such have long been outpaced by other web-enabled identification technologies like the QR code. However, in our implementation of did:key on Tangem’s NFC cards, we’ve demonstrated that linking DIDs to NFC Smart Cards solves the usability problem. Now, users have a single device that solves offline authentication, while unlocking secure access to secure storage [ad infinitum/in an infinite amount] — for instance the same user can use their single NFC-based DID to authorize a shipment release at a port as to access full shipment reports stored in their cloud-based EDV — both actions that would be captured in an immutable audit log. This also ensures that actions conducted offline are still accounted for since the DID is a url-based unique identifier. In order to try and explain better how Smart Cards, DIDs and VCs can be used, we’ve developed a set of small user stories. These are in no way exhaustive, but hopefully they help paint a picture of both the B2B and B2C opportunities in this space. “As an inspector, I scan a QR Code on a crate with my phone, loading product information into a credential representing inventory review, and then tap my NFC card to create an inventory reviewed credential, which will automatically be persisted to my encrypted data vault and anchored on a blockchain when I regain internet access.” “As a COVID-19 testing facility operator, I verify permanent resident card associated with a patient, after challenging them to authenticate with their NFC Card. I then complete their rapid response test, and issue a test results credential to their NFC Card. When they travel, they can use their NFC card to present their test results credential to transport authorization personnel.” “ As an undercover operative, I register my did with my handler, and they watch a public blockchain for transactions associated with my NFC Card. When they see a transaction, they check IPFS for the encrypted message, and decrypt it to see the update regarding the ongoing investigation.” “As a quality assurance officer for steel inc, I use my NFC card, issued by my employer to create digital certificates for my mill test reports. I use my NFC card to encrypt those certificates and submit them to an encrypted data vault. I use my NFC card to anchor those certificates to a blockchain. I use my NFC card to authorize Customs Brokers / Import Specialists to access my certified mill test report.” Bar staff is required to perform age verification before serving alcoholic drinks. Today, they need to see a government issued ID and inherently, involuntarily, unnecessarily view all the PII displayed on that ID. With an NFC card, they will only be privy to the enabled credentials, limited to the public data therein. They will be able to trust the age verification process, without the need to see user data. This is accomplished by leveraging the smart card’s ability to provide a cryptographically secured response to a specific question (is over 21). The rest of the credential details need not be presented. Check out our demo here: https://nfc.did.ai/tangem Want to stay informed on relevant standards and working groups? Here are some related links: decentralized-identity/secure-data-store Create one or more specifications to establish a foundational layer for secure data storage (including Personal data)… GitHub.com Transmute-industries/universal-wallet Dismiss GitHub is home to over 50 million developers working together to host and review code, manage projects, and… GitHub.com Verifiable Credentials Data Model 1.0 Credentials are a part of our daily lives; driver's licenses are used to assert that we are capable of operating a… www.w3.org Decentralized Identifiers (DIDs) v1.0 Decentralized identifiers (DIDs) are a new type of identifier that enables verifiable, decentralized digital identity… www.w3.org Web NFC Near Field Communication (NFC) enables wireless communication between two devices at close proximity, usually less than… w3c.GitHub.io",https://medium.com/transmute-techtalk/nfc-dids-6d56fda45831,,Post,,Standards,,Supply Chain,,,,NFC,,2020-07-09,,,,,,,,,,,,,
|
||
Transmute,Transmute,,ssimeetup,,,,,,,"The Element DID Method: Sidetree, Ethereum & IPFS – Orie Steele","Supply chain logistics companies are particularly interesting with how they manage their extended business networks as they compete for new business. This includes faster and safer on-boarding of customers and third-party vendors, and new ways to manage the lifecycle and associated data of those relationships.","The Element DID Method: Sidetree, Ethereum & IPFS – Orie Steele – Webinar 31 Orie Steele is Cofounder and CTO of Transmute, a company developing IAM and Verifiable Credential solutions that integrate Decentralized Identity for Enterprises. He has a BS in Cyber Security and MS in Computer Science from Stevens Institute of Technology where he studied social network malware and botnets between 2007-2012. He was an early engineer at Patient IO, a Techstars backed startup acquired by Athena Health in 2016, where he helped develop and secure a care coordination platform that connected nurses and patients. In this talk, Orie will discuss the history of the Element DID Method, how it leverages the same Sidetree Protocol that is used by ION on the Bitcoin Network. He’ll introduce the motivation for Element and ION, and then walk through the core components of developing a working DID System, including topics such as wallets, signing, DID resolution, key revocation, and decentralization. Video recording: Slideshare presentation: How can you use these slides and knowledge? This content is shared with a Creative Commons by Share Alike License. This allows you to reuse the powerpoint slides we are sharing here to build your own SSI communities around the globe. You only need to credit SSIMeetup and the invited guest of the day and share whatever you produce with the same license. Please read the license for full details. Download the full presentation “The Element DID Method: Sidetree, Ethereum & IPFS – Orie Steele” from Google Slides. Interested in collaborating or sharing? Please get in touch via the contact form or one of the social media channels and we will find something interesting to do together or support you.",https://ssimeetup.org/element-did-method-sidetree-ethereum-ipfs-orie-steele-webinar-31/,,Post,,Standards,,,,,,,DID:Element,2019-07-04,,,,,,,,,,,,,
|
||
Transmute,Transmute,,Medium,,Okta,,,,,Verifiable Credentials with Transmute and Okta,Okta provides a mechanism for adding custom claims to id_tokens and access_tokens: Hooks and Custom Authorization Servers. These components can enable automated integrations with emergent technology including decentralized identifiers and verifiable credentials.,,https://medium.com/transmute-techtalk/verifiable-credentials-with-transmute-and-okta-574edaec887b,,Post,,Standards,,,,,,,,2020-04-17,,,,,,,,,,,,,
|
||
Transmute,Transmute,,,,,,,,,"Verifiable Presentation Personas: Certifiers, Consolidators, & Submitters","The arrow for “Issue Credentials” is exactly the same as “Send Presentation,” leading us to believe these activities are similar, but how are they similar? We can’t adequately answer these questions by looking at the above picture and the specification doesn’t provide a ton of help either…",,https://medium.com/@transmute/verifiable-presentation-personas-certifiers-consolidators-submitters-b38a281eb92f,,Post,,Standards,,,,,,,,2022-04-21,,,,,,,,,,,,,
|
||
Transmute,Xaralite,,,,Transmute; Consensys; uPort; IBM; Blockstack; Danube Tech; Trinsic; Spherity; Microsoft,,,,,"Decentralized Identifiers Market May See a Big Move: Major Giants- Consensys, Blockstack, Danube Tech","provides valuable market size data for historical (Volume & Value) from 2016 to 2020 which is estimated and forecasted till 2026*. Some are the key & emerging players that are part of coverage and have being profiled are Transmute (United States), Consensys (United States), uPort (United States), IBM (United States), Tykn Tech (Netherlands), Blockstack (United States), Danube Tech (Austria), Trinsic (United States), Spherity (Germany), Microsoft (United States).",,https://xaralite.com/1746487/news/decentralized-identifiers-market-may-see-a-big-move-major-giants-consensys-blockstack-danube-tech/,,Report,,Meta,,,,,,,,,,,,,,,,,,,,,
|
||
Trinsic,Streetcred,Trinsic,,Riley Hughes; Michael Boyd,DIF; Sovrin Foundation; Verifiable Organizations Network; Covid Credentials; TOIP; Hyperledger Foundation; W3C,"USA, New York, New York",USA,,,Trinsic,"We make it easy to implement Self-Sovereign Identity based on Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs), a new digital identity standard. Our software is based on the open-source Hyperledger Aries project, to which we are a primary contributor.<br><br>Together with proper governance, SSI enables, for the first time, the Trust over IP (ToIP) stack. Once trust can effectively be conveyed over IP, a tremendous number of opportunities arise in every industry vertical imaginable. We build the tools to help you capitalize on this opportunity.",,https://trinsic.id/,,Company,,Company,Enterprise,ID,Software,,,,"Verifiable Credentials,DID",1998,,https://twitter.com/trinsic_id,https://www.youtube.com/channel/UCkPIelMjjBfT-0bHLVdmZZA,https://trinsic.id/blog/,https://trinsic.id/feed/,https://join.slack.com/t/trinsiccommunity/shared_invite/zt-liwrvejk-dXC3uwYL6CCP~~RNIzc7sg,https://www.crunchbase.com/organization/trinsic-inc,https://www.linkedin.com/company/trinsic-id/,https://docs.trinsic.id/docs,https://trinsic.id/trinsic-studio/,https://trinsic.id/trinsic-wallet/,,
|
||
Trinsic,NorthernBlock,,,,,,,,,Building Digital Trust Ecosystems with Riley Hughes from Trinsic,"The reason I love that quote is that digital credentials and verifiable data can not only impact the use cases that everybody tends to think about when they think about SSI, but they could permeate our whole lives and streamline everything we do.","Listen to this Episode about Building Digital Trust Ecosystems on Spotify The Virality of Self-Sovereign Identity Mathieu: I wanted to start with a quote, Riley, from the movie Inception: from Leonardo DiCaprio’s character, Cobb. I’ve seen this quote in an article that you’ve written before, and the quote goes like this: “What is the most resilient parasite? Is it bacteria? A virus? An intestinal worm? An idea. Resilient, highly contagious. Once an idea has taken hold of the brain, it’s almost impossible to eradicate.” You had used this quote to refer to Self-sovereign Identity, or when you started thinking about ‘digital trust’ for the first time some years ago. Would you mind giving a background of how this has taken over your brain, and your life, and effectively, the mission behind Trinsic today? Riley: Yes, that’s an awesome quote — I’m glad that you brought that back up. The first time I was exposed to Self-sovereign Identity and this concept of decentralized identity was when I interviewed for a job at the Sovrin Foundation when it was looking to hire its first employee. My interview was with somebody named Steve Fulling, as well as Phil Windley. Phil Windley is the founder of the Internet Identity Workshop, and he was the chair of the board of Sovrin. He’s quite a ‘guru,’ and he was very good at conveying the vision and what we were trying to accomplish. I went in for a regular job interview, and it went well, but after I left the interview, I looked everywhere around me, and all I could see were digital credentials or the lack thereof. It was as if, once I saw the way the world could operate. I’ve never been able to go about life normally again. Even small daily things; one time, UPS was shipping me a package, and I wanted to pick it up at the distribution center, but I couldn’t prove that it was actually me who bought the package. Or another example: I used to work in the solar industry, collecting people’s electricity data so that we could build them a solar panel estimate. Things like that: everyday activities that you don’t normally hear about, such as KYC (Know Your Customer), or health credentials, or the prime SSI use cases. The reason I love that quote is that digital credentials and verifiable data can not only impact the use cases that everybody tends to think about when they think about SSI, but they could permeate our whole lives and streamline everything we do. I think that was evidenced throughout my time in this space. It seems as if people who get into Self-sovereign Identity, can’t get out. Even people that I worked with at the Sovrin Foundation; while we were there, we never ever had any attrition of employees, even though the Sovrin Foundation was a hard place to be because of funding challenges. Ultimately, those funding challenges resulted in people losing their jobs at one point, but even throughout all that, nobody ever left. We’ve seen the same thing at Trinsic: nobody’s ever left, unless they’re an intern, or someone moving on to their next project. Mathieu, maybe you could speak to this, but I remember when Northern Block came into this space. You weren’t focused solely on Self-sovereign Identity, but it seems like you’ve leaned into this subject. I think that’s characteristic of what I’ve seen over and over and over; once this concept infects your brain, you really can’t help but go all in. I think that’s a characteristic of really promising movements: Once you see it, you can’t unsee it. In my opinion, it’s inevitable that the world is going to go in this direction. Mathieu: Yes, it’s a similar mindset. The feelings that you had towards SSI or digital trust, or Trust over IP, however, we call it; I had a similar feeling some years before when I got into crypto, and blockchain, and decentralized technology as a whole. Having worked on many software projects that utilize some type of decentralized or distributed computing technology, the property of verifiability that comes with these verifiable credentials is something that, for me at least, was missing in the decentralized stack that we were playing with. It was cool that we were building distributed or decentralized protocols; we touched it all for you-name-the-use-case. But, it was missing the core property of verifiability. Many people were trying to use technologies like distributed ledgers or blockchains to solve every problem in the world. Still, when it came down to lack of trust in transactions, caused by issues with the verifiability of data, it was as if a light went off there. As you started working at the Sovrin Foundation, I’m assuming you got more and more excited about that. What are the types of things that you did at the Sovrin Foundation in the early days? Early Days with the Sovrin Foundation Riley: Yes, definitely; Sovrin was a very cool place to be. First of all, from a Personal development standpoint; as the first employee, there was a lot to do, and I wore a lot of hats when I first joined. I think the other cool thing about it was that, at the time, it was very early; I don’t even remember whether the Decentralized Identity Foundation was established yet; it may have been. The Trust over IP Foundation was definitely not established, and so Sovrin was the most established place to go. If you were interested in this space, it was the destination. My first major undertaking at the Sovrin Foundation was to establish the Sovrin Steward program, which is the program that the Sovrin Foundation runs to get organizations to operate nodes on the Sovrin network. Part of the governance of Sovrin is that the nodes are permissioned, but the read-and-write access was thought to be, ideally and eventually, totally public. Although the consensus protocol on the blockchain was done in a permissioned fashion, everything else would be public. Part of establishing the governance over the permissioning of the nodes that operated consensus on the network, was essentially establishing a standardized process through open governance. This would allow new stewards to join the network, add a node, and participate in consensus. That whole initial process was my first major project. The first step in that process is usually to have a call with somebody who works at the Sovrin Foundation, or somebody on the committee would call the steward qualification committee. Originally, all of that was done by me. Eventually, it became much more decentralized, with different volunteers from around the world who are doing that, and it’s still being done today. But initially, it was all me. Following on to what I was saying about Sovrin being the first destination for organizations who wanted to get involved with decentralized identity, I had so many calls. My whole day would be filled with conversations with people who were interested in applying this new technology to some business problem that they had. Because of that, I got to hear, from the very beginning, about all sorts of business problems, and all sorts of industries to which people were trying to apply Self-sovereign Identity. Essentially, I was able to absorb all those learnings, in a way that I think was unique. I don’t know if I would have been able to have that opportunity anywhere else in the world at that time. Mathieu: We can see this happening over and over again, where you see areas that foster technology startups, like Silicon Valley. They all leverage each other. In these really condensed environments, with people working in the same direction, you seem to see an ecosystem take shape. Different companies and different people can leverage each other’s skillsets for something specific. Many of our companies today have a more global footprint, and we’re able to work with people throughout the world. Still, it’s quite crazy to look at Utah (where Trinsic is based) and to review the Sovrin Foundation coming out of Utah, all the SSI companies coming up there, all the Fintech companies coming out of there; it’s absolutely crazy. It feels as if there’s entrepreneurship in the mountain water that everyone’s drinking down there. Riley: Yes, that’s absolutely true. I think that part of that is that, as I mentioned, once people see it, they can’t unsee it; people stay in the Self-sovereign Identity space, even if they leave their roles. If you look at all the people who were laid off from the Sovrin Foundation, I’m trying to think on the spot here, if anyone did something like leaving the space entirely: Nathan George went to Kiva. Ken Ebert and others based here in Utah started Indicio.Tech, along with Heather Dahl, who’s based out of DC. Evernym has a presence here in Utah, and people who’ve left Evernym have gone on to start things or integrate SSI into their own applications. Mike Lotter started a company called TrustFrame that’s based here in Utah. Of course, I came from that ecosystem and started Trinsic. So, I’m telling you that people can’t leave once they see this — it’s such an opportunity with such an open ocean. It makes sense that a lot of the initial ecosystem was built here, and that ecosystem continues to grow, and it’s very cool to see. Mathieu: I agree; I honestly can’t imagine doing anything else once you’ve experienced this, and you understand how verifiable credentials could really work. I think it’s interesting because there are so many opportunities to use verifiable credentials to turn paper-based processes into better processes, and also, to turn digitized processes into digitally valuable processes, as well. It’s one thing to say that I’m digitizing a paper document, but being able to actually have verifiability and authenticity behind it, and being able to leverage the data further on, in whatever customer journey or whatever you’re trying to do, is very exciting. So, yes, I’d have to echo your comments earlier in the discussion. Once you get lost in the topic: in your day-to-day activity, you’re looking at things, and you’re thinking, “Oh my god, there are so many opportunities to use this stuff to create a better world for ourselves.” What is the Business Value of SSI? Riley: Yes. You spoke to a lot of business value that can be unlocked through decentralized identity; I’m sure we’ll get more into that because that’s something that I’ve been obsessed with for the last four years. Before we leave the topic, and to add to your comment there; I think one of the main reasons that people stay in the space is not only for the big business opportunity. It’s also the fact that I truly believe (and I think that most everyone else in this space truly believes) that Self-sovereign Identity is going to create a better world for people. It’s going to make basic services, that people need but they can’t currently access, accessible, and it’s going to change the power balance in the world to favour people, more than it currently is. I can’t see the future, but I do know that giving individual people more privacy, and more control and transparency into the way their data is used and managed are a big part of why people stay in this space. We can have all sorts of philosophical discussions about Self-sovereign Identity. I know that for me, I would not be nearly as passionate about Self-sovereign Identity, if it was just about automating business processes, and making paper-based workflows more efficient. Although, those are very good objectives: that’s where a lot of the money is to be made today. I think that in the long run, what we’re really trying to do, is we’re trying to create a digital economy that works for people and for the world. We’re trying to create trust over the internet protocol — Trust over IP. We’re trying to create digital tools and digital ways for people to establish trust and get access to the things that they need and the things that they’re entitled to. I certainly wouldn’t be nearly as passionate and work nearly as hard for this movement, and I don’t think nearly as many people would stay in the space if it were only about the business value of what SSI can unlock. Mathieu: I’ve gotten emails from you at six in the morning Eastern time, and I know you’re a couple of hours behind. So, I definitely know that you’re grinding away at this stuff because you truly believe in it. It’s quite similar to the vision that many people have — generally those in Web 3.0 or who are focused on decentralized technology. They have a vision for a digital economy with greater balance, more transparency, more traceability, and more value for people who are actually injecting value into the system. To close on the Sovrin topic: I think Sovrin is amazing. I spend a fair amount of time in the Trust over IP Foundation, more particularly today in the utility foundry working group. It’s incredible how much of a step forward Sovrin has made for everyone, because everyone trying to set up a public utility today is looking to Sovrin as a model: Sovrin is effectively shaping how the space is growing today. You left Sovrin to found a startup called Street Cred at that time; would you mind talking through the early days, and the transition out of there? Was this your first startup? SSI Toolkits for Developers Riley: It was my first real startup. I’ve always been entrepreneurial since I was a little kid, and in fact, I started a business when I was 14. At that time, I manufactured aftermarket freestyle Razor scooter components and sold them to e-commerce stores and physical locations in the USA, and also in Australia because that’s a big sport down there. I did that throughout high school, which paid for college, so that was fun. This is my first tech startup. I would say it’s my ‘real’ startup, in the sense that it’s a startup whose goal is not to sustain my own living, but it’s a startup whose aim is to make a meaningful impact on the world, and create an organization that is significant and enduring. I’m happy to talk about the transition there. For me, Mathieu, and this might sound dumb or cliché or whatever, but it really didn’t feel like a transition; it felt as if I was doing the same thing. While I was at Sovrin, I was totally obsessed with the idea of building adoption of Self-sovereign Identity. I could see the future; as I said, I was walking around, and I would walk into a restaurant, and I would walk into my university (because I was a student at the time). Everywhere I went, I saw what the world could be, and I couldn’t handle living in this world, when I knew it could be so much better, but not doing anything about it. While I was at Sovrin, I was the staff lead over governance, for example. Drummond Reed from Evernym deserves so much credit for all of the work he’s done on governance over the years, as well as at Sovrin. He was an employee of Evernym, and I was the Sovrin staff employee who was working right alongside him and others on governance. The whole point there was, “How do we make SSI more adoptable?” I did a lot of work on the Sovrin Tokenomics — that was my other big project at Sovrin; Token was a big conversation at the time, and I was effectively the lead of the Tokenomics work at Sovrin. That work was focused on, “How do we potentially use a cryptocurrency to facilitate, or a utility token to facilitate adoption of verifiable credentials?” And, “How can we provide incentive mechanisms to drive that adoption?” This went all the way to starting the SSI incubator, which was an idea that I had from the beginning. The SSI incubator idea revolved around helping startups who were going to move fast, and build use cases for different industries, get funding and support that they needed to deploy this stuff. Everything I was doing at Sovrin was about trying to crack the code of adoption. The other piece is the business of SSI working group, which I started and led for some time. Some good work was done there, as well, and so all of my work at Sovrin was around getting adoption. The story of how Trinsic got started (or, at the time, it was StreetCred) was that it was another continuation of what I was already doing. I was talking to people, and saying, “Why aren’t you going to production with your SSI solutions? What’s holding you back, and what do we need to do, to make it so that nothing’s holding you back?” I heard over, and over, and over, that the tools were not there, that there were no products that could do what needed to be done; the technology wasn’t mature enough, the technology wasn’t quite there, the technology was too hard to use, or hard to integrate into existing applications. For a long time, my answer was, “Okay, well, the private sector or whatever, the marketplace will solve that problem. That’s not Sovrin’s role.” That’s true, and so I really eventually thought, “Okay, I’ve done all my stuff I need to do here at the Sovrin Foundation, and now I need to try to make it easy for people.” I needed to work on this other piece, which involves productizing SSI in a way, specifically, that developers could easily integrate into their applications; whether they be existing applications that are serving existing markets, or building new verticals and new use cases on top of Self-sovereign Identity, for whatever business problem that they’re solving. So, I got together with Tomislav Markowski, who wrote the first Aries framework for mobile, and then also Michael Boyd, who wrote the first Indy agent in Python, along with the rest of the working group at the time. These were two of the founding engineers, or code producers, in the Hyperledger Aries project. I said, “Guys, let’s get together, and let’s make something that makes it really easy for developers to integrate SSI into any application. Let’s try to make it as simple as Stripe made payments, or as simple as Twilio made communications, or as simple as name-your-API-based, developer-focused company made whatever thing easier. That’s what we did. When I transitioned into doing StreetCred, it felt more like it was my next step in making SSI easier to adopt. Last Monday, we announced Trinsic Ecosystems, which is our new product. It’s a productized way to build trust ecosystems. This is the same next step in that same journey to make SSI adoptable and easy to use, and to successfully get digital wallets in the hands of people that have credentials, who are using them for useful purposes. That’s the summary. Mathieu: Yes. I was going to hop forward, because, in the meantime, you guys have been the go-to for developer tools; that’s how we met each other. We had developers looking at the stuff, and the contributions were coming from your team. You pushed the bar forward there, and made it easy for a lot of people to start implementing Self-sovereign Identity. Now, we’ve developed these tools for developers to do basic functions, such as making connections, and exchange credentials, and do verifications, and so forth. That’s the space that we are looking at, as a company. There needs to be a provider, or someone that’s going to create an ecosystem; create the rules, and create the governance. We could talk about community-driven initiatives, such as when you mentioned Tokenomics before; there’s a discussion there to be had. If we take a step back, and we look at how to enable organizations or companies to successfully offer the benefits of Self-sovereign Identity to their customers, and to their employees: how do you reap the benefits from less friction, more privacy? The list goes on and on; you need to have someone kick-starting the ecosystem. Was that the same realization for you guys? And, what did you feel was necessary to make this easy for an ecosystem provider to really get going? Focusing on Self-Sovereign Identity (SSI) Ecosystems Riley: Yes, we had that same journey as well. However, I think that for us, it was more around the fact that we have this developer platform, and we have this API product that’s being used by hundreds of organizations. At its core, it was an exercise of looking at the current state of our product and its growth, and the different segments. Some people are using verifiable credentials for authentication, others are using them for sharing and building these ecosystems, others are using credentials in a contained and closed ecosystem, and still others are open ecosystems. There are various different use cases for credentials, of course. Looking at our own data that we’d collected and thinking, “Where are credentials getting adoption?” That’s been the number one thing since day one, and it still is. It’s going to be the main priority all the way until we’ve got a network effect established for verifiable credentials that is so strong, that nothing’s ever going to topple it. That’s going to be the thing that we’re constantly thinking about: adoption. Looking at the data that we had, credentials are getting adoption most — it’s the best use case today. The best way to get credentials into the world with real adoption has been in these ecosystems; it’s when people come to our platform, pick up the product, and use it to build an ecosystem — that’s where the best adoption has been. So, I would say that the Trinsic Ecosystems product was about talking to our customers who’d already created ecosystems, building things for them that help to solve their problems and help make it easier for them to build ecosystems, and then trying to generalize that. If we say, “Okay. These five companies need a trust registry,” for example: at a high level, it’s a list of issuers who are authorized to issue certain types of credentials. If these five customers need this, is this something that can be generalized to more use cases? Something I think that we’ve been very intentional about at Trinsic, is talking to the people who are actually implementing and deploying this stuff in production. Basically, Trinsic Ecosystems is our way of productizing all of those things that everybody needs if they want to deploy a credential ecosystem. At least, the general pieces that people need, so that they can configure an ecosystem and then deploy it out to their network that they’re trying to affect. Mathieu: I love the vision. There are a lot of similarities for me, coming from a crypto/blockchain space, where people were trying to reinvent the world in so many different ways. That’s a separate conversation. Ecosystems exist today all over the place. I think that many people, when they approach something new with a new technology, they’re trying to create this new way of doing something, when things aren’t necessarily done that way. That’s not the easiest way to get people bought into doing something new. If I have a business case for credentials —there are a hundred of them —it could be to reduce costs, or reduce fraud, or lower my compliance costs, or reduce friction. Whatever it is, you need an easy way for an organization or an entity that’s already trying to make their ecosystem a credentials-based ecosystem, to really get it going and get its existing governance, rule sets, and relationships into this new model. I think what you guys are doing there is good, and it makes it a lot easier for people to start using the stuff without throwing out what they’ve already done. Riley: I think that, as you said, there are already ecosystems out there. Taking a step back again to the meta-topic of adoption; the other thing to keep in mind is that I’ve never talked to anybody in my life who has questioned the value of SSI, if we assume that there is widespread adoption already. The question is always, “How do you get to that point? How do you bootstrap the network effect?” It’s not always a great user experience to ask your customer to download a wallet, but if they already have a wallet, and already have credentials in it, then it’s an awesome user experience for them to streamline the access process through SSI. That’s just one example out of many that I could give. I think that the ecosystem approach is a very pragmatic way for organizations to do something using verifiable credentials, that is uniquely valuable today. It’s something that they can do today, that’ll add value in a way that nothing else on Planet Earth can match. Of course, there are some caveats or dependencies there, perhaps, if you want to argue it. However, I think that what verifiable credentials are fundamentally good at, is portable data and verifiable data. When you have an ecosystem, verifiable credentials can be very costly and cumbersome, and not scalable when it comes to sharing data in these ecosystems. So, I think it’s an excellent way to give organizations something that they can do today, that will add value, reduce costs, and increase revenues and whatever else they want to accomplish. At the same time, they’re also keeping the door open for that eventual future, where verifiable credentials will be ubiquitous and where we will get to the more decentralized interactions that we all hope to see in the future. Mathieu: Do you have concerns with the term ‘Self-sovereign Identity’ when talking to a prospect? When people are so used to doing something today, maybe self-sovereign sounds too out there, or people may have mixed feelings about it. The term Self-sovereign Identity addresses the ‘self-sovereign’ aspect, but the identity aspect is maybe not necessarily everything that it encompasses, as well. I know you have written a very good blog post, called: “SSI has an identity problem.” Did you front-load Self-sovereign Identity in conversations? How do you look at that? Riley: No, I don’t front-load it.; it depends on who I’m talking to. You always want to gauge your audience, so when I’m talking to you, Mathieu, I use the term Self-sovereign Identity or SSI. I’m assuming many of the listeners here are familiar with the concept, because of your other podcasts and things like that, but you’ve got to gauge your audience. One of the things that I talked about, either in that blog post or in another one I did about SSI adoption and the current state of adoption, is: “don’t talk past your prospect or your buyer or your whatever.” Make sure you’re using language that’s shared. One of the challenges with any new technology is there’s no shared vocabulary. Having a shared vocabulary is the easiest way to communicate something; since we both know what a newspaper is, I can simply say the word “newspaper,” and there’s a great deal of context that we both have of shared knowledge. We can use that one word, instead of describing what it is in a bunch of words and sentences. I could try to do that here, but I’m not going to take the time to do that. I think you can see the point with SSI, right? When you already have all the context, I can say that acronym and you get a bunch of knowledge. Whereas, if you don’t already have that context, you’ve got to use a bunch of words to try to fill in that context. One thing that Timothy Ruff and I have talked about for years; we’ve been looking for a way to convey all of that context in as simple a way as possible, and, if possible, in only a few words, or one sentence. We’ve gotten close in various ways, but we haven’t gotten all the way there. So, generally speaking, when talking to a prospect, it is all about the use case and the business value, and it is not at all about the way to get there or the implementation path. For example, if somebody wants to do COVID test results, and all they want is the ability to know that the COVID test was actually done by a legitimate lab. Necessarily, they want to know that; that’s their business problem. I think the way to do that successfully is if we discuss it in terms of the business problem and with the context and words that they’re familiar with. As you said, I wrote a blog about all of the different names that people have tried to use for SSI over the years. There’s a new proposal, that’s not in that blog post. Maybe I’ll make an edit or update to that blog post about the term ‘authentic data,’ which I like, but again every name has its own trade-offs. I would say, generally, do I have problems with the term SSI? Yes. Do I think SSI is the most widely used, and is still the best term to convey all of that context? Yes. However, do I use it frequently with people who are unfamiliar, or with potential prospects? No. Mathieu: It’s all about solving a business problem today for organizations. KYC is a good example. For regulatory reasons, you need to collect and verify certain information about a prospective customer before offering them financial services or whatever. My feeling is that the world of technology solution providers that are selling into these companies today, is very one-transactional. Of course, in the financial world, every time you’re making a payment or conducting a transaction, there’s monitoring, there’s stuff that goes on for various policies such as anti-money laundering. However, much of the business value that people seem to be looking to get out of credentials, is that level of verifiability. I’m curious as to how you look at this, too. Is it more interesting to be talking about user-centricity, and consent-driven decisions, and privacy and better security? Again, it depends on who you’re talking to. If you’re talking to someone that understands Self-sovereign Identity, then then you could talk broader than that. But, talking about these core values or aspects of Self-sovereign Identity might make it easier to get in the door, get credentials used within their respective ecosystem, and then slowly start talking about more exploration. In other words, “Now that we’ve done this thing; by the way, there’s a lot more upside that can be achieved through the reusability and portability of these things.” Is my thought process similar to how you engage with prospects? Riley: Yes. The other thing that I’ll say is: what you and I, Mathieu, are selling (for lack of a better term) is SSI. We’re selling decentralized identity; we’re selling the ability for companies to implement this stuff. But, what all of our customers are selling, is some specific use case or business problem. So, I think that the things I say in a conversation with a prospect will be different than the things that I think our customer will say. Let’s take ‘Farmer Connect,’ for example: Farmer Connect has built a very awesome traceability solution, where you can scan a QR code on your coffee beans, and it resolves all the way back through the supply chain to an individual farmer who has a Self-sovereign Identity. That farmer can choose what data or information from all their verifiable credentials they want to share with the end consumer of the coffee beans that they grew. Then, the consumer can sort of ‘tip’ the farmer that grew the coffee beans, and it’s a very cool use case that implements SSI. Farmer Connect is not selling SSI to their ecosystem, right? They’re selling transparency, and they’re selling sustainability, and they’re selling a specific thing. Whereas, somebody like Trinsic: we don’t build specific products, we work with companies who are building specific vertically-focused products. We try to be the best in the world at building those tools and the low-level stuff. So, the conversation that I might have with the prospect would be different than how I would advise a customer of ours, or an implementer of SSI solutions directly. I would think that Farmer Connect is going to talk about farmers sharing their data, in a way that’s privacy-respecting and that those farmers can use, to get access to credit and other things. They’re going to talk about the farmers; they’re not necessarily going to talk about Self-sovereign Identity. Or, at least, I wouldn’t typically advise them to do so. Whereas when I have a conversation with someone, it is going to be about the tools, and the technology, and the approach. As an example, Amazon AWS is going to be talking about the cloud and the benefits of moving to the cloud; Pinterest is going to be talking about pictures, and Uber is going to be talking about rides, but they’re both built on AWS. For the end consumer, for the sake of the use case, I don’t think you need to talk about the technology that’s powering it. It totally depends on what you’re using the technology for, and it’s better to be talking about the business problem instead. Mathieu: Fair enough. You wrote another great piece: “Four Keys for SSI Adoption,” and one of the findings was simply, “Don’t sell the tech.” I think you alluded to that again here, in selling the value of SSI instead. I think that your fourth key for SSI adoption, was that chicken-and-egg problem. Has your thinking advanced since you’ve written this? And, perhaps you can also talk about the survey you did, and how you approach this thing? I think it’s pretty cool that you went out there, and came up with these findings. Riley: For the survey that I did, I was thinking that there are so many companies out there who are deploying SSI-based solutions. Almost all of them are using some vendor to help them, so the survey revolved around reaching out to other vendors, like us, and asking whether those vendors have any insight on their customers and their customers’ adoption. It turns out, they do; so it was more about collaborating with other vendors. There are still conversations around confidentiality there, and so that’s why all of the raw data is not shared, but at a high level, I did share what I felt like I could. I talked to all the SSI vendors that you’ve likely heard of, and got every one of them, with maybe an exception or two, depending on who you want to count in that arena as a vendor. I would say it’s perhaps not exhaustive, but it was many of the more prominent players and talking to them: “Where are your customers at? Have they gone to production? What’s essentially the scale of that deployment? What’s the approach? Ultimately, it comes down to that last component that you’re talking about, around the chicken and egg problem. Of course, if a bank is willing to accept a driver’s license and you already have a digital driver’s license on your device, and you can simply tap an NFC (Near Field Communications) reader as you walk into the bank — you can imagine really great user experiences once this stuff is widespread. But until that point, there’s a challenge with identifying where the specific value is, that verifiable credentials can add. My answer there was that there are really two ways to get around that: one is to take an ecosystem-type approach, which I think we’ve seen some successes in doing; and the other is to take a single-sided approach. That refers to the two- or three-sided marketplace that is SSI at scale. If we take one side of that, it would involve working with the Issuer, and trying to solve a problem for the Issuer. Or, to work with the Verifier, and try to solve a problem with the Verifier. In other words, you are starting with only one side of the Trust Triangle. I asked approximately 30-odd people in this survey I did, and this was an hour-long interview with each of these people: I dug deep into the background behind the answers. Fifty percent of them said that an ecosystem-type approach was the right way to go, and fifty percent said a single-sided approach was the right way to go. So, really, the answer is that nobody knows what the right way is. Maybe the answer is that there is no right way; it’s whatever works for a given use case. But somehow, in order to reach the level of value that we all see in SSI, we have to get some kind of network of verifiable credential Issuers, Verifiers, and Holders to exist. Once those networks can cross-pollinate, and you can get a network of networks, or a network of ecosystems is maybe is another way to think about it: that’s when the value is really tremendous. That’s when the promises, that we all want to see happen, will be unlocked. In order to get there, you’ve got to start somewhere, right? You’ve got to start with some immediate use case that can add value. The fax machine took nearly 100 years to achieve a widespread network effect. When the fax machine started out, its first initial use cases were point-to-point selling within US Military applications, because it’s very secure. So, sending confidential documentation from point-to-point within the military, that was the first use case for the fax machine. That’s a single-sided use case — it’s selling a new technology to one party, and having them implement it internally or for their own processes. Beyond that, it eventually became a way for other companies to get data from the US military and from the US postal service. That’s where the network effect was built from; it started on a single-sided adoption with one party, and then other people started building on top of that network or that initial node, and building other nodes relative to the first one, thereby creating a network effect. The question I ask in that blog is, “What’s the best way to make that happen within SSI?” Is it to go after a single party, such as the Issuer or Verifier, or is it to try to tackle a whole ecosystem, and get both sides of the market there from the start? An example of that, would be something like COVID credentials, where the lab issues the credential, and some employer, or airline, or venue, verifies the credential. In those cases, there are effectively two sides; two separate parties that are interacting. I can’t speak to what’s best. What I can say, is that in our worldview, or with the people that we work with, the ecosystems have been the deployments that have seen strong initial adoption. That doesn’t mean that the other approach won’t work also; I think those models are not mutually exclusive. We created the Trinsic Ecosystems product to lean into that insight that we had on ecosystems, and to make it easier for people to bring Issuers and Verifiers and Holders into a single trust ecosystem, without a lot of custom development work and implementation. Faxes are unfortunately still used today for stuff — every time I see that, I can’t get over it. But, well, it goes to show how powerful a network effect can be. Think about the fact that faxing is still so prominent; we may not use it in our everyday life quite as much, but when you look at global volumes, it’s still a very prominent communication vehicle around the world for secure document sharing. The fact that that is true is a testament to how powerful a network effect can be. If we can create a network effect with something that empowers individuals, and gives them more privacy and autonomy and things like that, then that’s something I absolutely want to be a part of. Mathieu: Riley, thanks for doing this. I really appreciate it. Riley: Thanks for having me; this has been an awesome conversation.",https://northernblock.io/building-digital-trust-ecosystems/,,Post,,Explainer,,,,,,,,2021-05-05,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,Medium,,,,,,,Call to Action: Verifiable Credentials & COVID-19,"Gates suggests that a digital certificate is needed because it will enable people to share trustworthy information with others. In other words, it is a tool at our disposal to help us reduce uncertainty around the virus. As uncertainty is reduced, additional information enables risk decisions⁴ to be made to ensure our economy doesn’t slip into a total depression. More people can go back to work, faster. Bill Gates’ comment on Reddit caught attention among my colleagues, customers, and partners because there is already a W3C technology standard to accomplish this called verifiable credentials (VC). Hundreds of organizations, including the largest companies in the world, are using VCs for all sorts of things. Verifiable credentials are like digital certificates but with special superpowers that give people privacy, control, and convenience.","Call to Action: Verifiable Credentials & COVID-19 Shortly after Bill Gates posted on Reddit a few weeks ago, my notifications started blowing up. In response to a question about what businesses should stay open during this COVID-19 pandemic, he said: Worldwide Pandemic COVID-19 is an unprecedented pandemic that’s turned the global economy upside down in a matter of weeks. But the true cause of cities shutting down, sports leagues cancelling, and nationwide layoffs is not the novel Coronavirus per se. The true cause is uncertainty¹. Because the virus reportedly has an incubation period of up to 14 days, it’s impossible to know who has the virus. That means everyone you come in contact with is a potential threat to you and your family, and given that, society’s rational response has been to reduce the number of people we come in contact with through social distancing, quarantine, and other measures. This effort is saving millions of lives, but costing millions of jobs. It’s reducing the burden on our medical system, but increasing the economic burden on people everywhere². The degree to which uncertainty exists is the degree to which the economy must remain on lockdown. The degree to which we reduce uncertainty is the degree to which people can go back to work. The longer the economy is on lockdown, the more harm is done to the most vulnerable groups of people³ and crucially important small businesses. The question logically follows: How do we reduce uncertainty and pick up our economy? I believe we need two things. - We need to know whether we have the virus or not. That means we need lots of affordable tests. I won’t spend time on this point — the medical community is working at lightspeed to make this happen, and recent/upcoming FDA approvals look promising. - We need to know whether others have the virus or not. We need to be able to share our status and verify the status of others. That means we need a scalable, privacy-respecting infrastructure for sharing trustworthy information. This is exactly the point Bill Gates was alluding to. Trusted data Gates suggests that a digital certificate is needed because it will enable people to share trustworthy information with others. In other words, it is a tool at our disposal to help us reduce uncertainty around the virus. As uncertainty is reduced, additional information enables risk decisions⁴ to be made to ensure our economy doesn’t slip into a total depression. More people can go back to work, faster. Bill Gates’ comment on Reddit caught attention among my colleagues, customers, and partners because there is already a W3C technology standard to accomplish this called verifiable credentials (VC). Hundreds of organizations, including the largest companies in the world, are using VCs for all sorts of things. Verifiable credentials are like digital certificates but with special superpowers that give people privacy, control, and convenience. Community Weeks ago, various partners and customers of ours began reaching out and discussing the possibility of using VCs to respond to the COVID-19 situation. We’ve been collaborating with the self-sovereign identity and greater verifiable credentials communities to bring standardized credentials and governance to market. This effort is ongoing. The community has a Slack channel and a volunteer initiative between 60+ companies in which anyone is welcome to participate. Trinsic In line with our vision to make verifiable credentials more accessible to the world, Trinsic will provide our world-class tools and support to anyone working on something related to COVID-19 free of charge for a period of time. We believe verifiable credentials can play an important technical role in our global response to COVID-19. With over 150 developers and companies using our full-stack SSI tools, Trinsic is an ideal platform to rapidly integrate verifiable credentials into any application or system. Over 15 companies are already using Trinsic to respond to the virus in different ways, and we welcome as many as are eager to get solutions to market. There are countless ways verifiable credentials can be applied to the crisis. Below are a few examples: - Senior care facilities need a way to verify the employees and visitors who enter aren’t infected - Doctors who want to practice via video call (Telehealth) can prove they’re actually doctors - Digitizing physical test results into a verifiable credential - Insurance and entitlement fraud prevention solutions - A more privacy-respecting contact tracing solution Some of the applications of verifiable credentials to this pandemic are obvious while others are subtle. Some are niche and others are widely applicable. If you’re interested in getting involved in the initiative or building a solution, reach out to us at hello@Trinsic.id. ¹ My argument here is more easily made by looking at the extreme case. If everyone on earth had an embedded screen on their hand that turned red when they were infected with COVID-19, then we would take efforts to quarantine those people and the rest of society could proceed as usual. ² This sentence shouldn’t be misinterpreted — I am in favor of the quarantine measures and strongly support the guidance of medical professionals. Economic cost is worth incurring to save countless lives. My point is simply that economic cost is very unfortunate for many people, especially the most vulnerable. ³ Mental illness, drug addiction, and “tremendous suicides” are some examples, as President Trump puts it in this video. ⁴ By risk decisions, I mean that everyone can autonomously decide how much risk they’re willing to take on. For example, an airline could specify passengers will only be allowed to board if they can present a negative result from a test that was taken in the last 2 hours. But the employee loading the bags into the plane maybe only needs to have taken a test in the last 48 hours. The policy can be set depending on the risk level of the activity.",https://medium.com/trinsic/call-to-action-verifiable-credentials-covid-19-a180155a157c,,Post,,Explainer,,,COVID,,,,Verifiable Credentials,2020-06-22,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Decreased Unemployment Among African Youth Using Verifiable Credentials,"In Africa, these difficulties are magnified by the pre-existing high unemployment rate among African youth. Yoma is a platform that uses verifiable credentials to help African youth build up their digital CV and find employment that matches their skills. Although Yoma and its benefits were relevant and needed before the pandemic, the economic impacts of COVID-19 have only increased the platform’s effectiveness for African youth.
|
||
|
||
|
||
Below is the interview we had with Lohan Spies, the individual responsible for integrating verifiable credentials into the Yoma platform using the Trinsic platform.",,https://trinsic.id/decreasing-unemployment-verifiable-credentials/,,Interview,,Explainer,,,,,,,,2020-08-04,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,SSI Has an Identity Problem,"there is a new category of technology and business solutions that seeks to solve the proverbial “dog on the internet” identity problem for individuals, organizations, and connected devices. Most often called ‘SSI’ (for ‘self-sovereign identity’) or ‘decentralized identity’, these terms are often used in the same way ‘elephant’ is used—with a wealth of meaning and nuance not apparent to beginners. A review of the different terms used to reference SSI provides a helpful introduction.",,https://trinsic.id/ssi-has-an-identity-problem/,https://trinsic.id/wp-content/uploads/2020/11/elephant.png,Post,,Explainer,,,,,,,,2020-11-24,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,Medium,,,,,,,The Story of Decentralized Identity,"Most of the time we don’t realize how much our private data is exposed and shared. Often we don’t even question how much information about us we should share to get something. Do you really need all those sensitive details about me to go through even a simple process as a rental application for a tiny apartment? Why do you need to see my bank history to verify I have sufficient income, or see my name and address on my ID to verify I’m over 21? Why do we still rely on physical documents to prove something about us in this age of technological advancement?","The Story of Decentralized Identity Three years ago I was moving from my Jersey City apartment to a new apartment in midtown Manhattan. The real estate agent seeing me through the process explained all the necessary documentation I will need to present to the landlord to prove I’m eligible and worthy of renting a place in the middle of the island. This was not an ordinary rental application — I had to provide proof of employment, rental…",https://medium.com/trinsic/the-case-for-decentralized-identity-820b48527cba,,Post,,Explainer,,,,,,,,2018-08-19,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Trinsic Basics: The Three Models of Digital Identity,"Digital identity has advanced over time, most recently culminating in self-sovereign identity (SSI). In this Trinsic Basics post, we are going to briefly cover the different models of digital identity and how SSI is the next step in the digital identity evolution. The content in this post is inspired by a blog post",,https://trinsic.id/the-three-models-of-digital-identity/,,Post,,Explainer,,,,,,,,2020-09-25,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Trinsic Basics: What Are Decentralized Identifiers (DIDs)?,"Most identifiers are given to us by centralized registration authorities like governments, telephone companies, and email providers. But that puts an organization in between us and our ability to access basic services, compromising privacy and putting individuals in a position of powerlessness. The answer to this problem is a W3C standard called Decentralized Identifiers (DIDs).",,https://trinsic.id/trinsic-basics-what-are-decentralized-identifiers-dids/,,Post,,Explainer,,,,,,,DID,2020-09-03,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Trinsic Basics: What Are SSI Digital Wallets?,"A digital wallet, in the context of self-sovereign identity, is a software application and encrypted database that stores credentials, keys, and other secrets necessary for self-sovereign identity.³\",,https://trinsic.id/what-are-ssi-digital-wallets/,,Post,,Explainer,,,,,,Wallet,,2020-08-20,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Verifiable Credentials and Smart Contracts for COVID-19 Data Management,"The app is called “State Surveillance System for Covid19 Testing and Vaccine Distribution Management”. It is a prototype app developed using DAML (Digital Assets Modeling Language) and W3C’s verifiable credentials. The app showcases a prototype solution that provides a digital, secure experience for citizens, health clinic providers, and state agencies to share COVID-19 test results, “proof of vaccine” administration, and other “immunity proofs” using a centralized ledger.",,https://trinsic.id/verifiable-credentials-and-smart-contracts-for-covid19-data-management/,,Post,,Explainer,,,,,,,,2020-09-10,,,,,,,,,,,,,
|
||
Trinsic,Personal,,,Damien Bowden,,,,,,Creating Verifiable credentials in ASP.NET Core for decentralized identities using Trinsic,This article shows how verifiable credentials can be created in ASP.NET Core for decentralized identities using the Trinsic platform which is a Self-sovereign identity implementation with APIs to integrate.,"This article shows how verifiable credentials can be created in ASP.NET Core for decentralized identities using the Trinsic platform which is a Self-sovereign identity implementation with APIs to integrate. The verifiable credentials can be downloaded to your digital wallet if you have access and can be used in separate application which understands the Trinsic APIs. Code: https://GitHub.com/swiss-ssi-group/TrinsicAspNetCore Blogs in this series - Getting started with Self Sovereign Identity SSI - Creating Verifiable credentials in ASP.NET Core for decentralized identities using Trinsic - Verifying Verifiable Credentials in ASP.NET Core for Decentralized Identities using Trinsic - Challenges to Self Sovereign Identity Setup We want implement the flow shown in the following figure. The National Driving license application is responsible for issuing driver licenses and administrating licenses for users which have authenticated correctly. The user can see his or her driver license and a verifiable credential displayed as a QR code which can be used to add the credential to a digital wallet. When the application generates the credential, it adds the credential DID to the blockchain ledger with the cryptographic proof of the issuer and the document. When you scan the QR Code, the DID will get validated and will be added to the wallet along with the request claims. The digital wallet must be able to find the DID on the correct network and the schema and needs to search for the ledger in the correct blockchain. A good wallet should take care of this for you. The schema is required so that the data in the DID document can be understood. Trinsic Setup Trinsic is used to connect to the blockchain and create the DIDs, credentials in this example. Trinsic provides good getting started docs. In Trinsic, you need to create an organisation for the Issuer application. Click on the details of the organisation to get the API key. This is required for the application. This API Key cannot be replaced or updated, so if you make a mistake and lose this, commit it in code, you would have to create a new organisation. It is almost important to note the network. This is where you can find the DID to get the credentials produced by this issuer. To issuer credentials, you need to create a template or schema with the claims which are issued in the credential using the template. The issuer application provides values for the claims. Implementing the ASP.NET Core Issuer The verifiable credentials issuer is implemented in an ASP.NET Core application using Razor pages and Identity. This application needs to authenticate the users before issuing a verifiable credential for the user. FIDO2 with the correct authenticate flow would be a good choice as this would protect against phishing. You could use credentials as well, if the users of the applications had a trusted ID. You would still have to protect against phishing. The quality of the credentials issued depends on the security of the issuing application. If the application has weak user authentication, then the credentials cannot be trusted. For a bank, gov IDs, drivings license, a high level of security is required. Open ID Connect FAPI with FIDO2 would make a good solution to authenticate the user. Or a user with a trusted gov issued credential together with FIDO2 would also be good. The ASP.NET Core application initializes the services and adds the Trinsic client using the API Key from the organisation which issues the credentials. The Trinsic.ServiceClients Nuget package is used for the Trinsic integration. ASP.NET Core Identity is used to add, remove users and add driving licenses for the users in the administration part of the application. MFA should be setup but as this is a demo, I have not forced this. When the application is started, you can register and create a new license in the license administration. Add licences as required. The credentials will not be created here, only when you try to get a driver license as a user. The QR code of the license is displayed which can be scanned and added to your Trinsic digital wallet. Notes This works fairly good but has a number of problems. The digital wallets are vendor specific and the QR Code, credential links are dependent on the product used to create this. The wallet implementations and the URL created for the credentials are all specific and rely on good will of the different implementations of the different vendors. This requires an RFC specification or something like this, if SSI should become easy to use and mainstream. Without this, users would require n-wallets for all the different applications and would also have problems using credentials between different systems. Another problem is the organisations API keys use the represent the issuer or the verifier applications. If this API keys get leaked which they will, the keys are hard to replace. Using the wallet, the user also needs to know which network to use to load the credentials, or to login to your product. A default user will not know where to find the required DID. If signing in using the wallet credentials, the application does not protect against phishing. This is not good enough for high security authentication. FIDO2 and WebAuthN should be used if handling such sensitive data as this is designed for. Self sovereign identities is in the very early stages but holds lots of potential. A lot will depend on how easy it is to use and how easy it is to implement and share credentials between systems. The quality of the credential will depend on the quality of the application issuing it. In a follow up blog to this one, Matteo will use the verifiable credentials added to the digital wallet and verify them in a second application. Links https://studio.Trinsic.id/ https://www.YouTube.com/watch?v=mvF5gfMG9ps https://GitHub.com/Trinsic-id/verifier-reference-app https://docs.Trinsic.id/docs/tutorial https://techcommunity.Microsoft.com/t5/identity-standards-blog/ion-we-have-liftoff/ba-p/1441555",https://damienbod.com/2021/04/05/creating-verifiable-credentials-in-asp-net-core-for-decentralized-identities-using-trinsic/,,Post,,HowTo,,,,,,ASPNET,Verifiable Credentials,2021-04-05,,,,,,,,,,,,,
|
||
Trinsic,IDCommons,,IIW,Riley Hughes,,,,,,Build an SSI proof of concept in <30 minutes,"The session began with a short introduction to SSI, an introduction to Trinsic, and an overview of how to get started. Then, everybody present starting building an SSI proof of concept, creating issuers, verifiers, and schemas to learn first-hand how it all works. A step-by-step guide on how to replicate this session can be found at the following link: [https://www.notion.so/Trinsic/Build-an-SSI-Proof-of-Concept-dae9d6e565eb4770be41b61d55e090cb](https://www.notion.so/Trinsic/Build-an-SSI-Proof-of-Concept-dae9d6e565eb4770be41b61d55e090cb)","21G/ Build an SSI proof of concept in 30 minutes Build an SSI Proof of Concept in <30 min Thursday 21G Convener: Riley Hughes Notes-taker(s): Discussion notes, key understandings, outstanding questions, observations, and, if appropriate to this discussion: action items, next steps: The session began with a short introduction to SSI, an introduction to Trinsic, and an overview of how to get started. Then, everybody present starting building an SSI proof of concept, creating issuers, verifiers, and schemas to learn first-hand how it all works. A step-by-step guide on how to replicate this session can be found at the following link: https://www.notion.so/Trinsic/Build-an-SSI-Proof-of-Concept-dae9d6e565eb4770be41b61d55e090cb",https://iiw.idcommons.net/21g/_build_an_ssi_proof_of_concept_in_30_minutes,,Session,,HowTo,,,,Ecosystem,,,,2021-05-06,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,TOIP; TOIP Founder,,,,,Joined Trust over IP Foundation as Founding Member,"As technology was developed to enable voice to travel over the Internet Protocol, a technology that powers Zoom, call centers, and more, it was coined Voice over IP. Other examples abound, including PC over IP, AV over IP, etc. Trust over IP (ToIP) is exactly what it sounds like. For the first time, the internet can add an element of human trust that would not have been possible before.",,https://trinsic.id/streetcred-id-joins-trust-over-ip-foundation-as-founding-member/,,Post,,Meta,,,,,,,,2020-05-05,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Trinsic just raised $8.5M 🎉 and we want to celebrate with you!,"developers face a dizzying number of standards to be compatible with—“SoulBound Tokens” and “Web5” being the latest additions to the litany of W3C, ISO, DIF, ToIP, and other existing specs. Trinsic offers teams a single API that acts as an abstraction layer that bridges ecosystems, strips complexity away from the development process, and ensures products are future-proof.",,https://trinsic.id/trinsic-raises-8-5m-for-decentralized-identity-platform/,,Post,,Meta,,,,,,,,2022-06-28,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Trinsic Leads SSI Digital Wallet Portability,"“Portable” is one of the 10 principles of self-sovereign identity (SSI). In order to achieve portability or self-sovereignty, an individual must be able to control where their identity information and credentials are stored. They must be able to leave their current provider and move to a new provider and never be trapped in vendor lock-in.
|
||
|
||
Wallet portability for individuals has always been an aspiration of wallet providers, but until today, has never been successful. We’re proud to announce that Trinsic has achieved interoperable wallet portability with two other SSI wallet vendors—Lissi and esatus AG. For the first time, an individual can “fire their wallet”¹ and use a new one.",,https://trinsic.id/ssi-digital-wallet-portability/,,Post,,Meta,,,,Portability,,Wallet,,2020-08-18,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Trinsic raises pre-seed funding and rebrands from Streetcred ID to Trinsic,"Salt Lake City, UT / June 10, 2020 / — Streetcred ID, a SaaS platform for decentralized identity, announced today that it rebranded to the name Trinsic and closed a pre-seed funding round with institutional investors. Kickstart Seed Fund (Kickstart), a seed-stage venture capital firm in the Mountain West, led the round. Trinsic is Kickstart’s first investment of its recently-closed, oversubscribed $110 million fund.",,https://trinsic.id/streetcred-id-rebrands-to-trinsic-raises-pre-seed-funding/,,Post,,Meta,,,,,,,,2020-06-10,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,IIW #28,Trinsic Cements its Commitment to Interoperability Ahead of Internet Identity Workshop XXXI,Interoperability has always been of paramount importance to Trinsic. That story begins at an IIW #28 demo,,https://trinsic.id/trinsic-commitment-to-interoperability-ahead-of-iiw/,,Post,,Meta,,,,,,,,2020-10-20,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Company Culture & Trinsineers,"Trinsineers are people who’ve agreed to take the journey to make the world more accessible to people everywhere. We’re a team of people who happen to be working together inside a legal entity called Trinsic. This journey is not a casual stroll, but an expedition. As Trinsineers, we’re developing a culture that is not only helping us accomplish our goals but bringing fulfillment and enjoyment along the way.",,https://trinsic.id/on-company-culture-trinsineers/,,Post,,Meta,,,,,,,,2021-02-09,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,Verity,,,,,Trinsic Api's,Developers are thrilled when they discover Trinsic’s APIs because they are the simplest way to integrate self-sovereign identity into any application. <br>,The backend to the most innovative SSI applications Developers are thrilled when they discover Trinsic’s APIs because they are the simplest way to integrate self-sovereign identity into any application. Used by savvy developers around the world.,https://trinsic.id/powerful-apis/,,Code,,Product,,,,,,,,2020-04-06,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,MedCreds,,,,,MedCreds: Reducing the Risk of Returning to Work,"Looking to do its part in the fight against the COVID-19 pandemic, Trinsic announced three months ago that it would waive all fees for anyone working on projects related to the pandemic. Since then, we have seen a myriad of use cases ranging from using verifiable credentials for HIPAA certifications, to privacy-first contact tracing, to credentialing doctors for telemedicine.<br><br>The use case with the strongest traction has been creating verifiable COVID-19 test results in digital form. Currently, the process of receiving and using a paper-based COVID-19 test result is fraud-prone and clunky. Verifiable credentials makes this process more secure and streamlined.<br><br>One of our partners, MedCreds, is on the leading edge of providing secure, privacy-respecting, and regulatory-compliant solutions and has recently taken their COVID-19 verifiable test-result product to market",,https://trinsic.id/medcreds/,,Page,,Product,,,,,,,,2020-07-17,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,Medium,,,,,,,AgentFramework for .NET joins Hyperledger Aries,"We’re excited to announce that AgentFramework for .NET — a library for building interoperable SSI agents for the .NET Core runtime, joined the Hyperledger Aries family of frameworks. Aries provides a shared, reusable, interoperable tool kit designed for initiatives and solutions focused on creating, transmitting, and storing verifiable digital credentials.","AgentFramework for .NET joins Hyperledger Aries We’re excited to announce that AgentFramework for .NET — a library for building interoperable SSI agents for the .NET Core runtime, joined the Hyperledger Aries family of frameworks. Aries provides a shared, reusable, interoperable tool kit designed for initiatives and solutions focused on creating, transmitting, and storing verifiable digital credentials.",https://medium.com/trinsic/agentframework-for-net-joins-hyperledger-aries-14aba357da41,,Post,,Product,,,,,,"AgentFramework,.NET",,2019-08-24,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,How to Create Connections in Trinsic Studio,"In this blog post, we will explain how to create connections in Trinsic Studio. Throughout the steps below, we will be referring to a fictitious person, Alice Smith, who is a recent graduate of Faber College and is applying for a job. Alice has already received her digital diploma in the form of a verifiable credential and is wanting to apply to work for a company called ACME Corp. In order to start the online job application, Alice must first make a connection with ACME Corp.",,https://trinsic.id/how-to-create-connections-in-trinsic-studio/,,Post,,Product,,,,,Trinsic studio,,,2020-12-02,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,Yoma,,,,,How Yoma Uses Trinsic to Help African Youth Build Digital CVs,"Verifiable credentials is a beautiful set of technology that allows people and organizations to get the data in a verifiable form that still respects agency.” Lohan Spies, Technical Lead, Yoma",,https://trinsic.id/customer-story-yoma/,,Post,,Product,,,,,,,,2023-05-09,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Partnered with Zapier to Bring SSI to 2000+ Applications,"In our eternal quest to make SSI easier to adopt, Trinsic partnered with leading workflow automation platform Zapier to enable Trinsic’s developer community to integrate self-sovereign identity with 2000+ common applications without coding! While Trinsic specializes in building the world’s best developer toolkit for decentralized identity, we recognize that plenty of non-technical people want to build SSI integrations. Zapier is the best tool we found to connect the APIs of various different services behind the scenes, making SSI more accessible than ever before.",,https://trinsic.id/trinsic-and-zapier-partner/,,Post,,Product,,,,,,,,2020-10-07,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Simplifying SSI-Based Solutions to Focus on Adoption,"After the COVID-19 pandemic hit the state of Oregon and we shuttered shops and public places, here in my little piece of heaven—the city of Sisters—I went to some of my friends at Economic Development for Central Oregon",,https://trinsic.id/simplifying-ssi-based-solutions-to-focus-on-adoption/,,Post,,Product,,,COVID,,,,,2020-09-01,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,ESSIFLab,,,,,Trinsic Builds Open Source Trust Registry Sponsored by eSSIF-Lab,"Driven by our motivation to make SSI more adoptable, we built the world’s first turn-key, open source trust registry solution. This work was sponsored by the European Self-Sovereign Identity Framework Lab, which is an EU consortium that provides funding for projects that build SSI open source tools. Any ecosystem provider can use the trust registry implementation to enable governance in their verifiable data ecosystem.",,https://web.archive.org/web/20220810154637/https://trinsic.id/trinsic-builds-open-source-trust-registry-sponsored-by-essif-lab/,,Post,archived,Product,,,,,,Trust Registry,,2022-08-10,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Trinsic Introduces Interactive Connections in Trinsic Wallet & Platform,"Current digital wallet implementations fall short of the vision of self-sovereign identity (SSI) because they only allow wallet holders to respond to, not initiate, interactions with institutions. This reduces wallet holders to a passive role, which at best delivers suboptimal utility to the holder, and at worst can reinforce the unhealthy power asymmetries between institutions and people that exist today. Interactive connections solve this problem by creating a two-way street between a wallet holder and an institution. Instead of a passive responder, the wallet holder is a peer who can initiate actions of their own. In addition, wallet holders can interact not only with institutions, but also with other wallet holders, to communicate securely and share verified information.",,https://trinsic.id/interactive-connections/,,Post,,Product,,,,Wallets,,,,2020-11-03,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Announcing Trinsic’s Largest Platform Update Ever,"The next version of the Trinsic platform is 10x as accessible, 100x more performant, and 1,000x more scalable. And it is available now.",,https://trinsic.id/announcing-trinsics-largest-platform-update-ever/,,Post,,Product,,,,,,,,2021-07-08,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Introducing Trinsic Ecosystems,"Once an ecosystem is configured, providers need to onboard participants like issuers and verifiers. Trinsic Ecosystems comes with an API that’s extremely easy for any issuer or verifier to integrate and can be white-labeled with the name of the provider. In addition to the API, ecosystem participants can use the Trinsic Studio, a white-labeled web dashboard.",,https://trinsic.id/introducing-trinsic-ecosystems/,,Post,,Product,,,,,,,,2021-04-19,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Combining Verifiable Credentials and Smart Contracts for COVID-19 Data Management,"The app is called “State Surveillance System for Covid19 Testing and Vaccine Distribution Management”. It is a prototype app developed using DAML (Digital Assets Modeling Language) and W3C’s verifiable credential. The app showcases a prototype solution that provides a digital, secure experience for citizens, health clinic providers, and state agencies to share COVID-19 test results, “proof of vaccine” administration, and other “immunity proofs” using a centralized ledger.",,https://trinsic.id/verifiable-credentials-and-smart-contracts-for-covid19-data-management/,,Post,,Product,,,COVID,,,Digital Assets Modeling Language,Verifiable Credentials,2020-09-10,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,Verity,,,,,Trinsic Wallets,"In Trinsic’s platform, identity wallets are secure, partitioned data stores scoped to a single holder, capable of storing and sharing credentials and proofs. Endless configurations of wallets exist (custodial, non-custodial, etc.) each with different trade-offs; Trinsic has designed a hybrid-cloud wallet system intended to strike the ideal balance between security and usability:","Trinsic Wallet: It's like your physical wallet, but digital. One place for all things you You have been collecting paper and plastic representations of your identity, achievements, certifications, and experiences since you were a child. But until now, there has been no standard way to do this digitally. Trinsic allows you to simplify your digital life by obtaining digital versions of all these credentials so that they’re there when you need them—easily, securely, and privately. Trinsic Studio + Trinsic Wallet: The perfect marriage The Trinsic Wallet works seamlessly with the Trinsic Studio, the fastest way to issue credentials to a digital wallet. Use the API for more advanced integrations. Get started for free or check out our additional plans. A wallet for every requirement Mobile Wallet SDK Integrate an embedded digital wallet into any application. White Label Wallet Skip the development effort by white labeling Trinsic’s popular mobile wallet.",https://docs.trinsic.id/learn/concepts/wallets/,,Documentation,,Product,,,,,,,,2020-04-06,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Webinar Recap: Making Money with SSI,"In our recent expert-panel webinar, “Making Money with SSI,” we dive in to the details of creating a business out of SSI from experts who’ve done it. Whether you missed the webinar or just want to refer back to the best parts, we’ve got you covered with a full recording. Scroll below the recording to view a highlight reel!",,https://trinsic.id/webinar-recap-making-money-with-ssi/,,Post,,Recap,,,,Business,,,,2020-10-30,,,,,,,,,,,,,
|
||
Trinsic,Trinsic,,,,,,,,,Trinsic has released some tools to issue verifiable credentials,"All verifiable credentials come from credential templates. These templates specify what information should be included in a credential. Faber College would most likely want the credential template of its digital diplomas to include a graduate’s first and last name, what they got their degree in, what year they graduated, and thier GPA. Let’s begin!",,https://trinsic.id/how-to-issue-credentials-in-trinsic-studio/,,Post,,Resources,,,,,,,Verifiable Credentials,2020-10-19,,,,,,,,,,,,,
|
||
Ubisecure,,Ubisecure,,,,,,,,Ubisecure,,,,,Company,,Company,,,,,,,,,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,LTADI,,Truprofile.io,,United Kingdom,,,Digital identity in the UK in 2021 with TrueProfile.io’s René Seifert,"“I think it’s interesting if we overlay this utopia of a self-sovereign identity that sounds maybe like science fiction today, and where these UK digital initiatives are geared, and my best guess is we can and will land somewhere in the middle.”","with René Seifert, Co-Founder & Co-Head at TrueProfile.io. In episode 37, René Seifert talks about the current status of identity in the UK; the government’s recent call for evidence and DIU (digital identity unit); the resultant six guiding principles – including privacy and inclusivity; the potential of self-sovereign identity to solve some of these issues; TrueProfile.io and the importance of verified credentials in an HR context; plus the ethical, political and technical challenges of ‘immunity passports’. [Scroll down for transcript] “I think it’s interesting if we overlay this utopia of a self-sovereign identity that sounds maybe like science fiction today, and where these UK digital initiatives are geared, and my best guess is we can and will land somewhere in the middle.” René Seifert is a serial entrepreneur and co-head of TrueProfile.io, a credential verification solution provider. Powered by the DataFlow Group, TrueProfile.io provides these services in a modern environment via the adoption of Ethereum blockchain. Prior to this, René was the co-founder and co-CEO of Venturate AG, a crowdfunding platform allowing regular people to invest side-by- side with experienced business angels. In addition, he has been involved in founding several internet, tech and media companies, among the Holtzbrinck eLab. René, half German and half Croatian, began his career hosting radio shows and running an advertising agency parallel to his studies. He was head of marketing and presenter at the radio station Bayern 3. During the “new economy” he headed the entertainment department at Lycos Europe. Find René on Twitter @reneseifert and on LinkedIn. We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla. Oscar Santolalla: Hello and thank you for joining today, an episode in this New Year 2021 and we are going to discuss, especially now, the digital identity in the UK for this New Year 2021. I have a super special guest today who is René Seifert. He is a serial entrepreneur and co-head of TrueProfile.io, the industry leader in document verification. Powered by the DataFlow Group, TrueProfile.io provides these services in a modern environment via the adoption of Ethereum blockchain. Prior to this, René was the co-founder and co-CEO of Venturate AG, a crowdfunding platform allowing regular people to invest side-by-side with experienced business angels. In addition, he has been involved in founding several internet, tech and media companies among the Holtzbrinck eLab. René, half German and half Croatian, began his career hosting radio shows and running an advertising agency parallel to his studies. He was head of marketing and presenter at the radio station Bayern 3. During the “new economy” he headed the entertainment department at Lycos Europe. Hello René. Welcome. René Seifert: Hi, Oscar. And Happy New Year! My pleasure for this podcast. Oscar: It’s great talking with you. Thank you. Hope you are having a great start of the New Year 2021. First, we would like to hear more about you particularly, how you have been doing in media and other very interesting things about technology, how your life ended in this world of digital identity? René: If I knew that myself… I think it’s a quite unlikely scenario that panned out. And maybe you also heard that famous commencement speech from Steve Jobs in Harvard that you only can connect the dots in hindsight, you can’t connect them living your life forward. And let me maybe try to connect these dots. And you mentioned a couple of already things how they evolved in my life. Indeed, in my first life, as I tend to say, I was sitting on the other side of our conversation, I was a radio presenter, I was a journalist, I’d say my highlight there was war correspondent for German public radio in Macedonia and Albania during the Kosovo war in 1999. I did this kind of on the side of my university education for economics and management. And then post-graduation I was more focusing on the media management side of things. And as you rightly said, I was head of marketing of Bayern Drei, Bayern 3, one of the top 10 German radio stations after that. The new economy came, if you might recall that time, the boom and the bust where I was director, entertainment at that time famous and then infamous search engine with a variety of other services – Lycos Europe. And I really saw a lot of this bust of this new economy which made me take a year off in 2002 and spend a year travelling the world in a sabbatical, doing all sorts of things I always wanted to do from doing a pilot license, motorbike license, motorboat license, learning languages like Russian, Spanish, doing a bit of Muay Thai. And then I came back to Munich and well, this was my first immigration then going to Bangalore, India and that’s where in fact I started my entrepreneurial journey in businesses like e-commerce selling jewellery on eBay, then moving into an outsourcing consultancy for what Bangalore has been very and still is very famous for. And then starting indeed to angel invest into Indian companies with this angel and were called Mumbai Angels. So, and imperilled with lots of back and forth and you mentioned that also, I helped build an incubator in Munich owned by Holtzbrinck publisher called Holtzbrinck eLab where we created some 13 companies in the span of four years. Then I really truly moved back to Munich for a short period of time where I co-founded a social media agency and that crowdfunding platform Venturate AG, which we then sold a year later to a publicly listed company. Then came my second emmigration, this time with my family to Bangkok, and then a kind of opportunity presented itself, I’d say very typical through network connections, to join the DataFlow Group, at that time it was headquartered in Hong Kong and meanwhile headquartered in Dubai. And DataFlow is doing is PSV, Primary Source Verification, since 2006. And I was given a quite broad mandate to look at how can this thing be more digitised? And let’s face it, I’d say verification is not the most sexy topic on the face of this earth. But maybe it’s also the reason why nobody has really looked at that so I tried to do exactly that so I kind of became a co-founder of TrueProfile.io where we really put the individual and their empowerment into the centre. So we launched the first version of something called DataFlowPlus.org some four years ago and that morphed subsequently into TrueProfile.io which I’m now running in a shared responsibility with my esteemed co-head Alejandro Coca from Spain who’s focusing on the say commercial business part, while I focus on the product and tech part. So, in hindsight, maybe it is possible to connect the dots, I’d say my Personal motivation here in the common thread has been – never stop being curious and never stop learning and never stop willing to make a move into a new unchartered territory. Oscar: Yeah. I can see. I can see. Many changes in geography and also in the business. René: And so, may I just add, I think it’s really interesting because I listened to some 80% of your podcast and I learned a hell of a lot. And I think what you’re doing, you’re doing a great service of building this industry of digital identity from identity for people, identity for technical system like APIs, then also what are policy implications or what is the identity of a legal entity. So, I found these areas always very separate, and you are kind of bringing them under one umbrella where they belong. And I hope that maybe today I can contribute another facet around verified credentials which are useful in particular in an HR context into your realm of digital identity. Oscar: Yes, exactly, exactly. I’m sure we are going to have a very different facet from what you are doing there in TrueProfile. In this last year, you are building this company and making it really global. So, if we focus in starting this year and a bit more into UK where I know your company is also operating and having a lot of business there. How secure digital identities for citizen could be introduced in the UK? What’s your take on that? René: Maybe if you allow me. Let’s do a quick game, Oscar and René where we are today with identity in general and let’s see how that could play out if we are doing things right. So, I know you are from Peru, right? Oscar: Yes. René: So, if you remember maybe your first ID card that you ever got, what did you have to present as a document to get that ID? Oscar: Well, in Peru, first you go to the military first. It’s not mandatory military service but you have to go when you are I think 16 or 17. You get your first ID, it’s a military ID. And then when you become 18, you bring that and you go to the electoral body and you get the ID that is going to be for the rest of your life. You go in person, yes. René: But for your kind of invocation to the military, into a military ID, you’re somewhere in a register, probably they have your birth certificate to know that you are there. So, what I want to get at is that there is central government authority that issues a birth certificate, based on that birth certificate they know you are there. Then they call you to school, they call you to the military then you get your first ID. If you want a passport, you’ll typically present your ID, same with your driving license. If you want a bank account or credit card, you have to present your ID. But these are all tied to your identity that is issued or bestowed by the government, a central authority. So, how about thinking of an alternate universe where we have something like a self-sovereign identity. Imagine this alternate universe without government and maybe even without a central gatekeeper like Facebook who are holding large parts of your identity – what would change your biological existence as Oscar wouldn’t change, you’re still there. So, imagine in that alternate universe, you could create your own identity, a self-sovereign identity on the blockchain, and enhance it with all sorts of objects and attributes as you move through your life. A couple of examples say, credentials from your education universities, license, driving license, pilot license, courses that you might have completed from something like Udemy. And again maybe first reflex is to say credit card but in an ideal, decentralised scenario. It might be something rather like your crypto wallet where you can receive payments, make payments, maybe even small micropayments and where you choose yourself if you would like to disclose during payment your identity, parts of it or none. At some point, this identity, self-sovereign identity, could serve you to login into all sorts of services without using any passwords. So, a couple of properties that are important to understand, this would be universally accessible. It would be secure. It would be easy to use. It’s fully under your control and widely accepted. And the best part it’s decentralised, it’s robust against censorship and it’s purely based on blockchain. So, that’s almost Utopia. Oscar: Today, it’s not like that. René: Yeah, exactly. It’s not like that. That’s exactly my point. So let’s take a quick break from Utopia and you were asking me specifically where does the UK government stand on this? Before we kind of get back to Utopia and try to bring those worlds, maybe somehow into connection. So, there is this initiative that we are quite aware which is GOV.UK Verify that cost some 200 million pounds and the general consensus is that this identity system is ill-equipped to serve truly for a digital identity that would be able to do all these things that people want in particular when it comes to doing online transactions. So, in July, two years back, 2019, the government embarked on another initiative that was indeed truly about digital identity where they made a call for evidence and received 148 submissions. Many that were focused on issues around privacy, trust and the role of government obviously in enabling a private sector market for digital identities. And based on that, they created a DIU, Digital Identity Unit, which brings together a variety of stakeholders mainly from governmental bodies. And where do they stand right now? So, the current status is that– and I find it quite interesting they developed six principles, what this digital identity strategy should encompass. Those six are privacy, transparency, inclusivity, interoperability, proportionality and good governance. If we just kind of touch upon a couple of them in a bit more detailed and I guess privacy is certainly key. So, when Personal data is accessed, will people really have confidence that there are measures in place to ensure that this is really private? So instance, if somebody goes to supermarket and needs to prove buying liquor how old they are, that should work but it should be still under the control of the individual. Likewise, transparency should be very clear. For example, in the purchase of some liquor in a store, that only this data point is transmitted and not all the other parts of your digital identity from name to your credentials, et cetera what have you. Inclusivity, I find that interesting because even the government says a bit in contrast to what I said before on the evolution of all your different pieces of what form your identity today based on government that people who want or need a digital identity should be able to obtain one and without the necessity of documenting that they have a passport or a driving license. So they really want to make this almost an entry point into an identity which I find good. Then we have another aspect, interoperability, which I think is self-explanatory that this would be a standard that works across several applications and platforms and can be sent across. Number five and six, I’ll make that really short, is a bit of proportionality and good governance. Well, I think the government tries to portray itself as the very good guy that will never engage in any sort of overreach I guess that something that is nice to say but I’d be always sceptical on that part. If you allow me now to kind of maybe get back to taking these principles on the UK digital identity and maybe just maybe to finish just where this stands, it really stands now with these principles. And now, the consideration is going forward towards some sort of feasibility analysis and as we can all imagine, such huge governmental projects take years to complete. I guess we should be monitoring where this is going but this is where it stands today. I think it’s now interesting if we overlay this utopia from an entirely self-sovereign identity that sounds maybe like science fiction today and where this UK digital identity initiatives are geared. So, my best guess is we can and will land somewhere in the middle. So, the government is not going away for the foreseeable future but we have, thanks to blockchain, a solid chance to reclaim some of our liberty. So, what I think is there will be some sort of hybrid approach where you might use the launch of government issued digital ID to make it part of self-sovereign identity and move from there into the future. Oscar: OK. I was not aware there is a new initiative by the UK government for the digital national ID. What is your estimate, when do you think there will be something to try? René: Oh, I’d say five to 10 years. You can’t blame the government for everything. If you are a government, you have to consider a lot of stakeholders, a lot of discussion which is all fine. And then these things take longer than if you are just kind of putting out something like uPort, we can speak about that maybe later which is a totally self-sovereign wallet and allow people to just pick it up, if it doesn’t work, nobody is really responsible, again, it’s self-sovereign. So we are between those two polar opposites so the government needs to take an extremely robust and secure approach to it and my best guess is some five to 10 years. Oscar: OK. So, time would tell. But it’s super interesting that now it’s already being cooked and with these six principles that you just explained, excellent. So now moving to another interesting topic is about the qualifications or something that is part of the company you are doing. What are the challenges that fake or fraudulent qualification present for successful roll out of digital IDs? René: I think it’s a good question and in fact, if you then roll a bit back then there’s really not so much specific to the digital ID part as it is already now. It might make matters maybe even worse. So, generally speaking, fake qualifications have always been a problem and cost the industry worldwide billions of dollars on multiple levels. Look at the loss of productivity, you do hire a candidate, you realise it’s not the right one, you need to rehire. Then you might get into lawsuits, you have to compensate for damages and reputational damage particular in some high risk context. So, that’s just bad. You don’t want that. So, here again, the key concept is trust so – and blockchain here is no other than the old saying around pretty much any system or application which is garbage in, garbage out. The same applies to blockchain, garbage in, garbage out. So I guess the best answer we can come up with today is to work with one or even better multiple say intermediaries who can play a vital role in verifying the veracity of a statement. And I listened to one of your podcasts, you have a guest from DigiCert who are doing exactly that for the ownership of websites. And something like TrueProfile or underlying DataFlow Group can do the same for professional credentials. And once you have completed this stage then you could put the result on the blockchain and have then all the benefits that the blockchain offers with a let’s say reasonable level of trust. Oscar: Correct. So, how the qualifications are not widely connected with the digital IDs, for instance, now you ask Peru from my home country, you declare for instance, you declare you have university studies for instance or not, yes, not. You declare that and that’s it. So the government doesn’t verify you, you declare that. René: Yeah, absolutely. And everybody can declare anything and LinkedIn is full of statements all over the place that are not true. So again, as good as it gets and I guess that’s the specialty that DataFlow and TrueProfile have built up over the last 15 years is that we do something called primary source verification. So Oscar, if you were to do it with us, you would give us your legal consent, that’s something we absolutely need and a copy of your document with a few other details. Then we would really reach out to your university and ask them a simple question, is this diploma that we got from Oscar true? Has that been issued by you – by you, as a university? And it takes a few days and then yeah, we get an answer, in your case of course, it would be everything perfect and then we would be able to verify that. If you allow me maybe to comment on the other side question that you asked initially which is why is that not all digitally connected? I think that’s a world where we might be moving to but reality of a fact is if you maybe recall your own university, I don’t know how it looks. We deal with all sorts of let’s stay with universities – those that are super digitised to those that are not digitised at all. So, it’s no point trying to digitise those that are not digitised at all. We rather try to take the world as it is and try to make the best out of that. So, we engage with a lot of universities are totally analogous and we write emails, we call them up, at times we go there in person to ask the question and get a result. So, it’s really as good as it gets moving then jointly hopefully into an era that is more digitised all together. Oscar: Yeah, exactly, exactly. So now that you are talking about the qualification so let’s move to talk about TrueProfile.io. So tell me what is this company, TrueProfile.io? René: Well, I guess a lot is in the name if you look at True Profile then it’s the true profile of you, of any individual. I’ve heard a couple of times a description that I didn’t come up but when I explain it to people they say, “Ah, OK, you’re sort of a better LinkedIn if you do away with all the social connectivity. But with kind of the profile that you have on LinkedIn.” So, if you look at the problem that we’re trying to solve then there is a certain lack of trust, otherwise you wouldn’t need all these background checks – and as you said in Peru, well, you can say whatever you like and there’s nobody to verify it. So, because there is this lack of trust, there’s a whole industry of background checks with verifications and even worse as a sort of problem second order when people change their jobs nowadays up to 10, 15 times in the lifetime. I’m exaggerating now a bit but allow me for the sake of the example, they might have to get verified every time again, and it’s a lot of friction in the system because you have three stakeholders who need to participate. It’s one the individual, who has to submit a documents and details and legal consent. Then you have the employer, who has to do and pay for something that probably has been done already before. And then you have in our example, the university that has to respond and respond a second, third, fifth time for the same question. And so, that’s what TrueProfile is really about. We’re trying to bridge this trust gap by empowering people, by facilitating fast and verified connections through a technology-driven platform. And we are bringing together the right professionals in an ecosystem of trust how you like to call it. So, I think it’s important to understand that there are really two major sides if we take the university quickly out of the equation. It’s on the one hand the individuals where we really want to empower them to achieve career success – that’s really in the centre, empower individuals by enabling them to share their professional authenticity through our platform. So, just as an example and the data point, half a year ago approximately, we celebrated our 500,000 mark of registered members on TrueProfile.io. Then we have the other side of the employers where we say OK, we want to help businesses around the world to save time and mitigate risk. We’re providing them a platform and then they can make trusted business decisions about the people they want to work with. So, let me maybe also describe a bit the context where this becomes relevant and I don’t want to sound like the guy who only has a hammer so everything for him looks like a nail. So, a DJ at a party doesn’t need to be verified. It will be a bad evening, people do not dance but that’s it. Likewise, let’s say a software developer, a bit more critical but typically also not that critical. His team lead will quickly realise, this person is no good and exchange them before major damage is done. But where really verification becomes relevant is what we’ve learned in the cross hair of two dimensions. One is risk. The second is global professional migration. And if you look at how we define a risk or risk industry, it’s typically I think what you might also think top of your mind it’s health, healthcare, new one is telemedicine, quite interesting by the way, things like engineering, say a person who is calculating the statics of a bridge, they should know their job better, likewise aviation pilot. So that’s sort of a risk industry. I guess that’s understandable. The second might not be that intuitive but it’s really particularly relevant in a context of global professional migration. So what we also learned, say, I hail from Germany, well, there are also doctors there but they don’t really need to be verified because there is somehow a possibility to get an understanding is that person real or not? And moreover, you go to jail several years if you falsify to be a doctor. So, there’s a significantly high deterrent. But in a context, say, a Peruvian doctor applying to a hospital in Germany or in Dubai, we wouldn’t know at all. Is he/she legit? And that’s where verification really kicks in and becomes important. Oscar: Yeah, exactly. And today that we are more in digitalised world and yes, people are relocating more often than before, one of your reasons as well. Yes. René: Absolutely. And in the context of that relocation, I think for us, it’s also important again, we also – and especially want to empower the individual – but it’s also balancing of somehow the prerogative of the receiving country to make a decision. Do we want that person in our country or not based on their credentials? And that’s what we are kind of discussing all the time to find this exact right balance. Oscar: So your main customers, if I understand, your main customers are the companies who are hiring and also can be the government, let’s say the migration bodies of the countries, in the case of relocation? René: Yeah. So, if I may answer that at the end, if I may guide you through a couple of I think important blocks how our service works then I think the answer become self-explanatory almost at the end whom we are serving. So again, everybody, and that’s the best part, can buy today their verification on TrueProfile.io and own them forever and do with them whatever they like. So, what we have established is a sort of standard for document verification which we call a true proof. So that true proof is a single positively verified document and that could be something along the lines of education, university diploma, professionals reference letter and something like medicine specific like a health license. Interestingly, we’ve also made a true proof around identity, your favourite topic, where we are working with a third-party provider from Germany. And again in the context of the UK, it’s interesting we are using, or they are using, the same standard that is by law required in the UK. If you want to open a bank account online to fulfil all the KYC requirements. So, again it ties in and to a bit what we discussed before, your identity really based on your say passport from a classic centralised government authority, now becomes in the shape of this true proof persistent, trusted and fully sharable under your control and you can share your true proof either as a URL or a PDF. And speaking of blockchain you can also share it into your self-sovereign wallet by uPort where we have a corporation and an integration. And we ourselves, again blockchain, add each fingerprint of a true proof onto the Ethereum blockchain into a smart contract that we have specifically developed. Now, I think always a key question why are we doing this? Not because we love the blockchain to play around. I think again, it’s empowering the individual, God forbid if one day TrueProfile, lost out of existence any third party could still verify against the blockchain, the veracity of your true proof that you have bought with us and see that this is legit. So, maybe just to go the next level before we answer the customer part, so then you have all these true proofs of yours that you then bring together into you’re My TrueProfile that again you can share I guess that’s as close as it gets to the better LinkedIn profile you can share that. We also transition to include certain elements of a CV, an online CV, where we have a clear separation between statements that are verified and those that are not verified. And we even see that people are using that as a sort of document repository that they then decide to very selectively verify. Maybe also maybe to close the loop to LinkedIn, my last point here is that we are able to connect to a specific statement on LinkedIn with a respective true proof through a URL. So, somebody sees a statement on LinkedIn, they would click twice and then end up on the true proof and be able to compare, is that what has been stated on LinkedIn consistent what is written in the true proof? So really trying to build this ecosystem out. Now, indeed, who are our customers? So we have a service which makes it really easy – verification on-demand for employers of all sorts and I think you had already the right suspicion it is recruiters, it is healthcare like Médecins Sans Frontières, Doctors Without Borders is one of our major clients who are using this service when they just have a lot of candidates where they are not sure about a few of them or all of them and want them to undergo verification with us. And then the client would get the result of the verification. And, as I said in the problem statement, for us, it’s really important that we do away with this over and over repetitive verification problem. So, although the employer gets to see the result, also the individual, the candidate, is able and allowed to keep their true proofs on their My TrueProfile and carry that forward. So, it means that the next time they apply somewhere, you don’t have to do that over and over again because it’s a trusted standard that can be reused if you kind of zoom out and see this on a systemic level that you are doing away with that friction. As DataFlow has been for 15 years now mainly doing verifications in the healthcare space, it was a very logical evolution for us to invite those candidates onto TrueProfile and it’s important to get the legal consent of them and to make them part of a recruitment database where healthcare recruiters can connect to them and solicit them for an employment. And especially in the UK context, we’re all aware that there are like 40,000 to 50,000 healthcare workers missing right now and we could contribute largely to help ameliorate that health crisis, let’s call it what it is, that has even compounded now during COVID-19. And there we’re working with leading recruiters that are using our database to get international talent and then hopefully bring them over to the United Kingdom. Oscar: Yeah, I can see many different type of organisations are your customer. Just one clarification, the end-user, so I can go and create my profile in TrueProfile? René: Yeah. Oscar: So I will need to pay for that correct or not as an individual? René: Correct. It’s like all sorts of variations that we have incorporated so you can go there today and buy your true proofs and use them for however and how long you want. We also have the other side that the business is paying for it but you’re then still able and allowed to keep it for free so we can approach this from both sides. Oscar: Yeah, yeah I understand. OK. It sounds definitely very good. The individual who wants to take the initiative to be ready and can pay, otherwise when they occasion comes one recruiting company will pay for that and you got this benefit for the future. Yeah, definitely it’s super interesting service and is definitely filling many gaps, many needs that have been appearing in the last years about verification. One last concept I would to ask you is not completely related to this, well, not the recruiting but still very topical today, it’s about the concept of immunity passports or health passports, what would you say about that? What are the challenges in the ethical, political, technical challenges you can say? Some ideas about that? René: I’d like to start that this should be part of a broader concept. It’s not really one silver bullet that will solve it all. I think we’ve seen a really miserable failure of old pre-warning systems and we have to put those in place first and where we might come to the point of willing to impose restrictions swiftly before we have to deal with what we are dealing right now and have been dealing for the last year. So without even going into the depth of the most important part which is avoiding loss of human lives, I guess the economic damage in a full lockdown once the virus has become pandemic is by order of magnitude bigger than say initial instances of closure for travel routes, masks and some precautionary social distancing without then having to close schools or offices, shops and what have you, which is really the worst case scenario which we also have experienced. So I think it’s important to understand how do we acquire immunity? So, one is that somebody has gone through COVID-19 successfully and got healthy again and has kind of acquired antibodies whose prevalence can be confirmed. There are some things we have also learned about this that it’s not entirely clear how long these antibodies truly can prevent new infection as we also know there have been cases of lapses and unfortunately also re-infection of the same person. So it’s really more about the second scenario of immunity which is by vaccine. And we’ve just had the first vaccine emergency approved from the FDA by Pfizer then followed by Moderna and as this will become more broadly available, my guest pass is that a health pass but will really focus more on the vaccination part than that of a naturally acquired immunity. So, how could that materialise? So, I remember when I was a boy in Munich, and I still have it today, this yellow book for vaccinations from the WHO, I think in the UK it’s a red book. And Oscar, you have something like that as well? Oscar: Hmm, there must be. I don’t remember. René: Ah, you don’t remember. OK. Oscar: I don’t know which colour it is. René: You know, OK. So, the point is now with especially COVID-19, we have seen lots of instances of now vaccination not yet on the horizon but at least antibody tests or even negative COVID-19 tests being issued for a few bucks which is catastrophic and we can all foresee that exactly the same is now about to happen about confirmation for vaccinations. So, I’d say this book doesn’t really do the job for a credible confirmation verification of a vaccination that has occurred. So we need to become better than that. We need to move that somehow into the digital realm. And I guess the use cases are already showing up. Qantas, the Australian airline, wants a vaccination proof for boarding their planes. IATA, the airlines association, is now considering the same. We can also think of other use cases like a stadium event or something like a trade show especially indoors where the organisers want to see that a person has received the vaccination before they are allowed in. So, what we opposed at is you should be able to expose your health status but nothing else. I think that is in the centre. It’s not about sort of other properties of your digital identity. At the same time, we should be aware that we as a society are accepting then a double treatment. I’m just speaking of what’s happening for people who have a vaccine, who might have been lucky to be ahead of the queue towards others who have not and who now enjoy preferential treatment. We have to have a debate how that works out and if we are willing to accept that. And in spite of always being a friend for technical solutions to problems that are very practical and all the bravado, I clearly also see the other side that this pandemic has been an extremely unlikely rare occurrence almost something like a black swan. And if you hold in general a pro-freedom world view, then there are challenges namely that all these things like centralisation, surveillance deep platforming, really get on steroids where a real health crisis no doubt is used as a pretext for governmental overreach which we have close to never seen and which typically doesn’t back after the root cause has ceased to exist. And it’s like these emergency powers for state that eventually never get rolled back. So, I guess the key question we should ask ourselves is how do we balance this kind of need to return to some sort of new health normalcy with our liberties and our freedoms? And I think we should have an open debate where all voices should be heard, without that those who don’t just parrot mainstream media opinions are being silenced upfront. And then I’m optimistic still that we’ll move to the best possible equilibrium. Oscar: Yeah, and definitely. Thanks for sharing that. Yes, for instance, when you were explaining that I was imagining going to a trade show with my immunity passport who might probably an app or something digital and the app should just reveal, I get the vaccination, yes or no? That’s it. René: Exactly. And make sure that people know that this was really you. I think those are the two things at on the spot. And once that question is answered, that should be all fine. And ideally, it should also be deleted and just be used for the very point of entry. Oscar: Well, hopefully we’ll have these solutions in the short term we definitely will need those. René, finally, I would like to ask you a final question. Imagine for all business leaders that are listening to us now, right now, what is the one actual idea that they should write on their agenda today? René: You’ve been often asking the question around protecting the digital identity and I would like to start and then move to another point. It’s really critical that for a business leader themselves but also for their teams and their organisation that they take digital identity more serious than ever. On a Personal level, it really means imagine your Personal worst-case scenario somewhere on holiday with your computer in a car, your mobile phone in the car, it gets stolen. Think of that and I’ve done that by the way myself and kind of how would I get back into my services from iCloud to I don’t know, Dropbox, to all sorts of services. Try to work through that scenario, spend half a day, write it down and then work backwards to how you set up your system. The same applies to company level, company continuity which then extends to how do I prevent from getting hacked? Or even more easier nowadays because mostly the human is the weakest link through some social engineering attacks and make your organisation really robust, how to get security on every level on to the highest possible level. My second point is that I think we are now living in an era of transition and cryptocurrencies based on blockchain has been the most fundamental invention since we’ve had the internet starting somewhere in the ‘90s. I would recommend to every business leader, get acquainted yourself with these concepts. They are not that difficult to understand with some fundamental interest in maths, technology, economics, in particular, microeconomics to understand how does the blockchain work? Why does it work as it works? How could that help Bitcoin, Ethereum play out? What is Ethereum more than just a cryptocurrency which is a whole ecosystem? And there are a whole host of other applications that are now coming up on the blockchain from some DNS services to funding services, to social networks. Not all of them will work out but why I’m saying this is important, I think it gives us an unprecedented chance to reclaim parts of our liberty and thus parts of our autonomy. And that’s something I Personally hold extremely dear. Oscar: Yeah, thanks. Thanks a lot for this. Yeah, definitely when you mentioned this having your, like a business continuity for – as an individual, well, yes, you are correct still in that. OK and blockchain is a thing that every business leader should write on their agenda for this new year. Excellent. Thanks a lot, René. It was really very interesting to hear what TrueProfile is doing and how everything, the same concepts also apply for all the other challenge that we have including the health, health issues. So please, let us know how people can get in touch with you or follow you, what are the best ways for that? René: I’m on Twitter just my first name and last name together @RenéSeifert on Twitter. You’ll find me on LinkedIn and if you have anything where I can help, feel free to DM me, happy to reply and help out. Oscar: OK. Excellent. Again, René, it was a pleasure talking with you and all the best and Happy New Year! René: Yeah, the pleasure was entirely on my side. So thank you very much Oscar and all the best for 2021! Let’s hope we get over this health crisis and get to some sort of new normalcy again. Oscar: Thank you. René: Thank you. Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at Ubisecure.com/podcast or join us on Twitter @Ubisecure and use the #LTADI. Until next time. [End of transcript]",https://www.ubisecure.com/podcast/digital-identity-uk-2021-ssi-trueprofile-rene-seifert/,,Episode,,Ecosystem,,,,,,,,2021-01-20,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,LTADI,,,,,,,"Enhancing the Privacy of Mobile Credentials, with John Wunderlich","what are the challenges and solutions surrounding mobile credentials, what is IAM’s role in this and how systems need to be developed around trust.","with John Wunderlich, Information Privacy and Security Expert. Join Oscar and John Wunderlich in this week’s podcast episode, 71, as they discuss mobile credentials – what are the challenges and solutions surrounding mobile credentials, what is IAM’s role in this and how systems need to be developed around trust. [Transcript below] “So, you have different levels of assurance in the physical world, just as you do in the digital world. So, anybody can issue a credential, the question is what level of authority you give to the credential.” John Wunderlich is an information privacy & security expert with extensive experience in information privacy, identity management and data security. He has designed, built, operated and assess systems for operations and compliance in the private and public sectors for over 25 years. This included working or consulting for Fortune 500 corporations, government ministries, small companies, volunteer organisations, regulators and health system organisations of all sizes. Connect with John on LinkedIn and Twitter or email him at [email protected]. This is the Report on mobile Driving License Privacy: kantarainitiative.org/download/pimdl-v1-final/ We’ll be continuing this conversation on Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla. Oscar Santolalla: In the recent years, there have been organisations across the world creating, for instance, mobile credentials, and specifically mobile driving licenses. So, we’re going to discuss about this topic, and also the privacy side of this super interesting system that has been around. So, for that, we have an expert who is joining us today. My guest today is John Wunderlich. He is an information privacy and security expert with extensive experience in information privacy, identity management, and data security. He has designed, built, operated, and assessed system for operations and compliance in the private and public sectors for over 25 years. These includes working or consulting for Fortune 500 corporations, government ministries, small companies, volunteer organisations, regulators, and health system organisations of all sizes. Hello, John. John Wunderlich: Hi, Oscar, how are you doing? Oscar: Very good. It’s a pleasure talking with you. John: Likewise. Oscar: Fantastic. That’s a super interesting topic we’re going to discuss today about mobile credentials, so yeah, let’s talk about digital identity. But first, of course, we want to hear something a bit more about you as a guest. So please tell us your journey to the world of digital identity. John: Long story short, I used to be Corporate Systems Administrator, Network Administrator, Operations Manager, and the Federal Privacy Law in Canada was introduced, I took that as a project at my company, and it turned into a career. When I moved from the corporate side to working for a regulator, I first met Kim Cameron, a name that most of your listeners will know, working with the Privacy Commissioner of Ontario, shortly after he introduced the Seven Laws of Identity. And around the same time, my former boss introduced the idea of Privacy by Design. So, for me going back 15, 16 years privacy and identity have been in lockstep. There’s a very large Venn diagram overlap between the two. And I’ve been consulting and working on standards and volunteer areas in that joint area since then. Oscar: Excellent. Yes, just a few years ago, maybe almost two, a bit more than two years ago, we met in Kantara Initiative, in one of the working groups, and you are super involved there. And I know that recently, you and other authors have released one document called Privacy and Identity Protection in Mobile Driving License Ecosystem. So first of all, kudos for that very good report. John: Gracias! Oscar: I read at least partly that and definitely I want to hear more about that today. So going into that specific mobile credentials, mobile driving licenses, I would like to hear, to start, a simple definition. So, what are mobile credentials? John: Well, at the very highest level, a mobile credential is a bundle of attributes about an entity that sits on a mobile phone, or a mobile device. That’s at a very high level. And it then sort of branches down from there. You look at wallets or you can look at flash passes, or you can look at mobile driving licenses which have a specific ISO standard. You can talk about the W3C mobile credentials data model. You can talk about decentralised IDs, the W3C DIDs. So, it branches out into a whole grouping of things, but it ends up being bundles of attributes on a mobile phone. Oscar: In practice, that replaces holding a driving license, the physical driving license. John: I think in any substantive way, it doesn’t replace it yet. I mean, we’re talking about in-person presentation on the phone. In the workgroup that I lead that’s working on a recommendation report at Kantara, we’ve kind of matched the scope of ISO 18013-5, sorry for the ISO number reference, which is mobile driving licenses on the phone. It’s not the online presentation, which helps reduce the problem space significantly. But I think it is the case that most credentials like that are issued on the phone still have physical analogues. Oscar: Yes. John: So that people have a choice of presenting, for example, in Canada, health is a provincial matter so depending on the province, you would be issued a QR code for your COVID vaccination status and you could print that out on paper, or there were apps that you could present it on the phone, so people did both. So, I think in the space of the mobile credential, mass adoption is still sort of dual track, not replacing but supplementing. Oscar: Exactly. So yeah, whichever is available or convenient for the individual. And tell me, because I know it’s in some countries, you mentioned in Canada, this type of mobile credentials, more specifically, the mobile driving licenses are already being issued, not everywhere in world. Normally, who are these mobile credentials issued by and how do they work in practice? John: Well, anybody can issue a credential. The analogue is weak, but I still think of a bundle of credentials, a bunch of attributes as the digital equivalent of a business card. I can print my own business card. But the level of assurance and level of authority is pretty weak on a business card, as opposed to a laminated plastic with a hologram issued by a government for a driving license or a proof of citizenship or a medical card or an in between, something that might be issued by a university or a company. So, you have different levels of assurance in the physical world, just as you do in the digital world. So, anybody can issue a credential, the question is what level of authority you give to the credential. Oscar: And we can say the right now the most, the mobile credential that has the highest level of assurance are done by governments today. John: Yeah, I don’t think that’s going to change. The government, possibly financial institutions, possibly health institutions, but they are institutionally issued by institutions that have the capability of doing identity verification on initial issuance to provide a very high level of assurance that this credential is associated with this entity. Oscar: Sure. Tell me a bit of the practicalities because I have never used, at least a mobile driving license, how it works in practice, when people who drive want to present the credential? John: Well, let’s talk about two use cases for a mobile driving license. One is proof of authority that you’re, in fact, a licensed driver. So, you’re pulled over by the police and asked to present your driver’s license. Right now, you present a plastic laminate card, they look at it, look at the picture, may go back to their squad car, and do a check to make sure that there’s no problems with that license, that there’s no warrants. Once a warrant is issued against your name, all that good stuff. We’re presuming here that this is a legitimate traffic stop. I have a lead foot, they pull me over for going 80 in a 60, or whatever. And all of that is authorised by law. So, they get a full set of the credentials that are on the card and are able to access their backend system to get further information about me. So that’s fairly intrusive. But that’s what you would expect when you are a potential violator of the law. As opposed to the more common use case for driving license, which is age verification. So, it’s long past the age where I get checked when I go into an establishment where I could buy alcohol or cannabis. But that’s the joke, driving licenses get used much more for drinking than they do for driving because people need them to buy alcohol. So right now, if I’m a just barely legal age young woman, because this is the edge case that causes real difficulties, and I’m trying to enter a club, I have to show the doorman or the bouncer at the front of the club my entire driving license. Now, what he should be doing is looking at the driving license, looking at the picture to make sure that I’m actually the person I purport to be. And then looking at the date of birth and making sure that I’m old enough in that jurisdiction to enter the club. That’s it. That’s all and no retention of information. The problem, the dual edged nature of the digital driving license is this. If you do it right, you end up with all that shows up on the device that the doorman is using to verify your digital license. So, you present a QR code or there’s an NFC, you wave your phone near their receiver, the same way you do a tap to pay or a Bluetooth off in the MDL specification. All of those are allowable, the Bluetooth and the NFC are preferable. And what shows up on the doorman’s verifying device is a picture of the person whose license it is so that they can do the proof of presence. Yeah, this is indeed the person I’m presented with. And a green checkmark that says this person is old enough. So, as opposed to the analogue physical driving license, he has no opportunity should he be amorously struck by the person to grab her address or follow her or grab your phone number or do any of that stuff. And the system operated by the bar also cannot record anything other than there was an age verification for a person, or they can–. On the other hand, depending on how the presentation goes, the backend system for John’s Bar and Grill could collect all the credentials that are on that driving license even if it only shows the doorman the other one and send me marketing email and all that other kind of stuff. So, you can do it in a really privacy protective manner, or you can take the bad edge case of presenting a card and automate that for negative consequences. So obviously, the Kantara workgroup that I lead for reports and recommendations on privacy enhancing mobile credentials, is trying to come up with a set of recommendations for providers of verifier device issuers, providers of wallets or designers of mobile credentials, how to protect the privacy of the individual above and beyond the transaction. Oscar: Yeah, definitely. In this example of the going to a bar, you illustrated how it works, but also in the privacy implications. John: Personally, I’d prefer some of the European countries that don’t have age distinctions. If you want to send your six-year-old to the bar to pick up some wine, then… But I think there’s cultural bands in those countries where if you’re publicly intoxicated, you’re subject to social ridicule. There’s no social benefit to being inebriated. Anyway, that’s a side-line. Oscar: Yeah. And definitely you started explaining there some of the challenges about privacy. What would you say are the main, if you can just summarise the main challenges about privacy? John: Well, the business culture, surveillance capitalism, if you will, I know that’s a fraught face. When I was running networking systems and systems administrator in the ’90s, I was issuing X.509 certificates and I had all my user data, and I had access to it. Which kind of made sense in mid to late ’90s systems. And we were a B2B company that I was working for, but we were processing the Personal information of our customer’s clients. So, we also had a proprietary interest in that. So, coming into 2000, there’s this sort of culture of data about people is an asset owned by the companies and can be used to those companies’ advantage. The idea of, because systems were centralised and few, privacy issues didn’t really raise their head. There was good confidentiality in most companies, but not privacy. And then, with the explosion of systems, and the introduction of a very interconnected backend, and especially with the introduction of monetising data through behavioural advertising and tracking, the entire thing got out of control, and we need to wrest control back of our own information. The challenge is cultural as much as anything, like the business culture around Personal data. And the business models, which are I think, not sustainable. Oscar: Yeah, definitely. And could you now tell us about the solutions? What are the solutions to those challenges you just mentioned? John: Well, I think the solutions are two or three-fold. Legislation like the GDPR in Europe, or a day or two ago, I don’t know when this is going out. But just today, I heard the news that in the US, there was an agreement that might lead towards a Federal Privacy Law in the US. But all of those laws have the same flaw, which is that they’re built on the idea of Fair Information Practices, which came out of a 1970s vision of the way computer networks and systems run. Which is what we know now to be a completely flawed, notice and consent model where the organisation is a trusted organisation. It provides you notice of what they’re going to do with your data, they get your consent, and then they take possession of your data and act as a trusted custodian of your data. That was fine when there was a couple of 100 mainframes scattered around the world and IBM, [0:13:31] [unsure] on most of them doesn’t make any sense now. So, there’s a new type of regulation that’s needed. But there’s a new type of standard that’s needed as well. So, the new type of regulation is what I started to taking to call “digital building code” which is you shouldn’t, any more than you shouldn’t build a house with electrical connections without making sure that your electrical infrastructure meets the building code for your jurisdiction. I think that there needs to be digital building codes so you can’t build a system that’s going to process Personal information without meeting a certain minimum set of standards for protecting so that people don’t have to try and read privacy policies that can give notice and consent. That should be matched by standards and business culture to meet that floor standards. So, it’s a tripod, if you will, the complaint mechanisms in the current regulations, safety regulations to make sure everybody is operating off the same floor, and standards to enable developers and companies to meet those. And I’m working on the standards side. Oscar: OK. You mentioned the companies tend to use the data of the individuals as a property. So how you, when we were discussing before this interview, you mentioned was the role of the IAM into this, what is your view on that? John: Well, I remember talking at IIW a few years ago, it wasn’t about SSI as this was pre-SSI, but it was about a product in the system that enabled companies to give control over some portion of their customer’s data to the customer. So, in financial institutions, they’re highly regulated, and oftentimes have a requirement to send paper to their customer’s home on a yearly or quarterly basis. If you have the wrong address, it’s very expensive, but you have this address in your database for the customer. And if you’ve got it wrong, you eat the cost of reprinting, resending, bringing back the data that you sent to the wrong address. The customer neglected to update you on their address, and so forth and so on. It costs millions of dollars to some of these companies every year just in data around customer addresses. So, this was a system that instead of a customer requesting an address change, the company gave control of the address information to the customer, so that if the customer didn’t update the data, so with power comes responsibility, then the customer will be charged for the delivery charge to the wrong place, the reprinting and delivery charge. So that de-risks that company on that. In that particular system, I had also enabled the customer to update their address with multiple companies at once if they were participating in the protocol. So, you can de-risk your data, especially against data rot, by actually giving control back to the person who knows it best. Now, you’re not going to do, I know I worked in payroll and HR so theoretically, customers can also update their financial information like, “Oh, I changed account, let me enter, put a new account number.” But anybody who’s worked in payroll and HR knows that you want to verify that, so you have to balance that. But giving control of customer’s data can save you money and improve the quality of your data. And I think that was kind of one of the arguments that SSI people don’t make often enough is just the simple, reduced cost and business risk of putting people in control of aspects of their own data. Oscar: Yeah, exactly. Regarding the report that you built in Kantara, tell me more about that. How it was the work that and you can tell the main findings? John: The way Kantara works, there’s two kinds of discussion groups which produce reports and then recommendations, which will have requirements for conformance that could be tested. So, your audience may well know the identity assurance work trust mark that Kantara can issue if you want to make claims about your level of identity assurance for your firm. So, the discussion group produced a report on protecting privacy and information and mobile driving licenses coming and that report is available on the kantarainitiative.org website for download and you can take a look at it. The workgroup is trying to do a little bit more ambitious. It’s trying to talk about how do you meet the reasonable expectations of privacy in the issuer, holder, verifier triangle where Alice is the holder? How can she trust that her reasonable expectations of privacy are maintained in the transactions after she’s been issued a mobile credential that that credential won’t be abused in a way? Because most of the standards are transactional, which is to say that if you use the ISO MDL standard, for example, it talks about the interfaces. It has good requirements around data minimisation, and notice and consent, but it’s all around the transaction. It doesn’t speak to, and it’s not supposed to, it’s scoped out of it. It doesn’t speak about if John’s Bar and Grill is the verifier and they’re using a system issued by ‘insert your mobile payment system verifier here’ that both that system at John’s Bar and Grill. And John’s Bar and Grill, once they’ve gotten the identity attributes from the transaction, how do they use those in a privacy protective manner? If you’re building a wallet or building a mobile credential for a phone, how do you do that, so that Alice can trust the wallet not to share the information for advertising on the backend, and so forth, and so on. So, building human trust between entities is how I like to summarise it rather than technical trust between endpoints, which is a lot of what most of us work with. Oscar: That’s a very interesting distinction you make, right? Trust is something technical for most of us. John: Well, sure. But if you think about it, Zk-SNARKs or Zero Trust Solutions make perfect sense, right? If I went back to being a Network Administrator, I’d build a zero-trust network. I would assume that there was an APT inside my network and there would be zero trust between the endpoints, and everything would be cryptographically signed and verified and yadda yadda yadda. But in the real world, and this is– I steal this example from Bruce Schneier. If there was zero trust, nobody would ever cruise through a green light. How many times, if you were driving today, how many times to do sales through a green light without looking to see if some idiot was ignoring the red light and ploughing through? I mean, it happens every once in a while, for a variety of reasons. But by and large, if you’ve got a green light, you trust that all the other drivers on the road are following the rules and you just go through. That’s trust. That’s human trust. It’s built on systems that have all kinds of safeguards to make sure that you don’t have a green light going four ways. But once the systems are working, people can trust each other using those systems. And we do not have that on the internet. There is essentially zero people trust, and sadly, that’s been earned by the behaviour of a number of ecosystems that handle Personal ecosystems. Oscar: Indeed. So, you have seen this, the understanding of trust, let’s say the identity professionals are the one who are building the systems and the, let’s say, the majority of people that difference. So, you have noticed that this makes things complicated for developing the systems? John: Oh, yeah. Yeah. I mean, the story for surveillance capital, like companies that depend on advertising is, we never sell your information, or we never share your information. Well, which was true in a sense, and that if you went to site X, or site F, or site G, you know, insert whatever company. And you went to that site, and then behind the scenes, real time bidding occurred, they didn’t share your information with any of the advertisers. The advertisers said, I want to put my ad on profiles that meet these parameters. And there’ll be real time bidding. So, at one level, until you clicked on the ad, and at which point your positive action of clicking on the ad created a relationship between you and the advertiser, at which point it’s out of system X or system F or system G’s hands, you’ve now done something, and that advertiser or the publisher has access to your information, because you clicked on their ad. So, in a very narrow, untrustworthy sense, yeah, they weren’t selling your data. Although a lot has come out now about the way real time bidding works, and how much information is shared for the 100 bidders, who get to see some bit of your data to bid on that page, and only one of them actually gets the data, the one that actually wins the bid. It’s a snake pit behind the scenes. Oscar: Indeed. Well, thank you, very interesting, what you are sharing with us about the mobile credential, the privacy and how things are progressing. Could you tell us now for all business leaders that are listening to us, what is the one actionable idea that they should write on their agendas today? John: The actionable item is, do you know who you share your customer’s data with? And why you shared it? And would you be comfortable sharing that information with your customer? So, the answer to all three of those questions should be yes. Yes, I know with whom I shared it. Yes, I know what I shared. And yes, I’m comfortable letting the customer know that I’ve shared it. If any one of those answers is no, then there’s going to be a reckoning at some point with your customers, or with the regulator, or with your business model. Oscar: Yup. Three questions that, as you said, they should be yes, absolutely. John: Those three yeses de-risk and future-proof your organisation. Oscar: Exactly, exactly. Well, thanks a lot for putting it in a very, very clear way for everybody who is listening to this. Well, thanks a lot for sharing all this super interesting about mobile driver licenses. And of course, I recommend, so we’ll put the link to the report, Kantara Initiative report that John has co-authored and super interesting, John: And a link to the workgroup, hopefully, because I invite anybody who is interested in this to join the workgroup because you can start to help shape the requirements for the standard to come. Oscar: So, what is the name of the workgroup? What is the exact name? John: The Privacy Enhancing Mobile Credentials. Oscar: OK, perfect. We will add the link as well. Fantastic. So, John, it was a pleasure talking with you. If there’s something else that you would like to tell or how people can get in touch with you. John: Probably the simplest way is I’m @PrivacyCDN on both Twitter and LinkedIn, P-R- I-V-A-C-Y CDN. That’s a play on words because in Canada, CDN is sometimes used for Canadian. But in the rest of the world, it means Content Delivery Network, so I thought that was an interesting pun. Anyway, @PrivacyCDN on LinkedIn or Twitter. Oscar: OK, I didn’t know that’s CDN, so it’s good to know, thank you. Again, it was a pleasure talking with you, John, and all the best. John: Take care, Oscar. Thanks for listening to this episode of Let’s Talk About Digital Identity, produced by Ubisecure. Stay up to date with episodes at Ubisecure.com/podcast or join us on Twitter @Ubisecure and use the #LTADI. Until next time. [End of transcript]",https://www.ubisecure.com/podcast/mobile-credentials-john-wunderlich/,,Episode,,Ecosystem,,,,,,,,2022-06-22,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,LTADI,,Verimi,,Germany,,,Germany’s digital identity landscape with Verimi’s Roland Adrian,"In episode 40, Roland fills us in on how Verimi works and its privacy-by-design cornerstones, including data minimisation. Oscar and Roland also discuss the digital identity landscape in Germany","with Roland Adrian, Managing Director at Verimi. In episode 40, Roland fills us in on how Verimi works and its privacy-by-design cornerstones, including data minimisation. Oscar and Roland also discuss the digital identity landscape in Germany and how it’s been affected by the pandemic, plus the future of identity in Germany and what needs to happen next. [Scroll down for transcript] “Customer experience is king at digital identity. And really, technology, security, privacy, whatever it is – it’s important, but in a sense it’s a commodity.” Roland Adrian has been Managing Director and Spokesman of the Management Board at Verimi since January 2019. Previously, he was Managing Director and Spokesman of the Executive Board at Lufthansa Miles & More GmbH for four years. The business degree holder started his career in 1996 at Roland Berger Strategy Consultants in Munich. After holding leading positions in the KarstadtQuelle Group, he built up the HappyDigits bonus programme from 2002 as a joint venture between Arcandor AG and of Deutsche Telekom AG. In 2009, he moved to PAYBACK in Munich and from 2010 focused on the launch of the programme in India. As Vice President, he led PAYBACK’s expansion into various markets worldwide. Find Roland on LinkedIn or email him at [email protected]. Verimi is the European cross industry identity and trusted platform. Verimi combines a convenient central login (Single Sign On), the highest data security and protection standards in line with European law and the self-determination of users regarding the use of their Personal data. Verimi was founded in spring of 2017. The identity and trusted platform is supported by a network of thirteen international corporations. The shareholder network includes Allianz, Axel Springer, Bundesdruckerei, Core, Daimler, Deutsche Bahn, Deutsche Bank and Postbank, Deutsche Telekom, Giesecke+Devrient, Here Technologies, Lufthansa, Samsung and Volkswagen. Verimi is a Ubisecure partner. Read more about the partnership in the press release: https://www.Ubisecure.com/news-events/verimi-partnership/ We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla. Oscar Santolalla: Hi, and thanks for joining. Today, we are going to hear about the digital landscape in Germany. And for that our special guest is Roland Adrian. He is Managing Director and Spokesman of the Management Board at Verimi since January 2019. Previously, he was Managing Director and Spokesman of the Executive Board at Lufthansa Miles and More for four years. He built up the HappyDigits bonus programme from 2002 as a joint venture of Deutsche Telekom. In 2009, he moved to PAYBACK. And as Vice President, he led PAYBACK’s expansion into various markets worldwide. Hi, Roland. Roland Adrian: Hi Oscar. Oscar: Nice talking with you Roland and really happy to hear what is going on in Germany in terms of digital identity and everything related to that. And happy to know more about Verimi. I’ve been hearing ‘Verimi’ already for the last years and definitely need to hear more details. What are the products you are building and offering today? So please, tell us a bit of your journey how you became the managing director at Verimi. Roland: Yes. Thank you, Oscar. And many thanks for the invitation. Glad to be here and talk to you a little bit about the market in Germany. So yeah, what was my journey becoming Managing Director of Verimi. Actually, my journey, professional journey, started 25 years ago when I started my career in consulting. Then some stations at Karstadt which is a department store group. And then I founded multi partner loyalty scheme together with Deutsche Telekom. And from there, I moved to PAYBACK which actually is Germany’s leading multi partner loyalty scheme. They are in quite some markets worldwide, in India, Mexico, Italy, US. And from all the travel I then got introduced to Lufthansa, of course, and became the CEO of Lufthansa Miles & More. And during that time, actually, I realised that the future is more and more about seamless customer experience. Because if you look at Lufthansa, in many cases, the real loyalty benefits that you can get there, they actually translate into a real seamless customer experience that you get. If you look at all the fast tracks for security and immigration, priority boarding, preferred seating in the plane, it’s actually – the customers tend to reward benefits in their experience much more than any loyalty currency. And so at the moment, where then Lufthansa invested into the Verimi idea, for me, it was very clear that this would be an exciting next step for me Personally. So I decided to switch over to Verimi to be the CEO of Verimi, and to push forward digital identity to provide seamless customer experience for the users. Oscar: And I can induce that Lufthansa is one of the founders organisation behind Verimi, but tell us more please for the ones who are not familiar with Verimi, please tell us what Verimi does. Roland: Yeah, in fact, Lufthansa is actually one of the investors and we have altogether 13 very large companies in Germany that invested into the Verimi idea. And the large companies are renowned brand names such as Allianz, Deutsche Bahn, Deutsche Bank, Deutsche Lufthansa, Deutsche Telekom, Daimler, Samsung, Volkswagen. So, all very large companies that invested into Verimi to establish a wallet of digital identities. So that was their driving force. And I think when we will talk about the market a little later on, we will see that it was a very good moment to invest into such platform because the market urgently requires the platform and there’s pretty much an empty space currently in Germany. And what we provide as Verimi is this one click digital experience for verified identification within our partners’ use cases. And at the core of it all, is an identity platform, of course, that matches all the regulatory requirements for anti-money laundering or EDA substantial. And this comes along with the solution for strong customer authentication. Because the critical part of such a platform is not the identification of customers itself but actually, the critical part is the reuse, and that means the re-access to the digital identity. And then there are services that actually build on the digital identities such as electronic signatures, or secure payments, and actually Verimi offers products in all these four fields, that means for identity, so just identifying customers for strong customer authentication, for electronic signatures, and for payment solutions. And this is the platform that we have, so how do we bring customers on that platform? Actually, invite the customers to store their verified identity data with Verimi. And verified identity data, what can that be? We all know that’s ID card, it’s passport, it’s driver’s license, and it’s many attributes that are not such in governmental ID, but that are just verified IDs, like verified email, verified phone number or even in Germany, the tax number. And customers can actually come to us on Verimi.com and store all their data and everything they wish to store there. But our experience is that customers don’t come without use case. So for my Personal loyalty background, I oftentimes compare it to situations you wouldn’t register for loyalty scheme, if you’re not planning to shop or to travel with the partner where you can actually go and can collect the miles or the points. And very similarly, nobody would come to Verimi if he doesn’t plan to actually use his identity data somewhere. So from our experience, we learned that it’s crucial to be integrated into the use cases of our partners. So once you actually need to identify for the use case, we offer the customers to store their data with Verimi. So either they need to identify and haven’t stored with Verimi before, so then they can do it. And if they come, and they already stored their identity before, then they can just benefit from the one click identity that we offer. So we actually provide a much better experience for the user, because instead of having video IDENT or photo IDENT, or whatever ID procedure is there, it’s a one click, seamless experience. And for our partners, we can enable them that they offer much better services to their customers. They minimise barriers to entry that come from any kind of identification process. And maybe even more important, we minimise transaction costs for identification, because reuse of a digital identity, of course, is much more efficient than any kind of first time identification. Oscar: Yeah, definitely. Sounds like an excellent offer of identity for anybody who is at the moment in these brands, in these companies that have invested but I’m sure it is planned to be anywhere in – for now, in Germany, I guess. Some of these big names you mentioned, some of these big names, big company that are behind Verimi, some of them already have launched services in which the users can use Verimi? Roland: Yeah, absolutely. I mean, we launched the Verimi platform some two years ago, actually. And we just launched it with a single sign-on feature so that you can use a classical single sign-on at that various partners. That was our MVP to really prove that the platform is there, that it’s stable, that it’s operational. The actual product that we are all aiming at, meaning the identity product we just launched in spring last year. So we are live for a little less than a year currently. And we already have the first partner integrations and kind of minimum viable product setup. So there is an MVP in the market, where we now showcase all the various use cases that we can deliver. And the use cases just technically they split up into that we can do it in the web, that we can do it in the app, that we have an SDK, that we can do it as an in-line registration, what I just meant that you can register within your process or that you can have the one click. And for the registration within the process, we’re also applying various identity methods. So that means we offer a bank ID where you identify with your current account with the bank, we offer a photo IDENT solution where you take a picture of your ID documents, we also have video IDENT solution. And of course, we also have an e–ID solution. This is the governmental solution. And this we showcase throughout the various use cases with a little focus on our shareholders, but we also have partners from outside of our shareholding group, where we now have a nice set of use cases and this year is about the scaling of all these use cases with more and more partners. Oscar: So they are available. And for the end user, if someone wants to go to Verimi and create their identity, so that can be done at any time. Roland: Anytime, you just go to Verimi.com, you create your account and you can do that in the web or on the app. Then you can choose whatever you want to put in your wallet. So you can verify your passport, you can verify your bank credentials. You can verify your email. So it’s a little that you can play around, just verify your data. But as I said, without having a use case, it’s more maybe a gamification that you can play around and try it out. But we see that the most customers come in if they really have a use case. Oscar: Completely understandable. Yes. So we often talk in this podcast about the concept of privacy by design in the identity industry. And particularly in Germany, given its reputation, this country’s reputation for a strong focus on data privacy, how do these tie into what Verimi is aiming to achieve? Roland: Yeah, and that’s a very good question. And absolutely, Verimi is what you can say, privacy by design. And the focus in Germany, the Germans tend to be very sensitive about data privacy, but it’s also the reputation of our shareholders, as you can imagine. I mean, they are not investing into a joint venture with all these large companies to then create any kind of sense that there is conflict with data privacy. So this means for Verimi privacy is done by design. And there are three cornerstones when we say privacy by design at Verimi. So three cornerstones, and the one is a clear policy that there’s no advertising policy with Verimi. So we don’t offer any services or infrastructure for any kind of customer profiling or commercialisation of customer data. No, it’s just for the identity wallet that the data are there, nothing around. The second cornerstone is secure storage. So, data is encrypted by private keys. And these private keys are securely stored locally on the smartphone of the users. And the entire architecture that we chose is as secure as possible and at the same time, as convenient as possible so that it’s a solution that people want to use. And all this, needs to meet the current regulatory context, which is quite unique in Germany, I believe. It’s unique and also complex. And we are meeting all this context as convenient and as secure as possible. And the third cornerstone is the data sovereignty for the user. So, it’s very important for us that we are always fully transparent to the user. We tell him exactly what data he has stored in the wallet, what data he actually shares with the partner that he chooses. And we show very transparently a transaction record so that the customer is informed what happened. And there’s nothing hidden, nothing aside, no commercial benefit aside that we have transferred the identity data to the partner as the customer wanted. And of course, the second key word is principle of data minimisation. It is data minimisation, also by design, because it is the wallet of the user. So whatever is in the wallet, actually, the user put in the wallet. The user decides what to put in. We are not collecting anything in the background, where we then need to say, “Oh, we just can collect minimal dataset.” The user decides what to put in. And the user also decides case by case, what to share with whom. And that is very transparent. And then the principle of data minimisation does go more go to our partners, because they can only request a minimal dataset. And if they request more in our processes, they ask the customer if for you know any use they could also get this or that data, but the customer can say “No, I don’t want to give you that data”. Oscar: Oh, yeah, definitely. I can see that there’s really good designs in terms of privacy and data minimisation in Verimi’s service. Bringing back the attention to the business landscape, could you tell us how is the wider digital identity landscape that Germany has today and if has been affected or how has been affected by the pandemic? Roland: Yeah, sure. I would say digital landscape in Germany or digital identity landscape in Germany, overall digital identity has not really arrived in Germany. Where does it come from? I mean, if we look at the market, we see that we have a governmental e-ID scheme. And that scheme is there for more than 10 years now. And according to the latest studies, less than 6% of citizens actually use it. So that’s the one from the governmental side. And then compared to other European countries, if you look at the banking industry, the banking industry has not set up something like a bank ID that we see in many other countries. And the reasons for that are very specific if you look at the German banking landscape, how the structure is and how many players are there and how these players work with each other. And then as the third pillar in Germany, or the third aspect to look at, we have several Identity Service Providers that actually use this gap that there is not a governmental e-ID scheme or bank ID, Identity Service Providers came up. And they actually verify the physical ID, so the ID card, or the passport or the driver’s license, and confirm the data for the digital process. And these processes look like a very – a video IDENT, for example, where you have a call with a video agent, and then you show your ID card and everything and the video agent says “OK, yeah, looks perfect.” Or you have a photo identification, where you take pictures of your ID cards. Or you have a very, I think German phenomena, where you go to the postal office and show your ID card in the postal office for one time process and then the postal agent says, “OK, I saw it, I can confirm you have it”. And all this is for one-time specific use case only. So there’s nobody really in the market that stores data for digital ID. Oscar: So it needs to be done, again, for instance, if I need to do another identification, I need to go to the post office again. Roland: Exactly. So if you have like two or three transactions, if you happen to have two or three transactions the same week, you can go to the post office each and every day. And the procedure will start from scratch. I mean, you can also collect them and show the three transactions at your one visit but each and every transaction will be captured individually. So there is no synergy. And then there is no storage. So, if the week after you have the fourth process, you need to go to the post office again and start from scratch. Yeah, and this shows as a result, the user has to identify himself over and over. So for every use case, and that’s not convenient. And that also creates very high barriers to entry for the merchants in the market. Because every user says, “Oh, that’s a hassle, I don’t want to do it.” And at the end of the day, this is pretty much inefficient for the digital transformation of Germany. So we urgently need a solution for this in Germany. And this is also one of the reasons why our shareholders came together and said, “OK, we need to do something.” And that we need to do something you asked for the pandemic impacts, I would say, as probably in all countries, I mean, pandemic has put very much pressure on the system. If we now reflect what we have in Germany, if we look at the public administration, public administration is not yet open to accept any kind of privately operated IDs. So public administration is very much focused on their governmental e-ID scheme, and as I said, only 6% of citizens use it. So within the pandemic, when all the offices were closed, in the first place, there are only very few processes that are digital at the moment at all in public administration in Germany. But for those that are digital already, the access to these digital processes has been a key bottleneck. So basically, if you want to enable your public e-ID on your ID card that requires and required until end of last year, it required Personal visit to the office, and that was closed. So they were fully stuck. And the solution is now that they want to make e-ID even more attractive and we will all see where it leads us this year. But the solution is not that they now accept privately operated ID schemes, which is pretty sad. So if we look at the private sector, I introduced the various methods that we have there so that you can go to the postal office to identify yourself. Well, there was limited access to the postal offices too during these days. And the next one is that you can actually have a video call with a service agent in service centres. But also there, I mean, if you imagine the setup of a service centre and the call that everybody works from home, I mean that was a lot of pressure to keep the service centres up and running in the first place. And then they had a massive increase in volume and in requests from customers. So they were pretty challenged. So that challenge is we are now in this situation for about a year. I think everybody has kind of learned how to cope with the challenges. But the good thing is like probably in every country, now there’s a big openness for change in the industry. So everybody is aware, it’s not the best idea to just continue what we have, but we need to change things. That is very good. That is very good for Verimi because everybody is talking about, “Ah, can’t we have a digital ID where people just have a fully digital process?” That’s a very good discussion. On the other hand, we all know if you want to integrate identification solution in a regulated context, in a large company, that takes time. So now we have the good situation that people want to change and want to have innovations. We have the challenging situation that you can’t do it from one day to another. And we have the challenging situation, of course, that free budgets for those things are hard to find. So it will take time that we transition. Oscar: Yeah, of course, I can imagine not only from people, people want change, better services, better data services, and also the organisation, companies and government are already into this. And so seeing now towards the future, again in Germany, what do you see as the future of digital identity in Germany? And if you can compare how are things happening other countries, for instance, other – yeah, other geographies? Roland: Sure, yeah, looking at Germany, I would say Verimi is among those that actually pioneer the development of digital identity in the German market. So as I said, for the introduction, certain shareholders have founded Verimi, they have an objective what Verimi shall create for them. And they put strong financing into Verimi so that we are really enabled to create a change in the market. So we have two main tasks in this pioneering role. The first task is that we need to set up an infrastructure that is able to deliver solutions in the given context. And that means now and today, our solution needs to work. So it’s not about the vision that may be in five years, or in ten years, it would be great to have any kind of solution that would work. But our task is to have an infrastructure ready that delivers solutions now. And this infrastructure is ready, we have the MVP, we are rolling it out. So that objective is done. And the other objective is that, of course, we need to push forward and shape the landscape that we deliver innovative solutions and innovative solutions that can translate into kind of local storage. And I said that we are storing the private keys locally on the smartphone of the user, in a trusted execution environment at the app or even using the secure elements in the smartphone. This is something we introduced last year. So it was not planned in the initial architecture, but we saw that technology was able to deliver such a security service, and we innovated on this so that’s in the market now for about six months. And we are working on new technologies where we have self-sovereign identity solutions based on blockchain technologies. And all this we want to deliver in the upcoming future. But for us, it’s really important that the solutions work in the current context. And if today you talk about self-sovereign identity in a regulated context in Germany, that is a great vision to have but there are many questions to answer before you can put it into practice. So this is the first thing we want to deliver in practice, and also to innovate. I think that Germany as a whole is just starting in terms of digital identity, and there is still a very long way to go. First thing is that the given regulatory framework in Germany is not really in favour of digital identities. There’s a great lack of interoperability between the various frameworks for regulation, for example, anti-money laundering and telecommunications, the eIDAS framework for the public sector. It’s all isolated silos. And even within the silos, if you look into any anti-money laundering or telecommunication, these regulations are not really done for a digital space. So there’s still a lot of work to do that it fits the digital context. The second one, why are we just starting in Germany, is that in German context, there’s only few appetite for public, private corporations. And I believe that the public sector has a key role to really initiate digital identity and the acceptance of digital identity among all the citizens. And if we look into the recent initiatives in the public administration and e-health sector in Germany, we don’t see that they really foster an open ecosystem of digital identities. It is very much about security and technology, and innovation and how it shall be and could be. But for the current situation where we urgently need a solution, from day to day, there is not the real inception. So it’s rather on a visionary stage where we say, OK, if we all reach there in five years, then we all have achieved something, which is great, that we have a common goal. But for kind of the context this year, we are lacking some dynamics on that side, I would say. And the third part of why we are just at the beginning of digital identity in Germany is that we need to overcome all these legacy of intermediate solutions. As I said, we have the postal IDENT services, we have the video IDENT services, all these services are integrated at most of the companies and processes also tailored to these solutions. So it’s quite hard to overcome this legacy and to say, “Oh now, we need a new solution and a new process and new investment and new resources to realise it” in a situation where the process as a basic process works. So I would say these are the three points why Germany is just starting. So it’s a regulatory framework. It’s a public private co-operation. And it’s a legacy of intermediate solutions that we have. But in this situation, I’m pretty confident that we will actually manage to set up digital solution and the real digital identity solution that is not only covering the ID card, but also driver’s license and any other Personal IDs or certificates that you can put in your digital identity. And I’m confident that the joint efforts of the private sector will actually bring up solutions, and, of course, Verimi, we feel at the forefront of this development that we bring up solutions that fit the requirements at least of the private sector, and then, at any time, also invite the public sector to take part. And there are so much room for development in the market, because we need digital processes, even for the existing use cases for efficiency and compliance reasons. I mean, it’s reason enough to digitise what we have. And there are so many new use cases coming in FinTechs, in shared mobility, that all need highly efficient workflows. And in some cases, these use cases just arise from the fact that you have this efficient workflow because there’s no barrier to entry anymore. And there’s no upfront investment into customer acquisition and customer identification anymore. It’s just one click away. I mean, this development will come, it will come in Germany. We are fostering this development. I think in an overall European context, it is the question how we can drive these dynamics, because at the end of the day, we all know that the big tech players are providing these digital identity services as well. So the question will be, will the big tech players like the smartphone industry or software industry from US or from Asia, will they sooner or later provide these services to German and to European citizens? Or whether we manage to have our own national sovereign solutions in Germany? And this is what we are aiming at. We want to have our sovereign, own verified identity solution for Germany and also for Europe. And well taken, at the moment, this digital infrastructure is missing in Germany, in particular – other European countries are far more ahead in that context. But I think we all need to push forward so that we keep our sovereign verified identity solution. Oscar: Yeah, I couldn’t agree more with that. Definitely. And I’m sure that’s going to be the case to Germany, and also the other European countries, deliver their – the right way to deliver identity. And yeah, there are several challenges, there’s no, no doubt. So I really wish a lot of success to Verimi and the companies that are behind that. So… Roland: Thank you very much. Oscar: I’m sure if you deliver this business model that people need for the needs that we have today, and for the ones who are coming as you mentioned, or some mobility soon they are coming and be more, more important. Yes. So, that is going to be a success. So Roland, I would like to ask you one final question. So for all business leaders who are listening to this conversation, what is the one actionable idea that they should write on their agendas today? Roland: That’s a good question. The one actionable idea… and I think the audience will be pretty international. So in my understanding, it very much depends on the specific context that you’re in, especially for digital identity. There’s only one thing I think overall, which is clear to all of us but I would like to underline. I mean, at the end of the day, the customer will choose the solution that best fits the expectations. The customer doesn’t think about technical solutions, and what technology is applied and how much privacy is in it. The customer just wants to have a solution for his use case. As I said, big tech players are targeting digital identity. And if there is somebody in the market that tends to have the best customer experience, then it comes from big tech players. So the one actionable idea, what is it? I would say customer experience is king at digital identity. And really technology, security, data, privacy, whatever it is, it’s important but in a sense, it’s a commodity. And even though this sounds very basic, in reality, I see that a lot of discussions it is just the other way around. In a lot of discussions, it’s about technology, security, data privacy. But I believe this is not putting the European players in a leading position. Customer experience is king, even in digital identity. And this, I believe, is something that we shall all have very present in whatever we do. Oscar: Yeah, I couldn’t agree more with that – customer experience and making it super easy for the users to try something that is, is designed like Verimi with privacy by design, data minimisation, doing right, something that has to be simple to use customer experience are the top of the mind, and that’s going to be a winning solution that customer will use. Roland: Yeah, absolutely. Oscar: Well, thanks a lot, Roland for enlightening us about the situation of digital identity in Germany and also the kind of work that you’re doing in Verimi. So please let us know how people can get in touch with you, find more about you and Verimi, what are the best ways for that? Roland: Just send me an email [email protected] and I will certainly reply to you. Oscar: Excellent. Again, Roland, it was a pleasure talking with you and all the best. Roland: My pleasure. Thanks for the invitation and looking forward to keeping in touch. Thank you, Oscar. Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at Ubisecure.com/podcast or join us on Twitter @Ubisecure and use the #LTADI. Until next time. [End of transcript]",https://www.ubisecure.com/podcast/verimi-roland-adrian-identity-germany/,,Episode,,Ecosystem,,,,,,,,2021-03-03,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,LTADI,,WomenInID,,,,,"Inclusive identity: are we doing enough? With Tricerion, Women in Identity and FinClusive",One of the clearest areas of digital identity where we see the impact of not doing enough to include vulnerable people is authentication – the point where a user must verify their identity in order to gain access to a service.,"with Schehrezade Davidson, CEO of Tricerion, Sarah Walton, Code of Conduct Programme Manager at Women in Identity, and Amit Sharma, Founder and CEO at FinClusive. Episode 67 explores inclusive identity. Making identity solutions inclusive for everyone wanting (or needing) to use them is a topic that’s coming more and more to the forefront of the identity industry. From logging into apps, to accessing essential services; to how barriers to organisation identity is impacting individuals – in this episode, we speak to three guests from the identity industry on what they’re doing to help solve these issues. [Transcript below] Schehrezade Davidson is the CEO of Tricerion Limited, a company that owns novel patented mutual authentication software using image passwords. Find Schehrezade on LinkedIn. Find out more about Tricerion at tricerion.com. Schehrezade has appeared on the podcast twice before, talking about: neurographic passwords (episode 26) and immunity passports (episode 41). “If the onus is on the individual to authenticate themselves, those in the industry need to make it truly inclusive with alternative ways, depending on a customer’s needs.” Dr Sarah Walton is a digital consultant, author, coach and public speaker. She founded Counterpoint in 2003 to support organisations become digital, innovate and grow. Most recently she led the UK Open Finance programme and is Women in Identity’s ID Code of Conduct Programme Manager, as well as being commissioned by the Open Identity Exchange to author ID Inclusion reports. Find Sarah on Twitter @sarahlwalton and on LinkedIn. Find out more about Women in Identity at www.womeninidentity.org. “This is very much something that is very commercially important but it’s also extremely important to people’s lives and livelihoods on an individual basis.” Amit Sharma has engaged in a myriad of roles that intersect financial markets, risk management, regulatory compliance, and international development. He is the Founder and CEO of FinClusive, a hybrid FinTech and RegTech company dedicated to financial inclusion. Connect with Amit on Twitter @ASharma_VT and on LinkedIn. Find out more about FinClusive at finclusive.com. Amit has featured on the podcast before, discussing the role of identity in financial inclusion (episode 51). “From a macroeconomic perspective, it’s important to note that identity challenges are often seen as just at the individual level, but these at the institutional or entity level are equally important.” We’ll be continuing this conversation on Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Welcome to the Let’s Talk About Digital Identity podcast. I’m Francesca Hobson and I’ll be guest hosting this episode of the podcast all around inclusive identity. Francesca: When I say I work in Identity, my friends and family usually don’t know what I’m talking about. To explain, I’ll often give the example of signing up to an app and logging in – which really doesn’t begin to cover the myriad of use cases that identity enables (some of which we’ve explored on this podcast), but it’s such a common experience that it’s often the easiest for them to relate to. They’re touching our industry several times a day, many of them without really thinking of Identity as the key to so many processes. Of course, that’s not the case for everyone. Some people, often the more vulnerable in society, are only too aware of how important identity is to accessing and using services. Security is clearly high priority for service providers when it comes to identity, as is regulatory compliance. But when these aspects aren’t correctly balanced with user experience – or when users with varying abilities, technical proficiency, or access to resources are not fully catered for – there is a very real risk that the intended users will be excluded from, or have trouble, using the service, bringing all kinds of implications from ethical to economic. Making identity solutions inclusive for everyone wanting (or needing) to use them is a topic that’s coming more and more to the forefront of our industry. From that almost-universal use case of logging into apps; to what happens when people are prevented from accessing essential services; to how barriers to organisation identity is impacting individuals – in this episode, we speak to three guests from the identity industry on what they’re doing to help solve these issues. One of the clearest areas of digital identity where we see the impact of not doing enough to include vulnerable people is authentication – the point where a user must verify their identity in order to gain access to a service. Many services still require authentication via a password – though many others are now opening up options for authentication via biometrics, identity providers, real-time document proofing, and so on. To explore this topic further, our regular podcast host –Oscar Santolalla – spoke to Schehrezade Davidson – the CEO of Tricerion, which provides strong authentication with picture-based passwords. Oscar: How do current authentication methods, especially but not exclusively username/password, are impacting inclusion? Schehrezade: Inclusion is a very wide term as we know, Oscar, and one of the key issues post-pandemic is an increase of use of digital across the board. That’s obvious to a lot of people. But what isn’t so obvious is this whole concept of digital divide and how that impacts inclusion. So if you are not sophisticated in using online services, and maybe because you’re too young or you’re too old, that is excluding people from accessing services. In addition, it is true that there’s a whole cohort of people that find the simple alphanumeric passwords very challenging, maybe they have learning difficulties or maybe they don’t but they have something like dyslexia. That’s not a learning difficulty. That’s just difficulty maybe typing in numbers and letters. But if that’s the way the corporate entity forces you to authenticate then there’s an issue. Similarly, I think sometimes biometrics and fingerprints are things that people don’t necessarily want to use as well in terms of, where is my data stored, who gets access to it. So I think corporate entities need to think wider than just what looks simple might be complex for some people, and that’s why I think it’s important to bring in experts on user interface and actually usage by consumers or people using services online to get an idea of the wider impacts of this race to digital. There needs to be a wider review by corporates and governments on how people of all abilities manage to access digital services. And it is true that one size doesn’t fit all and corporates need to be quite flexible in the alternative ways that they offer authentication. And I think we talked about this – where if the onus is on the individual to authenticate themselves, those in the industry need to make it truly inclusive with alternative ways, depending on a customer’s needs. Oscar: Exactly. And what are these alternatives? What are the available alternatives for those providers who have decided to be more inclusive? And you can tell us what is Tricerion’s role in this. Schehrezade: Yeah. So our solution at Tricerion is an image-based password where it’s very simple, and we like to say you can use it whether you are from age 2 to age 100 because it’s based on images. And your brain is wired to remember a series of images or pictures much better than a long string of letters and numbers. So you have the keypad you recognise that’s delivered to you at each authentication occurrence. The images jumble around on the keypad but your password stays the same. So simple, easy, visual. And of course, we will acknowledge that not everybody can use a visual password but I think this is where it comes back to the first point is giving individuals choices on how they authenticate and allowing them to choose something that works for them. And that’s a different way of thinking for corporates actually because they do want to force all of their customers to use the same methodology because it’s cheaper for them and easier. But I think it is really beholden on all of us who work in the industry to push hard for alternative authentication methods because then digital truly becomes an inclusive secure environment where people aren’t nervous about going online. And that has got to be a good thing all around, right? Oscar: Yeah, absolutely. Francesca: Thank you to Schehrezade for filling us in on the impact of exclusion in the online service authentication process. So what about national-level IDs, such as physical documents, which are now often used for both virtual and in-person identity verification? What happens when people are prevented from accessing essential services, such as Banking, because of a lack of perceived sufficient identity documentation? One organisation carrying out important research in this area is Women in Identity – a not-for-profit organisation making practical steps towards an identity industry that is “built for all, by all”. To fill us in on its latest research on this topic, Oscar spoke to Sarah Walton, who’s managing Women in Identity’s innovative Code of Conduct programme. Oscar: Hi, Sarah. Could you tell us what is the impact of identity exclusion? Sarah: Yes, certainly. There’s a number of really quite detrimental impacts both to end-users to organisations who are involved in the identity ecosystem, so that would include reliant parties and also it would include identity providers. But in terms of the impact to end-users, it can mean sometimes that they can’t find work, they can’t find somewhere to live because they can’t prove their identity and so they can’t support their request in their application for a bank account for example, which would then enable them to find somewhere to rent. So end-users end up sometimes having to compromise their integrity and to lie in order to find ways around quite a rigid system, and the system doesn’t have to be that rigid. So we interviewed a number of end-users in our Human Impact of ID Exclusion Report. And we found that there were similarities both in more developed economies and emerging markets. We took Ghana and the UK as representative of one of each of those types of economy. And we found that there were definitely similarities to the quite horrendous experiences that end-users have and how it limits their engagement in society. But also, very importantly, this is something that makes people outsiders within society that they don’t belong, and so there’s an emotional aspect and impact to that for real people in their lives too. And very quickly I’ll just add that that’s the human impact, the emotional impact, the impact on individual people’s lives and livelihood, and children suffer because if parents can’t make money then children sometimes can’t get fed. So it’s really very serious. But from a global perspective, from a national perspective, from economic and commercial perspective, the McKinsey report has recently suggested that between 3 and 13% of gross domestic product, so the income to a country could be increased by 3-13%, probably 3% for more developed economies and 13% more towards that end of the spectrum for emerging economies. So in terms of income into a country, which is beneficial for everyone involved in every type of organisation and individuals, if we included a more inclusive in our identity service creation then we would all be richer essentially. So this is very much something that is very commercially important but it’s also extremely important to people’s lives and livelihoods on an individual basis. Oscar: Yeah, absolutely. I see that everybody benefits with more inclusion. Sarah: Absolutely, yes. Oscar: So I think one of the products of the project that you just mentioned is the Women in Identity’s Code of Conduct. So could you tell us more about this and how it aims to support identity inclusion? Sarah: Absolutely. There are a number of phases to this piece of work. It’s an international code of conduct for the identity industry and I would stress that it’s international and we have very much a fragmented approach at the moment in different regions but also, within those regions, irrelevant of whether it’s a government–owned or a government-assured or private identity system. So the ID Code of Conduct work is aiming to create a set of guiding principles that will ensure that all users of identity systems have a consistent and high quality end-user experience because what we found in our first phase, which is the human impact of ID exclusion that I just mentioned, is that users – irrelevant of what type of economy they come from – have a lack of knowledge of how the process works. So if we could at least make this consistent then people will have a certain set of expectations and then they can learn one approach and then know that approach will apply wherever, whatever products or services they are accessing. And also, that there are alternative ways for those people to access those services. And the work that we’ve done so far, it has emerged that there are 5 key principles that will be foundational to the creation of an ID Code of Conduct. So the first one, it’s pretty obvious to any product design, designer, and design team, but very important that the user is at the centre of the ID ecosystem, and there are many ID ecosystems. So, we always need to come back to this user-centric approach. Secondly, that social norms are changing. So we need to acknowledge that one size doesn’t fit all and we need to move towards proportionality so vouching, tiered KYC, e-KYC and drawing on other forms of government data – so other datasets may help reduce the burden of identity for the user. And it’s essential to build diversity into ID for the reasons that I mentioned earlier because it has a commercial and a human impact benefit. And also, the identification may be individual but that we live in networks of people that already know us, so we need to account better for delegated authorities and intermediaries so we can leverage those networks effectively. So what we will be aiming to do is take these key principles and then develop probably about 5-10 principles that will be useful for ID creation teams, ID development teams. We also have another piece of work called the Implementation Framework and that will be a way of helping, the kind of how-to guide to look at how these Code of Conduct principles can be applied in practice in a practical setting via an ID team. And we will create a fictional ID product or service to show at each phase of the design and development process, in an agile process of development, how these principles would be applied coherently and in a practical way within the ID development team. And our sponsors who I’ll just mention and with a big thank you, GBG Group, Mastercard, RBC, and Omidyar Network are sponsors for the human impact work, some of whom will be coming with us on the journey and new sponsors are joining all the time. We will also be using some of our sponsors’ products and services to demonstrate how they provide in practice this ID Code of Conduct set of principles in order to create an easy to use, accessible ID service as an end product for the user. And I would just also like to say Oscar if I can that we are looking for new sponsors as well. More sponsors are coming on board all the time, but please do contact us because we are in the process of raising the funds for those last two elements that I mentioned, the ID Code of Conduct principles and the implementation framework and we’d very much like some new sponsors to come and join us – not just in terms of funding but also with their knowledge from their ID teams because the more widely we go with this, the more diversity we have in our development process for the ID Code of Conduct, the more relevant it’s going to be and useful it’s going to be for organisations and standards bodies and government entities around the world. All of whom are extremely interested and see this as a very useful potentially alternative solution if we can get adoption to the creation of legislation, because legislation is expensive and some organisations have been very keen to look at this piece of work in more of a standards-type frame, as a potential alternative in some countries to legislation potentially. So it’s being taken very seriously by lots of different types of organisations but we are looking for more sponsorship – please do get in contact. Oscar: Absolutely. Thank you very much, Dr. Sarah Walton for sharing your insights. Sarah: You’re very welcome. It’s great to be with you. Thank you. Francesca: Great insights there from Sarah on the impact of identity exclusion, and how service providers can mitigate against these critical issues. The issue of individuals being excluded from identity systems is key to the future of many industries, such as financial services. But what about organisations? Organisation identity is a lesser-focused on aspect of identity inclusion, yet how organisations identify other organisations (for example, in transactions between financial institutions and maintaining trade relations) is also a critical factor in whether access, or a specific process, is authorised – thereby impacting the individuals that rely on a successful B2B identity verification. To investigate this area, I spoke to Amit Sharma, Founder and CEO of FinClusive – a hybrid FinTech and RegTech company dedicated to financial inclusion. Thanks Amit for joining us, for the second time, on the Let’s Talk About Digital Identity podcast. So let’s dive right in. How can inadequate organisation identity systems exclude organisations—and individuals— unintentionally? Amit: So in the financial services domain, there are obligations by financial services providers to do the requisite Know Your Customer or Know Your Business due diligence. And so when an organisation has run through that due diligence, by rule, one has to not only test the validity and verify the authenticity of the organisation but they also have to cover what’s called beneficial ownership coverage. So individuals that may have an equity or ownership stake or other, what we call ‘control persons’ that control, say fiduciary or economic activities of that organisation. And if there are not adequate identity systems to verify and validate either the organisation or their beneficial owners, they get rejected by financial services organisations. And this can happen in a number of ways. Corporate enterprises, they may be large and very easy from a legitimacy perspective but because of how large they can be and their ownership structures, the complexity of their org structures can vary globally, it’s hard to actually do the beneficial ownership and controlled person verification because those org structures are pretty complex. But also small businesses, small businesses are very challenged because often there’s very little information on the backgrounds of those small businesses that are often start-ups, etc. They are not available in mainstream databases and corporate registries. And so, ensuring that the organisation’s identity can be verified and validated can be very difficult if they’re a small start-up with a registration in a local town but don’t have a state, province, municipality, or federal registration. And then non-profits are a great example too, critical for global humanitarian and development needs. But many are seen by financial services operators as operating without adequate documentation. And these all come back to organisational identity systems failing many of these organisations‘ ability to access basic financial services. So, it’s quite a complex challenge. It’s very solvable but these are some of the issues that we see. Francesca: This is really a lesser talked about aspect of identity and how it impacts inclusion. And as an industry, we do need to be more proactive on solving the kinds of things that you’ve spoken about. So, how can we do that? How is FinClusive supporting organisation identity in order to enable better financial inclusion? Amit: Well, we have the tools now to be able to assign digital credentials or verifiable credentials for individuals and entities – and entities of all types, from small businesses and micro businesses to non-profits to global corporate, and any other kind of legal entity out there. And so what we do at FinClusive in partnership with organisations like the Global Legal Entity Identifier Foundation is to be able to run the associated compliance checks, the background due diligence of the validity of the organisation itself and the beneficial owners associated with it. And with those full compliance checks, we are able to issue a credential that other business entities, other financial services organisations can be sure that the organisation is not only a legitimate registered organisation with a legitimate set of activities but also the individuals associated with that organisation have been equally checked. And by providing that, you create a veil of legitimacy associated with that organisation’s set of activities. So its business counterparts, its financial services providers can all understand and verify and validate in an efficient and cheap way the validity of that organisation. And that’s hugely important. Now, from a macroeconomic perspective, it’s important to note that identity challenges are often seen as just at the individual level, but these at the institutional or entity level are equally important. And they are important I’ll say by many standards, but one really stands out, that the global job creation space is small businesses. So we have to be able to ensure the validity of small businesses worldwide so that they can access capital to start and grow their businesses so they can interact with other organisations financially and economically. So this is a huge challenge. And when 90 plus percent of job creation and business growth is small businesses, this makes this very important. Francesca: Thanks to Amit for that summary of the many issues and implications around organisation identity and financial inclusion. So there we have it – three perspectives on the state of identity inclusion as it is today, and how they’re Personally taking strides towards making identity solutions more inclusive. Thank you so much to Schehrezade Davidson, Sarah Walton and Amit Sharma, for sharing your expertise with us. To find out more about this episode’s guests, and links to their work – take a look at the show notes. This topic isn’t going away any time soon. These challenges and solutions that are facing the identity industry aren’t simple. The more we raise awareness of these issues, the more initiatives and innovation we work towards, and share as a community, the faster we can build more inclusive identity systems – that work for everyone. We hope to provide one of those awareness platforms on the Let’s Talk About Digital Identity podcast – so make sure you subscribe. Until next time. [End of transcript]",https://www.ubisecure.com/podcast/inclusive-identity-tricerion-women-in-identity-finclusive/,,Episode,,Ecosystem,,,,,,,,2022-04-20,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,LTADI,,Global Assured Identity Network,,,,,Launching the Global Assured Identity Network (GAIN) with Elizabeth Garber,"Ubisecure fills us in on what the GAIN project is, explaining how it’s different from other trust networks and why GAIN is good for financial institutions. She also discusses the role of the Global Legal Entity Identifier Foundation (GLEIF) in the project, and what’s next for GAIN.","with Elizabeth Garber, Editor of GAIN. In episode 52, Elizabeth explores the recently announced Global Assured Identity Network (GAIN) initiative. She fills us in on what the GAIN project is, explaining how it’s different from other trust networks and why GAIN is good for financial institutions. She also discusses the role of the Global Legal Entity Identifier Foundation (GLEIF) in the project, and what’s next for GAIN. “This is really going to unleash creativity and expand access to individuals and communities and sellers all around the world.” Elizabeth Garber is a customer and product strategist who started her career in telecommunications and honed her craft in six different industries before joining one of the world’s largest retail banks. She is an expert in designing experiences and delivering transformational change based on a deep understanding of people. This interest has underpinned her graduate studies of the psychology of cross functional teams as well as how customers define value in relation to services they use. In 2015, she was named one of the top 3 marketers under 30 by the UK Marketing Society and was recognised by Energy UK and EY for her work building Trust across the UK energy industry. In 2017 she won the Financial Times/30% club ‘Women in Leadership’ award. Find Elizabeth on LinkedIn. Elizabeth recently played a leading role editing the paper published by more than 150 Identity experts – GAIN: How Financial Institutions are taking a leadership role in the Digital Economy by establishing a Global Assured Identity Network. It was announced at the European Identity and Cloud Conference on 13 September by Nat Sakimura, chairman of the OpenID Foundation, and Gottfried Leibbrandt, former CEO of Swift, and then published by, among others, the Institute of International Finance. To get involved, email [email protected] or join the LinkedIn group. We’ll be continuing this conversation on Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla. Oscar Santolalla: Hello, and thanks for joining. Our guest today played a leading role editing a paper published by more than 150 identity experts. The paper is called GAIN: How Financial Institutions are taking a leadership role in the Digital Economy by establishing a Global Assured Identity Network. It was announced at the European Identity and Cloud Conference last 13th of September by Nat Sakimura, who is the Chairman of the OpenID Foundation, and Gottfried Leibbrandt, former CEO of Swift, and then was published by, among others, the Institute of International Finance. Our guest today is Elizabeth Garber. She is a customer and product strategist who started her career in telecommunications and honed her craft in six different industries before joining one of the world’s largest retail banks. She is an expert in designing experiences and delivering transformational change based on a deep understanding of people. This interest has underpinned her graduate studies of the psychology of cross functional teams, as well as how customers define value in relation to the services they use. In 2015, she was named one of the top three marketers under 30 by the UK Marketing Society, and was recognised by Energy UK and EY for her work building trust across the UK energy industry. In 2017, she won the Financial Times 30% club Women in Leadership Award. Hello, Elizabeth. Elizabeth Garber: Hello, thanks for having me. Oscar: It’s a pleasure. Welcome to our show. And let’s talk about digital identity. And certainly, we always like to start hearing a little bit more about our guest, especially how was your journey into this world of digital identity. Please tell us a bit about yourself. Elizabeth: Sure. So my name is Elizabeth Garber. As you said, I’m a customer strategist, a product owner and service innovator. Now, I’m also a digital identity evangelist, I suppose. My passion is for understanding people, what drives their behaviour, how does any kind of service really add value in their lives. Then I help organisations to build services and communicate those benefits to real people. So these skills, I think, are critical in the digital identity space, because the solutions that we design really need to reflect the people who use them, how they move through their worlds, globally. And not only that, but really great customer experience, designers, product owners, they’ll get to know what’s really on people’s mind that might prevent adoption of any one solution. And how different solutions kind of destroy value maybe in sometimes insidious ways, like a really convenient identity solution could come with some real trade-offs that are invisible to users at first. Maybe their data isn’t secure, maybe it’s sold off, maybe they’re giving it willingly, but they’re blind to some of the applications that will follow. As someone who hasn’t been involved in identity for all that long, I can really relate to the users who don’t understand the difference between the offers that are currently on the market, and what might be there in future. So I’m here to get underneath how our identity solutions are going to create and potentially undermine benefits for real people. And then I want to promote the adoption of good ethical solutions. Oscar: Excellent. And what was your exact role in this project in this paper, GAIN? Elizabeth: In the paper? Oscar: Mm-hmm. Elizabeth: Yeah, as it turned out, a few other people thought that I could be useful. So one of the early co-authors Rod Boothby, who chairs the Open Digital Trust Initiative at the Institute of International Finance, reached out to me about his work. I was really floored when he took me through it and I knew that I wanted to get involved. It was really exciting. So, after I read through some of the documentation, I sent him some thoughts about the value to users, to banks, to relying parties. And then within a few days, I was facilitating a virtual whiteboard session with a lot of the other co-authors all over the world, trying to help them kind of coalesce around who the paper was for, who is the audience, what are the messages that are going to resonate with that audience and then the structure of the paper. At that point, I wondered if someone might realise that I didn’t have a huge background in identity, maybe kick me out, but they are an inclusive group of people, and they really valued the approach. So ultimately, I played a pretty big role in pulling the paper that we published together. I should say who else was involved. It was really an experiment in radical democracy as Don Thibeau of the OpenID Foundation likes to say. We brought more than 150 people together, experts in the space. I might mention some of their names in this interview, but it was a pro bono, no logos collaboration. But we did have the support of some major organisations in the space. The ones that participated and are now publishing and promoting the paper include the OpenID Foundation, the Institute of International Finance, the Open Identity Exchange, the Global Legal Entity Identifier Foundation, and the Cloud Signature Consortium. Oscar: Excellent. Tell us a bit more about the… Elizabeth: The paper itself? Oscar: Yes, please. Elizabeth: OK. Yeah, sure. So the paper invites financial institutions. It’s a call to action for financial institutions to join us in solving the biggest problem on the internet – a lack of trust, specifically a lack of trust due to the absence of verified identities. How do you know the person you’re dealing with is real? How do you know that they are who they claim to be? How do you know that your money will end up in the right hands? Fraud and cybercrime increases every year. Some estimates say it’s 5% of GDP, trillions of dollars. Criminals are thriving in anonymous digital spaces. And at the same time, you and I and our friends are pervasively tracked. We enter our details into countless sites, service providers follow us around, they trade our information. Even our biometrics are seeping around into more and more places, our faces, our fingertips, more and more people or parties have access to really private, really Personal information that could be used in any number of ways – steal our identities, for example. And that needs to stop. The paper argues for the internet’s missing trust layer. That’s a phrase coined by one of the co-authors, Kim Cameron, who led identity at Microsoft for many years, and he wrote the Laws of Identity. Then this concept was really beautifully explained by Nat Sakimura, Chairman of the OpenID Foundation at the European Identity and Cloud Conference in September. In this new paradigm, highly trustworthy identity information is passed from a regulated or otherwise highly trusted institution to an organisation that needs it, with end user knowledge and consent every time. So, if I’m buying another bottle of Malbec, I shouldn’t have to prove my age by uploading a driver’s license to another site or sharing a photo of my face. You don’t need my name, my face, my address. No one else needs a copy of my credentials or biometric information. You just need to know that I am the person I say I am and that I’m old enough. So my online wine retailer can send a message to my trusted identity information provider. I would choose my bank and they will use their app to authenticate me and say, “Hey, the International House of Malbec wants to know if you’re over 21, should I tell them?” And of course, it’s a yes, because I need to celebrate recording this podcast. Oscar: Tell us a bit the solution itself because there are some solutions, which one way or another address the problem you described? Elizabeth: Yeah, so none of this is really new. People have called for this for years. One of my co-authors actually from INNOPAY, Douwe Lycklama, shared a YouTube video arguing for the same thing, and it was dated in 2007. Of course, it does exist in some places already, Norway, Sweden, Belgium, Canada, Finland, in one form or another lots of jurisdictions around the world have a solution like this. And as another GAIN co-author Dave Birch, who also wrote Identity is the New Money, pointed out in Forbes last week – We’re starting to see evidence that these networks really facilitated the rollout of aid during the pandemic, and mitigated the risk of fraud. In particular, another co-author from Vipps in Norway helped us to compare some early information coming out against the US, UK data. So no, none of its new but we do argue in the paper that now is the time to think about global interoperability of these networks. Oscar: Yeah, and I think as the model of doing the solving these problems goes towards how the countries you have mentioned, most of them have created a system based on banks, others combined with all of the mobile operators, right, but those are like the main types. And the states, of course, the states, some of the states has also provided that. So these are the three types of identity provider, the one who provision these systems, this identification service, but obviously, it’s not – it’s not commercial big tech, no, that’s out of these group of entities who provide these systems. And of course, you mentioned different countries. We talk about few countries that have sufficient solutions. But I think the next question would come, how to make it available for the rest of the world, how we ensure some type of global interoperability? Elizabeth: Absolutely, absolutely. So we think global interoperability is really important for a lot of different reasons. Three in particular spring, to my mind. The number one, end users want to live globally, maybe more importantly, most online companies are global or need to be global. They have suppliers and customers across borders. So the benefits increase dramatically when a trust network is global, rather than local – fewer contracts, fewer integrations, and the benefits extend throughout their supply chains all the way to the end users who benefit from simpler, more convenient services all across the internet. Because my second point is, it’s an extension of that first one that those benefits actually extend to global society. A global network allows a specialist artisan in one part of the world to reach a global audience without relying on an intermediary. So Oscar, let’s say you have something to celebrate, and you’ve decided to buy yourself a handmade quilt from India, a Kantha. You will be able to find sellers that you can trust and they’ll be able to sell to you, charge you the price that someone in Finland expects to pay, and they will keep a greater percentage of it for themselves, potentially. So this is really going to unleash creativity and expand access to individuals and communities and sellers all around the world. Finally, there’s a practical benefit from promoting this vision of interoperability. And I believe it’s the key factor that will move the needle towards actually setting up this trust layer in places where it does not exist today. Major relying parties, and I’m talking now about big companies that operate worldwide with the heft to influence a global movement, they buy into this vision only when it has a global reach. They don’t want to integrate with a different provider in every jurisdiction on the planet. So with that in mind, financial service institutions around the world are far more likely to collaborate and catalyse this movement, if they see that the vision is expansive enough to meet the demand for global reach. Oscar: Okay, so definitely interoperability is something that we aim – not only the ones who are like me, I’m in Finland would like to have the same for the rest, but the ones who don’t have a sufficient identification system like those. But then something caught my attention is in the white paper. It says GAIN Digital Trust: How financial institutions are taking a leadership, et cetera, et cetera. So that means that the 100 persons who have been involved in this are targeting financial institutions. So why is targeting specifically financial institutions? Elizabeth: Great question. So the main body of the paper is targeting, it’s directed at the world’s large financial institutions. It’s a call to action for them to catalyse the creation of a globally interoperable network. To be really, really clear with everybody though, we don’t think the paper or the global assured identity network is only for banks or financial institutions. The network itself must be inclusive, and in some parts of the world energy companies, telecommunications providers etc., they may be better placed than banks to be trusted identity information providers. However, we argue that financial institutions are really well positioned to spark this change, to bring it to life, and to benefit from it. And so that’s what the paper really gets into. It runs through the reasons why financial institutions are positioned to do it – a point we repeat a few times – because we’ve seen them do it before. They built the rails for global payments, for cards, for securities, etc. They built Swift, Mastercard, Visa. There are three main strengths that they have that makes them the perfect catalyst. Number one is trust. It’s a bank’s core offer. I know some people might laugh at that, because you don’t always trust a bank to give you the best rates, or brilliant customer service every time, but you do trust them. You trust them to keep your money and your data safe. That’s why they exist. Some companies monetise your data. Banks, financial institutions are in the business of monetising your security and privacy. And they have been since the very first bank – the Medici bank in 1397. Banks are trusted. Number two, the second point, is a build off the first. They are regulated. A lot of that trust is underpinned by regulation. They need to be worthy of our trust and meet certain standards so that other businesses do not – and there are governance frameworks to build upon as these new services are created. The third point that makes them a great catalyst, the third strength, is how well they know their customers. Because of those first two things, banks invest significant amounts of money in making sure that they know their customers. You might hear me say Know Your Customer, or KYC, is the process – and it’s regulated – that they go through to validate your identity when you open an account. And they also build the technical infrastructure to know that it’s you each and every time someone tries to get into your account. They have built best-in-class tools to identify, authenticate you, while keep your account secure and your information private. And all that’s why banks are best placed to vouch for your identity and to use their authentication methods to confirm on behalf of others that it’s really you. Oscar: What would make possible that there is such interoperability in more practical, more technical terms? Elizabeth: Good question. So we’re going to try to apply interoperability standards that many different types of systems can plug into. There will be direct participants, banks or other identity information providers who use the network to verify information for the companies who need to consume it. And, of course, there will need to be intermediaries and translation layers to ensure that a BankID Sweden customer can verify their identity with a seller in Mexico, for example. That BankID customer will continue to verify using BankID. But we will have created a way for BankID to communicate to relying parties all over the world, including Mexico. We can also create servers that tap into self-sovereign identity networks. It’s important to know that we’re designing this system so that identity information passes from the provider, the identity information provider, directly to the relying party without passing through any centralised GAIN entity. That’s really important, at least as we understand it. That’s really important when we start to talk about interoperability from a legal standpoint. Technically speaking, though, these direct connections are enabled by common API specifications. They’re based on OpenID standards, such as FAPI. With all that said, this field is evolving and technical proof-of-concepts are underway. So I definitely need to reserve the right to build upon my answer later on. Oscar: Yeah, definitely. It’s very convincing your point, your three points. And I can feel that, absolutely. And the banks have already proved that in some of the countries that we have mentioned that they are leaders in this identification. Elizabeth: Absolutely. Yep. Oscar: And now my guess that among the 150 authors, there are some people who are directly involved in banks. Elizabeth: Mm-hmm. Oscar: OK, excellent. Elizabeth: Yes. Like I said, it was a no logo collaboration so I won’t be dropping the names of any specific names. Oscar: Yes. Elizabeth: But we did have the partnership of the Institute of International Finance, who is pulling together a proof of concept with the Open Digital Trust Initiative. And we do have banks involved in that. Oscar: So why it is really good for banks from the bank’s point of view, why it’s good to join GAIN? Elizabeth: Yeah, so importantly, it’s not just that they can do this, it’s also that they will benefit from doing it. The first benefit that’s easiest to explain is simple, it’s revenue. Identity verification services will have as they do today, a small price attached. Transactions will result in a small amount of money flowing into the ecosystem. And a percentage will go to the identity information provider, in this case, the bank. That turns the assets that I was just talking about – what they’ve spent to build up KYC, authentication, security, all those infrastructures – that turns them from a major cost centre, into a profit centre. Second, there will be massive efficiencies for banks, password resets, document signing, mortgage processing, etc. There will be more efficiency inside a bank. They will also see less fraud. In Norway, BankID saw fraud reduced from something like 1% to 0.00042% of transactions. But the biggest benefits are strategic. New competitors are coming in between banks and their customers, are diminishing the role that banks play in providing access to capital markets. Providing identity information services to their customers, under their brand name, cements a really critical role for banks. They will provide their customers with access to the digital economy. And they will keep their customers safer, more secure in their privacy than they are today. And that’s a real value add for their customers. So strategically, banks really must do this, or they risk getting cut out of a lot of transactions. Oscar: Indeed, banks have to hear it. When I started reading more about this paper, and you have mentioned at the beginning also, that the Global LEI Foundation is also involved, it’s one of the main supporters of this initiative. So tell me how that relates to that. They are not the banks. Of course, they have some business with the banks. But please tell us what is the connection? Elizabeth: Yeah, I listened to your podcast that you did with GLEIF, Global Legal Entity Identifier Foundation, while I was preparing to talk to you. And yeah, I think the connections are really strong. So they were a critical partner in pulling the paper together. They’re doing incredibly important work, and will absolutely be part of the next steps as we figure out how to realise a global assured identity network. There are three types of identity questions that we all really have as we transact online. Who is this person? Can I trust them? Who is this company? Are they trustworthy? And then connecting the two. Is this person related to a company? What’s their role? Are they entitled to be doing this thing, signing this document logging into my account on the company’s behalf? If we start getting key information about the individual, that’s great, that adds a ton of value. GLEIF is answering that second question, who is the company? And once we can get to the third point of connecting the dots, that’s going to be really powerful. So yeah, GLEIF is a critical partner and will remain so throughout the rest of the journey. Oscar: Yeah, yeah, now the way you have explained also is pretty good. And we say that verify identity of a person, of a company or organisation and how to link these two in this. And that’s pretty critical. That’s something that has not been explored enough, I would say. So that’s, I’m really intrigued to hear how GAIN is addressing doubts about this because it’s essential these days. So, Elizabeth, I’d like to hear more what comes next. So just a couple of weeks ago was the launch of the paper and I also saw some interest on the media. That’s excellent to hear. But what come next now for GAIN? Elizabeth: Well, we already have planning underway for GAIN technical proof of concept. And we’re looking for more people to get involved, more companies to participate. So we have big companies who need to de-duplicate their customers, companies that need to collect verified signatures, the number of use cases is seemingly endless. We’re also looking for partners, aggregators, or service providers who can help us – help bring these relying parties on board or technical service providers who can help us to envision and build services on top. There are so many ways to be involved. So if you’re thinking about getting involved, it might mean that you’re interested in identity. So you’re listening to this podcast after all, that probably means that you’re only one or two steps away from one of the co-authors, so it’d be really easy for you to reach out to them. You can also reach out to me or the Global Assured Identity Network LinkedIn group that we have. And if you’re really interested in the POC, we’ve got an email address for you, it would be [email protected]. So even if none of that is true for you, there’s something you can do. You could call up your bank and tell them to offer this service Oscar: Exactly. Even as a user, you can ask your customer for the bank, you can ask the bank, have you heard of GAIN? Yeah, absolutely. And I guess – assume that you already have a big list of the most potential use cases right that some of these customers, companies, service provider that you are now inviting. They could fit into this use cases. Excellent. Elizabeth, finally, tell us – for all business leaders that are listening to us in this interview. So what would you say is the one actionable idea that they should write on their agendas today? Elizabeth: So these ideas that we’ve been talking about are going to mark a step change in the digital economy. A third wave in identity as my colleague Rod Boothby pointed out on LinkedIn last week. First, it was all about companies providing IDs and passwords to employees to access work systems, and businesses gave customers IDs and passwords to access services on the internet. This third wave is all about us bringing our own trusted digital identity wherever we go. With a really inclusive approach and active global collaboration, this will open up our digital economy. It’s going to expand access, and make life so much simpler and safer online. So for those who are listening, I would urge business leaders to figure out what role your business can and will play in a globally interoperable assured identity network. Are you an identity information provider? Can you be? Will you be a relying party who consumes these services so that you can verify that the signature on a contract is valid? Can you help us onboard relying parties or integrate these services? On your agenda, I’d say write down, figure it out, figure out how you’re going to get involved and then get in touch. Again, join our Global Assured Identity Network on LinkedIn, or email [email protected]. If you know that you want to join the POC. Oscar: Yeah, indeed. This might still be relatively new concept for companies who are not so exposed to this type of identity services. So yeah, it’s a good idea, as you said, to decide, think about what is going to be that new role of your company from the many choices you have mentioned. Thanks a lot, Elizabeth. It was super interesting to hear about this extraordinary effort that had been done by GAIN project, very recently launched. And as you said, now, the Proof-of-Concept are starting to keep going. Please let us know how people could get in touch with you if they like to follow this conversation with you. Elizabeth: Yeah, so you can find me on LinkedIn really easily. My name is Elizabeth Garber. Yeah, I think that’s the easiest way. Oscar: OK. LinkedIn is the easiest way. Thanks a lot Elizabeth. It was a pleasure talking with you and all the best. Elizabeth: Thank you. Thanks for having me. Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at Ubisecure.com/podcast or join us on Twitter @Ubisecure and use the #LTADI. Until next time. [End of transcript]",https://www.ubisecure.com/podcast/global-assured-identity-network-gain-elizabeth-garber/,,Episode,,Ecosystem,,,,,,,,2021-10-06,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,LTADI,,Me2BA,,,,,Lisa LeVasseur on the ethical behaviour of technology and the Me2B Alliance LTADI,"the Me2B Alliance and how it aims to make technology better for humans, plus the businesses (B-s) which are shining a light on privacy issues and giving the Me-s more control. “We used to call ourselves something like the ‘organic food label’. But that’s actually not right. We’re more like independent automobile crash testing.”","with Lisa LeVasseur, Executive Director at Me2B Alliance. In episode 38, Lisa and Oscar discuss the Me2B Alliance and how it aims to make technology better for humans, plus the businesses (B-s) which are shining a light on privacy issues and giving the Me-s more control. [Scroll down for transcript] “We used to call ourselves something like the ‘organic food label’. But that’s actually not right. We’re more like independent automobile crash testing.” Lisa LeVasseur is Executive Director at Me2B Alliance, a non-profit organisation that is setting the standard for respectful technology. An MBA technologist with a background in Computer Science and Philosophy, Lisa began strategic work in cellular telecom industry standards in the late ‘90s while at Motorola. Since then, she has participated in 3GPP, 3GPP2, MEIF, WAP Forum, IETF, W3C, IEEE and Kantara Initiative. Find out more about Me2B Alliance at me2ba.org. Join as a ‘Me’ or a ‘B’ at me2ba.org/membership. We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla. Oscar Santolalla: Hello and thanks for joining today. We are going to discuss today something pretty different about the ethical aspects of technology. A lot of technology we are already using. We are using a lot of technologies brought by big tech, by many organisations around the world and we are going to hear what could be a better vision for how the technology treats people in a more respectful way. For that, I have a very special guest who is Lisa LeVasseur. She is the Executive Director at Me2B Alliance, a non-profit organisation that is setting the standard for respectful technology. An MBA technologist with a background in Computer Science and Philosophy, Lisa began strategic work in cellular telecom industry standards in the late ‘90s while working at Motorola. Since then, she has participated in several other standards organisations such as 3GPP, 3GPP2, MEIF, WAP Forum, IETF, W3C, IEEE and Kantara Initiative. Hi Lisa. Lisa LeVasseur: Morning. Or evening! Oscar: Yes, exactly. We’re in the opposite. Quite early for you. The night is falling here in Helsinki. So it’s a pleasure talking with you Lisa. Welcome and let’s talk about digital identity and this very interesting concept and project you are embarking on, Me2B. But I would like to hear more about your beginnings and how things led to the world of digital identity and this latest project you have. Lisa: Sure. Thanks Oscar. Thanks for having me. I’m really honoured to be here talking with you. So how I got involved in this world was back in 2009, I started working on a product that was designed to put families really in control of their information and the services that they use, whether those services were in the brick-and-mortar world or online services. And it was through research in that project where I really became aware of – I think it was initially Doc Searls and I maybe became aware of some trust framework stuff and then I sort of unlocked the door to this whole world of people working on identity management and identity standards and realised that there was a whole world of people sort of on the leading edge of this work. That’s how I kind of stumbled in. It was probably around 2012 or so. Oscar: At that time you were the product manager, building software, building product? That was your role at the time? Lisa: That’s right. Oscar: And how did that evolve to today, Me2B, which is relatively new, right? Lisa: Yeah. Well, interestingly enough, having this sort of long experience in industry standards and being one of four people on the planet who actually like industry standards work, as back – as far as 2009, I actually had this idea when I started to define this product because I had the ecosystem in mind and I was like, “Well, what we really need is we need a new kind of standard.” We need a standard that measures beyond just bits and bits and protocols but into like the softer aspects, like the user interface and the usability and eventually the ethics of it. So I had this idea for – and it was really kind of quaint and naïve that all we needed to do in order to really stimulate this market for more ethical technology was build a standard. Build a standard. Build a specification and start certifying and voila, we will have what we need. It has been much more difficult than that. As I mentioned, I started thinking about this in 2009 literally. I have slide decks with a precursor to this idea and then it had just been percolating, percolating, percolating and I was working on the product. Fast forward to 2018, I went to IIW. You know, the Internet Identity Workshop in October and I had at that point a fully-fleshed out idea of the organisation. I pitched it there because it’s an unconference and I ran a couple of sessions and got a lot of support and then went back to the foundation where I work and said, “So, I’ve done this thing. I’ve created this organisation. I’m going to do it no matter what. So I just wanted you to know.” Luckily they got behind it and they’ve been supporting us kind of as seed investors into the Me2B Alliance and we started in earnest in 2019. Oscar: OK. So the idea was in your mind and of course you were just iterating with the work you were doing through this year until this unconference, right? In the Internet Identity Workshop. So you show it and you get a lot of support and I guess some people who had similar ideas were able to also tell you and support you. So what is today – how would you really define now Me2B? Lisa: Yeah. So let me kind of separate the semantics or syntax of this a little bit. So Me2B is really a qualifier. It’s really a shorthand that is encompassing an entire ethos and sort of ethical point of view and it’s not really different from other ethical points of view like VRM, like – you know, Doc Searls’ VRM and my data and we’ve listed all of the organisations that are working on better technology in some form or fashion and there are hundreds of them. Hundreds of organisations around the world and we all sort of have the spiritual alignment that we want to make technology better for humans. But the real like heart of the ethos that we’ve come up with in the Me2B Alliance, one of the first thing we had to do was really figure out like an ethical foundation. So in 2019 at the IIW in April, we ran a bunch of sessions there. We ran a session on sort of a new social contract for the internet. We had some really great minds in that session talking about the possible ethical frameworks. We looked at moral foundation theory. We looked at a lot of different things and after kind of a lot of synthesis and just kind of sitting with all this information, where we’ve ended up is that you can’t see me but I’m going to pick up my phone now and say this is not a tool. This is not a tool. These things that we use, they’re not tools. We think they’re tools. We think like they’re tools. But they’re not. They’re relationships. They’re becoming more humanesque. We’re talking to them. We’re gesturing to them. They’re responsive. They’re Personalised responsivity. It’s more like a two-way human relationship. So if we hold technology to that level and say, “Ah, this is a relationship,” well, at a minimum, it should treat me like a polite stranger. At a maximum, it should be a trusted good friend and if we go even a little bit more down this path and we did into human psychology, interPersonal psychology, and we see that there are norms. There are actual scientific and sociocultural norms for how to create and what the attributes are for healthy human relationship. So what we did is we took those attributes and we sort of distilled them out to a list of what we call the rules of engagement and we are quite literally measuring those attributes in connected technology and we are measuring – right now we are measuring mobile apps and websites. But ultimately anything that can be connected and that has a user interface, a user experience, we can test. Oscar: And a relationship between person and a company or a business, that’s the type of relationship? Lisa: Yeah. Well, that’s the tricky part and that’s the tricky part of the acronym. If you go on our website, you will see some educational tools. We’re really starting to hone in on a very clear vocabulary because I will say this. We do not have language to adequately describe what’s happening to us in the digital world. And we have had to create the language so that we can actually measure things because you can’t measure when the words are too broad and not nuanced and specific enough. So Me2B, you have to sort of suspend this belief a little bit. The Me2B relationship is actually a collection of relationships and some are with the B. So the legal side of it is with the B, the business, right? You sign a contract with the business behind the product. You have an experiential relationship with the product itself. We call that Me2P, me to product. There are technology enablers that are necessarily along for the ride and we call those Me2T and then for all of those Me2P and Me2T relationships, there are integrations. We no longer write pure software, right? We integrate. It’s an integration activity almost more than anything else, right? I’ve been in software since punch cards. So for me, it’s a substantial evolution of how things have changed. So in those Me2P and Me2T part layers, there’s also B2B relationships that are the integrations. Those are like the invisible sort of – I call them strange bedfellows. When I say a Me2B relationship, there’s really a lot. It’s like an onion with a lot of layers. It’s Me2B in the sense that the B is responsible for the behaviour of the product. It’s responsible for the legal terms of using the product. It’s responsible for those integrations in a data controller sense perhaps. But it is that whole network. It’s a whole network of relationships. So it’s a little fuzzy, which is why we had to get deeper and more specific. Oscar: Right, right. Yeah, yeah. And many relationships, as you said this, and different types. OK. Tell a bit about the organisation as well. What are you doing? Lisa: Yeah. So because I come out of standards and because I love standards, we are a kind of standards development organisation. We are organised like a standards development organisation. We just opened me and B members. Last year we were in soft launch with membership. One of the critical points of the ethos that I didn’t mention is like sort of the underlying belief is that these healthy Me2B relationships, all of them in the layers, are better not just for me but also for Bs. So that’s a critical part of our ethos really and so we feel it’s really crucial that, you know, we’re a finger on the scale on the side of Mes because of the power asymmetry right now with surveillance capitalism. You know, that whole dynamic. But we can’t solve this without Bs. So we really feel like this is a sort of yin and yang kind of relationship. We need both Mes and Bs. We need the users of technology, the people using technology, as well as the makers of technology. So a standards organisation that is really bringing in both and perhaps a little more focus on the Me side. So we’ve got four working groups. We’ve got one working group working on the certification criteria. We’ve got a Mes working group that is working on educational support for Mes. We’ve got a Bs working group working on educational support for Bs and then we’ve got a policy and legal working group, which is working on sort of educational materials and policy work for mostly in the US because as you I think well know, we are somewhat behind in the US in terms of our policy and this year looks to be like a very big year. We have some very strong ideas. We would like to see regulation or legislation that comes up. We really want to start educating people to kind of look at it through our lens. We think our lens is really powerful. It has really held up. We’ve started testing products and it’s really holding up. Like I think a big problem with this space is that a lot of us viscerally know that technology isn’t treating us right. But kind of translating into a practical ethic has been very, very hard. So this framework that we’ve got is really – it’s holding up well and so that’s how the organisation is structured. We are – in the US, we’re a non-profit. We’re a 501(c)(3) organisation. Oscar: OK. Excellent. Already very active with several work groups. Also to understand this vision if we try to see it in products that already exist. I think – I’m not sure if you already have certified products but let’s say products that are like tools or products that exist today and people could use them. Could you mention a few of these software or technology that somehow are embracing the vision, the Me2B vision today? Lisa: Yeah. And we haven’t officially certified or published any testing results. So this has all untested my sort of sensibilities about what tools I like to use, what relationships I like to build out. So really it’s the people who are building at least pretty privacy-aware tools and technologies. The browser is a very important animal in the ecosystem, right? It has access to a lot of information about us and that’s one of the most intimate relationships we have frankly with a browser, whether we know it or not. So I think browsers are really important to scrutinise and make sure that they’re treating you like a polite stranger or a good and trusted friend. So the two browsers – actually, there are three that I feel OK about, the Brave browser, Mozilla’s Firefox browser and also I don’t use Apple products but I know Apple Safari is doing a lot of great things and really proactively setting privacy policies. They’re doing respectful defaults I think in a good way. Then in Apple in general, Apple is on the right path with their privacy nutrition label in their app store. They’re on the right path with that. You know, so it’s the companies that are really marketing around and positioning on privacy. There are other categories too. Like there’s Digi.me and Meeco which are ecosystem enablers, so that people can be in charge of how their data is getting shared across other apps. There are other things like browser extensions. There’s a lot of really good browser extensions. I think that’s something that a lot of people are – well, I think they’re pretty aware of it with like Ghostery and the EFF privacy badger. Then Terms of Service; Didn’t Read, that’s another good extension to help us keep more aware of what’s going on. I will say that one of the primary objectives of the certification market self that we’re developing and the testing that we’re developing is really about shining a spotlight on things. Like shining a light into the dark corners and letting people know really what’s happening. So a lot of these tools, there are two facets of the tool. They’re either shining a light on the dark corner, like Terms of Service; Didn’t Read. They’re shining a light on the terms of service or they’re giving us more control over our destiny, like the Apple default settings, which are more respectful. So – and I think those are – yeah, that’s a good list and eventually I do want to say this point is that we’re really – by creating a standard, you know, 5, 10 years from now, it’s my aspiration that this respectful technology is the norm and not the exception. Oscar: Exactly. It’s today the exception as you said. Yeah. Thanks for sharing these. So that helps us illustrate more what technology or companies can do and – doing technology the right way and which one you can use. Yeah. So I have to try a couple of those I have not tried, so yeah. I will do definitely to try them. What kind of software we really need to fulfil the vision nobody, absolutely nobody is offering today? Lisa: Well, the thing that I work a lot on in IEEE 701, it’s a P7012 – and that standard is machine-readable Personal privacy terms. So when we think about like signing up for a new relationship, right? The first thing that happens, we call this the “Me2B marriage”. The marriage is creating credentials. Here’s the tie-in to identity by the way. So the Me2B marriage is when you say, “You know what? I want to be remembered, recognised and responded to by this service, by this product.” Those are Joe Andrews’ three functional characteristics of identity. I want to create the credentials. I want to get married and then the marriage certificate is the terms of service, right? And those come from the vendor and they’re designed to really keep the vendor safe, right? It’s a legal instrument that’s designed to keep the vendor safe. So in P7012, we’re defining the flipside of that, right? What if individuals could offer their own set of permissions? It’s like that’s great. That’s great that you want all that. Here’s what I’m allowing. Here’s what I’m granting. So you can think of this almost like a reverse EULA or a reverse terms of service. But it’s a way, it’s a mechanism to actually assert your own preferences and permissions, legally binding preferences and permissions. So what we need then is a tool that can actually – like a software agent, my Privacy Pal or whatever, my little agent, my trusted agent has a duty of loyalty to me. That’s important. Most companies don’t have a duty of loyalty to the end user, the individual. They have a duty of loyalty to the company. So we do need a new kind of software agent that could for example – you know, we think about decentralised identity or bring your own identity credentials. It could bring my own identity credentials and it could bring my own terms. That’s really crucial. That’s maybe the most crucial thing and really having an impact on the power asymmetry between Mes and Bs. Oscar: So that will be a service, an agent, a service, so someone to be running that. Lisa: Yeah. Somebody, something, some kind of entity that can have a duty of loyalty and not just a duty of care but a duty of loyalty. Meaning like a real estate agent or a financial agent. Those entities at least in the US. I’m not familiar with like all global regulation around that. But at least in the US they carry a duty of loyalty usually. So we really need something like that. We really need something like that. The J-Link protocol and what they’re doing and even like Digi.me. Like those ecosystem types of things and probably Solid where you can at least be in control of your information. It’s kind of like that. So like Digi.me has their own overarching developer rules, right? So if you’re going to develop a Digi.me app in that ecosystem, you’ve got to abide by the generally privacy, human-respecting privacy terms, to even build an app in that space. So those are kind of like that. The thing is, is you have to opt into an ecosystem and I’m a fierce independent in a lot of ways. So I say, “Why should I have to join something to be treated right?” and those are great products. I want to also say that they’re great and they’re absolutely necessary. But I – again looking further down the road, I would love to just always be treated right and not have to join something. So I would like to be able to assert my own permissions wherever I go. Not just within like a specific ecosystem. Oscar: Yeah, exactly, because there’s a lot of tools that a lot of people use and you are either in or out of that in many cases. Take it or leave it and mostly who decides that is the big companies, big tech. So what will happen with big tech in this vision? How will we make that big tech join this vision we are talking about today? Lisa: Well, I think they will join because we will, fingers crossed, we will succeed and we will shed a lot of light on this and we will get a lot of education and awareness with people. We have this one in my mind and also on our website we have this diagram of like, you know, you start with a certification which is really awareness. It’s an awareness-building tool and people then start to become aware that oh, these things aren’t really treating me right, but some are. So then they start to demand. You know, they start to choose and that hopefully drives more demand and more choice until eventually we have a lot more choice and I think once that awareness and once that dissatisfaction gets to be very wide scale, there will be a lack of tolerance with certain behaviours as there should be. In some sense, I feel like technology is hiding behind the inherent opacity of it. We don’t understand it. We mere mortals don’t understand what’s happening under the hood of technology. But once we can start to understand that, which hopefully our educational tools and certification and testing will illuminate, people will get more vocal and demand better. Oscar: Yeah, it’s true what you said. Technology goes so fast, especially the ones who are leading the kind of forefront and the ones who have the most popular services and applications, etc. So it’s difficult for everybody else to catch up to really understand what is behind these tools, this technology. Lisa: Yeah. I want to add one more thought to this. We’re just on time with this. I think about the arc of technologies. And new technology’s introduced; then we understand the potential harms; then we regulate and then we get mature, right? So we standardise and the order may shift a little bit. But one of the ways we describe our work, and I should have mentioned this earlier when I was describing the Me2B Alliance, we used to call ourselves something like the organic food label. But that’s actually not right. We’re more like independent automobile crash testing. There are many of these organisations, right? And they didn’t happen until the automobile was out and manufactured and had some sense of scale, I think. I’m not entirely sure but I think that’s how the arc of time went. That’s very much what we’re doing is our testing is looking for the risks and potential harms to people and we won’t be the only one. It’s too big. There are lots of different kinds of risk. We’re in a whole new territory, right? Look at what’s happening with freedom of speech in the US right now and the confusion around this and the responsibility of the technology platform. There are certain things we are not testing like deep fakes. There are organisations that are becoming expert in that and we’re hoping that there will be lots of other organisations testing other things. Like we’re down in the plumbing right now. You think about the content and the harms of the content potentially and that’s way up a stack for us. So yeah, we think there will be lots of organisations doing this kind of technology crash testing. Oscar: So independent crash testing for technology. Yeah. I like the simile. So it’s definitely pretty good. Lisa: It didn’t come into me until last year. I have been looking at this for years and years and yeah, it was really hazy. It has been refinement exercise. You know, initially we were like, “Oh, Me2B, we’re going to validate the B,” and then I was like, “No, that’s not – not interested in that. It doesn’t scale well either. So I really don’t want to do that.” But yeah, it has been a continuous refinement and then yeah, so crash testing feels good. Oscar: So Me2B Alliance is doing crash testing. And what about Me2B Alliance, how people can – both people, individuals and organisations, can join forces? So what people who are listening to that and say, “OK, this is a good cause?” How people can join forces or contribute one way or another. Lisa: Yeah. You can join the alliance. We made it very sort of approachable for both Mes and Bs to do so. We’re also – we’re in a soft launch. We are having a little bit of technical difficulties. If you will have some patience, you will see some things is all I will say. But you can join through our website at www.me2ba.org/membership I believe is the full link. You can join as a me or a B or you can just join us. We’re just publishing a public calendar. We’re doing a little bit of refinement on the website. Every other month we have a public call that kind of is just an update on what’s happening. So we’re due to have that the first Monday. I think that looks like February 1st. It will be our next one at 8:00 AM Pacific Time, whatever that maps to in your location. We also have a lot of educational materials. There’s a tutorial or two on the website. In our library, we have a lot of the tutorial presentations in the presentation field. The most recent presentation that has been recorded is the one from the W3C credentials working group. So that’s the latest and greatest. Oscar: Well, excellent. A final question is for all business leaders that are listening to us now. What is the one actionable idea that they should write on their agendas today, especially it’s beginning of the new year. Lisa: I think the deep question is to have an honest, clear-eyed look at the things that you make, the things that you build, whether it’s your website or a product and say with honesty, “Is it treating people respectfully?” and the sort of sister question to that is, “Do I really know all of the integrations and all of the things happening in my software that I’m building?” Again whether it’s the website or a standalone product of some sort or service. What we’re finding in our testing is we’re in early certification. We’re hoping to launch later this year. But in a testing that we’ve done so far, we’re seeing that our greatest value is that the makers of technology don’t really know what’s happening in some cases with the integrations and what’s happening on websites and apps. Oscar: Yeah, definitely a very good reflection. That’s true because as you had mentioned earlier, a long time ago, you knew exactly the one who was making the product, the software, you know exactly what it was, the pieces, because the same company would do all the pieces. But today it’s just the opposite. We are taking parcels from different open source or providers, so much your own software. So a lot of integration, APIs, et cetera. So you don’t know exactly what each company is doing. Lisa: One of the critical things that we’re unearthing in our testing right now, and this relates to identity too, and this was something I raised with the W3C credentials working group. There are two parallel universes when we use technology. There’s the one that we as the user of technology, you know, as the individual side, the Me side, there’s what we experience viscerally. Like oh, you’re asking me to share location. You’re asking me to create an account. I see that. I see what I’m sharing. It’s the known sort of visible world. But the more potentially harmful and risky parallel universe is the invisible world. Largely this is through a lot of the MarTech and AdTech technology. But there’s this whole invisible layer that’s there that is actually identifying you with specificity globally and so when we think about identification, there’s this sort of identity and access management at the visible layer and then there’s this invisible layer that is happening unbeknownst to us and sometimes unbeknownst to the companies that have built the software. So it’s that layer that just really needs more understanding and more visibility. Oscar: Oh, thanks a lot Lisa for this conversation. It was super interesting hearing this Me2B project that you are leading today, doing this fabulous work and I really hope that everything crystalises as you said. In five to two years, we will have this level of respectful technology that we should operate today. Please tell us how people can find you or the organisation. What are the best ways for that to find you on the net? Lisa: Yeah. So the website is www.me2ba.org is the URL and that’s the best way to get connected with us. Oscar: Excellent. Thank you Lisa again and all the best. Lisa: Oscar, thank you so much. It was wonderful catching up with you. Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at Ubisecure.com/podcast or join us on Twitter @Ubisecure and use the hashtag #LTADI. Until next time. [End of transcript]",https://www.ubisecure.com/podcast/lisa-levasseur-me2b-alliance/,,Episode,,Ecosystem,,,,,,,,2021-02-03,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,LTADI,,Spherical Cow,,,,,"Making Identity Easy for Everyone - Heather Flanagan, Spherical Cow Consulting","how to explain digital identity to people outside of the identity industry, why is it important for everyone to understand, and what the industry can do to improve the understanding of identity for everyone.","with Heather Flanagan, Principal at Spherical Cow Consulting. In episode 74, Heather Flanagan discusses making identity easy for everyone – how to explain digital identity to people outside of the identity industry, why is it important for everyone to understand, and what the industry can do to improve the understanding of identity for everyone. [Transcript below] “If you talk to any identity professional, they will agree that passwords are one of the biggest, possibly the biggest challenge facing the industry. So how are we solving it?” Heather Flanagan, Principal at Spherical Cow Consulting and choreographer for Identity Flash Mob, comes from a position that the Internet is led by people, powered by words, and inspired by technology. She has been involved in leadership roles with some of the most technical, volunteer-driven organisations on the Internet, including IDPro as Principal Editor, the IETF, the IAB, and the IRTF as RFC Series Editor, ICANN as Technical Writer, and REFEDS as Coordinator, just to name a few. If there is work going on to develop new Internet standards, or discussions around the future of digital identity, she is interested in engaging in that work. Connect with Heather on LinkedIn. We’ll be continuing this conversation on Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla. Oscar Santolalla: Hello and thank you for joining us. Today, we are going to hear from an expert in identity about – how from the perspective of, let’s say regular people, most of the people, who are not involved in the identity industry, how much they understand the identity, the methods, the technology and everything that we in this industry are building. So, we’re going to talk about how we can make identity easy for everyone. For that, our guest is Heather Flanagan. She is Principal at Spherical Cow Consulting, and Choreographer for Identity Flash Mob. She comes from a position that the Internet is led by people powered by words and inspired by technology. She has been involved in leadership roles with some of the most technical, volunteer-driven organisations on the internet, including IDPro as Principal Editor, the IETF, the IAB as RFC Series Editor, ICANN as Technical Writer just to name a few. Hello, Heather. Heather Flanagan: Hello, Oscar. Oscar: Nice having you. Heather: Thank you. It’s great to be here. Oscar: Excellent. This is going to be super fun talking about how to make identity easy for everyone. Let’s see how our conversation goes. So yeah, let’s get started, let’s talk about digital identity. First, I would like to hear a bit more about yourself, please tell us your journey to this world of identity. Heather: Oh, you know, very few people actually decide that “You know, digital identity, that’s going to be my career.” In my case, I have a liberal arts degree as a history major, and a library science degree for my master’s degree. I mean, I was supposed to be a librarian when I grew up. But as is often the case, once the person falls into tech, everything ends up touching on digital identity. So immediately after university, I ended up working for the public research division of a newspaper that was just starting up an ISP. So, this was the mid ’90s, there weren’t a lot of experienced tech people to hire. And that ISP started hiring people who, you know, are you smart? Are you logical? Can you learn from a book? And there, as a sysadmin, I had to worry about creating user accounts and making sure that those users were able to access what they were allowed to on a system and only what they were allowed to on a system. When I left the ISP, I went to work for a large software company where again, the fundamental reason for even having an infrastructure IT team was to make sure that people could access what they needed to across all the different computers. And this kind of pattern of you come in, you’re working on the infrastructure for an organisation, and it all boils down to do people have access to what they need? Is their identity set up properly so that it can do what it has to do online? That pattern just kept repeating for as long as I had any job that included operational responsibilities. Oscar: So, you were supposed to be a librarian, you said. And then you are– in the mid ’90s, the beginning of the commercial internet, helping these ISPs, right? Internet Service Providers, so actually it’s a term that not many people use these days, you know, it’s like ISP is taken for granted, right? It used to be a big thing at that time. Heather: It was. It was. The ’90s was a heck of a time. There were so few people who actually had computer science degrees. I think in the team I worked with, there was me as the librarian. I had Political Science majors, French majors, English majors, Math majors, I think we might have had one Computer Science person and he left early on because he could. And yet the diversity of that team was outstanding. Oscar: Wow! Sounds very nice. And it looked like you just continued that path, making more step and you never came back. Yeah, fantastic, fantastic, the projects you’re working nowadays, and I know that at least in Identity Flash Mob we work a lot. Yeah, trying to make identity and other tech concepts understandable, easy to understand for everyone. So, I’d like to jump in and ask you how do you explain what digital identity is for a young person, someone who just entered, a college let’s say. Heather: So, actually wrote a blog post on that topic on the Identity Flash Mob website called What is digital identity (and why should you care)? It’s interesting if you think about explaining this to a freshman in college, versus say how you would explain it to your mom. Because one of them is born digital, right? They’ve never lived in a world that didn’t have an aspect of them online. And yet, I don’t think I would explain it too differently to either them if they actually asked. I mean, on the one hand, digital identity is the representation of you online. It’s your online access to your banks and your credit cards. It’s your account to your email provider. It’s your presence on social media. It’s your electronic information with your government. It’s your browsing history. It’s the details of your smartphone. It’s where your computing devices are in physical relation to others. Just like different things make up your Personal human identity, brown hair, blue eyes, a lot of different things make up your digital identity. And not everyone has the same characteristics. You may not have a smartphone, or you may not have a social media account, but you almost certainly have some other things that come together to make up what a digital identity is. So, it’s, I think, that kind of explanation, hopefully, bridges all different kinds of demographics, in terms of how techy someone is or isn’t, as the case may be. Oscar: And what have been the faces after you give that answer? Heather: Usually, minds start to explode a little bit because it’s a lot. There’s a field of psychology to try and understand the human mind and that field has been around for a while and it’s really complicated, and people get doctorates in it. Digital identity is actually also really complicated. As soon as you start pulling it apart to try and understand it, you realise that there’s a lot there to unpack. Unfortunately, we’re still in early days, and unlike the field of psychology, we don’t have quite the massive of knowledge and understanding and research that has gone into it yet to make it something that you can study. There’s very few schools out there that actually have courses on understanding digital identity or managing digital identity or being careful with digital identity. Oscar: Yeah, definitely. And yeah, you know what, it doesn’t happen often to me, at least, that someone, someone who is not in the industry asked me, “OK, what is digital identity?” It’s not that common. But the ones who, for instance, when I give an answer, you have given this answer, it was a really good answer, simple answer, that’s pretty good. So, some people get an idea of that. But beyond that, it’s good to understand what are the consequences, right, of having this reasonable understanding about digital identity for an– yeah, regular person. So, what could be the worst consequences of not really understanding enough the identity for regular people? Heather: Oh, that’s a good question. And then I’m glad you qualified it with understanding enough because no one’s going to understand everything, you just have to understand enough. Because computers are absolutely everywhere in governments, in banks, in schools, in businesses, everybody has a digital identity. And if you don’t understand that, you don’t even know that you have to protect it. And if you don’t protect it, then people can impersonate you, to do all sorts of things. That’s something that we call identity theft. There’s a big insurance company in the US called Allstate Insurance. They did a study a few years back and found it took over six months to recover from identity theft. Experian, which is a great big credit agency said that it can even take years, you know, six months might be optimistic. So, if you don’t understand that you have a digital identity, that just because you may not have a smartphone or you don’t have a Facebook account, you still do have a presence that you need to take into account and need to think about how to protect. For me, that’s like the biggest single consequence. It’s the lack of awareness that it’s something that you need to protect just like you need to protect your birth certificate or your passport, or the keys to your house. Oscar: Yes, exactly. Yeah. As you put it in concrete and visual examples, yeah, people the key, or the key for your passport, something that you would have it on your pocket very close to you. And if you would lose that, you definitely would feel like “Oh, I’m in big trouble, or I will be in big trouble if that happens.” So, the same way everybody should feel if there is a chance of losing the identity, well, not losing but someone takes over, there is identity theft. Heather: Yeah, it’s hard to get back. Oscar: Yeah, exactly six months in some cases, or you say that’s, that’s a lot. What can you do in six months without a digital identity, at least for the most relevant identities that you have, such as the government identity, for instance. Also going into the worst scenario, the worst consequences of not understanding well digital identity, but now the perspective of people who are directly and indirectly building digital services and this can be people, for instance, working in start-ups, in the government, in the health care. They are not necessarily the developers, the technical people but they are working, they are part of the extended team that are building these services. So, what can be the worst consequences of, again, these extended on to people who are working if they don’t understand well? Heather: So, we just said that people suffer when their digital identity is stolen. But you know, businesses do, too. I was moderating a webinar just yesterday, where the speaker, Tim Cappalli from Microsoft said, “Attackers don’t break in, they log in.” That means, like the majority attacks against businesses come through attackers just logging in, using stolen account information. If a business doesn’t have a handle on their identity and access management services, and the best practices for the security in that space, they’re going to lose so much money to fraud, just even more money to cyber-attacks. I mean, businesses often think of their IT infrastructure as being this cost centre, because it’s not what’s directly making the money. But wow, messing up that infrastructure can put a company right out of business, or at least put a serious dent into their net worth. Oscar: Yeah, definitely, as you said, data breaches or just fix something, it can have a huge impact not only financial, but also in, well, reputation, et cetera, et cetera. And how do you see if, again, I ask you the extended people who work building these services, how much do you feel that an average project manager, an average designer is aware about digital identity? Heather: Usually not very. I mean, this comes back to it’s not like digital identity is as taught as a separate thing, right? In a way, it’s almost when you’re working with computers, it’s almost like air, because it’s just there. Of course, you have to log in, of course, you have to have an account. You hear about having to have strong passwords and all these other things. And so actually understanding that it is more complicated than that, and that there are good reasons for it to be, it’s not as common as it should be. Certainly, otherwise, we wouldn’t be losing all this money to fraud and identity theft. Oscar: Yeah, indeed, indeed. So, there’s a lot of work for us, the ones who are working in the industry to keep educating, right? Making it easy also to understand that’s super important, and I know you work very hard into that. Heather: Oh, the identity industry isn’t making it easy, necessarily. So, there is always that to consider too. Oscar: If– so where the– if you see the identity industry where the identity industries is failing to make itself understood? Heather: Oh, I have a list. So, let’s start with like one of the questions you asked earlier about, OK, how would you define digital identity? I gave you, my definition. But if you get 10 identity professionals in a room and ask them, you’re probably going to get at least 15 different answers. I mean, that’s among professionals already in this space. So now, imagine how hard it is for people, not in the space to get a handle on what their digital identity is and how to manage it. The terminology used in the industry is imprecise. The standards that we’re developing are incomplete. They’re inconsistently used. And it’s just how we communicate with the rest of the world’s big problem. I’ll give you another example. Passwords. Let’s just talk about passwords. If you talk to any identity professional, they will agree that passwords are one of the biggest, possibly the biggest challenge facing the industry. So how are we solving it? There’s password complexity rules, password reuse rules, passphrase suggestions, password manager guidance, passwordless guidance, multifactor authentication using authenticator apps, multifactor authentication using biometric data, multifactor authentication using SMS codes, all of these options, all these different guidance, how’s the user suppose to understand all that? When they’re going from one site to the next to the next and they’re just being mentally pummelled with all these different options? How are they supposed to know what the best practice is? What are they supposed to be doing to protect their identity if they’re getting such an inconsistent story, just on the most basic aspect of logging in? Every site is implementing things differently. A regular user doesn’t know if that’s the current best practice or not. And to be fair, practitioners can barely keep up with it themselves because the identity industry at this point, it’s not just sending mixed messages. We’re just sending white noise and static to our users, because it’s just all over the place. That’s a huge failing on our part. Oscar: Ah, yeah, yeah. That’s cool in many people’s ear, I believe. Yes, it’s true. So, the way you say it, so if 15 different identity professionals answer the same questions, all of them understand very well the concept, the problems, but yeah, the answer will be different. So that already tells a lot. Yeah, I would like to hear if there are, there must be, there must be good examples of projects or initiatives that have – well, intentionally or unintentionally have been making identity much more understandable for everyone. So, if you can share some success stories about how projects initially have educated about digital identity. Heather: So yeah, I definitely have some favourites here. Though I want to preface that educating people about digital identity is hard, because often people don’t want it to be hard, so they don’t want to know about how complicated it is. And that, that makes it a bit challenging to offer education. But that said, I love what I see coming out of the Internet Safety Labs, that was previously known as the Me2B Alliance, they’ve turned into an independent software product testing organisation. And that focuses on online safety standards that company can actually quantitatively measure their software against to see if they’re really offering a safe online experience. I’m also a huge fan of the FIDO Alliance. And the work they’re doing to finally move the world away from passwords in a way that’s very easy for the end user to understand and respond to. But that said, I see those as success stories that influence the business side of things, on behalf of users, but still, it’s very business oriented, which is great. But I also want to highlight those groups that are actually– they’re actually getting out there on Instagram, and TikTok to get useful information about digital identity in front of regular people where they hang out every day. So, I mean, that’s something I’m trying to do through Identity Flash Mob, which is a passion project for me. But another one that I’d point to is the Cyber Security Hub. I find their posts on Instagram hilarious and absolutely spot on. And they’re reaching over 343,000 people on Instagram with those posts. Oscar: OK, so definitely have to check it out. Was it Cyber Security Hub? Heather: Yeah. Oscar: And of course, your project Identity Flash Mob. So, you’re already on TikTok? Heather: I tried to do TikTok, I’m not very good at it. I’m much better at Instagram. Oscar: All right. Each one of course has their own strengths. Yeah, yeah. I’m not in TikTok either. I have to definitely keep following the work you are doing. But tell me a bit more what you’re doing in Identity Flash Mob now that we are in this point? Heather: So what I’m trying to do there is, as you say, make the information about identity and security and standards, basically how technology works a bit more consumable to people like my mother, or my friend, Laura, Laura Paglione, who’s partnering with me on this, you know, make it something that her daughter would find interesting enough to actually stop and read. And that means getting the information into accurate and consumable, almost sound bites, you know, very, very short pieces and in front of them where it goes. We do write blog posts, because blog posts helped me organise my thoughts as to how I want to present an idea to the users. And a lot of the posts offer what can you do in five minutes, in 15 minutes, in 30 minutes to learn more about the space? And also, the regular posting on Instagram just to get ideas and thoughts out there to people? Oscar: Yeah, excellent. Yeah, I checked your website and definitely it’s super interesting. I read a few of your articles, and excellent job that you are doing there. And yeah, I’ll keep following. Yeah, that’s super interesting for me I was– every time I have the well, the responsibility to give a talk or some, yeah, writing blog, for instance, writing blog posts, I also like, to the best possible effort to make it very easy to understand and to connect with something that people are familiar. So, it is super important, as we are seeing after how this conversation is going really, it’s – that’s a lot to do, yeah, on that aspect. And now, I’d like to hear more about your work in those standard organisations. And you have been working as a technical writer, editor, et cetera, also leading projects there. Yeah. Tell me about this, this work you’ve been doing writing the standards. Heather: So, standards development is one of my favourite things. So, a bit about standards, right, is that standards exist to make it so things can actually interoperate. The fact that you can send an email from your mail app, and have it received across the world to someone in India, with a completely different mail app is actually a miracle of standards, you know, that that functions all the way through. I can’t emphasise enough how important it is to get people involved in those kinds of efforts. And it needs all different kinds of people. I don’t write standards. I’m not that kind of technical. But I’m really good at reading them and offering feedback or helping with the process or helping facilitate the calls. Whatever is needed to help the engineers write what they need to write is how I participate. So, there’s rooms for lots of different kinds of smart, a lot of different kinds of technical, it’s not just system engineers and what they can do. So, I was the Publisher the Executive Oversight for Publishing Internet Standards for eight years through the Internet Engineering Task Force, and I got to watch a lot of that work being done and help the groups actually get their words out and published to the world. It was very exciting. And it’s also very hard, because one of the biggest challenges that the whole standards development process has, is that to actually achieve interoperable standards, you have to get people from different backgrounds to help write review, test the proposed standards to see if they meet the needs that they are written to address. I mean, as it stands today, even though there’s companies that sort of donate their people’s time, nobody actually has a full-time job saying, I am a standard writer, right? They work for a company, they have other things they have to do, their “day job”. So, it’s always the best effort really, in terms of are the right people in the room? Or is there enough diversity in the room? And so far, they’re just huge diversity gaps there. I think most standards or organisations have some kind of diversity effort on the books. But it’s hard to attract new people, when you already have a strong culture in place that itself isn’t attractive to diverse participants. It can be done, but it’s pretty slow going. I think I’ve had it easier, in a way because I embrace the fact that I’m really good at organising people and organising processes. And that was a space that the specification writers themselves didn’t want to do. And so, it was easy to just slot in and say I can help you with this stuff that you don’t want to do. And it made for very good give and take to get things out the door. But I think other folks may find it a little bit more challenging, because the thing I hear constantly is I’m not technical enough to do that. You use a computer, you can turn it on, you can offer what’s your experience? What did you find easy? What did you find hard? That right there is useful information for spec writers to know. Oscar: Yeah, I understand the challenge of as you said, most of the people who are writing these standards, well are squeezing their time, right, to work on those projects. Yeah, it’s one, among other tasks of their day job for most of these people. That is correct. And it’s already challenging. So, you have mentioned the challenging in finding diversity, if you can explain a bit more in what about geographically, what about which aspect of diversity are not so well right now, let’s say. Heather: All of them. There’s the geographic diversity. I mean, if you look at the demographics of who’s attending something like an IETF meeting, it’s mostly US and Europe, and then China and India, but it quickly drops off, right? They try and address this by moving the meeting around. So, you’ll have North America, then you’ll have Europe, then you’ll have Asia, right, to try and get it at least proximate to the geography of who will actually attend. But that’s just one thing. There’s also gender, tech is notoriously driven by a male dominated field. There’s also age, one of the bigger concerns is the fact that people who write standards are usually more at the tail end of their career, as opposed to the beginning of their career. And so, it’s a group that’s aging out. And not a lot of young folks come into it at all. Often, their company says, “Well, you’re too junior to participate in that.” which is unfortunate. So really, pretty much every dimension has a lack when it comes to diversity. Oscar: So, you would encourage as you just said now, for instance, one person who is relatively young, and so the boss or someone the company says, “You don’t have enough experience to do that.” So, we need to challenge that, right? Heather Flanagan: We do. I think of it this way. I mean, the argument I would make is, by the time someone is at the end of their career and coming in and everything that you know, they have very strong opinions, and that’s great, but we’re leaving the junior staff to relearn all our mistakes. Because they’re going to go through this learning process, and they won’t understand the history of why did something develop the way it did. And because they don’t understand it, they’re going to reinvent the same problems. And maybe they’ll solve them slightly differently. But wouldn’t it be better if we could jumpstart them and say, “OK, this is what we did. This is why we did it. This is you know, how we did it.” But condense that so that earlier on in their career they can say, “OK, I understand what was tried before and why it didn’t work. Now let’s really think new things.” That would be a wonderful thing to see happen. Oscar: Indeed. Yeah, I hope people who are listening to this episode are – have some curiosity to… so what happened in someone who is right now listening to this podcast say, “OK, maybe I can try it.” So, whom they should contact to, in order to try participating one of these organisations, what are the best way to – whom to contact or how to get started? Heather: So, the first thing to do, I guess, is to narrow down into at least a little bit into well, what kind of thing, what are you most interested in? Right? And you could actually just post on Twitter or post on LinkedIn or something like that and just do an outreach saying, “I want to help with the standards process, how do I do it?” And see what kind of answer you get, see what’s open. For organisations like the IETF, they have… oh, goodness, I don’t know how many working groups, we’ll easily say at least 100, possibly more. And you can just look through them and say, “OK, this sounds interesting.” And its public knowledge, well, who’s… you know, what kind of documents is that group working on? Read them, and you’ll know who their authors are. You’ll have their contact information. You’ll have the contact information of the working group Chairs, and you can reach out to them and say, “I’m interested in helping, what can I do?” And sure, that they will be so happy. So happy to hear that there’s interest in that space. Oscar: That’s great, definitely. I have only once for a couple of years, I participated in Kantara Initiative, definitely, a colleague suggested me to join for a while and it was super nice, definitely. So yeah, actually, anyone who has not participated in any of these standard organisations, yeah, please follow Heather’s advice. Just find something that’s interesting for you and contact them. Because it’s very important, it’s very important for all of us not only to ship the standards, but that process of making everything more and more understandable. So super interesting conversation with you, Heather. Now, a final question is for all business leaders who are listening now in this conversation, what is the one actionable idea that they should write on their agendas today? Heather: OK. So, I knew you’re going to ask this, and I gave it some thought. And what I would love to see happen, and I think would actually be fun. You need to strengthen your identity and access management, you need to strengthen that function in your organisation to the point that your marketing team, your finance team, your management team, your dev teams, all understand what digital identity means in your company, and why it’s so important to protect. And I’m talking about getting creative about it. Not like having an online training course that says, “Make sure that your passwords are 16 characters long, it contains the following.” No, no, that’s not what I mean. I want you to encourage study groups. I want you to create a game day, where you have teams that include representatives from every department acting out a response to an incident involving an identity related security issue. So, the marketing person would– they would be tagged to figure out what is the communication message that you have to send. The finance person would have to figure out what would the cost of this be? The development person would have to figure out how to research and close whatever vulnerability was used in the system. And that’s not just going to help your company. It’ll actually help your people protect themselves everyday online. So, and I think it could kind of be fun to do. Oscar: I like it. It sounds definitely fun. So yeah, digital identity can be fun. So, it’s a great idea that you are suggesting to organise a game. So, well, thanks a lot for that. It was super interesting. And again, I commend you for all the work you are doing in Identity Flash Mob and all the standard organisations. So, for people who would like to continue this conversation with you, Heather, or find more about the work you are doing, what are the best ways to find you? Heather: So, you can always find me on LinkedIn, I’m fairly active there. You can do me a huge favour and follow Identity Flash Mob on Instagram, which is always great. Or, you can go to the identityflashmob.com website and just read about what we’re doing and it will give you a few ways to contact us there too. Oscar: Excellent. Again, thanks a lot, Heather. And all the best. Heather: Thank you very much. Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up-to-date with episode at Ubisecure.com/podcast or join us on Twitter @Ubisecure and use the #LTADI. Until next time. [End of transcript]",https://www.ubisecure.com/podcast/making-identity-easy-heather-flanagan/,,Episode,,Ecosystem,,,,,,,,2022-08-31,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,LTADI,,Kantara,,,,,"Meet Kantara’s new Executive Director, Kay Chopard",Lets Talk about Digital Identity Kay explores why identity is so critical in so many applications; her hope for more promotion of Kantara’s great work and to advance opportunities for collaboration; Kantara’s new mobile drivers licenses (mDLs) work group; Women in Identity and the problem of lack of diversity in standards working groups; and why access and inclusion is one of the biggest challenges facing identity today.,"with Kay Chopard, Executive Director at Kantara Initiative. In this first episode of series 3, we put your burning questions to Kantara’s newly appointed Executive Director, Kay Chopard. Kay explores why identity is so critical in so many applications; her hope for more promotion of Kantara’s great work and to advance opportunities for collaboration; Kantara’s new mobile drivers licenses (mDLs) work group; Women in Identity and the problem of lack of diversity in standards working groups; and why access and inclusion is one of the biggest challenges facing identity today. [Scroll down for transcript] “Digital identity is going to be one of the most critical issues going forward, for the world.” Kay Chopard is the newly appointed Executive Director of the Kantara Initiative, a non-profit corporation. She is the former President and CEO of Chopard Consulting based in the Washington, DC metro area and is the founder of the Women’s Leadership Institute. Kay has more than 30 years’ experience in executive leadership in government, non-profit, and business organisations, with leadership positions in several organisations including: Identity Ecosystem Steering Group (IDESG), National District Attorneys Association (NDAA), National Criminal Justice Association (NCJA) and the National Highway Traffic Safety Administration (NHTSA). She is an attorney and has served as a prosecutor and maintained a private practice. Ms. Chopard also serves on the Board of Directors of Women in Identity US and volunteers in the leadership of the Women in Identity UK. Find Kay on Twitter @KayChopardCohen and on LinkedIn. The Kantara Initiative is a unique global ‘commons’ that operates conformity assessment, assurance and grant of Trust Marks against de-jure standards under its Trust Framework programme, while at the same time nurturing ‘beyond-the-state-of-the-art’ ideas and developing specifications to transform the state of digital identity and Personal data agency domains. Find out more about Kantara at kantarainitiative.org. We’ll be continuing this conversation on Twitter using #LTADI – join us @Ubisecure! Go to our YouTube to watch the video transcript for this episode. Or subscribe with your favorite app by using the address below Podcast transcript Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla. Oscar Santolalla: Hello and thanks for joining today. We are after our summer break in 2021. We are coming back with amazing conversations, episodes talking about digital identity from many aspects. And now we have a great pleasure to start this new third season with a person who is the leader of an organisation in the identity industry that is very close to my heart. So let’s introduce her. Mrs. Kay Chopard is the newly appointed Executive Director of the Kantara Initiative, a non-profit organisation. The Kantara Initiative is a unique global ‘commons’ that operates conformity assessment assurance and grant of Trust Marks against de-jure standards under its Trust Framework programme, while at the same time nurturing beyond the state-of-the-art ideas and developing specifications to transform the state of digital identity and Personal data agency domains. Kay has more than 30 years’ experience in executive leadership in government, non-profit, and business organisations in the DC area. She has led several organisations but in identity especially, I would like to mention she was Executive Director of the Identity Ecosystem Steering Group, IDESG, a non-profit organisation developed in a public-private partnership to implement the national strategy for trusted identities in cyberspace, in partnership with the National Institute of Standards and Technology, NIST. She is an attorney and has served as a prosecutor and maintains a private practice. Kay also serves on the Board of Directors of the Women in Identity US and volunteers in the leadership of the Women in Identity UK. She lectures internationally and has authored several articles and white papers on racial and gender diversity, equity and inclusion, as well as a variety of criminal justice issues, including use of technology in the courts, and legal policy around privacy and security in the use of court technology. Hello, Kay. Kay Chopard: Hello. Oscar: Welcome. It’s really nice talking with you, very new Executive Director of Kantara Initiative. And definitely not only myself, many, many of us who are listening to this and our colleagues, many people I know, want to hear, want to know more about you. Kay: Thank you, Oscar. I really appreciate the opportunity to talk with you and to talk with all your listeners. I’ve just begun and I’m excited to be a part of Kantara. Oscar: Fantastic. The first question came from myself. Given your background in law, what attracted you to digital identity? Kay: Gosh, that’s a really good question Oscar. Although my background is in law, I’ve had a lot of experience working around national policy and international policy on a variety of topics. And what I find intriguing about digital identity, a couple of things. One, I think it is one of the most if not the most rapidly advancing technology. It’s becoming more and more clear I think that digital identity impacts every type of industry, every vertical public as well as private. In terms of any solution and any interaction that consumers have, it just becomes so critical. So that was intriguing as I began to see that flourish. I think that years ago, people called it something else, and over time, we realised that digital identity is really going to be one of the most critical issues going forward for the world. And then secondly, with that comes the opportunity to help form policy that makes a difference in the lives of individual people. And when I say that, it’s because digital identity requires a balancing of privacy, privacy of rights, privacy of data and security, as well as enabling business to get done, enabling people to take control of their lives, to be able to do a variety of things. And so, it just was such a critical issue and had the ability to impact everyone both positively and negatively so the opportunity for leadership in this area, I think, is very big and very important. Oscar: Excellent. And now that you started recently as Executive Director of Kantara Initiative, I would like to hear also what are your main goals having now this position, and if the mission will remain the same for Kantara? Kay: That’s also really good. And I want people to know, I believe that the mission of Kantara, which is to improve trustworthy use of identity and Personal data through innovation and standardisation and good practice, that mission is not going to change. That’s an important mission and we need to stay focused on that. What I’m hoping to do for Kantara is because the organisation does a lot of good work, I really want to elevate that. I’m not sure that it’s always clear what our message is, or we haven’t really been good about bragging about our own work. I think that the workgroups and our assurance programme are really stellar programmes. But I have heard as I’ve talked to others, in some ways, it’s kind of the best kept secret. So I really hope to elevate our profile to make sure that people are aware of what we have to offer, of the expertise that we have. And also, I very much believe in collaboration and working with others. And I really want to make sure that we continue to work collaboratively, as there are more and more organisations in this space. But also, because this is something that’s so critical in so many different places, in healthcare, in government, in people accessing benefits, or being able to conduct business. There are just so many different places. And I want to make sure that we’re at the table and participating and that we’re offering the expertise that we have to really bring to bear good privacy practice and good standardisation of data and doing all that we can to contribute to making the experience for users and the work of corporations and small businesses and government agencies more seamless, and yet very secure and very robust. So the mission will definitely stay the same. Oscar: Yeah, I agree with you doing, it’s interesting you tell of course, making more known the big work, the mission work that Kantara is doing. And you mentioned bragging, yes, so brag more about what we do, what we’ve been doing. Of course, that’s good, because how we’ll reach more people, more people will be able to, as you said, collaborate and collaborate with other organisations and people who are not, for instance, I’ve never been working with any standard organisation until, because of working at Ubisecure, I was suggested to join Kantara, join this working groups. And yeah, it’s not long ago. So before that, I didn’t know. I haven’t heard too much of such organisations. So it’s – I agree with you that it’s important to keep the message being shared and amplified. Kay: Yes. And I think your experience is common. And it’s one of those once you get involved, you can really appreciate the good work and why this is a really helpful organisation. So thank you, I appreciate you sharing your experience. Oscar: It’s my pleasure. You mentioned of course, collaboration with other standards organisations, the standards is a big component in Kantara. I Personally have work in the consent and information sharing group, interoperability group now. Particularly with consent received, I work with that. Another big standard is User Manage Access, UMA, you have also the Trust Mark programme. That has been like the main – depending who sees, but yeah, some of the main big projects on standards going on. So what is next? Is there anything that is coming from Kantara? Kay: Yeah, I think, well, one of the things that I think is coming, we issued a report about mobile drivers licenses, and privacy, really tried to make some recommendations about best practices. And so we’re standing up another workgroup that is going to focus specifically on mobile drivers licensing and working towards standards for that, because it has the ability to do good things, but it could also be abused. So it’s important to make sure that while the security is there for that kind of technology that we also make sure to protect privacy rights and hopefully use it in a way that’s beneficial to both the person whose mobile drivers license it is and for those who want to be able to use it for identity. So that’s our next big thing, I think. Oscar: OK. Yeah, interesting. I have not heard about that, so it’s interesting, so mobile driver licenses, OK. So your position or Kantara’s position in mobile driver licenses is of course positive otherwise it would not start one working group. What else you could say about mobile driver license? Kay: I think that one of the areas, obviously, that Kantara is focused on is privacy rights, and protecting privacy, and being able to make sure that digital identities are trustworthy, that what’s happening with Personal data though is used appropriately. So I think that that will be one of the focuses of this workgroup. And as we’re trying to look at going forward, as we see more and more use of this type of digital identity, what are the standards we can put in place to assure that we are using it correctly and protecting all those privacy rights as well. So that said, I do not claim to be an expert on this topic, but we have folks who are coming together who really have a lot of expertise. And I’m looking forward to seeing what kinds of things come out of their discussions and their work. And I’m hopeful that it will be very beneficial to the field as a whole. Oscar: Sounds super interesting this new working group project, mobile drivers license. I also was curious to hear now that we are, depends how we count it, but around one and a half year into this pandemic, so do you feel that pandemic has somehow changed Kantara’s plans? Kay: I think the pandemic has affected everyone. It has in some ways changed how we had to do business. Just in terms of conferences are now virtual much more often and many of the meetings that we would have with folks have to be virtual. So you automatically have a little bit different way of doing business. But that said, if anything, I think that the pandemic highlighted how the mission of Kantara is even more critical. It really made it more pronounced about the importance of identity and digital identity. But also, I think it also illustrated how there are certain populations around the world, most of them perhaps disadvantaged populations, marginalised populations, where digital identity during the pandemic was important for them to be able to conduct business, get services, whatever it might be, and yet it was very problematic often. So it really highlighted perhaps some of the places where there were weaknesses, where the systems needed to be strengthened. I think it highlighted the importance of protecting privacy. So it sort of took all of the issues that already existed, and further highlighted and enhanced them so that we’re all in this industry looking at it and realising, “Wow, we knew this was an issue. Now we see even more why it’s so important for us to address that.” So in that sense, yeah, business was maybe done a little bit differently. But I think really what the pandemic did was help us get more focus and insight into things that we recognised were probably issues but hadn’t fully delved into. And I don’t think it really changes our plans on the long run except that it helps us to be a little bit more fine-tuned in what our focus is. Oscar: You are one of the leaders of another very superb organisation called Women in Identity. You are part of the leadership team. So tell us a bit about the work you do. Do you think that there is enough diverse representation in standards working groups? Kay: Those are kind of two different topics. But yes, well, number one, I don’t think there’s enough diversity in standards working groups. But before I talk about that, let me just tell you a little bit about my role at Women in Identity. I think you mentioned in the beginning, I’m based in Washington, DC, and the Women in Identity really started out of the UK in many ways. And many of the leaders are still there and based in European countries. But in the US, I became the US ambassador to try to encourage members of the identity industry to participate. And I think one of the most important things to know is that the Women in Identity organisation is very focused on diversity across the board, not just gender. And I think one of the things that I have learned even more than, and I kind of have this sense but I have realised over time, that it is really critical that we recognise we’re sort of all in this together, right? So the members of Women in Identity are also very diverse. So that means that there are women and men, there are people of colour, there are people who are disabled, there are people who are from the LBGTQ+. So there are all different kinds of folks. And I really think that’s what makes the organisation stronger. They sort of walk the walk, and talk the talk, they don’t just talk, they really have tried to implement that and how they run the organisation. Of course, it’s fully volunteer, so almost everything, I mean, what I do for them and everyone else there are all volunteers and give up their time. And it’s really because we believe in the importance of diversity in the digital identity space. And I think that one of the areas that is weaker is diverse representation in the standards work groups. And that’s one of the things actually that is intriguing about Kantara because our work groups obviously are working on standards, they’re really working on things that then they hand off, they give to, has the ability to go to standards organisations and that’s where it’s really important to have diversity. That said, that’s not an easy thing I think to make happen. And I think there are a lot of reasons for that. I mean, I think that’s the other nice thing about Women in Identity, and why I’m passionate about diversity and inclusion is truly not about blaming anyone or saying this didn’t work, or that didn’t work. You saw in my bio, I’ve been the CEO of a variety of organisations, primarily non-profits and I had my own company. But many times I tried, as the CEO, to make sure that I was hiring diverse groups, and you know, all of these kinds of things. And I tried and some things didn’t work, they crashed and burned. It seemed like a good idea. It did not have the effect I wanted, which is I wanted to have a diverse workforce. So I think one of the good things that I’ve learned and that I think the organisation pushes is that this is not a checklist and this isn’t just that you do a couple things and it’ll all get itself fixed. It’s not. And some of that is the culture that affects everyone – men, women, the whole nine yards. And I think that the standards working groups, in some ways, have a little bit of their own culture. So I’m really hoping that we can work together to find ways to have that diverse representation. Because what happens is, the products that get developed using those standards, they have to be available for a diversity of users, right? They have to work for everyone. And so a lot of times, if we could start on the frontend with the standards that we put together, and we make sure that we’re developing them in a diverse and inclusive way, then it’s more likely that what happens at the end, when these standards are deployed to technology, to digital identity, there’ll be fewer problems, right? So I think one of the things that Kantara does well is really trying to look at the future state, right? Really trying to focus on what’s coming down the road, and thinking about those things, and how to begin to address them, and getting ahead of the curve, if you will, which is not always an easy thing. And I find – it’s fascinating to me that Kantara is – that’s their focus and that has been their focus since inception. And it’s important to have people looking at that. And I’m hopeful that as we begin to embrace some of these other things, in terms of diversity, we’re going to see a difference in the standards that are developed but then what we see in the products that are available, right, for users. That was maybe a convoluted answer, sorry. But I think it’s a good question to ask, I think you’re right on the money. I’m kind of excited because this is sort of right in Kantara’s sweet spot. So I’m really looking forward to working with the members of the organisation to see you know, what kinds of new and even better things we can be doing. Oscar: Yeah, I’m sure absolutely that particularly in Kantara, being you the leader and believing so much in diversity, there’s going to be results. Definitely not in, as you say, not in the short term, but because it’s about the culture, things have to change gradually. But yeah, it’s – I’m sure, we will get the result there with your leadership. And of course, you know, in other organisations as well, this is very important that is standard organisation also in companies. What would you say is today the biggest challenge that digital identity is facing? Kay: I knew you were going to ask me something like this. And I think there are a couple of big challenges, which I’ve kind of talked about, in some ways already. But I think privacy and protecting privacy, and what that looks like in digital identity continues to be a big challenge. I think the pandemic only highlighted that more. And what I’m seeing is there are a lot more organisations now, including lots of other non-profits who are working in the digital identity space, which is good. And some of them are focused on specific industry verticals, which is also good. But I think that there are different ideas, thoughts, approaches to how do we protect the privacy of individuals and their digital identity. Not only how do we protect, but how do we give them some control over the data that they’re actually sharing. And to me that also goes to privacy in many ways. So I think that that is going to continue, and probably even more so going forward. The other piece that I think is another challenge, and I’m assuming people are recognising this. But I think as I mentioned a little bit earlier, the pandemic has highlighted how disadvantaged or marginalised populations really struggle with having digital identity. And in some countries, we’re seeing where some of those disadvantaged populations, which tend to be more diverse, don’t have access to being able to have a digital identity and that cuts them out of a variety of things that they just can’t access and can’t use because they have to have that digital identity. So in the US, we see this where folks who most need to access services, to access the courts, to be able to conduct business have the most struggle with it. And I think it’s really highlighted that digital divide, if you will, that separation. And it’s just really emphasised those below a certain socio-economic level, for example, it’s an uphill battle for them to be able to use digital identity and yet the pandemic forced all kinds of industries, all kinds of government entities, in all kinds of ways that happened quickly because of the pandemic. And consequently, many of those from lower socio-economic status, or whatever it might be, it really, in some ways made life harder for them. And I hope that we recognise that as an important challenge, right? It shouldn’t be that way. We should find ways to make that better. And I think for those in the industry, in some ways that’s sort of an ethical obligation we have. I mean, we can talk about privacy and security. We can talk about data privacy, all these different things. But I think we have to recognise that the work we do impacts real people. And because it has a disparate impact, it’s on us to find a way to make this right, in my opinion. I think that’s a big challenge. I don’t think there are easy answers. I think we’re going to grapple with this for a while. And I think people have found ways to do better. But I think we have to keep striving for that. It’s just important. You know, it can’t be that digital identity is for the haves and not the have nots. We have to find a way to do it better. And I just think that’s very challenging, because there aren’t any easy answers. But as I said, I think all the things that we’ve talked about are still big challenges, right? And I think the whole privacy, and the trustworthiness of data, standardisation and good practices, those are all still important challenges. I don’t want to take away from those. But I think that recent experiences just highlighted some things that probably were not nearly as evident until we’ve had the recent experiences. And who would have thought this could ever happen? I mean, I didn’t think it would. So it’s sort of amazing. I think what we have to look at this as it’s an opportunity. It’s an opportunity for us to do better. It’s an opportunity for us to do more good things. One of the things I like about the digital identity industry space is, and this probably sounds very altruistic, but we have the ability to change the world. We have the ability to make it a better place. And how exciting to be part of an industry that really can make a difference in every person’s life around the globe. That’s amazing. That’s phenomenal. So I’m excited to be a part of that. Oscar: Of course, of course. I like how you say this the altruistic way. I think this helps in that everyone who is one way or another in the digital industry has to embrace that. And how to solve these problems, the two that you mentioned, the privacy, and especially the second one that is giving digital identity to everyone, which is even I think has been more challenging. Thank you a lot for that, Kay. One final question that I have in this Q&A special episode of Let’s Talk About Digital Identity, we ask for one final idea for people who are listening to this, especially for business leaders, business minded people who are listening to us, what is the one actionable idea that they should write on their agendas today? Kay: Well, in my opinion, many of the things that we’ve talked about, as I’ve mentioned some of these things are cultural, some of these things are things that we don’t even recognise, and it sort of permeates all of our work. So the one thing that I would say is for every business leader, and for anyone who listens to your podcast, start with yourself. I think you start by learning about what is all this diversity and inclusion. It’s a lot of talk, you see it everywhere. People are trying a whole bunch of different things. I think in some ways, what you do is you start with yourself. There are a lot of things to be learned. You know, read a short article, there are books on this, you can read books. I’m happy to give you my reading list. There are lots of things where you take the time to read something for yourself, where you learn something. And then you start to look in your own work life, right? So you start to look at when you’re participating on a product team, let’s say, pay attention to some of the dynamics. Who is part of that team? And who is clearly not there? When you have your meetings, who does all the talking? Who gets interrupted as you’re talking? Who seems to have their ideas co-opted? How are members of the group treated and how are their ideas treated? And some of that is just making observations, paying attention in ways that maybe you hadn’t before. And I think the more that you can educate yourself, the more that you can learn, the more that you take a critical look at how business gets done in your organisation. That’s how we begin to affect change. And I think it makes us sensitive to some of the big challenges and issues that I mentioned before. But I think it starts with you. So that would be my challenge for the business leaders listening that they put that on their agenda. It doesn’t require a big lift, but it requires being intentional, and deliberately educating yourself, deliberately paying attention to some things that, because our work lives are so incredibly busy, that sometimes we overlook. So I encourage people to take a step back and pay attention and learn from what they see to maybe make changes in their own lives and their own organisations. Because as I said, I think we have the ability to change the world. I really do. And it starts with you. So I hope that folks will do things like that. Oscar: It’s really good one. Thanks for that. Start with you and learn more and then put all these in action to make changes, to impact the industry, impact everybody who needs from our industry. Thanks a lot, Kay. It was a fantastic conversation with you. Thank you for answering all these questions that we compiled from the audience and other people. And please let us know how people, if they would like to learn more about you or get in touch with you, what are the best ways? Kay: Thank you so much for asking that. So the Kantara website has a lot of information, much more detailed than what I gave you. It talks about all the workgroups, like you talked about your participation, Oscar, that’s an excellent group, and there are some others as well, we would love to have more people involved. The website is kantarainitiative.org. We also have a LinkedIn page, we often post things about our work, some of the things that we’re doing. We’re on Twitter. So I hope you’ll follow us. As well, I have a LinkedIn page and a Twitter account. And as you can imagine, there not a lot of Kay Chopards, so I think if you look for that, you’ll probably find me. And certainly, if people want to reach out to me directly, you’re welcome to do that. My email is Kay, [email protected] But I would really encourage people to take a look at the website, because I think, as I kind of mentioned early on, in some ways we’re the best kept secret, there’s a lot of really good things happening. And of course, our assurance programme is I think, very impressive, it’s really well done. And I think it really provides an amazing service to the industry, to companies who want to be able to do certain types of work. And we have really good people who work in that programme. It’s really kind of the crown jewel of the organisation, in my opinion. And there’s information about that and what we have to offer. And I should also point out that the programme takes a variety of steps to ensure confidentiality, so companies don’t need to worry that to go through that assurance process that they are having to reveal any type of proprietary information beyond the very narrow assurance programme, the assessors, the review board, all of these folks. We take that very seriously. But it really is an amazing service. I hope people take advantage. So take a look at the website. And I hope that you’ll reach out and follow us and like us. I look forward to meeting the folks and thank you for including me in this. I really hope this was helpful to your audience. And I was a little nervous when you put out on social media, “Ask Kay anything.” But these are great questions and I really appreciate the field asking some of this. I’m sure there were other questions that we didn’t have time for but this was a great conversation Oscar and I really appreciate your experience and all you had to offer as well. Oscar: Thanks a lot Kay for this fantastic interview and all the best. Kay: Thank you, all the best to you, too. Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at Ubisecure.com/podcast or join us on Twitter @Ubisecure and use the #LTADI. Until next time.",https://www.ubisecure.com/podcast/kay-chopard-kantara/,,Episode,,Ecosystem,,,,,,,,2021-08-25,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,,,,,,,,How LEIs streamline KYC,"Ubisecure brought innovation to the LEI market by automating LEI issuance. Revolutionising how quickly and effectively an LEI can be registered. While improving data accuracy along the way by connecting directly to business registries globally. This innovation has helped RapidLEI to become the #1 LEI Issuer globally, issuing about 1 in 4 new LEIs monthly, in just 3 short years.",,https://www.ubisecure.com/legal-entity-identifier-lei/lei-in-KYC/,,Post,,Explainer,,,,KYC,,,,2022-06-15,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,,,,,,,,Verifiable Credentials – how does it work? Understanding key VC principles,"The Verifiable Credentials specification by W3C provides a way to express credentials on the web. In this article I’m giving an overview of components and terminology related to VCs, and also some information about other technologies and specifications that are needed when implementing Verifiable Credentials.","As part of our series on understanding more about the vLEI, the new Verifiable Credential to identify organisations and organisation representatives, this article is a follow-up to a previous post Introduction to Verifiable Credentials. This time, we will look in more detail at how W3C specification verifiable credentials work. The Verifiable Credentials (VC) specification by W3C provides a way to express credentials on the web. In this article I’m giving an overview of components and terminology related to VCs, and also some information about other technologies and specifications that are needed when implementing verifiable credentials. Claims Before I go into the components, I’ll run through a quick explanation on claims as a critical concept to understand for verifiable credentials. A claim is a name-value statement about a subject. The subject is typically a person, but it could also be a thing, such as hardware device. Examples of claims are: - Person’s first and last name, date of birth etc. - Organisation name - Personal ID - … and many others The concept of claims is familiar to those who have been working with federation protocols, such as OpenID Connect and SAML. Example “credentialSubject” : { “id”: “did:example:7564cb9c-165c-4857-a887-bfc2460af867”, “birth_date”: “1970-01-01” } Components of Verifiable Credentials A Verifiable Credential (VC) is a collection of claims made by an issuer (recap issuers in Introduction to Verifiable Credentials). VC metadata describes properties such as type, expiration and issuer of a credential. VC proof is used to verify the integrity of a credential. A proof is typically expressed as a digital signature, made with the private key of the issuer. Example { ""@context"": [], ""id"": ""e9ea3429-b32f-44ad-b481-b9929370bb90"", ""type"": [ ""VerifiableCredential"", ""ExampleCredential"" ], ""issuer"": { ""id"": ""did:example:2d28bb79-87a9-4224-8c63-d28b29716b67"" }, ""issuanceDate"": ""2022-01-01T00:00:00Z"", ""credentialSubject"": { ""id"": ""did:example:7564cb9c-165c-4857-a887-bfc2460af867"", ""birth_date"": ""1970-01-01"" }, ""expirationDate"": ""2023-01-01T00:00:00Z"", ""proof"": {} } Verifiable Presentation Verifiable Presentation (VP) is a collection of Verifiable Credentials (VC). Typically, a VP contains a single VC, but in more complex scenarios (such as selective presentation or delegation) there could be many VCs within a single VP. VP proof is used to verify the integrity of presentation. Proof is the holder’s counter signature of a collection of VCs where each VC has been individually signed by its issuer. As with verifiable credentials, proof is typically expressed as a digital signature made with the private key of the holder. Usually, the VP holder and subject of VCs within the verifiable presentation is the same. In more complex scenarios, the holder and subject could be different. In such cases some information or rules need to exist that allow correlating VCs. VP metadata describes properties such as expiration and nonce. The nonce is a random value generated by the relying party when requesting a VP, letting the relying party prevent re-play of VP tokens. In effect, VP is a classic challenge response authentication protocol. Example { ""@context"": [], ""type"": [ ""VerifiablePresentation"" ], ""verifiableCredential"": [ { ""@context"": [], ""id"": ""e9ea3429-b32f-44ad-b481-b9929370bb90"", ""type"": [ ""VerifiableCredential"", ""ExampleCredential"" ], ""issuer"": { ""id"": ""did:example:2d28bb79-87a9-4224-8c63-d28b29716b67"" }, ""issuanceDate"": ""2022-01-01T00:00:00Z"", ""credentialSubject"": { ""id"": ""did:example:7564cb9c-165c-4857-a887-bfc2460af867"", ""birth_date"": ""1970-01-01"" }, ""expirationDate"": ""2023-01-01T00:00:00Z"", ""proof"": {} } ], ""holder"": ""did:example:7564cb9c-165c-4857-a887-bfc2460af867"", ""proof"": {} } Decentralised Identifier A Decentralised Identifier (DID) is an identifier that is used to identify the subject, issuer and holder of verifiable credentials and presentations. A DID is formatted as a URI starting with prefix “did:”, followed by the scheme identifier and a scheme specific part (did:scheme:address). Numerous DID schemes have been defined, yet not all schemes have the same capabilities. Some schemes operate on distributed ledgers, such as blockchains. Others take more traditional approaches, such as simply referencing http resources, or even X.509 certificates. Basically, the DID is an indirect reference to public keys, usually controlled by the entity the DID represents. This way, the keys of an entity can, for example, be rotated without the DID value having to change. Example did:example:e762e3b0-1cf9-4899-925f-9a6ae50a8ad6 DID document The DID scheme defines how to resolve a DID value into a DID document. The DID document is a way to present metadata and public keys of the entity. The public keys are needed to verify VCs and VPs. Example { ""@context"": [], ""id"": ""did:example:7564cb9c-165c-4857-a887-bfc2460af867"", ""verificationMethod"": [ { ""id"": ""did:example:7564cb9c-165c-4857-a887-bfc2460af867#key-1"", ""type"": ""JsonWebKey2020"", ""publicKeyJwk"": {} } ], ""authentication"": [ ""#key-1"" ], ""assertionMethod"": [ ""#key-1"" ], ""controller"": [ ""did:example:7564cb9c-165c-4857-a887-bfc2460af867"" ] } Trust Model For claims to be verifiable, the relying party must establish trust with the issuer of the Verifiable Credential (VC) with the claims. One possible model for a relying party is to use out of band methods to establish trust, where the relying party simply knows the DID value of the issuer being the trust anchor. More complex models allow implementing chains of trust where a VC issued by the trust root could delegate trust to other intermediary VC issuers. Self-attestation is a model where the VC is issued by the subject itself. In this case, the claims are not verifiable but could still be useful depending on the use case. Digital Wallet The digital wallet application holds Verifiable Credentials (VC) from issuers. The wallet creates Verifiable Presentations (VP) for relying parties with user’s consent. This application can exist on a user’s device, it can be hosted on a server, or it could be a combination of both. Simon Wood looks at Identity Wallets used in the vLEI specification in more detail in his blog vLEI & Identity Wallets. Self-Sovereign Identity The wallet plays a critical role in implementation of the self-sovereign concept of Verifiable Credentials. The wallet must not release information to relying parties without the user’s consent. Also, it must not let the VC issuer know when and where the VCs have been used. The latter property is one of the most significant differences compared to traditional OIDC and SAML based federated systems. Selective disclosure Selective disclosure lets the wallet app compose VPs from selected properties of VCs. This enables a wallet to “pick and choose” claims from VCs without disclosing full details and thus improving privacy. A typical example is a claim such as “age over 18”. This lets a relying party know the subject’s age is over 18 without disclosing the exact date of birth. Issuing Credentials Users receive Verifiable Credentials (VC) from credential issuers and store the VCs in their wallets. To create VCs, the issuer needs to verify the identity of the user so that it can issue credentials with the correct claims. Because the VCs are bound to a subject DID, the issuer must also verify the user is in control of the subject DID. One method is for the issuer to request the user to create a digital signature that is verifiable with the public key referenced by the DID. OpenID Connect for Verifiable Credential Issuance Federated Identity Providers (IdP) already implement user identity verification and they have repositories with claims and other statements about users. It is expected that many IdPs will be extended to become issuers of VCs. The OpenID Connect for Verifiable Credential Issuance (VCI) is a set of extensions to standard OIDC flows that enable interoperability across digital wallets and identity providers. The VCI flow takes care of both DID and identity verification requirements. Relying Party A Relying Party or Verifier is the entity that receives Verifiable Credentials (VC) embedded within a Verifiable Presentation (VP) from a user’s wallet. To get a VP from a wallet the Relying Party sends a Presentation Request to the user’s wallet. Properties of the request indicate things like what claims the Relying Party expects to receive. After receiving a VP, the task of the verifier is to verify the integrity of both the VCs embedded within a VP and the VP itself. To do so, the verifier must look up public keys of both the subject and the issuer by resolving DIDs. OpenID Connect for Self-Issued OP To request and receive VPs from a wallet, a set of extensions to standard OIDC flows have been designed. The OpenID Connect for Self-Issued OP (SIOP) flow allows a relying party to request VPs from a wallet using a protocol that resembles the OIDC implicit flow. Presentation Exchange Presentation Exchange lets a relying party request particular types of credentials and claims. This specification defines filters and conditions that the wallet evaluates when it is constructing a VP from the set of VCs it has stored. Example { ""id"": ""8cbe12c7-5d6b-4bfa-81c9-57855daebc77"", ""input_descriptors"": [ { ""id"": ""ExampleCredential"", ""constraints"": { ""fields"": [ { ""path"": [ ""$.type"" ], ""filter"": { ""type"": ""array"", ""const"": [ ""VerifiableCredential"", ""ExampleCredential"" ] } } ] } } ] } Token types and formats The Verifiable Credential spec itself being very generic does not define any token format. The examples captured in this article are from JSON-LD formatted tokens. In addition to JSON-LD there exist many other token formats such as VC-JWT, Mobile Driver’s License (mDL), AnonCreds and KERI ACDC. Each token format has its own characteristics, such as text or binary formatting, different cryptographic proof types, support for selective disclosure etc. Conclusions The privacy improvements by the Self-Sovereign aspects of Verifiable Credentials are a significant improvement compared to traditional federated systems. The current W3C VC specification is highly generic. This presents a challenge to implementations wanting to achieve interoperability. Work on the next version of the W3C VC has already started, with the goal of addressing, amongst others, interoperability problems. OpenID for Verifiable Credentials work simplifies some implementation challenges by reusing concepts and terminology from the very well-known OpenID Connect and OAuth 2 group of specifications. To learn more about the vLEI, and the Global LEI Foundation’s approach to using Verifiable Credentials, learn more with our vLEI 101 article. If you have any questions on Verifiable Credentials, please comment below and I’ll get back to you or contact us. You can also sign up to our monthly identity newsletter to stay informed of future posts around Verifiable Credentials and other key identity topics. References - https://www.w3.org/TR/VC-data-model/ - https://www.w3.org/TR/did-core/ - https://openid.net/specs/openid-connect-self-issued-v2-1_0.html - https://openid.net/specs/openid-connect-4-verifiable-presentations-1_0.html - https://openid.net/specs/openid-connect-4-verifiable-credential-issuance-1_0.html - https://identity.foundation/presentation-exchange/",https://www.ubisecure.com/identity-management/verifiable-credentials-understanding-key-principles/,,Post,,Explainer,,,,,,,,2022-08-18,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,,,,,,,,vLEI 101 – Issuance and Wallets,"In my last blog, vLEI 101 – the Verifiable Legal Entity Identifier, I spoke about the potential of the vLEI. In the brief outline of the wider vLEI eco-system we saw that there were a number of types of Verifiable Credential:
|
||
|
||
LE-vLEI: a credential providing LEI data for the organisation
|
||
LE-OOR: a credential providing information about an individual holding a specific formal role within the organisation
|
||
LE-ECR: a credential providing information about an individual who has a ‘user defined’ role in relation to an organisation.
|
||
The credentials are issued by a Qualified vLEI Issuer, a QVI, and stored in a wallet. Let’s look at that in a bit more detail to understand what is going on.",,https://www.ubisecure.com/legal-entity-identifier-lei/vlei-101-issuance-and-wallets/,,Post,,Explainer,,,,,,,,2022-08-03,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,,,,,,,,vLEI 101 – the Verifiable Legal Entity Identifier,"We’ve been involved in some really cool work over the last few weeks focusing on the issuance of vLEIs and associated role credentials. Specifically, Ubisecure was the credential issuer for the GLEIF’s vLEI proof of concept project and issued the world’s first vLEI to the GLEIF, which was then used to sign the GLEIF’s 2021 annual report.","Understanding the potential of the vLEI – the Organisation verifiable credential We’ve been involved in some really cool work over the last few weeks focusing on the issuance of vLEIs and associated role credentials. Specifically, Ubisecure was the credential issuer for the GLEIF’s vLEI proof of concept project and issued the world’s first vLEI to the GLEIF, which was then used to sign the GLEIF’s 2021 annual report. The system works really well, but stepping back from the entire process there is a lot of technology and complexity involved behind the vLEI, as there is with most identity and/or cryptographic platforms. Like many complex systems we can break it down and take a more understandable view on what a vLEI actually is, what they do, and how we expect them to be used in the future. What is a vLEI? We are going to make a few assumptions here, the main one is that you already know about LEI (short for Legal Entity Identifiers). If you want to know more about the LEI itself we have some great material in our LEI Knowledge Base. The v in vLEI stands for “verifiable”, but what does that mean? The term verifiable in this case comes from the term “Verifiable Credential”. A verifiable credential is just a collection of information with a mechanism that allows a computer to verify that the information has not been modified and that the information was originally stated to be correct by some third party (maybe a bank, or the driving license authority). Often (almost always really) the information will include a link to the entity the information is about. With those three things the verifiable credential can be used to provide information to others in a way that allows the receiver to be very confident about the claims made by the information. Let’s take a simple, parallel example based around a driving license. Fred has his driving license as a plastic card in his wallet as issued to him by his national driving authority. He loses his card, but shortly gets a call from the local police station saying it has been handed in and he should come and claim it. When Fred gets to the police station the desk sergeant spends a long time looking at the photo (which is quite out of date now, time has not been kind to Fred!), asking Fred questions about his address, date of birth etc. Eventually the sergeant feels that Fred has answered enough correctly and hands over the license. Alice also has a driving license but her license is on her mobile phone. Unfortunately, Alice loses her phone, but again shortly gets a call from the police to say it has been handed in. When Alice gets to the station, she can prove it is her phone by using her fingerprint to unlock it. The desk sergeant does not need to use his judgement, Alice has proved control over the phone and so it must be hers. Verifiable Credentials work in the same kind of way, there is the ability to prove ownership of the credential. This process is understood by computer systems and so all the checks can be performed electronically online and in turn that allows automation and significant cost saving. Back to the vLEI, at the basic level the vLEI is simply an LEI code, a unique organisation identifier, stored as part of the information set in a verifiable credential. A standard mechanism exists to prove ‘control’ over any given vLEI and so it is possible to determine, automatically, if the entity presenting the vLEI is entitled to do so. This capability now allows organisations to participate in trusted automatic transactions. Wait, there’s more to vLEI than just the organisation The vLEI standards define more than just a verifiable credential for the legal entity. Two further verifiable credentials are defined that allow information on people associated with the organisation. The first of those two credentials is the “Official Organisation Role” credential (OOR). The OOR links an individual with an organisation in a well-known role. The roles are limited to an official set of ‘official’ roles as defined by an ISO standard (ISO 5009_2022). This list includes roles such as ‘Director’, ‘Chief Executive Officer’, ‘Chief Financial Officer’. With an OOR credential an individual is able to present themselves as holding an official role for a given organisation, and all the claims presented can be electronically verified in real time. The second of these two credentials is the “Engagement Context Role” credential (ECR). The ECR is very similar to the OOR except that the role is custom, the legal entity can define any role they wish and place that in the ECR. For example, “customer of”, “supplier to”, “contractor for”. In the below example we see the GLEIF annual report signed using vLEI, OOR and ECRs. The browser-based document viewer displays the signers, their roles and their organisation association: What can we do with a vLEI There are many reasons why vLEIs will see a rising prominence in the coming months and years: - Document signing: document signing solutions exist, however they currently use ‘standard’ signatures. These do not have the ability to link between signatures. Credentials from the vLEI ecosystem can provide the same level of security of signature and also provide linkage information between the various signing parties. - KYC/KYB: It is currently quite challenging to determine that an individual is empowered to act in a certain capacity for an organisation. The vLEI credentials solve this challenge by design, massively reducing onboarding and ongoing verification costs. - Delegation: The ability to understand the linkage between an individual and an organisation allows for electronic delegation of rights and responsibilities. Delegation capabilities brings significant cost saving to organisations where complex inter-company relationships need to be electronically enabled and enforced, see Ubisecure’s Finnish Government Katso case study for an example. However whereas Katso was a closed community operating just within one jurisdiction, the vLEI offers the same potential using a trusted, standardised, globally verifiable credential. - Representation Governance: Parameterising delegation (think individual expense sign off levels as an example) moves delegation into the role of representation governance. This capability then allows all the business process that have been manual to be automated with the obvious cost savings. Whilst this is already possible in some regions the vLEI ecosystem enables this capability on a global basis. It’s all about trust (frameworks) There are more credentials in the broader framework than have been covered here, and that broader framework is quite important. We have already understood that a credential wraps information and that systems can automatically verify the information to determine the original issuer of the information and that it is unaltered. This allows someone receiving the credential to know the Legal Entity in question and the nature of the relationship between that Legal Entity and the individual representing them. But how do we know that the person didn’t manage to get a fake identity through the system? That assurance comes from the vLEI ecosystem having a defined Trust Framework, and an audit system that validates compliance to the Trust Framework. When the credential is originally issued the issuer (known as a ‘Qualified vLEI Issuer’ or QVI) is required to perform identity checks to a globally defined standard. This is the same as you having to present your passport when you open a bank account or create a mobile phone account. As more national Ids become available the national ids will be able to provide the individual information meaning that a manual check will not be needed. In fact eIDaS 2 will be based on verifiable credentials itself. What happens now The GLEIF, along with stakeholders like Ubisecure, have been working incredibly hard to progress the vLEI project. Now the proof of concept has been released, the ecosystem will continue to develop the issuance technology and policy frameworks to make the vLEI a commercial reality. We are excited to continue to contribute to this exciting organisation identity advancement. If you would like to know more about LEIs, vLEIs or any of the underlying Identity and Access Management work that sits underneath them please get in contact with us and we’d be delighted to help.",https://www.ubisecure.com/legal-entity-identifier-lei/vlei-101/,,Post,,Explainer,,,,,,,,2022-06-30,,,,,,,,,,,,,
|
||
Ubisecure,PSAToday,,anchor.fm,,Ubisecure,,,,,PSA Today: Kaliya & Seth talk LEIs,"with Simon Wood, CEO of Ubisecure (#1 issuer of Legal Entity Identifiers) the evolution of LEIs since the financial crisis of 2008, the difference between high assurance and low assurance, and the relationship between rights and ownership as it relates to identity management of entities.","PSA Today By Kaliya & Seth PSA Today = Privacy, Surveillance, Anonymity. Join Kaliya Young and Seth Goldstein for a spirited conversation at the intersection of the three themes driving modern identity: privacy, surveillance and anonymity. We wrestle each week with some of the most contentious issues facing our world as we try to find opportunities for agency and self-sovereignty within shared communities, both online and off. PSA Today #34: Kaliya & Seth talk LEIs (Legal Entity Identifiers) with Simon Wood, CEO of Ubisecure PSA Today • By Kaliya & Seth • Feb 17, 2021 00:00 38:05 1x",https://anchor.fm/psatoday/episodes/psa-today-34-kaliya--seth-talk-leis-legal-entity-identifiers-with-simon-wood--ceo-of-ubisecure-eqia74,,Episode,,Meta,,,,,,,,2021-03-29,,,,,,,,,,,,,
|
||
Ubisecure,GLEIF,,,,,,,,,Ecosystem Governance Framework vLEI Credential Governance Framework Legal Entity Official Organizational Role,,,https://www.gleif.org/vlei/introducing-the-vlei-ecosystem-governance-framework/2022-02-07_legal-entity-vlei-credential-gf-draft-publication_v0.9-draft.pdf,,Framework,,Meta,,,,Governance,,,,2022-08-07,,,,,,,,,,,,,
|
||
Ubisecure,Ubisecure,,,,,,,,,How to get a vLEI Credential,Simon Wood The first step in issuance is for a representative to enter a contractual agreement with a QVI to provide the issuance service. The individual from the legal entity that undertakes this contractual signup is known as the Designated Authorised Representative (DAR),,https://www.ubisecure.com/legal-entity-identifier-lei/how-to-get-a-vlei-credential/,,Post,,Meta,,,,,,,,2020-01-01,,,,,,,,,,,,,
|
||
ValidatedID,,ValidatedID,,Fernando Pino; Santi Casas,ESSIFLab; DIF,"European Union, Spain, Catalonia, Barcelona,",Europe,,,ValidatedID,"Validated ID brings real identities to the digital world by helping businesses send and sign documents online and identify users and clients with maximum efficiency, security, trust and legal compliance.<br><br>With ViDSigner we offer a SaaS multichannel electronic signature platform that combines the security of cryptographic technology, biometrics and easy use from email certification to website and mobile integrations, smartcard and handwritten in-person signing.<br><br>With ViDChain we provide a Blockchain based one click digital identity, implementable in the optimization of user and online customer onboarding and digital procedures involving identity verification; thus saving time and money while increasing efficiency.","Partners In a global organization like United VARs, the contract signing procedure is often one of the most complicated. Since we implemented Validated ID, signatures are collected immediately and with 100% legal certainty. Detlef Mehlmann Managing Director, United VARs",https://www.validatedid.com/,,Company,,Company,Enterprise,ID,,,,,,2012,,https://Twitter.com/ValidatedID,https://www.YouTube.com/channel/UCUjvPN9zO-qVoVAF16OIluw,https://www.ValidatedID.com/blog,,,https://www.crunchbase.com/organization/validated-id,https://www.linkedin.com/company/validated-id/,,,,,
|
||
ValidatedID,ValidatedID,,,,,,,,,"Digital signatures, a fast track to digital transformation in the real estate sector",The latest real estate trend reports show how the pandemic has accelerated the use of technology and the implementation of trends such as teleworking and digitisation of processes. Find out how digital signatures are revolutionising the industry.,"This article is also available in French, German and Spanish. In the age of Google Meet or Zoom meetings, it is no surprise that the growing digital transformation is among the top trends in real estate and will have the most significant long-term impact in the coming years. The report Emerging Trends in Real Estate in Europe 2021, prepared by PwC and Urban Land Institute (ULI), presents a sector in full transformation. It also shows how the pandemic has accelerated the use of technology and the implementation of trends such as remote working and the digitisation of processes. Digital transformation in the real estate sector, then and now The incursion of new technologies has been one of the determining factors in recent years for the digital transformation in the real estate sector, but it was understood in terms of the use of digital platforms to promote properties in order to gain more visibility. On the other hand, in the last year, real estate technology has experienced a significant acceleration and specialists maintain that it will emerge stronger after the coronavirus crisis. Customer perception has evolved and the demand for the use of tools such as 360º virtual tours and digital signatures are here to stay. It is logical, considering that with the traditional methods of signing on paper or visiting a property, your clients had to physically travel to sign the required documentation or view the property. There is no doubt that these processes slow down the closing of deals and make it very difficult to follow up on ongoing transactions. Therefore, among the new trends in the real estate sector we find digital solutions that improve productivity and minimise travel. Save time and close sales on the spot without leaving your CRM The digital signature for real estate is a mere evolution of the paper signature, which allows the signing of all documents that accompany the real estate agent's work. By using our electronic signature service, real estate agents have much more operational flexibility, as they are freed from coordinating buyers and sellers with different schedules and geographic locations to close a sale. Your clients and estate agents will be able to sign contracts for rent, sale, lease or deposit, and accept offers from any mobile device, tablet or PC, quickly, safely and with maximum legal security. With VIDsigner e-signatures you can keep up with the speed of business and sign all your digital documentation remotely and without leaving your CRM. Real estate companies that manage many contracts related to their real estate portfolio on a daily basis can benefit from the integration of VIDsigner electronic signatures with SAP by getting their documents signed within the SAP Real Estate Management solution. 4 reasons to implement the electronic signature service in your real estate agency 1. Closing sales immediately Real estate agencies using VIDsigner get 99% of their contracts digitally signed. 2. Substantial reduction of face-to-face processes With electronic signatures your customers can sign documents from any device: mobile, tablet or PC and anywhere. No need to install any software. 3. Improved customer satisfaction Our service allows real estate agents to communicate a rate change or receive a signed contract without the need for the two parties to meet in the same physical space. 4. Possibility of integration within a CRM VIDsigner e-signatures are easily integrated into third-party solutions, allowing for greater productivity and coordination. As you can see, it is a solution that understands the needs of your customers and your company. Main use cases of e-signatures in real estate companies Our different digital signature mechanisms cover any scenario and adapt perfectly to the needs of your real estate agency or your clients: whether it is a handwritten signature on a tablet (BIO), remotely with a smartphone (REMOTE), by means of a qualified Personal certificate (CENTRALISED) or reliable communications by email (E-Delivery). The most common uses of signing documents that you can carry out as a real estate agency can be divided into 5 main sections: rentals, sales and purchases, administration, construction companies and developers. Rentals - Contract renewals - Termination notices to tenants - Assignment of powers of attorney - Subscriptions, cancellations and changes of ownership of utility contracts - Tenant's home insurance - Authorisation to request tenant references - Notification of non-payment - Payment authorisations (bank transfers) Buying and selling - Deposits - Purchase orders - Contract for publication of advertisements on real estate portals - Property showing reports - Inventory of properties - Home insurance - Terms and conditions - Agreements with third parties - Acceptance of offers and counter-offers Administration - Appointments and visits - Maintenance contracts - Conditions of use - Damage report - Repair orders - Change requests - Approvals - Data protection Construction companies - Contract with subcontractors - Contract with company - Purchase orders - RFQ - Documentation relating to certificates and inspections - Human Resources - GDPR - PPE management - Purchasing cycle Developers - GDPR in showflats in developments - Contract with real estate agencies: general agreements - Contracts with construction company - Contracts for sale of real estate assets - Purchase orders - RFQ - Documentation relating to certificates and inspections - Human resources GRUPPISO success story Find out how other real estate agencies have improved customer satisfaction thanks to the immediacy in closing sales and rentals, and the reduction of face-to-face processes. GRUPPISO is a real estate agency with more than 20 years' experience in the sale and rental of all types of properties (flats, commercial properties, parking spaces, etc.). Since its creation, their objective has been to offer 100% satisfaction to all their clients and to grow in its range of services, with the possibility of making remote presentations of properties directly on the buyer's smartphone and offering other services such as legal and tax advice or advice on wills, inheritances, signatures in notaries' offices or negotiations with banks. The GRUPPISO real estate agency manages the signing of around 600 contracts a year, including purchase and sale contracts, deposit contracts, insurance policies and other related documents. Thanks to VIDsigner's electronic signatures and its integration with RICsigner, they keep their business up to date and have seen significant savings in time and cost, according to CEO Pedro García. The implementation of the electronic signature was a major development in the development in management. 99% of our documents are digitally signed digitally and we have eliminated many face-to-face meetings. Pedro García Gruppiso Real Estate Management",https://www.validatedid.com/post-en/digital-signatures-a-fast-track-to-digital-transformation-in-the-real-estate-sector,,Post,,Ecosystem,,,,,,,,2021-05-03,,,,,,,,,,,,,
|
||
ValidatedID,ValidatedID,,,,,,,,,Electronic signatures for hospitality,"Looking at the many developments that have happened lately, digitization has become the center of attention for all kinds of industries, and yet many of the typical processes within the hospitality industry remain paper-based.","This article is also available in French, German and Spanish. The world of hospitality faces the large challenge of recovering from the economic impact of COVID-19 crisis while preserving safety and customer experience. Looking at the many developments that have happened lately, digitization has become the center of attention for all kinds of industries, and yet many of the typical processes within the hospitality industry remain paper-based. Take a look at guest check-in and check-out forms, employee records, operating sheets, supplier invoices, sales reports, and cash receipts, for example. Today that approach has changed completely, hotels are going digital and contact-less in every aspect, and fortunately, electronic signatures can play a key role in offering the best experience to customers. Just as easy as the clients checked in before, new technologies such as electronic signatures enable more secure scenarios, like declaring on a digital form your details, passport, and vaccination credentials and sign them, when arriving to the hotel. Then, just like your clients checked in using an electronic biometric signature on a Tablet, they can check out using their smartphone with a remote signature and preserve social distance, and ask the staff to mail them the invoice instead of getting a hard copy on paper. Next time tourists book their stay at a hotel they may see pioneer experiences that are spreading globally like the one developed by H10 Hotels, which has successfully completed the digitalization of its check-in processes, and its integration with its corporate information systems based on Navision, thanks to its technology partner Costaisa. H10 Hotels has nearly 40 years of history and more than 60 hotels in 18 destinations around the world, and has installed up to 73 tablet devices for its customers in the Caribbean and Europe, using VIDsigner eSignature’s technology. The influence of the pandemic on the Spanish hospitality industry has been explored in depth. Moreover, the response and recovery strategies of the largest Spanish hotel chains to guarantee a COVID-19-free stay in their facilities and to recover the accommodation activity have been discussed on many European levels. In Spain, as published by law (BOE) this year and to reinforce security against COVID-19, the government updated hotel check-in to allow digital means. Check-in forms can now be completed in electronic format, allowing for more efficient management of log books. The future of technology for hospitality More and more hotels are relying on technology to position their services in the market. As consumers value this in every aspect of our lives, it is vital to offer services that are tailored to their needs and preferences. Using a check in or a check out with biometric signature on Tablet or remote signature with a smartphone means not only offering extra value to the brand and cutting inefficient costs, but also facilitating the registration of guests, allowing an automatic process that captures the information and streams the communication with the users and the authorities. Safety compliance As the vaccination process rolls out for part of the world, and tourism starts to reopen, the relationship between the front-desk employees and the tourist will become more and more digital. In addition, many destinations will require proof of vaccination to allow some travellers to skip COVID-19 tests and lengthy quarantines. And this proof for some countries may come in the form of a vaccine passport, as it is the case for the European Union. In terms of safety and legal security, an electronic signature becomes a smooth tool for employees and management who are genuinely concerned about safety, and enhances the chances on how organizations would maintain the compliance with COVID-19 safety rules and procedures to protect workers from being infected and stop possible transmission during service encounters.",https://www.validatedid.com/post-en/electronic-signatures-for-hospitality,,Post,,Ecosystem,,,,,,,,2021-06-29,,,,,,,,,,,,,
|
||
ValidatedID,ValidatedID,,,,,Ecosystem,,,,SportChain: a Decentralized Trust and Reputation Service for the Sports Industry,Do you know the story of Carlos Kaiser? He was a professional Brazilian football player [that never played a single match](https://www.theguardian.com/football/blog/2017/apr/26/the-forgotten-story-of-carlos-kaiser-footballs-greatest-conman) but managed to still have a professional football career. He wanted the lifestyle without having to do the work.,"This article is also available in French, German and Spanish. Do you know the story of Carlos Kaiser? He was a professional Brazilian football player that never played a single match but managed to still have a professional football career. He wanted the lifestyle without having to do the work. Although this is a somewhat extreme example, the issue of fake sports data is prevalent. The sports industry is undergoing a digital transformation. The world of sports has largely moved onto the web and mobile. But, despite the digital transition, challenges remain. Although the internet has created new opportunities for organizations and athletes, it has also given rise to a plethora of problems. One such problem is fake sports data. The sports industry is a multi-billion-dollar industry with a lot of different stakeholders. The stakeholders include the players, clubs, federations, sponsors, journalists, fans, media, etc. Currently, there is no central point where all parties have access to the same trustworthy information. As a result, there is a need for verifiable and trustworthy sports data that both organizations and players can rely on. Currently, identity data are often stored in so-called identity silos where users do not have control over their own data. In addition, this raises privacy and security concerns. Therefore, we are excited to present our new project, SportChain. SportChain is a blockchain-based service that aims to bring transparency, security, and efficiency to the sports industry. SportChain aims to elevate the trust in sports data and make these data verifiable, authentic, and analyzable. SportChain will achieve this by taking advantage of our decentralized identity service, VIDchain. VIDchain is a service based on Blockchain to give back control to people over their online identity and facilitate secure user access to online services. With VIDchain, only the user has full control of their information securely stored on their own Personal identity wallet, VIDwallet. In this way, you can have all your Personal data (such as your driving license, passport, vaccination credential, etc.), on your phone. In a similar way that you use your physical IDs to identify yourself in the real world, you can add Verifiable Credentials to your VIDwallet to authenticate yourself online. SportChain is the first step towards creating a decentralized ecosystem for the sports industry. By using an immutable sports ledger, all parties can write, read, and verify all information being stored on it. This will allow us to track all data regarding sports events — from player movements and scores to injury reports —with full transparency. A Blockchain-based database for player data management: This decentralized database will allow players, clubs, and agents to store any Personal information related to their careers securely. Additionally, it will enable them to grant different levels of access to different trusted parties within the sports ecosystem, such as club managers, trainers, and scouts. We envision an extension of the VIDchain services by a notarization and reputation service. Using this notarization service, we can notarize sports data using distributed ledger technology which is tamper-proof and secure. With this notarization, the data becomes trustworthy and verifiable (immutable) by anyone at any time. In addition, the reputation service can be used to perform some big data analyses taking the sports data as input. It offers huge potential for various types of applications. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 957228.",https://www.validatedid.com/post-en/sportchain-a-decentralized-trust-and-reputation-service-for-the-sports-industry,,Post,,Ecosystem,,,,,,,,2022-03-09,,,,,,,,,,,,,
|
||
ValidatedID,ValidatedID,,,,,,,,,The digital transformation of the education sector,"For schools and universities, adopting the electronic signature as a tool not only implies an improvement in the experience for students and employees, but it also means a great improvement in administrative processes.","This article is also available in French, German and Spanish. The education sector has undergone a major transformation in the last year. The health crisis has forced many schools and universities to provide entirely remote teaching or to adopt hybrid models in some cases. As in any other sector, education is also immersed in the transformation processes driven by technology in an increasingly digital world. Many specialists in this sector believe that this transformation has permanently altered the landscape for students and teachers, which will remain mainly digital even in a post-pandemic world. For this, an electronic signature is an indispensable tool in the digitization of educational processes. For schools and universities, adopting the electronic signature as a tool not only implies an improvement in the experience for students and employees, but it also means a great improvement in administrative processes. This is especially important now that centers have to process thousands of documents per day, and the volume is expected to continue to increase each year. On the other hand, digital processes significantly help attract more students in circumstances that do not allow for open houses or other methods of student recruitment. The electronic signature is a useful tool not only in the area of registrations but also, among others, in human resources, purchases, finances. In addition, today's students are digital natives: most only know life with smartphones in hand. For a student, printing, scanning, faxing, and mailing documents such as the registration form or a scholarship application is an unnecessary and annoying effort. Meanwhile, teachers are improving their digital literacy while increasing their favorable attitude towards using new technologies in the classroom and outside. For teachers, reducing paper processes (such as signing their employment contracts, even student permits signed by parents) translates into more time that can be devoted to teaching and improved communication with parents and tutors. Traditionally, the management of contracting processes has been the gateway to electronic signatures, integrated by the school's technological partner. We have examples managed by some of our partners such as RIC.DOC, Avacco, Toshiba, Educaria, Peakway, Clickedu, DocTech, DocuWare or Despapeliza Chile deployed for La Salle Catalunya, Cor de María, the UOC, Escolapios or Blanquerna, the University of Vic, Finis Terrae University, Comillas Pontifical University, Alfonso X El Sabio University, and Strathallan School, among others. In recent years, however, there has been an increasing tendency to extend signings to other areas of school management, such as enrolment management, cooperation agreements, or grants, and permits, for example. Advantages of the electronic signature in the education sector - Legal guarantee of documents. - Security when collecting consents from any device. - Lower costs and risks compared to documents signed on paper. - More agile processes adapted to digital transformation. Here are some examples of use in the field of education The wave of disintegration is now fully affecting the education sector, and companies and institutions will have to adapt to the realities of the digital world. Now more than ever is the time to implement electronic signatures in schools and universities.",https://www.validatedid.com/post-en/the-digital-transformation-of-the-education-sector,,Post,,Ecosystem,,,,,,,,2021-08-27,,,,,,,,,,,,,
|
||
ValidatedID,ValidatedID,,,,,,,,,The time for the eIDAS Bridge,"The main goal of this new program was to provide an implementation of eIDAS bridge and to proof the interoperability between different provider implementations. Validated ID was selected to participate in part of the Call 1 of infrastructure. The results of this project are available as open source. If you are interested in digging into the code, you can find it all in the following repositories: [our open source version implementation](https://gitlab.grnet.gr/eSSIF-lab/infrastructure/validated-id/seb) and the [SSI eIDAS Bridge interoperability](https://gitlab.grnet.gr/eSSIF-lab/interoperability/ssi-eidas-bridge) performed with SICPA.","Public Key infrastructure (PKI) has been, and still is, a very valid technology that we use every day without even noticing. It brings us security when we navigate on the internet since it provides a way to know the site you are connecting to really is owned by who the site claims to be. In other words, if you are buying in Amazon, you need to be sure you are about to purchase in Amazon and not in a fake site. In that sense, we could say that identifying legal entities on the internet is kind of a solved matter. Nonetheless, if you wonder how many of us, users, can really take advantage of PKI for identifying ourselves on the internet, the answer is quite deceiving. This mature technology has been available for decades but has never become mainstream among the society for identifying end users. The reason is obvious, the user experience is very poor. It is not trivial to use your certificate to authenticate yourself towards a third party. It is much easier to delegate this to a third party like Google or Facebook at the expense of telling them what you do. This is where Self Sovereign Identity (SSI) comes in. This new paradigm aims to bring the control to end users by means of using Verifiable Credentials (VC). These credentials are issued by an issuer and consists of a set of attributes that define certain claims about the holder. Then, the holder can independently use this VC to create a Verifiable Presentation (VP) and deliver it to a verifier. The key issue is the holder can present this information to identify himself/herself towards the requester without the need of letting anyone else know with who is he/she interacting. The holder of this credential is sovereign on the use of credentials of his/her property. Validated ID has been working in this new paradigm for the last three years by means of developing VIDchain and contributing in relevant projects and initiatives such as the European Blockchain Service Infrastructure (EBSI) in the European Commission, Sovrin, Alastria…. to make this model become a reality. Although there are many credential wallets under development and several companies like us are looking forward this prominent paradigm, the reality is that the legal framework is still not fully mature. Currently we have the eIDAS regulation, mostly focused on traditional PKIs and Certificates. In June 2021, the EC approved a new draft of this regulation that states that the new identities of the European citizens will be based in the SSI principles and backed by identity wallets. However, this regulation still needs to be formally approved and developed”. In a nutshell, there is still not a clear trust framework. Therefore, the eIDAS bridge has raised as an in-between step. The eIDAS bridge project is an initiative by the European Commission (EC) within the ISA2 program where Validated ID participated as expert of matter in PKI and SSI. The EC developed eIDAS bridge to promote eIDAS as a trust framework for the SSI ecosystem. In a nutshell, this project pretends to provide a solution to one of the most urgent existing challenges SSI faces: having a trust framework where to rely. The result of this project, i.e. the technical specifications, integration guidelines and the legal reports produced, can be found here. Sometime later, eSSIF Lab, another EU-funded project that aims to provide an ecosystem of parties that work together to make existing SSI technology into a scalable and interoperable infrastructure, opened a program to evolve eIDAS bridge. The main goal of this new program was to provide an implementation of eIDAS bridge and to proof the interoperability between different provider implementations. Validated ID was selected to participate in part of the Call 1 of infrastructure. The results of this project are available as open source. If you are interested in digging into the code, you can find it all in the following repositories: - our open source version implementation - our VIDchain Enterprise supported version in Integration of eIDAS Bridge | VIDchain documentation What is eIDAS Bridge and how does it work The eIDAS bridge consists of an API that allows you to sign and validate credentials using Qualified Electronic Certificates (QEC). As you can see, this is the reason why this tool is called a bridge since it is “bridging” the world of certificates with SSI credentials SSI. For an end user, it should be really simple to use since the API mainly exposes three endpoints for three steps: certificate storage for did association, signature with a QEC (CAdES) and QEC signature validation. The issuer sends the certificate and associates it to the DID that will be used as Verifiable Credential (VC) issuer. The API stores the certificate in Confidential Storage. The issuer requests to sign a VC using his/her previously stored certificate and the API provides a VC containing a CAdES signature. The verifier sends a VC with CAdES signature to be validated and the API provides the validation result. These three steps above are the main functionalities developed for eIDAS Bridge project and the interoperability of VC signed with our implementation and other providers was proven successfully. For that reason, we have taken this code and included in our VIDchain API to provide our users the ability of using their certificates to issue VCs and validating VPs containing QEC signatures. Current status Since the project finished at the end of June 2021, VIDchain API has taken the open-source implementation and evolved it to provide an improved and professionally supported as a service eIDAS Bridge with more security for end users. Validated ID offers eIDAS Bridge as a service to help issuers and verifiers to use this innovative solution that fill the gap between certificates and SSI while this last emerges. VIDchain API offers these endpoints shown above as an entity authenticated service and is currently working on providing new features such as support for HSMs and the integration with external Certification Authorities (CAs). eIDAS Bridge and Business Process Value As pioneers and defenders of the SSI paradigm, we are the first ones who wish to create the necessary trust environment so that verifiable credentials can be created with a Level of Assurance&Credibility that allows public and private organizations to start accepting them as elements well supported by the models of trust already covered by the eIDAS(v1) While new eIDASv2 gets formally approved. This would imply that we can rely on formal processes for the issuance of verifiable credentials, and that the credentials incorporate components recognizable by consumers of Trust Services and the solutions used to recognize eIDAS electronic identities. For the first case, the credential issuance process could already incorporate the sealing of the credential based on a qualified certificate from the issuer, endorsing with its own credibility the originality of the issued credential, and clearly differentiating it from self-issued credentials. Or, it can be sealed later. And according to the cases, contributing its branding to the issued credential. In fact, in this sense, a Certification Authority could participate in the Trust chain issuing a credential from their Registration Authority, after a formal verification of the holder, or based on the authentic source that is considered appropriate to the case. It would imply that the signing process was carried out, in a similar way to the qualified signature, in the HSM of the TSP. In this way, we would be able to extend the Trust context of classic electronic identities to the decentralized identities that concern us now. And this would cover not only the geographical context of eIDAS but would be perfectly applicable to any country with advanced electronic signature laws, to explore the use of verifiable credentials backed by their national PKIs. Regarding the capacity of recognition by the counterparts who want to verify the presentation of a credential, the stamping component itself is verifiable through the eIDAS Bridge, facilitating the recognition of the source entity that has signed the credential, in the same sense given to the act of sealing in PKI. But allowing later to take advantage of the programmatic nature of verifiable credentials for process automation. If you have more interest on eIDAS Bridge and you want to try out our ready to use and supported implementation, feel free to reach out us at vidchain@ValidatedID.com.",http://validatedid.com/post-en/the-time-for-the-eidas-bridge,,Post,,Ecosystem,,,,,,,,2022-02-18,,,,,,,,,,,,,
|
||
ValidatedID,ValidatedID,,,,,,,,,"A pilot project for interoperable decentralised identity between Aigües de Barcelona, CaixaBank and Validated ID","the solution has allowed CaixaBank, at the request of a fictitious user, to issue an account ownership credential and, subsequently, this credential has been used to proceed with the direct debit of the invoices of Aigües de Barcelona","This article is also available in French, German and Spanish. CaixaBank and Aigües de Barcelona test the interoperability between different blockchain digital identity networks with a solution coming from the startup Validated ID · Validated ID has created ani nnovative solution called VIDchain, with the aim of helping companies and users take control of their digital identity and exchange digital credentials in a simple, secure and private way. · The use cases tested during the pilot phase have made it possible to validate the use of digital identity to optimize the direct debit of bills in Aigües de Barcelona and to improve the risk scoring of a potential new CaixaBank customer. CaixaBank and Aigües de Barcelona have developed a proof of concept with the Catalan startup Validated ID, which has developed a decentralized digital identity solution based on blockchain technology. The objective of the project has been to demonstrate the feasibility of using digital identity in a decentralized manner to provide people with control of their identity and facilitate secure user access to online services. The pilot phase has helped build an identity solution that enables the identification of a customer and the exchange of secure information between two companies. Interoperability between different blockchain networks and sovereign digital identity standards has been successfully tested. During the pilot phase, carried out in a test environment without real users, the solution has allowed CaixaBank, at the request of a fictitious user, to issue an account ownership credential and, subsequently, this credential has been used to proceed with the direct debit of the invoices of Aigües de Barcelona. On the other hand, Aigües de Barcelona, also at the request of a fictitious user, has been able to issue a payment approval credential that allows it to obtain a better score when requesting a loan from CaixaBank. These are the two use cases tested during the pilot project. Control of digital identity for users Validated ID has created a sovereign identity platform called VIDchain, designed to help users take control of their digital identity and exchange digital credentials in a simple, secure and private way. Thanks to blockchain technology, users can manage their Personal identity securely and can collect and store digital credentials, as well as choose which data they want to share with third parties at any time. One of the advantages of the solution created by Validated ID is that the startup works with national and international partners, and with different standards and blockchain networks. VIDchain has been the first wallet to pass the conformity test of the European Commission's EBSI network, and is compatible with other networks such as Alastria and Ethereum. This creates an open ecosystem of decentralized identity accessible to everyone. For this proof of concept, CaixaBank has issued verifiable credentials through the Dalion ecosystem (a sovereign digital identity collaborative project based on blockchain technology) on the Alastria network (national blockchain consortium). For its part, Aigües de Barcelona has used Ethereum to issue credentials. With this, the testing of interoperability in different networks and in a transparent way for the end user has been performed. Collaboration with startups CaixaBank, together with other entities, has promoted Dalion, a collaborative project based on blockchain technology that aims to give people control over their Personal data to make a single digital identity a reality, and individually controlled and self-managed by each holder within a safe and reliable environment. With the aim of improving ability and efficiency ininnovation, CaixaBank is committed to collaborating with startups and fintech companies through initiatives such as zone2boost, an international innovation program promoted by the entity together with Visa, Global Payments and Worldline. CaixaBank also promotes, through DayOne, the open innovation program to support startups within the DayOne Open InnovationProgram. With this initiative, the purpose is to bring the entrepreneurial ecosystem closer to the entity's different business areas and jointly develop innovative projects that respond to the new needs of society and current challenges. Thanks to these initiatives, CaixaBank can speed up the time from when an idea arises until the new product or service is marketed, and more easily identify talent. Simultaneously, for startups, collaborating with financial institutions gives them the opportunity to have a multitude of very valuable resources, such as approaching an important portfolio of clients, having a large distribution channel, improving their positioning brand or gain visibility. On the other side, Aigües de Barcelona is promoting its own open innovation laboratory in order to identify innovative technological solutions that generate a positive impact on society and respond to the challenges linked to the climate emergency. The company acts as a lever and link between the administration, startups and local entities of the Barcelona metropolitan area, in order to contribute to the Sustainable Development Goals. The lines of action of the laboratory are framed in six blocks: resilient water resources, the impact of global change; efficient infrastructure management; the environment and health; water and energy; and water demand management. The projects developed contribute to the achievement of one or more Sustainable Development Goals, putting the citizen at the center of digital transformation and advocating a perspective on technological humanism. About Validated ID Validated ID is a Spanish technology startup founded in 2012 in Barcelona, which offers secure trust services with VIDsigner eSignature and VIDchain self-sovereign identity. VIDsigner is our secure signature SaaS service for face-to-face and remote scenarios. It combines simplicity of use with the security of cryptography and biometrics. Its different types of signatures are combined with each other to adapt to the needs of different teams and people. With VIDchain we offer a new decentralized sovereign digital identity solution based on blockchain technology for digital identity verification processes such as KYC and user onboarding, in compliance with GDPR and AML. About CaixaBank CaixaBank is the leading financial group in Spain. With the incorporation of Bankia, it reaches a volume of 689,217 million euros in assets, which makes it the largest bank in the domestic market, with a relevant position at European level. In addition, CaixaBank has a strong presence in Portugal, where it controls 100% of BIS. The group, chaired by José Ignacio Goirigolzarri and directed by Gonzalo Gortázar, has 20.4 million customers and the largest network of branches and ATMs in Spain and Portugal and positioned as a leader in digital banking, with a base of 11 million digital customers. About Aigues de Barcelona Aigües de Barcelona manages the complete water cycle: from collection to purification, transport and distribution, as well as sanitation, purification and regeneration of wastewater, either for its return to the natural environment or for its reuse. Committed to people and the planet, the company provides services to nearly 3 million citizens of municipalities in the Barcelona metropolitan area and works with the clear purpose of improving people's quality of life and making cities a better place to live. To do this, Aigües de Barcelona integrates social and climate action in a transversal way and its way of working is oriented towards proximity and dialogue with customers; excellence and responsibility in the provision of services; the commitment to innovation, digitization and internal talent; and collaboration with other companies, entities and administrations. In this way, the company promotes sustainable and prosperous cities, capable of facing the biggest problems humankind faces now and in the future.",https://www.validatedid.com/post-en/open-innovation-project-for-the-collaboration-between-large-companies-and-emerging-companies,,Post,,Meta,,,,,,,,2022-08-09,,,,,,,,,,,,,
|
||
ValidatedID,ValidatedID,,,,,,,,,Validated ID raises € 2M in financing round,"The new financing is led by Randstad Innovation Fund, Caixa Capital Risc, and Cuatrecasas Ventures","- The company – specialized in electronic signatures, electronic invoices, and digital identities has secured new funding to consolidate its growth and expansion in key international markets. - The new € 2M investment will also anchor its positioning in the emerging blockchain-based digital identity (SSI) market. Validated ID, the Barcelona-based tech company which specializes in the digital signature, electronic invoice, and digital identity, has raised a funding round of € 2M. The new financing is led by Randstad Innovation Fund, the corporate venture arm of Randstad – a global leader in the HR services industry helping over 670.900 candidates all over the world – and the participation of co-investor Caixa Capital Risc, CriteriaCaixa’s venture capital management company, which manages over € 215M in innovative companies. This round is also backed by founders, previous partners, and the leading law firm Cuatrecasas, via its investment vehicle for startups Cuatrecasas Ventures. In the past few years Validated ID’s digital signature solution, VIDsigner, has established a leading portfolio of secure and easy to use electronic signature services with the highest legal and security standards. It is being used by over 100 partners and over 1000 clients. This reach, along with the development of its new Self-Sovereign Identity service, VIDchain, has allowed the company to maintain steady three-digit growth. With the new resources and the support of leading companies in its sectors, Validated ID enters a new stage of expansion with the aim of intensifying the internationalization of its VIDsigner signature service, and accelerating the technological product development and market consolidation for VIDchain, its decentralized Self-Sovereign Identity service based on Blockchain for digital identity verification processes. This solution is already getting massive attention in healthcare, Public Administration, and education, due to its expected impact on data breach-related costs, IT costs, operations costs, support function costs, and personnel costs. VIDchain ensures GDPR compliance and improves procedures such as customer onboarding, AML, and KYC, and therefore is expected to be especially significant on fraud reduction and counterfeit products. We’re living an intense stage of growth in Validated ID. The time is right, our technology and our services are aligned with the market, and we have a great team that day after day is able to ride this wave. In this sense, the entry of Randstad Innovation Fund, Caixa Capital Risc, and Cuatrecasas Ventures is proof that we are on the right track. It implies that we’d be extending our digital signature service, VIDsigner, in a more ambitious way across European and LATAM countries and accelerating the evolution of VIDchain as a service-aligned with the new universal identity models. - Santi Casas, CEO of Validated ID Paul Jacquin, the Managing Partner at Randstad Innovation Fund, added that We are excited about the ongoing expansion of Validated ID’ e-signatures solution in Spain and its potential in selected markets in Europe. On top of that, we are especially strong sponsors of their involvement in Self Sovereign Identity (SSI) and the work they do with leading institutions. It is a natural development of the company’s capabilities that will play out as the SSI market matures. Xavier Álvarez, ICT Director at Caixa Capital Risc, also noted that This operation is one more example of our commitment with companies with a promising future and innovative solutions based on technology. Validated ID is a project that perfectly matches the spirit we’re looking for in companies when it comes to investing. It has a solid project and a very attractive product which makes it a company with a high potential growth. Validated ID has three main lines of business. On the one hand, VIDsigner provides an electronic signature service designed for all cases of face-to-face and remote use, with maximum levels of security, legal compliance, and usability. On the other hand, with SP4i, it offers an electronic invoice service that meets national and European regulatory requirements. Finally, with VIDchain it offers a new sovereign digital identity service (Self-Sovereign identity or SSI) based on blockchain technology and market standards that makes it a universal model, capable of solving identity validation problems such as security failures, privacy, GDPR, PSD2, and AML compliance, high cost, portability, and usability in KYC, and onboarding contexts. About Validated ID ─ www.ValidatedID.com Validated ID offers security trust services to the processes of electronic signature and digital identity verification. VIDsigner is our SaaS electronic signature service for face-to-face and remote scenarios. It combines the simplicity of use with the security of cryptography and biometrics. Its different types of signature combine with each other to fit the needs of teams and organizations. About Randstad Innovation Fund ─ www.randstadinnovationfund.com As a strategic corporate venture fund, Randstad Innovation Fund is building a portfolio of early-stage HR technology investments, which are at the forefront of technology or transforming the talent acquisition and workforce management industry, hence boosting external innovation within the Randstad Group. The goal is to combine Randstad’s expertise and reach with an entrepreneurial spirit and technological excellence. Randstad Innovation Fund has invested in companies e.g. Montage, Goodwall, Allyo, HackerRank, and Pymetrics since launch in March 2014. Randstad is the global leader in the HR services industry. The Group supports people and organizations in realizing their true potential by combining the power of today’s technology with our passion for people. In 2018, Randstad helped more than 2.5 million candidates find a meaningful job with our almost 250,000 clients. About Caixa Capital Risc ─ www.caixacapitalrisc.es Caixa Capital Risc, CriteriaCaixa’s venture capital management company, is an investor that provides equity and participating loans for innovative companies with high growth potential. With more than 10 years of experience, Caixa Capital Risc invests in start-up companies. Its investments are concentrated in three sectors: ICTs, health tech and industrial. Currently, Caixa Capital Risc manages, through eight investment vehicles, a volume of over 215 million euros in innovative companies in Spain and Portugal, and has a portfolio of 180 companies. Through its ICTs area, Caixa Capital Risc invests in initial phases of innovative companies that have projects for B2B markets with a strong technological component, led by committed entrepreneurs, with scalable value propositions, and a vocation to build a global business. Currently, Caixa Capital Risc manages a volume of 73 million euros in the ITC segment, with a portfolio of 40 active companies. About Cuatrecasas Ventures ─ www.acelera.cuatrecasas.com Cuatrecasas Ventures is the investment vehicle for specialized startups in the LegalTech field, promoted by the law firm Cuatrecasas. Cuatrecasas is one of the leading law firms in Europe with strong implementation in Spain, Portugal, and Latin America. With a multidisciplinary and diverse team of more than 1,600 professionals and 24 nationalities, it covers all business law practices by applying knowledge and experience from a sectoral perspective and focused on each type of business. It has 28 offices in 13 countries and also maintains a close collaboration with other leading firms to offer a team adapted to the needs of each client and situation.",https://www.validatedid.com/post-en/validated-id-raises-eu-2m-in-financing-round-from-randstad-innovation-fund-caixa-capital-risc-and-cuatrecasas-ventures,,Post,,Meta,,,,,,,,2019-11-05,,,,,,,,,,,,,
|
||
ValidatedID,ValidatedID,,,,,,,,,Validated ID turns 10 years old! The best is yet to come,"oday, we want to celebrate our ten years, reflecting on what we have accomplished and anticipating the future, because we are sure that there will be many more to come. As a result, we want to reaffirm our commitment to our objectives and mission. Moreover, we strive to improve our operations to ensure a prosperous future for our customers and partners.","This article is also available in French, German and Spanish. “I find it incredible that we are celebrating 10 years since the company was founded"" commented Santi Casas, CEO and one of the founders of Validated ID. This project was born with great enthusiasm, but they were not clear about what the result would be. The four founders: Santi Casas, Iván Basart, Jaume Fuentes and Fernando Pino today remember that they have gone through difficult and happy times, but they are very proud of what they have achieved so far and, above all, of the people who have made this company successful. We cannot talk about the history of this decade without referring to the achievements and lessons learned since they have been a fundamental part of our growth. At Validated ID we have gone from 5 employees since its foundation to more than 50 today, located not only in Spain but also in the United Kingdom, France, Germany, Ecuador, Israel and Turkey. On the other hand, thanks to the more than 200 allies that offer our digital signature, we currently serve clients in more than 30 countries in Europe, Africa, Asia and Latin America. As a result of this effort and collaboration, we have provided more than 20 million signatures worldwide since our humble beginnings, with a constant growth and a permanent innovation in our trusted services. Our commitment at Validated ID since it was founded is to provide the best digital identity and signature solutions with the aim of maximizing our clients' productivity and reducing environmental impact. On the other hand, we work hard to establish ourselves as leaders both nationally and internationally through VIDsigner, VIDchain and SP4i. We continue to work with the same passion as always in order to achieve new goals that drive us to the top. Our core values guide us in our work Our main values guide us in our work: autonomy, as we help individuals and organizations to empower themselves and achieve better results; teamwork, since we know that achievements are always the result of people; the Personalization of our services, because no two people or two companies are the same; simplicity, since the key to our success has always been to create services that are easy to use; and, finally, security, since as service providers that handle sensitive information, we understand that security is key to offer peace of mind to both our customers and suppliers. With VIDsigner, our first service, we want to offer the security that the traditional electronic signature gives us together with the new possibilities offered by the latest generation of touchscreen devices. VIDsigner, together with our electronic invoicing service SP4i, are our commitment to achieve ""zero paper"" in organizations. As Confucius said ""Study the past to design the future"". And with VIDchain, our mission is to foster a digital world where trust, privacy and security prevail to make our daily lives easier. The commemoration of this birthday is a source of pride for all of us and coincides with a difficult moment worldwide to which we have been able to adapt and see opportunities where there were obstacles. On the other hand, we remain firm in our commitment to digitization as a tool to create a sustainable planet. Today, we want to celebrate our ten years, reflecting on what we have accomplished and anticipating the future, because we are sure that there will be many more to come. As a result, we want to reaffirm our commitment to our objectives and mission. Moreover, we strive to improve our operations to ensure a prosperous future for our customers and partners. Because the past is written in stone, we wish to say farewell with what we have in our hands: the future. As the title of this post says, 'the best is yet to come' and it will come from all of you, our team at Validated ID as well as VIDsigner , VIDchain and SP4i. Many thanks to the entire team and partners for your dedication, performance and passion. Without you, we would not have been able to celebrate this anniversary.",https://www.validatedid.com/post-en/validated-id-turns-10-years-old-the-best-is-yet-to-come,,Post,,Meta,,,,,,,,2022-03-01,,,,,,,,,,,,,
|
||
ValidatedID,ValidatedID,,,,,,,,,Validated ID's journey to becoming EBSI compliant,"[Wallet Conformance Tests] are designed to demonstrate that the wallet provider can onboard users safely, receive verifiable credentials from a trusted issuer, and present verifiable credentials to a verifier. All of these, using of course EBSI infrastructure.","This article is also available in French, German and Spanish. We at Validated ID have been betting on EBSI since the beginning. We started working to become conformant wallet providers since the very first version of Wallet Conformance Tests (WCT) was published. The process of preparing our solution to become conformant has allowed us to appreciate how remarkable EBSI's work has been. In this article, we provide in sights into what these tests consist of, and we share with you our experience performing these tests as wallet providers. In essence, WCT are designed to demonstrate that the wallet provider can onboard users safely, receive verifiable credentials from a trusted issuer, and present verifiable credentials to a verifier. All of these, using of course EBSI infrastructure. Each scenario is clearly separated in the tests, and the wallet provider shows this by including an identifier (header) in requests sent to the EBSI APIs. Therefore, EBSI support office is later able to analyze if the flow performed by the wallet is correct. Let's take a closer look at what we had to demonstrate. Onboarding users is the first scenario. Users’ identifiers created within a wallet need to be correctly registered in the DID registry. DIDs are stored in this registry, along with their associated public keys in the blockchain. In simple terms, registering your identifier is the first step to interacting with other members of the network. It requires several steps to verify that there is a person behind the process and several cryptography challenges that follow protocols that ensure the keys are controlled by the person they are associated with. Although this might seem a bit intimidating, rest assured that the wallet handles everything behind the scenes. In our case, VIDwallet users only need to scan a QR code. This is enough to notify EBSI to register the DID created in VIDwallet. For the second scenario, EBSI has developed an“mock issuer” service by means of an API, i.e., Conformance API, that allows requesting this mock issuer to issue a credential to the wallet. A wallet requests a credential from the issuer, and this service sends the credential to the wallet. In the third scenario, the Conformance API is used to demonstrate that the wallet can create a valid verifiable presentation.In other words, the API acts as a “mock verifier” so the wallet provides a presentation that the API will verify. Once the presentation is shared with the“mock verifier”, the result of the validation is consequently responded. Whenever new technologies and ecosystems are introduced, there will be many challenging and unpredictable changes that must be addressed. We are dealing with Self Sovereign Identity (SSI), a promising technology but still in its infancy, and, because we started with the very first version of WCT, we had to adapt to several changes along the way. However, to achieve a goal you've never reached, you'll have to take steps you've never taken. Then again, let's face it, it has not been easy since we're dealing with new technology and a variety of stakeholders must agree on how to handle a process. A number of challenges were overcome, but as a result, we have a proper WCT suite that can be used by everyone. EBSI's extraordinary work and the agile way they worked with all wallet providers allowed us to provide feedback and learn during the process. The outcome is irrefutable, a refined WCT suite is now precisely defined, and any wallet provider can submit their integration much faster. As of the time of writing this article, five wallet providers have already been able to present at least a conformant scenario. We are proud to say that we were the first to pass the conformance test and with more use cases covered. EBSI offers a clear guide on how to consume their services to become conformant. EBSI is now completely equipped to evaluate new incomers. Therefore, we encourage other wallet providers around to become conformant and start collaborating in cross border and provider scenarios.Users must be free to choose any conformant wallet and interoperate with issuers and verifiers freely and securely. In our view, this is where SSI real value relies on. Let’s get the ball rolling!",https://www.validatedid.com/post-en/validated-ids-journey-to-becoming-ebsi-compliant,,Post,,Meta,,,,,,,,2022-03-10,,,,,,,,,,,,, |