decentralized-id.github.io/identosphere-dump/educational-resources/data-governance.md
2022-12-12 06:09:55 -05:00

28 KiB
Raw Blame History

published
false

Data Governance

  • Marissa Mayer wants to fix your address book CNBC

    At launch, Mayers start-up is rolling out Sunshine Contacts, an address book app that relies on artificial intelligence to find and merge duplicate contacts, fill out incomplete information and continually keep that data up to date. The app integrates with the iOS Contacts app as well as Gmail and will be free to all iOS users with an invitation.

  • You are not your Data but Your Data is still You.

    In the digital age, individual privacy in the broadest sense is about control over protecting ones personally identifiable information (PII), such as information about health, credit, shopping, or communication. But the types of information deemed personally identifiable and the amount of control one has over them varies around the world.

  • Ada Lovelace Institute (ALI) Shares Highlights and References

    discussing different approaches to data stewardship and potential principles individuals and organisations can follow

a set of principles for software that enables both collaboration and ownership for users. Local-first ideals include the ability to work offline and collaborate across multiple devices, while also improving the security, privacy, long-term preservation, and user control of data.

  • Personal Data Warehouses Simon Willison

    If youre like me, and you love building side-projects but you dont like paying $5/month for them for the rest of your life, this is perfect.

  • Why framing “data” as an asset or liability is dangerous MyDigital Footprint

    If there is one thing that can change finances power and dominance as a decision-making tool, it is the rest of the data. According to Google (2020), 3% of company data is finance data when considered part of an entire companys data lake. McKinsey reports that 90% of company decisions are based on finance data alone, the same 3% of data.

If you are in accounting, audit or finance shoes, how would you play the game to retain control when something more powerful comes on the scene?

[...] because data from their healthcare provider acquired into the server can be used to authenticate and assert that fact without the need to give any identity information. By using PDAs, apps that rely on sensitive data will be able to access this and stay “identity blind”. One effect of the Covid-19 pandemic will be the increased use of PDAs.

Google has all my e-mail. (And I dont. They merely let me access it with a browser.)

Facebook has the list of all of my friends and what I said to them. (And I dont.)

LinkedIn has all of my business contacts. (Repeat after me: and I dont.)

Instagram has all my photos. Well, the Instagram department of Facebook does. (Chorus now: and I dont.)

Amazon has the list of all my purchases, and knows what products I was interested in but didnt buy after all. (AND I DONT.)

I re-read Zhamak Dehghani original and follow-on posts. Zhamak is the creator of the data mesh. In her second post she identifies four data mesh principles:

  1. Domain-oriented decentralized data ownership and architecture
  2. Data as a product
  3. Self-serve data infrastructure as a platform
  4. Federated computational governance
  • Attitudes To Personal Data Management

    In recent years, personal data has been an increasingly popular topic of conversation for marketers, data analysts, regulators, and privacy warriors. Individuals have learnt that recent regulatory updates have given them more rights over how that data is used. Are these two forces aligned?

    We distributed a survey and received over 400 responses from both individuals and organisations answering questions about the management of personal data. How aligned are the two points of view? This infographic shows a summary of key questions and responses.

  • What Does It Actually Mean When a Company Says, “We Do Not Sell Your Data?” John Philipin

Probably because the alternatives produce even more income.

Given his breadth of experience and alignment with a number of strategic sectors where aNewGovernance is currently developing ecosystems, I am sure, he will bring incredible contribution.

The compelling data and research suggest that my original question now needs to be reframed. People most certainly do care about their data privacy. The question now is: how are organisations going to bridge this data privacy chasm?

So that there is no uncertainty or doubt, however, Duball4 reports that, while consumer privacy is a chief concern for the commission, it is not the primary concern to the exclusion of other concerns. The commission is also worried about algorithmic bias and “dark patterns” practices.

The scope of the Self-Sovereign Identity Personal Data Usage Licensing (SSI-PDUL) Model is personal digital identifiers and any associated identity data presented by Alice to the App. It does not include the permissioning of data internal to the App (although the natural extension of the solution to internal data is an obvious one)

How Alice User, an App User and Identity Owner, and Bob Developer, an App Developer and App Controller, might negotiate the use of Alices personal digital identifiers and any associated personal identity data by Bobs app, based on Self-Sovereign Identity Model Usage Principles

This multidimensional value—authenticity, compliance, integrity, and resilience—coupled with being easy to integrate is what separates the Indicio approach from the rest. Our growth in 16 months—with global enterprise customers and a global decentralized blockchain network supported by 23 companies on five continents is a sign that fundamental change is coming in the way we share information.

Mydex CIC has just published a blog for Cambridge Universitys Data Trust Initiative on Helping Data Trusts Manage Personal Data. In it, we address the challenges that arise as the Data Trust movement begins to scale.

we were presenting at the Open Data Institutes event on Data Sharing and the Rise of Data Institutions — a crucially important subject for the years ahead. (You can see the slides of our presentation here.)

Ontology is partnering with 4EVERLAND, a Web 3.0 cloud computing platform enabling global acceleration, privacy protection, distributed storage and other technical features to accelerate the move towards Web 3.0.

Individual Rights are hard to come by historically. Strong people make them possible. First requirement of their existence is thus, strong people.

The main reason why vital information is not getting where it needs to be is that our data economy has evolved to be an organisation-centric One User One Use (OUOU) system — whereas, thanks to the inner logic of data itself, it needs to operate as a Many Users, Many Uses (MUMU) data ecosystem.

Data Portability

  • ALIAS automating GDPR portability for applications developers.
  • Checkpipe Charlie tool for describing and validating data.
  • DIP Vaccination & Immunization Management using Verifiable Credentials.
  • Domi SSI-based digital passport to facilitate data portability in the housing rental sector.
  • DPella Data analyses with privacy in mind.
  • IDADEV-P2P Blockchain Based Data Portability System
  • OpenPKG decentralised data provenance system for improved governance and portability of personal data.
  • OpenXPort Open export of data across different systems and providers.
  • ORATORIO Energy data exchange platform.
  • Prov4ITData Provenance-aware querying and generation for interoperable and transparent data transfer.
  • UI-Transfer complete solution for “user initiated inter-controller and continuous data transfer”
  • Data as competitive advantage & control mechanism in platform economy

    Presenters: Sangeet Paul Choudary, Molly Schwartz Session host: Riikka Kämppi Molly Schwartz chats with Sangeet Paul Choudary - best-selling author of Platform Revolution and Platform Scale and founder of Platformation Labs - unpacks the ethics and economics of data.

  • Decentralized machine learning to respond to the health crisis

    The current health crisis has shown how essential it is to have data in order to make political decisions.[...]

    We present here one of these solutions that allows training an AI with the data of many individuals without ever disclosing it to third parties, thanks to a decentralized protocol.

  • Kaliya & Seth talk Synthetic Data with Harry Keen CEO of Hazy.com PSA Today #33

    Originally a UCL AI spin out, London-based Hazy was initially incubated by Post Urban Ventures and CyLon cybersecurity accelerator. Our startup began trying to fix the flaws of traditional data redaction and then data anonymisation. We soon discovered anonymised data will always pose a risk to re-identification.

Dynamic Data Economy: Digital Identity, Authentic Data Flows, Data Mesh and other dragons by Robert Mitwicki

DDE, HCF, Data Mesh, KERI, OCA

Session was held by Human Colossus Foundation folks who described the vision for DDE which is developed within the Colossi network around the foundations.

Dynamic Data Economy is a roadmap towards fair, decentralized and authentic data economy. Many times people are referring to blockchain technology as a revolution within digital space. But often they actually mean something more profound: the promise of Decentralisation brought by blockchain. A Dynamic Data Economy brings decentralization outside the technology realm into digital solutions for any economic actors. It does so by decentralizing all layers of the ecosystem:

  • Decentralized Governance - no administrative entity fully controls and sets the rules, Individuals, organization and government are sovereign on their data Governance.
  • Decentralized Architecture - physically decentralization of resources running that network. Economic actors keep control of their data storage solution according to the level of security required.
  • Decentralized Logic (Data) - if you cut the system in half it can continue working and data is not damaged in any way, e.g. no need for total ordering.

No blockchain fulfills all those requirements and some none at all. And this is a problem for sensitive areas (e.g. identity or data portability) Then the agreements on sets of principles, protocols and rules to fulfill those requirements are “add-ons”.They are not in the system by design and thus weakens the overall solution Thus the Human Colossus Foundation (HCF) is seeking for opening up discussion and lead towards standardization efforts to ensure that the decentralized technologies already existing brings to life a Dynamic Data Economy for all with and without blockchain.

This was a session to discuss the topics I brought up in my article on the authentic data economy:

With each successive wave of computerization the new innovations built on the last. Each one taking more of human-scale processes and shrinking them down and putting them into computers and eventually online. The authentic data economy isnt any different. It leverages data collection and networking and personal computing advances. It makes our data ours and authentic. It builds on all of the previous work done by countless engineers and inventors and dreamers. However, by being the last big problem it represents the final piece that brings together everything that came before it. The scope of the authentic data economy is literally everything in the human sphere. There is nothing that this wont change. Trust will go everywhere and into everything. But most importantly, so will privacy.

I talked about how the break from the W3C DID specs and other key innovations in cryptography have enabled me and Mike Lodder to design a solution for identity and all data provenance that is 1. privacy preserving, 2. scalable to global scale and how that creates an opportunity for authentic data to become the primary way data is used in the world.

Presentation: https://docs.google.com/presentation/d/1WOXgHhgAwG0Im45pZkTAhsadpd8xbck0xjlnsuVGGhI/edit?ts=60803bc8

The goal of this session is to present the idea and get community feedback regarding this.

Credential Marketplace is quite high up the SSI stack but we want to start this discussion.

  1. What is Credential Marketplace?
  2. We have a Trust Triangle of Issuer-Holder-Verifier. This does not need any centralized entity except schema hosting.
  3. However, we want to solve the problem of discovery of Issuers and Verifiers.
  4. Example: Im traveling to a new country. I need to get what healthcare VCs are needed to go there, in an automated way.
  5. How can we solve this without relaying on a centralized registry of Verifier requirements and Issuer capabilities?
  6. How it works
  7. In order to discover issuers / vc types, there should be a registration step where issuers/verifiers actively OR passively provide metadata about their capabilities.
  8. Credential Data — can contain some filters or constraints on the data from within the VC. E.g. As a Verifier, I only accept passport VC from only certain governments: only German nationals.
  9. VC Metadata
  10. Issuer Metadata
  11. Reputation mechanism for credential issuers
  12. Marketplace can also implement value transfer: paying for issuance by the verifier, for example. Even if they are part of different SSI ecosystems. This is optional but can help incentivize different participants.

Brent Shambaugh. Integrate computing, processing, storage. Virtual machine to integrate databases. Query language, like Linked Data, over a lot of different systems. SeeQL(?) - translate. Categorical databases (Ryan and Davids work): to not really rely on a single source ofo truth, but more rely on transformations between things. Use category theory to have an exact/provable way of doing that. Cat theory has to follow certain rules. Cat theory kind of abstract but provides a framework for unrelated disparate things. Ryan could say how to algebraicly describe things, which would branch off into… Josh at Uber: come up with schemas, get down to the data/logical layer. Different places to go. Way to translate in from out. Might have multiple different ones, want to map from each one, but if you have a vague interpediary in centralized model, loosely defined that both map to, then have a mapping between the two things. Linked Data problem

  • [...]

Linking data together is about machine readability. Involved humans… need to understand. Do it through language. Humans like OCA because can understand data in different languages, makes sense for people. Human element. In that capture space. Want to refine OCA, take out some of the rules parts, masking overlay, conditional overlays, and get it away from OCA as architecture - it convolutes things. OCA only meant for making theings human-readable.

Discussion moved to this Miro board:

17:10:05 From Bruce Conrad to Everyone : Another dystopia prediction from 1909: "The Machine Stops" is a science fiction short story (12,300 words) by E. M. Forster. [quoted from wikipedia]

17:18:50 From Jemima Gibbons to Everyone : https://join.slack.com/t/oneteamgovernment/shared_invite/zt-2tsf24lc-zhqjU6GIjWiDem_APXc0BQ

17:22:31 From Jemima Gibbons to Everyone : https://sfadigital.blog.gov.uk/2017/03/24/dont-bring-policy-and-delivery-closer-together-make-them-the-same-thing/

17:52:46 From Orie Steele to Everyone : Shameless plug for our work with GS1 on VCs

The data privacy/control issue isnt new, but the attitude shift is. People care more, demand more, and the scale of change that has occurred due to the Covid-19 pandemic is major. As we live through times exposing such injustice and inequality, it's becoming evident that this personal data ecosystem needs to undergo a major revamp.

Point 5. User control over data—data governance arrangements should ensure users privacy and control over data:

  • Today: Users trust intermediaries to keep data safe, but they do not have sufficient control over their data
  • Crypto: Transactions are public on the blockchain—which will not work with “real names”
  • Tomorrow: New data architectures can give users privacy and control over their data