decentralized-id.github.io/_posts/identosphere-dump/user-experience/user-experience.md
2022-12-02 05:56:45 -05:00

22 KiB
Raw Blame History

published
false

User Experience

That means 95% of US consumers are saying no way to cross-app tracking with Apples new App Tracking Transparency (ATT) feature.

Rather than having one huge, expensive, and probably illegal data hub, every customer becomes a data hub in their own right. They provide the data needed, just-in-time, under their control.

The YouGov poll of consumers in France and Germany we mentioned earlier says its the behind the scenes or back door nature of personalization that gives people the creeps.

Focused on communicating risks/harms to the user. Focus on the high-level user experience.

  • Steve Venema suggested the Privacy Co-op
  • Make an individual's policy decisions disappear into their workflow. Whenever the application needed a resource, we knew the answer from the action they took in the UX.
  • Trust based on the context of the other people I know.
  • Web of trust: have my friends shopped here?
  • Reputation: what is the ranking of this place?
  • Revocation
  • Information is given, cannot be revoked (photo of a driver's license)
  • Permission is given, can be revoked (allow a 3rd party to say I have a driver's license)
  • Trust based on browsing history
  • TOFU: Trust based On First Use - trusted it once, will trust it again
  • Kantara Initiative: agreed to terms once. Will stay agreed unless they change.
  • The opposite of "who do you trust?" is "how are you making yourself vulnerable?"
  • Kantara Initiative
  • obligations/consequences for violating the consequences
  • "identity trust workgroup" - Adopt the Personal Data Categories from Enterprise Privacy for the Consent Receipt V 1.1
  • Consent for each purpose. People give consent at the purpose-level.

Harms and User Risks. High Level UX by David Schmudde

Focused on communicating risks/harms to the user. Focus on the high-level user experience.

Discussion notes, key understandings, outstanding questions, observations, and, if appropriate to this discussion: action items, next steps:

  • Steve Venema suggested the Privacy Co-op
  • Make an individual's policy decisions disappear into their workflow. Whenever the application needed a resource, we knew the answer from the action they took in the UX.
  • Trust based on the context of the other people I know.
  • Web of trust: have my friends shopped here?
  • Reputation: what is the ranking of this place?
  • Revocation
  • Information is given, cannot be revoked (photo of a driver's license)
  • Permission is given, can be revoked (allow a 3rd party to say I have a driver's license)
  • Trust based on browsing history
  • TOFU: Trust based On First Use - trusted it once, will trust it again
  • Kantara Initiative: agreed to terms once. Will stay agreed unless they change.
  • The opposite of "who do you trust?" is "how are you making yourself vulnerable?"
  • Kantara Initiative
  • obligations/consequences for violating the consequences
  • "identity trust workgroup" - Adopt the Personal Data Categories from Enterprise Privacy for the Consent Receipt V 1.1
  • Consent for each purpose. People give consent at the purpose-level.
  • What is the right balance between ease of use and identity, and how we use that in real life
  • Idea of how do we even get our identity back from where its stored
  • Current tradeoff for privacy is solutions that are barely usable (Duckduckgo, SSB)
  • Real versus online world,
  • Focused on the idea of context for each thing
  • Lively debate about the nature of reality versus virtuality
  • Discussion of whether corporate ownership of data is “data assault” and that the term data theft might be too mild.

Links from Chat:

Catherine Nabbala, 10:56:43 AM

For offline discussions, pls email: win@finema.co

Takashi Minamii 11:29:45 AM

FYI:Hitachi's Solution (PBI)https://www.hitachi.com/rd/sc/story/pbi/index.html

Brief but rich conversation about what technologies may be available and/or practicable or are developing  to use with kids and their online presence.

Use case: Wonderland Stage & Screen, interested in developing a platform to support youth creating media to share, comment, discuss their work that meets COPPA guidelines, allows freedom of participants, and provides a mechanisms for privacy.

  • Create an onboarding process that models a physical process
  • Collect information
  • Issue a credential
  • Offer wallet options for use
  • What kinds of credentials could we use?
  • View only
  • Interactive
  • Comment enabled

The earliest discussion of the phrase I could find is a blog post from August 4th, 2011 by the “Chief Lizard Wrangler” herself, Mitchell Baker the CEO of Mozilla. In it she prophetically describes user sovereignty as the consequence of new “engines” that are “…open, open-source, interoperable, public-benefit, standards-based, platforms…” She also makes the critical link between the philosophy of openness and standards-based interoperability with that of identity management and personal data dominion.

  • EPS for SSI (Self-Sovereign Identity)

    you might be interested to hear that the core of EPS is designed to convert images to high-entropy codes, which work as very long passwords and also as the seeds of symmetric or asymmetric cryptographic keys.

  • Testing self-sovereign identity with the Lissi demo

    We are convinced this demonstrated user flow can help to better understand the interactions in a digital identity ecosystem such as IDunion. [...] The Lissi team is in discussion with trust service providers, authorities, municipalities, agencies, associations and other relevant stakeholders to meet all the necessary requirements and provide you with the best user experience.

  • Self-Sovereign Identity for Social Impact & Importance of UX Jimmy J.P. Snoek, Tykn

    We saw pretty early that the puristic view of SSI, in terms of having everything stored on edge wallets — when you go to somewhere in Sub-Saharan Africa, thats going to be pretty difficult, when theres maybe one phone in a village and its not even necessarily a smartphone. Its very easy to say, “Oh yeah, but within SSI, everything has to be stored on the edge wallet.” What we saw was that if you make that this hard requirement, and keep working from that, then all these population groups are just going to be left behind more and more.

  • Sexism in Facial Recognition Technology Berkman Klien Center

The use of facial recognition by law enforcement agencies has become common practice, despite increasing reports of false arrests and jail time. While there are various downsides to facial recognition technology being used at all, including fears of mass surveillance and invasion of privacy, there are flaws within facial recognition technologies themselves that lead to inaccurate results. One such major challenge for this still-burgeoning technology is gender-based inaccuracies.

Questions of control over personal data were a cross-cutting theme throughout a Research Sprint co-hosted by the Berkman Klein Center for Internet & Society and Digital Asia Hub. The Sprint also examined other important dimensions of self-determination in the digitally networked world, for instance, self-expression and participation in civic life and the digital economy, or relationship-building and well-being, to name just a few application areas.

We should be able to “tap and prove” any important fact and figures about ourselves as easily as we tap and pay with a mobile phone at any one of 100s of millions of terminals globally.

Raj Hegde sits with identity veteran, Nat Sakimura - Chairman of OpenID Foundation to understand how user-centric learnings from existing authentication protocols can be applied to future identity initiatives.

The Spotlight Report, “Consumer Sensitivity to Location Tracking by Websites and Mobile Apps”, was developed to validate the Location Commitment scoring criteria in the Me2B Alliance Safe & Respectful Technology Specification. The specification, produced by the Me2B Alliances Respectful Tech Spec Working Group, is designed to provide a standard for measuring safe and ethical behavior in connected technology.

The Me2B Alliance (“Me2BA”)3 is a nonprofit creating a safe and just digital world through standards development and independent technology testing. At the core of our work is our Respectful Technology Specification4, currently in development, which provides an objective standard for measuring safe and ethical technology behavior.

  • Consumers are aware that legal policies exist on connected technologies and that they should read them, but they continue to choose to largely ignore them.
  • 55% of survey participants did not understand that a TOS/TOU agreement is a legal contract. This has significant implications because a key requirement for legally binding contracts is mutual assent, which means that both parties have a “meeting of the minds” and must understand theyre entering into a contract.
  • None of the interview participants were aware of tools that explain or rate privacy policies and TOS/TOU documents, and half said that a score would not change their behavior.
  • 66% of survey respondents believe that privacy policies protect the business, while 50% say they protect the consumer. Its questionable that privacy policies protect either the individual or the business, as they are primarily legal notices, disclosures of how data is used by the technology and the companies behind it. Moreover, 39% of respondents erroneously thought that the privacy policy was a contract [between them and the company].

Using Backchannel as a model example, we propose four design principles for trusted digital relationships. Then we used Backchannel to design and build three sample apps: chat, location sharing, and document preview. We also tested these designs with journalists, researchers, and designers. Based on this testing, we outline common user experience challenges and recommended solutions.

There's a saying in security: "Don't roll your own crypto." I think we need a corollary in identity: "Don't roll your own interface." But how do we do that? And what should the interface be? One answer is to adopt the user experience people already understand from the physical world: connections and credentials.

Passwords were a major point of contention in that regard, with a strong majority (68 percent) of consumers indicating that it is difficult to remember and key in a large number of passwords. Nearly half (44 percent) believe that biometric authenticators are easier to use, while 34 percent would prefer to use them as their primary means of identity

This session will share the results and learnings of the creation and development of an ethical “yardstick” for respectful technology, including its application to websites and mobile apps. The speakers will also explore learnings from everyday people in the validation research around the certification mark as well as share recommendations for tech makers.

Accessibility

  • Disability-inclusive ID Systems

    Creating an inclusive ID system requires a comprehensive, whole-of-system approach to overcome barriers to ID enrollment and use for persons with disabilities.

Customer Commons \ Intention Economy

This was a small meeting primarily meant to tee up Hadrian Zbarceas demo of Customer Commons new Intention Byway model for better signaling between demand and supply in markets of all kinds

the FIDO Alliance, a set of open, scalable, and interoperable specifications has been developed to replace passwords as a secure authentication method for online services. The alliance has also worked with companies such as Microsoft, Google, and Apple  to integrate and adopt FIDO standards across their operating systems.

The complex ecosystem where manifold transactions can be automatically enabled by smart contracts contributes, at least in principle, to establish greater transparency about data use towards the many parties involved. However, the mere fact of building such a verifiable and traceable architecture does not automatically translate into understandable communications, easily applicable instructions and smooth transactions for human beings.

WHAT DO RUSSIAN protesters have in common with Twitter users freaked out about Elon Musk reading their DMs and people worried about the criminalization of abortion? It would serve them all to be protected by a more robust set of design practices from companies developing technologies.

The first international agreement on how refugees could handle the issue of missing or incomplete identity documents resulted from the Arrangement of 5 July, 1922, which was a meeting of the League of Nations. Among other things, the conference established a uniform “identity certificate” for Russian refugees, between one and two million of whom had been displaced by various conflicts over the previous decade.

A key part of this is continuity and longevity: a personal data store is for life, so the institutions providing personal data stores should be designed for decades (centuries, even). Whatever particular corporate form they take, legal safeguards relating to continuity and longevity of purpose need to be built into how they operate.

Human Rights

Good topic for CCG discussion and reading on the implications of a lot of

the tech we are working on:

The Ford Foundation paper attached provides the references. However, this thread should not be about governance philosophy but rather a focus on human rights as a design principle as we all work on protocols that will drive adoption of W3C VCs and DIDs at Internet scale.