- User research
When developing a new digital identity service, it should be designed with real users in mind, drawing on insight from real people in the relevant demographics who’ll be using the service now and in the future.
New services should be inclusive for everyone to be able to access regardless of any restrictions in their knowledge or technical ability.
The user research phase is where most of the upfront design work is done. The more effort and input at this stage of the design process, the less need for any remediation at a later date. Such in depth user research gathers insight to inform design
decisions.
Nothing should be assumed or guessed, which is why the real-life research with users and potential users is critical in the service design.
Users should be involved throughout the development process, with their regular and repeat insights supporting an iterative design process. Even once the service is live, users should continue to review and test so that the services can evolve
as user needs and preferences change.
- Privacy and security by design
Users and their data need to be kept safe by ensuring both privacy and security, without compromising the ability to meet user needs, nor should it impede the ability to access services.
Where personal data is collected and processed, it’s critical that privacy and security is woven throughout the design from the very start. It should never be a secondary thought; thereby making us proponents for both ‘privacy and
security by design’.
One method to ensure privacy and security by design is to follow the principle of data minimisation. Organisations should carefully plan data capture and, where possible, seek to leverage existing data sets to avoid duplication of data capture and the continued emergence of data silos. Moreover, data should be governed by security mechanisms that ensure it can only be accessed by those to whom it relates and is intended.
It is also critical to accurately identify risks to vulnerable populations in the user profile, and to put measures in place to mitigate those risks. By doing so in advance, the provider can ensure not only a secure service, but an ethical one
that safeguards its users.
- Transparency
One of the surest ways of encouraging buy in to identity services, is to give users an easy way to leverage their right to control their own data. By allowing for user-centred visibility and control, users have the ability to manage and selectively
disclose identity data and associated attributes.
By weaving in visibility and transparency of personal data usage, users are able to understand how, why and when their data is used. This gives users the opportunity to fully appreciate how digital identity ecosystems can and will work for them.
As such, digital identity service providers should provide an easy to use mechanism for the user to not only manage their data but also to audit where and when their data is used. Such a mechanism will give users visibility of what data is held
about them and how long it has been, or will be held for.
To achieve transparency and also trust, providers should clearly explain to the user how the system works and why it makes the decisions that it does. This is particularly important in cases where data is being transacted on a business to business
(B2B) basis, as well as for automated decision-making where users may lack some of the visibility that is critical to their understanding. By giving users a clear, informed view of the risks and opportunities presented by the solution, providers
will be able to build trust from their transparency.
- User control and consent
We are experiencing a shift in perspectives around identity service delivery, with technology innovations enabling greater user management and control of their data. As such, any service should seek to empower users and how they interact with,
and use, their data.
To ensure users are in control of their data, at a minimum, organisations should build in features that ensure information isn’t shared with third parties without full and transparent usage policies and, where relevant, user consent.
Users should be able to choose how, when and where their identity and its related attributes are used. Ensuring users are informed is not only a regulatory requirement in many contexts, but it is also ethical, helping to ensure service buy in
and adoption.
During the design process, organisations must also recognise the need for delegated authority in many scenarios. They should provide appropriate solutions which recognise individuals’ relationships, for example between parents, carers and
children – allowing people to complete trusted interactions on behalf of others.
Such built-in consent processes help to ensure users are genuinely in control of their data, and how it is used.
- Accessibility and inclusion
Digital identity services must reflect the needs of the user base. They must be accessible to everyone. To create inclusivity, organisations will need to design solutions which work for all demographic groups and minimise the potential for discrimination.
In doing so, organisations should actively consult with a diverse range of users covering as many of the likely demographic groups as possible. As the consultation progresses, potential users can be invited to participate in testing to fully assess
the user experience and look for bugs and anomalies, while checking it meets (or exceeds) accessibility standards.
Throughout the development journey, it’s good practice to regularly audit algorithms, often using independent parties for the assessment. The audit will check thoroughly for bias. The published results should also include detailed plans
to address issues that have been discovered.
A common issue that is often discovered is a solution that isn’t suitable for those users with a minimal digital footprint. Digital identity services must serve those who are digitally excluded by ensuring that some offline channels are
maintained, alongside any digital-only channels.
The audit and improvement process should be continuous. It’s imperative to not create new barriers to accessibility, whilst also actively working to reduce any current barriers to access.
- Accountability
Digital identity service providers and their partners must be held accountable for the impacts that such services have on individuals, communities and wider society. They must also put in place mechanisms to address
any impact.
Supporting those who may have been negatively impacted by the service in any way (whether as a result of fraud, data loss or discrimination) is an intrinsic part of the service. Support could include advice and remediation
services, where users can easily raise grievances or make complaints
Implementing robust governance isn’t sustainable without also ensuring those standards are upheld by each organisation’s partner ecosystem. Standards for
accountability should be readily adopted by all parties involved in delivering digital identity services.