<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Finema - Medium]]></title>
        <description><![CDATA[Self-sovereign identity expert. Provider of modern identity platforms, infrastructure, and tools for the digital world. - Medium]]></description>
        <link>https://medium.com/finema?source=rss----55e22a76bbfc---4</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Mon, 30 Mar 2026 07:54:38 GMT</lastBuildDate>
        <atom:link href="https://medium.com/feed/finema" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[This Month in Digital Identity — December Edition]]></title>
            <link>https://medium.com/finema/this-month-in-digital-identity-december-edition-d7c4fd503c87?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/d7c4fd503c87</guid>
            <category><![CDATA[digital-identity]]></category>
            <category><![CDATA[decentralized]]></category>
            <category><![CDATA[december]]></category>
            <category><![CDATA[newsletter]]></category>
            <dc:creator><![CDATA[Wint Hmone Thant]]></dc:creator>
            <pubDate>Mon, 02 Dec 2024 08:34:40 GMT</pubDate>
            <atom:updated>2024-12-02T08:34:40.624Z</atom:updated>
            <content:encoded><![CDATA[<h3>This Month in Digital Identity — December Edition</h3><p>Welcome to the December edition of our monthly digital identity series! This month, we explore how digital identity technology is safeguarding academic integrity, ensuring the credibility of research in an era of rising identity fraud. Dive into the GAO’s call for robust civil rights protections as federal agencies increasingly rely on AI and facial recognition. Discover the transformative potential of Digital Travel Credentials and decentralized identity in revolutionizing international travel. Lastly, we’ll examine the urgent need for unified frameworks to align emerging technologies with civil liberties.</p><p>Here’s a closer look at what you’ll find in this month’s insights:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*rbGPFC4umsKxLf7QLcicuQ.png" /></figure><h3>Protecting Academic Integrity with Digital Identity Technology</h3><p>The scholarly publishing industry is grappling with an alarming rise in identity fraud and unethical practices like paper mills and author impersonation, undermining trust in academic research. To counteract these challenges, STM Solutions has unveiled a comprehensive report titled <em>“Trusted Identity in Academic Publishing: The Central Role of Digital Identity in Research Integrity.”</em> Developed in collaboration with a Researcher Identity Task and Finish Group, this initiative aims to provide practical guidelines for the effective use of digital identity technology.</p><p>The report calls for stronger identity verification mechanisms while respecting inclusivity and privacy. It emphasizes how digital identity solutions can authenticate researchers while maintaining their autonomy, thus preserving the credibility of academic contributions. A crucial aspect of this framework is its scalability, which accommodates the diverse range of research disciplines and global contexts without being intrusive.</p><p>This strategic document also advocates an industry-wide approach, urging publishers, institutions, and researchers to unite for seamless integration of identity verification systems. By doing so, the academic community can address fraud more effectively, sustain open collaboration, and enhance trust in published findings.</p><h3>GAO Highlights Need for Civil Rights Protections Amid Federal Data Use</h3><p>As federal agencies increasingly rely on advanced technologies like artificial intelligence (AI) and facial recognition, the U.S. Government Accountability Office (GAO) has raised concerns about their impact on civil rights. Despite the growing adoption of these tools, a recent GAO report reveals that existing federal guidelines fall short in protecting against biases, inequities, and potential abuses of personal data.</p><p>While some agencies, such as Homeland Security, have begun creating frameworks to address ethical concerns, the overall response remains fragmented. For instance, the Privacy Act of 1974 governs data handling but does not encompass safeguards for emerging technologies. This leaves significant gaps in accountability, allowing for inadvertent discrimination and systemic inequities.</p><p>GAO recommends that Congress establish robust, government-wide civil rights guidelines tailored to emerging technologies. These should include privacy safeguards, risk assessments, and ethical oversight to prevent misuse. Furthermore, the report stresses the importance of bridging workforce skills gaps by investing in specialized training, enabling public agencies to navigate the complexities of these evolving tools.</p><p>This call to action underscores a need for comprehensive policy reform, ensuring that technological advancements align with societal values and protect citizens’ fundamental rights.</p><h3>Revolutionizing International Travel: Digital Identity Innovation</h3><p>Imagine arriving at your tropical destination just 30 minutes after stepping off the plane. This seamless travel experience is now a reality, thanks to a groundbreaking trial conducted by SITA, Indicio, and Delta Air Lines, in collaboration with Aruba’s government. Using Digital Travel Credentials (DTC) and IATA’s OneID, the initiative has redefined how travelers journey from Atlanta to Aruba.</p><p>The process begins with the DTC, a cryptographically secure digital version of a passport, allowing travelers to preauthorize their journey. At Hartsfield-Jackson Airport in Atlanta, IATA’s OneID streamlines check-ins, baggage handling, and boarding processes. Upon landing in Aruba, passengers cross the border within seconds, holding the data required for verification securely on their mobile devices.</p><p>The initiative demonstrates the potential of decentralized digital identity technology, merging efficiency, security, and privacy. By combining the DTC and OneID workflows, travelers experienced reduced wait times and an overall enhanced travel experience. This innovation highlights a significant leap toward a unified, interoperable digital travel system.</p><p>The success of this trial is more than a technological milestone—it is a glimpse into the future of international travel. During the IATA World Passenger Symposium in Bangkok, SITA’s Michael Zureik will present the findings, signaling a major shift toward global adoption of decentralized identity solutions.</p><h3>Emerging Technologies and the Call for Unified Civil Liberties Frameworks</h3><p>The U.S. Government Accountability Office (GAO) has identified alarming gaps in the way federal agencies handle civil rights issues associated with emerging technologies. AI and facial recognition tools, while enabling efficiencies, have also amplified biases and raised concerns about discriminatory practices. Existing laws, such as the Privacy Act of 1974, focus on privacy but fail to address the civil liberties challenges posed by modern tools.</p><p>GAO’s analysis of 24 federal agencies revealed inconsistent approaches to mitigating these risks. While some, such as the Department of Homeland Security, have developed initial risk assessment tools, these efforts lack uniformity and coordination across agencies. Moreover, workforce shortages and outdated regulatory frameworks hinder the effective adoption of ethical practices.</p><p>To close these gaps, GAO urges Congress to create a government-wide framework addressing ethical data use, transparency, and accountability. The report also highlights the need for investments in workforce training and infrastructure, enabling agencies to better respond to technological advancements while protecting civil liberties.</p><p>This recommendation seeks to balance the promise of innovation with the need to uphold social justice, ensuring that government operations remain fair, inclusive, and trustworthy.</p><p>We look forward to bringing you more insightful updates as we continue to explore the latest trends and innovations in the field of digital identity. Together, we can contribute to a more secure and inclusive digital future.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=d7c4fd503c87" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/this-month-in-digital-identity-december-edition-d7c4fd503c87">This Month in Digital Identity — December Edition</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[This Month in Digital Identity — November Edition]]></title>
            <link>https://medium.com/finema/this-month-in-digital-identity-november-edition-59572dfbe9ca?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/59572dfbe9ca</guid>
            <category><![CDATA[newsletter]]></category>
            <category><![CDATA[digital-identity]]></category>
            <category><![CDATA[finema]]></category>
            <dc:creator><![CDATA[Wint Hmone Thant]]></dc:creator>
            <pubDate>Fri, 01 Nov 2024 07:22:04 GMT</pubDate>
            <atom:updated>2024-11-01T07:22:04.242Z</atom:updated>
            <content:encoded><![CDATA[<h3>This Month in Digital Identity — November Edition</h3><p>Welcome to the November edition of our monthly digital identity series! This month, we’re diving into essential advancements shaping digital identity and the future of secure verification. Discover key updates on the European Digital Identity Wallet, the latest approaches to mobile driver’s license verification, and how deepfake detection is evolving to tackle growing threats. Plus, we’ll explore Jumio’s innovative biometric liveness detection and its role in combating identity fraud.</p><p>Here’s a closer look at what you’ll find in this month’s insights:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*CgHoDdBnJk9ohtz9K4j3_g.png" /></figure><h3>The EUDI Wallet</h3><p>The European Digital Identity (EUDI) Wallet is a transformative initiative designed to provide EU citizens with a secure, self-sovereign digital identity solution. This wallet enables individuals to manage their identity information independently, ensuring privacy and security in online interactions. By facilitating access to essential services such as banking, healthcare, and governmental applications, the EUDI Wallet promises to streamline daily life and foster a more inclusive digital economy.</p><p>However, its implementation faces several significant challenges. Firstly, robust cybersecurity measures are crucial to prevent identity theft and data breaches, which could undermine user trust. Secondly, achieving regulatory harmonization across diverse EU member states is essential, as different countries have unique legal frameworks and privacy regulations. Without a unified approach, the wallet’s effectiveness could be compromised, leading to confusion among users and service providers alike.</p><p>Collaboration among various stakeholders—including government agencies, technology providers, and civil society—will be pivotal in overcoming these hurdles. By establishing clear standards for security, interoperability, and user experience, the EUDI Wallet can become a reliable tool that empowers citizens while safeguarding their data. Ultimately, its success will hinge on building public trust and ensuring that the system is user-friendly, accessible, and compliant with the highest privacy standards.</p><h3>mDL Verification</h3><p>The emergence of mobile driver’s licenses (mDLs) signifies a significant shift in identity verification, presenting both opportunities and challenges for users and authorities. mDLs offer a modern alternative to traditional physical licenses, allowing users to carry their identification securely on their smartphones. This technological advancement aims to streamline the verification process for a wide range of services, from travel to online transactions.</p><p>However, the implementation of mDLs is fraught with challenges. One primary concern is the need for secure and intuitive verification processes that maintain user confidence while preventing identity fraud. Additionally, the lack of uniformity in regulations across different states poses a significant barrier to widespread adoption. Each state has its own legal standards and technical requirements, complicating interoperability and making it difficult for users to rely on their mDLs outside their home jurisdictions.</p><p>Privacy is another critical issue, as users must be assured that their personal information will remain secure and confidential. Striking a balance between robust security measures and a seamless user experience is essential for gaining public acceptance.</p><p>To facilitate the successful rollout of mDLs, continuous collaboration among stakeholders — including government agencies, technology developers, and consumers — is vital. By establishing best practices and regulatory frameworks, stakeholders can ensure that mDLs become a trusted and widely accepted form of identification, paving the way for a more secure and efficient digital identity landscape.</p><h3>Deepfake Detection</h3><p>The rise of deepfake technology presents significant challenges to the authenticity of digital content, raising concerns about misinformation and trust in media. Deepfakes utilize advanced artificial intelligence to create highly realistic but fabricated videos and audio, making it increasingly difficult for viewers to discern truth from deception. As this technology becomes more sophisticated, the need for effective detection methods becomes paramount.</p><p>Various techniques are being developed to identify deepfakes, focusing on detecting subtle inconsistencies that can indicate manipulation. These methods include analyzing facial movements, lighting discrepancies, and unnatural expressions, which may signal that a video has been altered. As detection technologies evolve, they must continuously adapt to keep pace with advances in deepfake creation.</p><p>A multi-pronged approach is essential to mitigate the risks associated with deepfakes. This involves not only developing robust technological solutions but also enhancing public awareness about the existence and implications of deepfakes. Educating consumers on how to recognize manipulated content is critical in fostering a more discerning audience that can critically evaluate the media they consume.</p><p>Regulatory measures also play a crucial role in addressing the challenges posed by deepfakes. Policymakers must consider ethical guidelines and legal frameworks that govern the creation and dissemination of synthetic media. By promoting transparency and accountability in digital content creation, society can better safeguard against the potential harms of deepfakes while preserving the integrity of visual communication.</p><h3>Jumio’s Biometric Liveness Detection</h3><p>Jumio’s development of in-house biometric liveness detection technology represents a significant leap forward in identity verification. As identity fraud becomes more prevalent, this innovative solution uses sophisticated artificial intelligence to accurately differentiate between genuine biometric data—such as facial recognition—and spoofing attempts, including photographs or masks. This capability is crucial for enhancing security in online transactions and customer onboarding processes.</p><p>The liveness detection technology analyzes various data points in real time, assessing factors such as facial movements and eye interactions to determine whether the biometric input is from a live person. By integrating this technology into their identity verification offerings, Jumio aims to provide organizations with a more reliable means of preventing identity theft and fraud.</p><p>Moreover, as the digital landscape evolves, the demand for effective biometric verification solutions is increasing. Organizations must navigate the dual challenge of enhancing security while ensuring a smooth user experience. Jumio’s biometric liveness detection addresses this need, positioning the company as a leader in the identity verification market.</p><p>In an environment where digital interactions are ubiquitous, establishing trust in online transactions is paramount. By offering advanced biometric solutions, Jumio is helping to build confidence among consumers and organizations alike, making it easier to engage in secure digital commerce. As identity verification technologies continue to evolve, Jumio’s innovations will play a vital role in shaping the future of secure online interactions.</p><p>We look forward to bringing you more insightful updates as we continue to explore the latest trends and innovations in the field of digital identity. Together, we can contribute to a more secure and inclusive digital future.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=59572dfbe9ca" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/this-month-in-digital-identity-november-edition-59572dfbe9ca">This Month in Digital Identity — November Edition</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[This Month in Digital Identity — October Edition]]></title>
            <link>https://medium.com/finema/this-month-in-digital-identity-october-edition-f7776d9de98d?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/f7776d9de98d</guid>
            <category><![CDATA[october]]></category>
            <category><![CDATA[digital-identity]]></category>
            <category><![CDATA[newsletter]]></category>
            <category><![CDATA[finema]]></category>
            <dc:creator><![CDATA[Wint Hmone Thant]]></dc:creator>
            <pubDate>Wed, 02 Oct 2024 07:22:56 GMT</pubDate>
            <atom:updated>2024-10-02T07:22:56.614Z</atom:updated>
            <content:encoded><![CDATA[<h3>This Month in Digital Identity — October Edition</h3><p>Welcome to the October edition of our monthly digital identity series! This month, we’re exploring the critical developments and innovative strategies that are redefining the landscape of digital identity. We’ll delve into significant advancements in decentralized identity, the balance between regulation and privacy, the role of biometric technology in hiring compliance, and the establishment of security standards for digital ID wallets in the EU.</p><p>Here’s a closer look at the essential topics we’ll be covering:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*d7jTcUQqfnlZc-gG4dAILQ.png" /></figure><h3><strong>Advancing Decentralized Identity with the SLAP Framework</strong></h3><p>Velocity Network has dedicated the past five years to developing the Internet of Careers, focusing on essential business needs through the SLAP framework. This innovative approach emphasizes four critical components:</p><ul><li><strong>Survivable Credentials:</strong> These credentials are designed to remain valid and accessible over time, ensuring that users can reliably present their identities without facing barriers.</li><li><strong>Legal Risk Mitigation:</strong> By addressing potential legal challenges associated with identity verification, organizations can significantly reduce their exposure to regulatory pitfalls, fostering a more secure environment for both users and businesses.</li><li><strong>Accreditation for Issuers and Relying Parties:</strong> Establishing trusted standards for all participants in the identity ecosystem helps to enhance credibility and build trust among users.</li><li><strong>Practical Privacy:</strong> Prioritizing user privacy ensures that individuals maintain control over their personal information, which is essential in today’s digital landscape.</li></ul><p>Velocity Network’s collaborative efforts invite stakeholders from various sectors to contribute to effective decentralized identity solutions. By working together, we can empower individuals with greater control over their identities and foster a more inclusive digital ecosystem.</p><h3><strong>Navigating the Tension Between Decentralized Identity and Regulation</strong></h3><p>In the ever-evolving digital landscape, the interplay between decentralized identity and regulatory frameworks has become increasingly critical. High-profile cases such as Silk Road and Tornado Cash highlight the challenges of balancing innovation with compliance.</p><p>To address these challenges, it is essential to adopt a balanced approach that fosters the development of decentralized reputation systems. Such systems can empower self-regulation while ensuring both privacy and accountability. By leveraging anonymous identities, we can create a framework where individuals have control over their digital presence while participating responsibly in digital platforms.</p><p>This approach not only enhances user empowerment but also helps build trust within communities. Learning from past experiences with regulatory challenges can inform the design of more resilient and adaptable decentralized identity systems. By understanding the nuances of this complex relationship, we can pave the way for innovative solutions that respect both freedom and the need for regulation.</p><h3><strong>Enhancing Hiring Compliance in the UK with Yoti Biometrics</strong></h3><p>Yoti is making significant strides in the UK hiring landscape by integrating biometric technology with Sterling’s background checks. This partnership aims to streamline compliance and enhance the security and accuracy of identity verification during the hiring process.</p><p>By utilizing Yoti’s biometric solutions, employers can simplify the compliance process, ensuring that they meet regulatory requirements efficiently. This integration not only reduces the risk of non-compliance but also enhances security, making it more difficult for fraudulent activities to occur.</p><p>Candidates benefit from this system as well, enjoying a smoother onboarding experience. The biometric verification process is designed to be quick and user-friendly, allowing job seekers to complete identity checks seamlessly. This innovative approach not only improves the overall efficiency of the hiring process but also instills greater confidence among employers and candidates alike.</p><p>As organizations increasingly recognize the value of biometric technology, Yoti’s integration with Sterling’s background checks stands as a promising development for the future of hiring compliance in the UK.</p><h3><strong>ENISA to Launch Cybersecurity Certification Scheme for EU Digital ID Wallets</strong></h3><p>In a significant move to bolster security in the digital identity landscape, the European Union Agency for Cybersecurity (ENISA) is set to establish a cybersecurity certification scheme for the EU’s digital ID wallets. This initiative aims to ensure that digital identity solutions meet high standards of security and trustworthiness, thereby promoting consumer confidence in these technologies.</p><p>The certification scheme will provide a robust framework for assessing and validating the security measures implemented in digital ID wallets. By aligning with EU regulations and standards, this initiative supports the broader strategy of creating a secure and interoperable digital identity ecosystem within the EU.</p><p>ENISA emphasizes the importance of collaboration with various stakeholders, including industry leaders and governmental bodies, to develop a comprehensive certification process. This collaborative approach is crucial for addressing the diverse needs and challenges in the digital identity landscape.</p><p>By fostering trust in digital identity solutions, this initiative paves the way for increased adoption and reliance on secure digital services across the EU.</p><p>We look forward to bringing you more insightful updates as we continue to explore the latest trends and innovations in the field of digital identity. Together, we can contribute to a more secure and inclusive digital future.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=f7776d9de98d" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/this-month-in-digital-identity-october-edition-f7776d9de98d">This Month in Digital Identity — October Edition</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[This Month in Digital Identity — September Edition]]></title>
            <link>https://medium.com/finema/this-month-in-digital-identity-september-edition-1b00d330e9f4?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/1b00d330e9f4</guid>
            <category><![CDATA[digital-identity]]></category>
            <category><![CDATA[september]]></category>
            <category><![CDATA[finema]]></category>
            <category><![CDATA[newsletter]]></category>
            <dc:creator><![CDATA[Wint Hmone Thant]]></dc:creator>
            <pubDate>Mon, 02 Sep 2024 09:16:53 GMT</pubDate>
            <atom:updated>2024-09-02T09:16:53.197Z</atom:updated>
            <content:encoded><![CDATA[<h3>This Month in Digital Identity — September Edition</h3><p>Welcome to the September edition of our monthly digital identity series! This month, we’re exploring the critical developments and innovative strategies that are redefining the landscape of digital identity. Here’s a closer look at the essential topics we’ll be covering:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*mXmtyT8k2n2UDsUbVqhUQQ.png" /></figure><h3><strong>AI Enhancing Healthcare Fraud Prevention</strong></h3><p>Artificial Intelligence (AI) is becoming a crucial tool in combating healthcare fraud by analyzing vast datasets in real-time to detect fraudulent activities, particularly through voice biometrics that verify patient identities and prevent unauthorized access to healthcare services. Additionally, there is a growing focus on enhancing patient experiences through digital trust technologies, such as secure digital signatures and messaging platforms, which protect patient data and streamline healthcare processes. Innovations like chip-based ID cards are also being adopted, as seen in Vietnam, to secure patient information and simplify access to healthcare services, reducing the risk of identity theft and fraud. These technological advancements collectively aim to strengthen the integrity of healthcare systems, safeguard patient data, and improve operational efficiency, ultimately enhancing the overall patient experience.</p><h3><strong>Somalia’s Financial Inclusion Drive</strong></h3><p>Somalia is advancing its digital transformation with a new Memorandum of Understanding (MoU) between the National Identification and Registration Authority (NIRA) and the Somali Banks Association (SBA) to drive financial inclusion through the national ID program. Launched a year ago, this program aims to provide the 18 million residents with a unified identity, facilitating access to banking services and aligning with global standards. The partnership seeks to enhance financial security, reduce fraud, and streamline banking processes by using the National Identification Number (NIN) for customer verification. This initiative is part of a broader effort to bolster the country’s economy, ensure compliance with international regulations, and increase public trust in financial institutions. The collaboration has been praised by key government figures and international partners, who see it as crucial for Somalia’s development. Ongoing consultations with stakeholders aim to further strengthen the national ID system, making it more impactful in supporting economic growth and modernizing financial services.</p><h3><strong>Spain’s New Age Verification System</strong></h3><p>Spain has introduced technical specifications for a new online age verification system aimed at controlling minors’ access to adult content, using W3C Verifiable Credentials (VCs) as the core technology. This approach addresses growing concerns over the negative impact of unrestricted access to adult content on the mental health and social skills of children and teenagers. By implementing W3C VCs, Spain ensures that age verification is conducted securely and privately, without disclosing personal information, thus aligning with GDPR principles. W3C VCs offer unmatched security through advanced cryptographic methods, enhanced privacy by allowing users to share only necessary information, and portability by integrating seamlessly with digital wallets. The system also follows the OpenID For Verifiable Presentations (OpenID4VP) specification, ensuring secure and private verification, and includes a trust management framework to ensure only authorized entities can issue or verify credentials, making it an ideal solution for protecting minors online.</p><h3><strong>The Digital Travel Credential (DTC)</strong></h3><p>In the realm of digital identity, numerous digital credentials are vying to replace physical documents, with the European Union’s eIDAS 2.0 and digital driver’s licenses being notable examples. However, none match the Digital Travel Credential (DTC) standard for digital trust, developed by the International Civil Aviation Organization (ICAO), which sets the universal standards for passports. The DTC, designed as the digital equivalent of a passport, offers two types: one created by a user from their physical passport and another issued directly by passport authorities. Indicio and SITA pioneered the implementation of the Type 1 DTC, which is now being adopted by countries and airlines for seamless travel. The DTC’s strength lies in its use of cryptographic verification, ensuring that passport data is securely held on a user’s device without needing to be stored in centralized databases, mitigating risks of data breaches. By scanning their passport, users can verify the authenticity of their data, bind it to their device through biometric checks, and ensure that their digital credentials are trustworthy and tamper-proof. This system provides airlines, airports, and border control with the confidence to streamline travel processes, knowing that the data in the DTC is authenticated, portable, and instantly verifiable.</p><p>We look forward to bringing you more insightful updates as we continue to explore the latest trends and innovations in the field of digital identity. Stay tuned for future editions of our monthly segment!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=1b00d330e9f4" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/this-month-in-digital-identity-september-edition-1b00d330e9f4">This Month in Digital Identity — September Edition</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[This Month in Digital Identity — August Edition]]></title>
            <link>https://medium.com/finema/this-month-in-digital-identity-august-edition-545ca5b8e2ea?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/545ca5b8e2ea</guid>
            <category><![CDATA[finema]]></category>
            <category><![CDATA[monthly-newsletter]]></category>
            <category><![CDATA[august]]></category>
            <dc:creator><![CDATA[Wint Hmone Thant]]></dc:creator>
            <pubDate>Tue, 13 Aug 2024 03:52:17 GMT</pubDate>
            <atom:updated>2024-09-02T09:17:59.115Z</atom:updated>
            <content:encoded><![CDATA[<h3><strong>This Month in Digital Identity — August Edition</strong></h3><p>Welcome to the August edition of our monthly digital identity segment! This month, we’re diving deep into pivotal advancements and strategies that are shaping the future of digital identity. Here’s an in-depth look at the key topics we’re covering:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*BqKLgqC2hoNmrA5UsnMwyA.png" /></figure><h3><strong>Enhancing Digital Identity Adoption</strong></h3><p>🌍 Our first article focuses on the EBSI-CAN project meeting, a landmark event in advancing digital identity adoption and cross-border interoperability between the EU and Canada. This meeting was crucial in addressing the complex challenges faced by international digital identity systems. It explored various technical barriers that currently impede seamless integration of digital identity systems across borders, such as differing standards and protocols. Regulatory alignment was another key focus, with discussions centered on harmonizing regulations to facilitate smoother interactions and exchanges of digital identity information between regions. Collaborative frameworks were also highlighted as essential for fostering international partnerships and creating a unified approach to digital identity. By tackling these issues, the EBSI-CAN project aims to build a more cohesive and efficient digital identity ecosystem that supports global digital transactions and interactions. This initiative represents a significant step toward overcoming the fragmentation in digital identity systems and achieving a more integrated global digital landscape.</p><h3><strong>Advancing Decentralized Identity</strong></h3><p>🔒 Our second feature delves into the exciting progress being made in the decentralized identity sphere, particularly the integration of OpenID’s verifiable credential protocols with DIDComm. This development marks a significant leap forward in enhancing digital identity management. OpenID’s verifiable credentials provide a robust framework for issuing and verifying digital identity information, while DIDComm enables secure, direct communication between trusted parties. The integration of these technologies facilitates a more secure and efficient exchange of identity information, supporting self-sovereign identity systems where users have greater control over their personal data. This advancement not only improves the reliability of digital identity exchanges but also enhances privacy by ensuring that personal information is only shared with trusted entities under secure conditions. The combination of OpenID and DIDComm represents a major stride toward a more user-centric and resilient digital identity infrastructure, paving the way for more secure and flexible identity management solutions.</p><h3><strong>Balancing Privacy, Security, and Convenience</strong></h3><p>🔐 In our third article, we explore the ongoing evolution of digital identity with a focus on balancing privacy, security, and convenience. As digital identity systems become more advanced, decentralized solutions are emerging as a promising way to enhance user control over personal data. These systems offer significant advantages over traditional centralized models by providing greater transparency and control to users. Our article examines how these decentralized systems address common concerns related to privacy and security while still delivering a high level of convenience. It discusses the technological innovations that are reshaping personal data management, including new methods for protecting user data and ensuring secure interactions with digital services. By exploring these advancements, the article provides insights into how future digital identity solutions might evolve to meet both user expectations and regulatory requirements, ultimately leading to a more balanced and user-friendly digital identity landscape.</p><h3><strong>The Strategic Advantage of Open Working Practices</strong></h3><p>💼 Our final feature in this edition discusses the strategic benefits of adopting open working practices. Open working practices, characterized by transparency, inclusivity, adaptability, collaboration, and community, offer organizations a powerful approach to enhancing their operations. The article explores how these principles can lead to greater organizational agility by breaking down traditional barriers and fostering a culture of open communication and collective problem-solving. It highlights how open working practices can drive innovation by encouraging diverse perspectives and ideas, leading to more creative and effective solutions. Additionally, the article examines how these practices can improve employee engagement and satisfaction by creating a more inclusive and supportive work environment. By embracing open working principles, organizations can achieve sustainable success and strengthen their performance in a rapidly changing business landscape.</p><p>We look forward to bringing you more insightful updates as we continue to explore the latest trends and innovations in the field of digital identity. Stay tuned for future editions of our monthly segment!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=545ca5b8e2ea" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/this-month-in-digital-identity-august-edition-545ca5b8e2ea">This Month in Digital Identity — August Edition</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[4 Criteria for Great vLEI Use Cases]]></title>
            <link>https://medium.com/finema/4-criteria-for-great-vlei-use-cases-c5fa0e04a6f5?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/c5fa0e04a6f5</guid>
            <category><![CDATA[decentralized-identity]]></category>
            <category><![CDATA[keris]]></category>
            <category><![CDATA[vlei]]></category>
            <category><![CDATA[digital-identity]]></category>
            <dc:creator><![CDATA[Yanisa Sunanchaiyakarn]]></dc:creator>
            <pubDate>Sat, 22 Jun 2024 20:41:54 GMT</pubDate>
            <atom:updated>2024-07-02T04:45:50.367Z</atom:updated>
            <content:encoded><![CDATA[<p><strong><em>Authors: </em></strong><em>Yanisa Sunanchaiyakarn &amp; Nuttawut Kongsuwan, Finema Co. Ltd.</em></p><p>As we enter the era of generative AI, the risk of identity theft for both individuals and organizations is escalating. A <a href="https://edition.cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk/index.html">recent case</a> in Hong Kong highlights this threat, where an office was deceived into transferring $25 million after a video call with a deepfake CFO and colleagues. As advances in AI erode trust, it becomes apparent that a robust digital identity solution for organizations is urgently needed.</p><p>In the past few years, the <a href="https://www.gleif.org/en/vlei/introducing-the-verifiable-lei-vlei">verifiable Legal Entity Identifier (vLEI)</a> framework has emerged as one of the most promising solutions to this global crisis. The vLEI framework offers one of the most secure and trustworthy digital organization identity management to date and potentially revolutionizes digital business transactions worldwide.</p><p>Since vLEI is also still in its infancy, there is a significant challenge for the community to overcome its adoption hurdle. In this article, we propose <strong>4 pragmatic criteria </strong>for identifying <strong>fair, good, and great use cases for vLEI </strong>from the <strong>adoptability perspective</strong>. These criteria are by no means rigid rules set in stone but serve as a useful mental model for pioneers and early adopters exploring vLEI use cases.</p><h3>The 4 Criteria</h3><p>We have identified <strong>4 criteria</strong> for good and great vLEI use cases:</p><ol><li>Use cases that involve <strong>organization-to-organization</strong> transactions</li><li>Use cases that involve <strong>cross-border</strong> transactions</li><li>Use cases that are <strong>highly regulated</strong></li><li>Use cases that have <strong>open ecosystems</strong></li></ol><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*BHsxSULjXID28DmBP5wX4w.png" /></figure><h4>1. Org-to-Org</h4><p>The first criterion for viable vLEI use cases is that they involve <strong>organization-to-organization (Org-to-Org) </strong>transactions. This includes business-to-business (B2B), business-to-government (B2G), and government-to-government (G2G) use cases. This is because org-to-org transactions often involve significant monetary value, which far outweighs the initial friction associated with adopting vLEI.</p><h4>2. Cross-Border</h4><p>Cross-border transactions often face challenges in identifying and verifying clients, suppliers, or partners across different countries, leading to significant perceived risks. The vLEI framework is ideal for addressing these challenges as it is specifically designed for international use. It can potentially streamline the cross-border identification and verification processes, mitigating risks and improving efficiency.</p><h4>3. Highly Regulated</h4><p>Highly regulated use cases are subject to strict requirements and standards, where non-compliance can result in severe penalties or legal consequences. As a result, these use cases often require extensive due diligence of business partners, including Know Your Customer (KYC), Anti-Money Laundering (AML), and Countering the Financing of Terrorism (CFT) checks.</p><p>The process of obtaining vLEI for organizations and their representatives involves robust identity verification that is more stringent than those often conducted in the financial sector. As a result, vLEI, which is built on the global LEI system, can help streamline the due diligence process, significantly reducing compliance costs.</p><h4>4. Open Ecosystems</h4><p>The final criterion for viable vLEI use cases is that they operate within open ecosystems, allowing the addition of organizations whose identities are not known in advance. Onboarding a new organization to the ecosystem often incurs significant cost, time, and effort. vLEI allows these onboarding costs to be offloaded to a third party, specifically a qualified vLEI issuer (QVI), which ensures that the organization and its representatives have undergone strict identity verification.</p><h3>Fair, Good, and Great Use Cases</h3><h4>Great Use Cases</h4><p>Great use cases are those that satisfy all 4 criteria, making them ideal candidates that could benefit from integrating vLEI into their workflows.</p><p>A prime example of a great use case is <strong>trade finance</strong>. It typically involves organizations (criterion 1) which are often located in different countries (criterion 2). Trade finance is also highly regulated, with financiers subject to AML and CFT requirements (criterion 3). The trade finance ecosystem is open, allowing any company worldwide to initiate trades and apply for, e.g., a letter of credit (criterion 4).</p><h4>Good Use Cases</h4><p>Good use cases are those that satisfy 3 out of the 4 criteria. We consider these use cases worth pursuing.</p><p>An example of a good use case is the financial reporting of banking institutions in Europe to the European <a href="https://www.eba.europa.eu/publications-and-media/press-releases/eba-publishes-discussion-paper-centralisation-eea-banks">Banking Authority (EBA)</a>. This exemplifies a B2G scenario (criterion 1), encompassing banks across multiple European countries (criterion 2) that operate within a highly regulated environment (criterion 3). What makes this use case simply good rather than great in our criteria is that European banks are already known entities within the closed ecosystem supervised by the EBA.</p><h4>Fair Use Cases</h4><p>Fair use cases are those that satisfy two or fewer criteria. While they may be a feasible use case for vLEI, we consider them a low-potential candidate. We argue that fair use cases will become viable once vLEI has achieved widespread adoption. For instance, organizations that already possess vLEI for stronger use cases might contemplate applying it to these fair use cases.</p><p>An example of a use case we consider “fair” for vLEI is an HR platform. Such a platform is used internally between employers and employees (not org-to-org) and is also not highly regulated. Typically, employees and employers have alternative means to verify each other, diminishing the necessity for vLEI in this context.</p><h3>Conclusion</h3><p>While vLEI holds significant potential to transform global business operations, it currently faces a challenge known as the “cold start problem,” where there are not enough holders and use cases to foster exponential growth in the ecosystem. Pioneers and early adopters are encouraged to prioritize exploring high-potential use cases.</p><p>Do you agree with our criteria? Are there additional criteria we should consider adding to the list? We welcome your feedback and invite you to contact us at contact@enauthn.id.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=c5fa0e04a6f5" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/4-criteria-for-great-vlei-use-cases-c5fa0e04a6f5">4 Criteria for Great vLEI Use Cases</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[KERI Tutorial: Sign and Verify with Signify & Keria]]></title>
            <link>https://medium.com/finema/keri-tutorial-sign-and-verify-with-signify-keria-833dabfd356b?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/833dabfd356b</guid>
            <category><![CDATA[decentralized-identity]]></category>
            <category><![CDATA[cryptography]]></category>
            <category><![CDATA[keris]]></category>
            <category><![CDATA[self-sovereign-identity]]></category>
            <dc:creator><![CDATA[Kriskanin Hengniran]]></dc:creator>
            <pubDate>Mon, 06 May 2024 06:19:40 GMT</pubDate>
            <atom:updated>2024-05-06T06:23:41.453Z</atom:updated>
            <content:encoded><![CDATA[<p><strong>Authors</strong>: Kriskanin Hengniran &amp; Nuttawut Kongsuwan, Finema Co., Ltd.</p><blockquote><strong>Note</strong>: This tutorial is heavily inspired by “KERI Tutorial Series — KLI: Sign and Verify with Heartnet” by Kent Bull</blockquote><blockquote><a href="https://kentbull.com/2023/01/27/keri-tutorial-series-kli-sign-and-verify-with-heartnet/">https://kentbull.com/2023/01/27/keri-tutorial-series-kli-sign-and-verify-with-heartnet/</a></blockquote><p>This blog presents an introductory guide for implementing the Key Event Receipt Infrastructure (KERI) protocol using the Signify and KERIA agents. The guide starts with installation and running the agents using Docker containers. The guide then provides a script to showcase the use of Signify and KERIA agents, including procedures for creating autonomic identifiers (AIDs), signing messages, and verifying signatures.</p><h3>Signify &amp; KERIA</h3><p>Signify &amp; KERIA are open-source projects developed for building client-side identity-wallet applications using the KERI protocol. Signify-KERIA identity wallet utilizes the <strong>hybrid edge-cloud wallet architecture</strong> where Signify provides a lightweight edge wallet component whereas KERIA provides a heavier cloud wallet component. Signify and KERIA were designed based on the principle of “<strong>key at the edge (KATE)</strong>”. That is, essential cryptographic operations are performed at edge devices.</p><p>Some resources for Signify &amp; KERIA can be found here</p><ul><li>The Signify-KERIA protocol by Philip Feairheller: <a href="https://github.com/WebOfTrust/keria/blob/main/docs/protocol.md">https://github.com/WebOfTrust/keria/blob/main/docs/protocol.md</a></li><li>KERI API (KAPI): <a href="https://github.com/WebOfTrust/kapi/blob/main/kapi.md">https://github.com/WebOfTrust/kapi/blob/main/kapi.md</a></li></ul><h4>Signify Edge Agent</h4><p>Signify provides an edge agent for a KERI identity wallet and is used primarily for essential cryptographic operations including key pair generation and digital signature creation. Signify utilizes the hierarchical deterministic (HD) key algorithm. For example, a Signify application could safeguard a single master seed which is used to generate and manage any number of AIDs. Signify is designed to be <strong>lightweight</strong> so as to support devices with limited capabilities.</p><p>Signify is currently available in Typescript and Python</p><ul><li>SignifyPy (Python) <a href="https://github.com/WebOfTrust/signifypy">https://github.com/WebOfTrust/signifypy</a></li><li>Signify-TS (Typescript) <a href="https://github.com/WebOfTrust/signify-ts">https://github.com/WebOfTrust/signify-ts</a></li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*jhO0Yb-A24uBjXTC" /></figure><h4>KERIA Cloud Agent</h4><p>KERI Agent (KERIA) provides a cloud agent for a KERI identity wallet and is used for, e.g., data storage, agent-to-agent communications, and verification of KERI key event logs (KELs). KERIA is engineered to handle the heavy lifting for users, allowing their edge devices to stay lightweight while maintaining high security by performing all essential cryptographic operations at the edge.</p><p>A KERIA agent is cryptographically delegated by a Signify agent using the KERI delegation protocol. All instructions from a user are signed at the edge by a Signify agent and subsequently verified by a KERIA cloud agent. KERIA is currently available in Python: <a href="https://github.com/WebOfTrust/keria">https://github.com/WebOfTrust/keria</a></p><h3>Installation Guide for Node and NVM</h3><p>In this guide, we will be using Node.js to run Signify-TS (Typescript) with a KERIA server running on a Docker container.</p><h4>Install Node.js</h4><p>Visit the <a href="https://nodejs.org/en">Node.js</a> website to download Node.js.</p><h4>Install NVM on Linux or MacOS</h4><p>To easily switch between different versions of Node.js, you could use Node Version Manager (NVM). NVM on Linux and macOS may be installed using curl</p><pre>curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.34.0/install.sh | bash</pre><p>Alternatively, we could use wget</p><pre>wget -qO- https://raw.githubusercontent.com/creationix/nvm/v0.34.0/install.sh | bash</pre><h4>Install NVM on Windows</h4><p>On Windows, NVM may be installed from the <a href="https://github.com/coreybutler/nvm-windows/releases">GitHub releases page.</a></p><p>Note: After installing NVM, do not forget to restart your terminal. Alternatively, you can refresh the available commands in your system path by executing source ~/.nvm/nvm.sh</p><h4>Using NVM</h4><p>To ensure that NVM is correctly installed, you can check its version with:</p><pre>nvm ls</pre><p>To install Node.js version 18.18.2, run:</p><pre>nvm install v18.18.2</pre><p>Here, we use v18.18.2 in this guide. This is not a requirement and other versions of Node.js should work with Signify-TS.</p><h3>Install Dependencies and Run a KERIA Server</h3><p>To install dependencies for the tutorial, you could clone this repository <a href="https://github.com/enauthn/tutorial-signify-keria">https://github.com/enauthn/tutorial-signify-keria</a> and run the following script</p><pre>git clone https://github.com/enauthn/tutorial-signify-keria<br>cd tutorial-signify-keria<br>npm install<br>docker-compose up -d</pre><p>where docker-compose up -d runs a KERIA server in a Docker container. Alternatively, you could get a Docker image for running a KERIA server from Docker Hub <a href="https://hub.docker.com/r/weboftrust/keria">https://hub.docker.com/r/weboftrust/keria</a>.</p><h3>Running Signify-TS Scripts</h3><p>Here, we demonstrate Signify-TS Scripts for signing and verifying an arbitrary message. Here, a signer, called Allie, signs a message “Hello Brett” and sends it to a verifier, called Brett. Brett subsequently obtains Allie’s key event log (KEL) and uses it to verify the signature on the message.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*WCNNY5qpScY6RmBM" /></figure><h4>Connecting Signify clients to the KERIA server</h4><p>First, Allie and Brett must boot and then connect to a KERIA server with their Signify (edge) clients. Each boot command creates a separate agent and a database in the KERIA server. see <a href="https://github.com/WebOfTrust/keria">https://github.com/WebOfTrust/keria</a> for more details.</p><p>Typically, Allie’s and Brett’s Signify scripts should run on separate devices, but here we put them in the same Typescript file for simplicity. Allie’s and Brett’s Signify clients may also connect to different KERI servers at different service endpoints. In this tutorial, both clients connect to a KERIA server running on the local host for simplicity.</p><pre>await signify.ready();<br><br>const url = &#39;http://127.0.0.1:3901&#39;;<br>const bootUrl = &#39;http://127.0.0.1:3903&#39;;<br>const bran1 = signify.randomPasscode();<br>const bran2 = signify.randomPasscode();<br><br>const allieClient = new signify.SignifyClient(<br>    url,<br>    bran1,<br>    signify.Tier.low,<br>    bootUrl<br>);<br>await allieClient.boot();<br>await allieClient.connect();<br><br>const brettClient = new signify.SignifyClient(<br>    url,<br>    bran2,<br>    signify.Tier.low,<br>    bootUrl<br>);<br>await brettClient.boot();<br>await brettClient.connect();</pre><p>To explain the above script:</p><ul><li>signify.randomPasscode() generates a random string with 126-bit entropy using libsodium</li><li>new signify.SignifyClient(…) creates a new Signify instance, initialized with the newly generated passcode</li><li>allieClient.boot() creates a KERIA agent and a corresponding database at the KERIA server via port 3903</li><li>allieClient.connect() connects a Signify agent to the corresponding KERIA agent that has been booted.</li></ul><p>Signify uses a passcode to generate cryptographic keys using a <a href="https://en.wikipedia.org/wiki/Key_derivation_function">key derivation function (KDF)</a> where signify.Tier specifies how much the passcode is stretched.</p><h4>Allie creates an AID</h4><p>Before Allie can sign a message with the KERI protocol, she must first create an <a href="https://medium.com/finema/keri-jargon-in-a-nutshell-part-1-fb554d58f9d0#7344">autonomic identifier (AID)</a> with a key inception event in a <a href="https://medium.com/finema/keri-jargon-in-a-nutshell-part-1-fb554d58f9d0#7c43">Key Event Log (KEL)</a>.</p><pre>const icpResult1 = await allieClient<br>    .identifiers()<br>    .create(&#39;aid1&#39;, {});<br>await waitOperation(allieClient, await icpResult1.op());<br><br>const rpyResult1 = await allieClient<br>    .identifiers()<br>    .addEndRole(&#39;aid1&#39;, &#39;agent&#39;, allieClient!.agent!.pre);<br>await waitOperation(allieClient, await rpyResult1.op());</pre><p>To explain the above script:</p><ul><li>allieClient.identifiers().create(&#39;aid1&#39;, {}) creates an AID where the Signify agent signs the inception event and sends the event along with its signature to the KERIA agent. Here, the AID is given an alias &#39;aid1&#39;.</li><li>allieClient.identifiers().addEndRole(...) cryptographically authorizes the KERIA agent to operate on behalf of the AID’s controller.</li><li>waitOperation(...) waits for the KERIA agent to complete its operation.</li></ul><p>addEndRole stands for “adding endpoint role authorization”. This is a mechanism in the KERI protocol where the controller of an AID cryptographically authorizes—by signing with a private key associated with the AID—a service endpoint of a KERIA agent to operate on the AID’s behalf. Another agent that needs to communicate with the authorized KERIA agent can then verify the authorization signature.</p><h4>Brett resolves Allie’s AID</h4><p>Allie and Brett may exchange their KELs using the <a href="https://medium.com/finema/keri-jargon-in-a-nutshell-part-3-oobi-and-ipex-2e6b222f4b87">Out-Of-Band Introduction (OOBI)</a> protocol. Allie may generate her OOBI URL that points to her KERIA agent’s service endpoint (which has been authorized in the previous step) as follows:</p><pre>const oobi1 = await allieClient.oobis().get(&#39;aid1&#39;, &#39;agent&#39;);</pre><p>The generated OOBI URL could be sent to Brett via an out-of-band channel such as email, messaging apps, or scanning QR code. Subsequently, Brett may ask his KERIA agent to resolve Allie’s OOBI to obtain the AID’s KEL from Allie’s KERIA agent:</p><pre>const oobiOp = await brettClient.oobis().resolve(oobi1.oobis[0], &#39;aid1&#39;);<br>await waitOperation(brettClient, oobiOp);</pre><h4>Allie signs the Message</h4><p>To sign a message, Allie may use the Signify<a href="https://github.com/WebOfTrust/signify-ts/blob/main/src/keri/core/keeping.ts"> </a><a href="https://github.com/WebOfTrust/signify-ts/blob/main/src/keri/core/keeping.ts">KeyManager class</a> as follows:</p><pre>const aid1 = await allieClient.identifiers().get(&#39;aid1&#39;);<br>const keeper1 = await allieClient.manager!.get(aid1);<br>const message = &quot;Test message&quot;;<br>const signature = await keeper1.sign(signify.b(message))[0];<br>console.log(&#39;signature&#39;, signature);</pre><p>which generates the following <a href="https://trustoverip.github.io/tswg-cesr-specification/">CESR-encoded</a> signature:</p><pre>signature AAAnBe-VPfBU9-3eb7aM5GNwr_NBuoJzA8vm9AFPmgj3I4LIv1mup2bwPDlbIQ6gAgtaEZg5rwE1_fTVVTmPo0oI</pre><p>To explain the above script:</p><ul><li>allieClient.identifiers().get(&#39;aid1&#39;) retrieves the information about the AID with alias &#39;aid1&#39; from the KERIA agent</li><li>allieClient.manager!.get(aid1) creates an instance from the KeyManager class called a keeper that signs an arbitrary byte string with keeper.sign()</li><li>signify.b() turns a text string into a byte string.</li></ul><p>When a Signify agent creates an AID, it uses a salt together with its passcode to generates cryptographic keys associated with the AID. The salt is then encrypted and sent to the KERIA agent. allieClient.identifiers().get(&#39;aid1&#39;) also retrieves and decrypts the salt where the KeyManager could use the salt to regenerate the key for signing the message.</p><h4>Brett verifies Allie’s signature</h4><p>After Allie sends the message to Brett, Brett wants to make sure the message is really from Allie by verifying the signature on the message. Brett may retrieve Allie’s KEL and the corresponding <a href="https://medium.com/finema/keri-jargon-in-a-nutshell-part-1-fb554d58f9d0#7344">key state</a> of her AID to verify the signature as follows:</p><pre>const aid1StateBybrettClient = await brettClient.keyStates().get(aid1.prefix);<br>const siger = new signify.Siger({qb64: signature});<br>const verfer = new signify.Verfer({<br>    qb64: aid1StateBybrettClient[0].k[0]<br>});<br>const verificationResult = verfer.verify(siger.raw, signify.b(message));<br>console.log(&#39;verificationResult&#39;, verificationResult);</pre><p>which gives the following output:</p><pre>verificationResult true</pre><p>To explain the above script:</p><ul><li>brettClient.keyStates().get(aid1.prefix) retrieves the key state of the Allie’s AID</li><li>signify.Siger({qb64: signature}) is a signature-wrapper instance of the <a href="https://github.com/WebOfTrust/signify-ts/blob/main/src/keri/core/siger.ts">Siger class</a>, initialized with the Allies’ signature on the message.</li><li>signify.Verfer(...) is a verifier-wrapper instance of the <a href="https://github.com/WebOfTrust/signify-ts/blob/main/src/keri/core/verfer.ts">Verfer class</a>, initialized by the Allie’s public key</li><li>verfer.verify(...) then verifies the message and its signature using Allie’s public key</li></ul><p>The verification of the signature against the message indicates that the signature is valid. This ensures the authenticity and integrity of the message that Brett received from Allie.</p><h3>Conclusion</h3><p>This tutorial gives a brief introduction for using the KERI protocol with the Signify and KERIA agents, which provide a footing for building client-side KERI-based applications. Signify provides libraries for building KERI edge agents whereas KERIA provides libraries for building companion cloud agents. These agents follow the principle of “key at the edge (KATE)” where essential cryptographic operations are performed at edge devices.</p><p>Unfortunately, this is still the early days for these two projects, and there are not many educational materials around as of May 2024. To dive deeper into these two projects, I recommend studying their integration scripts at <a href="https://github.com/WebOfTrust/signify-ts/tree/main/examples/integration-scripts">https://github.com/WebOfTrust/signify-ts/tree/main/examples/integration-scripts</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=833dabfd356b" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/keri-tutorial-sign-and-verify-with-signify-keria-833dabfd356b">KERI Tutorial: Sign and Verify with Signify &amp; Keria</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[vLEI Demystified Part 3: QVI Qualification Program]]></title>
            <link>https://medium.com/finema/vlei-demystified-part-3-qvi-qualification-program-59b4eba308f0?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/59b4eba308f0</guid>
            <category><![CDATA[decentralized-identity]]></category>
            <category><![CDATA[keris]]></category>
            <category><![CDATA[self-sovereign-identity]]></category>
            <category><![CDATA[cryptography]]></category>
            <dc:creator><![CDATA[Yanisa Sunanchaiyakarn]]></dc:creator>
            <pubDate>Fri, 19 Apr 2024 23:22:44 GMT</pubDate>
            <atom:updated>2024-04-29T02:22:25.992Z</atom:updated>
            <content:encoded><![CDATA[<p><strong><em>Authors: </em></strong><em>Yanisa Sunanchaiyakarn &amp; Nuttawut Kongsuwan, Finema Co. Ltd.</em></p><p>vLEI Demystified Series:</p><ul><li><a href="https://medium.com/finema/vlei-demystified-part-1-comprehensive-overview-212349c09643">Part1: Comprehensive Overview</a></li><li><a href="https://medium.com/finema/vlei-demystified-part-2-identity-verification-519102614b8e">Part 2: Identity Verification</a></li><li><a href="https://medium.com/finema/vlei-demystified-part-3-qvi-qualification-program-59b4eba308f0"><strong>Part 3: QVI Qualification Program</strong></a></li></ul><p>This blog is the third part of the vLEI Demystified series. The previous two, <a href="https://www.linkedin.com/posts/finema-official_vlei-demystified-part-1-comprehensive-overview-activity-7149006651126075392-mRZ_">vLEI Demystified Part 1: Comprehensive Overview</a> and <a href="https://medium.com/finema/vlei-demystified-part-2-identity-verification-519102614b8e">vLEI Demystified Part 2: Identity Verification</a>, have explained the foundation of the pioneering verifiable Legal Entity Identifier (vLEI) ecosystem as well as its robust Identity Verification procedures. In this part, we will share with you our journey through the Qualification of vLEI Issuers, called <strong>Qualified vLEI Issuers (QVIs)</strong>, including the requirements and obligations that QVIs have to fulfill once they are authorized by GLEIF to perform their roles in the ecosystem.</p><p>The Qualification of vLEI Issuers is the evaluation process conducted by the Global Legal Entity Identifier Foundation (GLEIF) to assess the suitability of organizations aspiring to serve as Qualified vLEI Issuers within the vLEI ecosystem. GLEIF has established the Qualification Program for all interested organizations, which can either be the current LEI Issuers (Local Operating Units: LOUs) or new business partners who wish to explore the emerging vLEI ecosystem. The organizations that complete the Qualification Program under the GLEIF vLEI Ecosystem Framework (EGF) are authorized to perform verification, issuance, and revocation of vLEI credentials to legal entities seeking the credentials and also their representatives.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*ERzbZDmmIP0Ob84c" /><figcaption>Photo by <a href="https://unsplash.com/@nguyendhn?utm_source=medium&amp;utm_medium=referral">Nguyen Dang Hoang Nhu</a> on <a href="https://unsplash.com?utm_source=medium&amp;utm_medium=referral">Unsplash</a></figcaption></figure><h3>Step 1: Start the Qualification Program</h3><p>To kick start the qualification process, the organizations interested in becoming QVIs must first review <a href="https://www.gleif.org/en/vlei/the-lifecycle-of-a-vlei-issuer/gleif-qualification-of-vlei-issuers/required-documents">Appendix 2: vLEI Issuer Qualification Program Manual</a>, which provides an overview of the required Qualification Program, and the vLEI Ecosystem Governance Framework (vLEI EGF) to make sure that they understand how to incorporate the requirements outlined in the framework to their operations. Once they have decided to proceed, the interested organizations may initiate the Qualification Program by sending an email to <a href="mailto:qualificationrequest@gleif.org">qualificationrequest@gleif.org</a> along with a <strong>Non-Disclosure Agreement (NDA) </strong>signed by an authorized signatory of the interested organization.</p><p>Unless GLEIF has a mean to verify the signatory’s authority by themselves, the interested organization may be required to submit proof of the signatory’s authority. In the case where the NDA signer’s authority is delegated, the power of attorney may also be required.</p><p>After GLEIF reviews the qualification request, they will countersign the NDA and organize an introductory meeting with the interested organization, now called a <strong>candidate vLEI issuer</strong>, to discuss the next step of the Qualification Program.</p><h3>Step 2: Implement the Qualification Program Requirements</h3><p>To evaluate if a candidate vLEI issuer has both the financial and technical capabilities to perform the QVI role, the candidate vLEI issuer is required to implement the Qualification Program Requirements, which consist of <strong>business</strong> and <strong>technical</strong> qualifications. Throughout this process, the candidate vLEI issuer may schedule <strong>up to two meetings</strong> with GLEIF to clarify program issues and requirements.</p><h4><strong>Complete the Qualification Program Checklist</strong></h4><p>A candidate vLEI issuer is required to complete <a href="https://www.gleif.org/en/vlei/the-lifecycle-of-a-vlei-issuer/gleif-qualification-of-vlei-issuers/required-documents">Appendix 3: vLEI Issuer Qualification Program Checklist</a> to demonstrate that they are capable of actively participating in the vLEI ecosystem as well as being in good financial standing. The checklist and supporting documents can be submitted via online portals provided by GLEIF.</p><p>The Qualification Program Checklist is divided into 12 sections from Section A to Section L. The <strong>first five </strong>sections (Section A to Section E) focus mainly on the business aspects while the <strong>last seven</strong> sections (Section F to Section L) cover the technical specifications and relevant policies for operating the <strong>vLEI issuer Services</strong>.</p><blockquote>Note: <strong>vLEI Issuer Services</strong> are all of the services related to the issuance, management, and revocation of vLEI credentials provided by the QVI.</blockquote><p><strong>Section A: Contact Details:</strong></p><p>This section requires submission of the candidate vLEI issuer’s general information as well as contact details of the key persons involved in the vLEI operation project, namely: (1) Internal Project Manager, (2) Designated Authorized Representative (DAR), (3) Key Contact Operations, and (4) Key Contact Finance</p><p><strong>Section B: Entity Structure</strong></p><p>This section requires submission of the candidate vLEI issuer’s organization structure, regulatory internal and external audit reports, operational frameworks, and any third-party consultants that the candidate vLEI issuer has engaged with regarding their business and technological evaluation.</p><p><strong>Section C: Organization Structure</strong></p><p>This section requires submission of the current organization chart for all vLEI operations and a complete list of all relevant third-party service providers that support the vLEI operations.</p><p><strong>Section D: Financial Data, Audits &amp; General Governance</strong></p><p>This section requires submission of financial and operational conditions of the candidate vLEI issuer’s business, including:</p><ul><li>Audited financial statements for the prior year</li><li>Financial auditor reports</li><li>Formal vLEI Issuer Operation Budget</li></ul><p><strong>Section E: Pricing Model</strong></p><p>In this section, the candidate vLEI issuer outlines their strategy to generate revenue from the vLEI operations and ensure that they are committed to managing the funding and monetization of the services they plan to offer. This includes the pricing model and business plan regarding the vLEI issuer services</p><p><strong>Section F: vLEI Issuer Services</strong></p><p>In this section, the candidate vLEI issuer shall outline their detailed plans and processes related to the issuance and revocation of vLEI credentials, including:</p><ul><li>Processes for receiving payments from legal entity (LE) clients.</li><li>Processes for identity verification in accordance with the vLEI EGF.</li><li>Processes for validating the legal identity of official organization role (OOR) persons as well as using GLEIF API to choose the correct OOR code.</li><li>Processes for calling the vLEI Reporting API for each issuance of LE and OOR vLEI credentials</li><li>Processes for verifying the statuses of legal entity clients’ LEI. The clients must be notified 30 days before their LEI expires.</li><li>Processes for revoking all vLEIs issued to the legal entity client whose LEI has lapsed</li><li>Processes for monitoring compliance with the Service Level Agreement (Appendix 5)</li><li>Processes for monitoring witnesses for erroneous or malicious activities</li></ul><p><strong>Section G: Records Management</strong></p><p>In this section, the candidate vLEI issuer provides their internal <strong>Records Management Policy</strong> that defines the responsibilities of the personnel related to records retention to ensure that the records management processes are documented, communicated, and supervised.</p><p><strong>Section H: Website Requirements</strong></p><p>In this section, the candidate QVI’s websites are required to display the following items:</p><ul><li>QVI Trustmark</li><li>Applications, contracts, and required documents for legal entities to apply for vLEI credentials.</li></ul><p><strong>Section I: Software</strong></p><p>In this section, the candidate vLEI issuer provides their internal policy for the <strong>Service Management Process</strong> including:</p><ul><li>Processes for installing, testing, and approving new software</li><li>Processes for identifying, tracking, and correcting software errors/bugs</li><li>Processes for managing cryptographic keys</li><li>Processes for recovering from compromise</li></ul><p>The candidate vLEI issuer must also specify their policies and operations related to management for private keys and KERI witnesses as follows:</p><ul><li>Processes and policies for managing thresholded multi-signature scheme, where at least 2 out of 3 qualified vLEI issuer authorized representatives (QARs) are required to approve issuance or revocation of vLEI credentials</li><li>Processes for operating KERI witnesses, where at least 5 witnesses are required for the vLEI issuer services</li></ul><p><strong>Section J: Networks and Key Event Receipt Infrastructure (KERI)</strong></p><p>In this section, the candidate vLEI issuer describes their <strong>network architecture </strong>including KERI witnesses and the details of third-party cloud-based services as well as a process monitoring of the vLEI Issuer-related IT infrastructure. The candidate vLEI issuer must also provide the following internal policies:</p><ul><li>Disaster Recovery and/or Business Continuity Plan</li><li>Backup Policies and Practices</li><li>The vetting process for evaluating the reliability of third-party service providers</li></ul><p><strong>Section K: Information Security</strong></p><p>In this section, the candidate vLEI issuer provides their internal <strong>Information Security Policy</strong> that includes, for example, formal governance, revision management, personnel training, physical access policies, incident reports, and remediation from security breaches.</p><p><strong>Section L: Compliance</strong></p><p>QVI candidates must declare that they will abide by the general and legal requirements as a vLEI Issuer by:</p><ul><li>Execute a vLEI Issuer Qualification Agreement with GLEIF</li><li>Execute a formal contract, of which the template follows the Agreement requirements, with a Legal Entity before the issuance of a vLEI credential</li><li>Comply with the requirements for Qualification, vLEI Ecosystem Governance Framework, and any other applicable legal requirements</li></ul><h4>Respond to Remediation (if any)</h4><p>After the candidate vLEI issuer has submitted the qualification program checklist and supporting documents through online portals, GLEIF will review the submission and provide the <strong>review results</strong> and <strong>remediation requirements</strong>, if any. Subsequently, the candidate vLEI issuer must respond to the remediation requirements along with corresponding updates to their qualification program checklist and supporting documents.</p><h4>Undergo Technical Qualification</h4><p>After the qualification program checklist has been submitted, reviewed, and remediated, the candidate vLEI issuer then proceeds to the technical part of the qualification program. GLEIF and the candidate vLEI issuer then organize a dry run to test that the candidate vLEI issuer is capable of:</p><ul><li>Performing OOBI sessions and authentication</li><li>Generating and managing a multi-signature group AID</li><li>Issuing, verifying and revoking vLEI credentials</li></ul><p>The purpose of the dry run is to make sure that the candidate vLEI issuer has the technical capability to operate as a QVI as well as identify and fix any technical issue that may arise. A dry run may take multiple meeting sessions if required.</p><p>After the candidate vLEI issuer completes the dry run, they may proceed to the official technical qualification, which repeats the process during the dry run. vLEI credentials issued during the official session are official and may be used in the vLEI ecosystem.</p><h3>Step 3: Sign the Qualification Agreement</h3><p>Once the vLEI candidates have completed all of the business and technical qualification processes, GLEIF will notify the organization regarding the result of the Qualification Program. The approval of the qualification application will result in the candidate vLEI Issuer signing the <a href="https://www.gleif.org/en/vlei/the-lifecycle-of-a-vlei-issuer/gleif-qualification-of-vlei-issuers/required-documents"><strong>vLEI Issuer Qualification Agreement</strong></a> with GLEIF. The candidate vLEI Issuer will then officially become a QVI.</p><h4>Beyond the Qualification Program</h4><p>Once officially qualified, the QVI must ensure strict compliance with the vLEI EGF and the requirements that they completed in the Qualification Program Checklist. For example, their day-to-day operations must comply with <a href="https://www.gleif.org/en/vlei/the-lifecycle-of-a-vlei-issuer/gleif-qualification-of-vlei-issuers/required-documents">Appendix 5: Qualified vLEI Issuer Service Level Agreement (SLA)</a> as well as comply with their internal policies such as the Records Management Policy and Information Security Policy. They must also continuously monitor their services and IT infrastructure including the witnesses.</p><h4>Annual vLEI Issuer Qualification</h4><p>The QVI is also subject to the <strong>Annual vLEI Issuer Qualification</strong> by GLEIF to ensure that they continue to meet the requirements of the vLEI Ecosystem Governance Framework. If the QVI has made significant changes to their vLEI issuer services, IT infrastructure, or internal policies, the QVI must document the details of the changes and update corresponding supporting documentation. GLEIF will then review the changes and request for remediation actions, if any.</p><h3>Conclusion</h3><p>The processes of the QVI Qualification Program are designed to be extensively rigorous to ensure the trustworthiness of the vLEI ecosystems as QVIs play a vital role in maintaining trust and integrity among the downstream vLEI stakeholders. We at Finema are committed to promoting the vLEI ecosystem, and would be delighted to assist should you be interested in embarking on your journey to participate in the ecosystem.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=59b4eba308f0" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/vlei-demystified-part-3-qvi-qualification-program-59b4eba308f0">vLEI Demystified Part 3: QVI Qualification Program</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[The Hitchhiker’s Guide to KERI. Part 3: How do you use KERI?]]></title>
            <link>https://medium.com/finema/the-hitchhikers-guide-to-keri-part-3-how-do-you-use-keri-2d1724afa432?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/2d1724afa432</guid>
            <category><![CDATA[self-sovereign-identity]]></category>
            <category><![CDATA[decentralized-identity]]></category>
            <category><![CDATA[keris]]></category>
            <category><![CDATA[cryptography]]></category>
            <dc:creator><![CDATA[Nuttawut Kongsuwan]]></dc:creator>
            <pubDate>Sat, 13 Apr 2024 13:20:21 GMT</pubDate>
            <atom:updated>2024-05-03T02:56:32.835Z</atom:updated>
            <content:encoded><![CDATA[<p>This blog is the third part of a three-part series, <strong>the</strong> <strong>Hitchhiker’s Guide to KERI</strong>:</p><ul><li><a href="https://medium.com/finema/the-hitchhikers-guide-to-keri-part-1-51371f655bba">Part 1: Why should you adopt KERI?</a></li><li><a href="https://medium.com/finema/the-hitchhikers-guide-to-keri-part-2-what-exactly-is-keri-e46a649ac54c">Part 2: What exactly is KERI?</a></li><li><strong>Part 3: How do you use KERI?</strong></li></ul><p>Now that you grasp the rationale underpinning the adoption of KERI and have acquired a foundational understanding of its principles, this part of the series is dedicated to elucidating the preliminary steps necessary for embarking upon a journey with KERI and the development of applications grounded in its framework.</p><p>The resources provided below, while presented <strong>in no particular order</strong>, serve to supplement your exploration of KERI. Moreover, this blog will serve as <strong>an implementer guide</strong> to further deepen your understanding and proficiency in utilizing KERI.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*c9go54fF7ajIog3d" /><figcaption>Photo by <a href="https://unsplash.com/@ilyapavlov?utm_source=medium&amp;utm_medium=referral">Ilya Pavlov</a> on <a href="https://unsplash.com?utm_source=medium&amp;utm_medium=referral">Unsplash</a></figcaption></figure><h3>Read the Whitepaper</h3><p>The Key Event Receipt Infrastructure (KEI) protocol was first introduced in the <a href="https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/KERI_WP_2.x.web.pdf">KERI whitepaper</a> by<strong> Dr. Samuel M. Smith </strong>in 2019. The whitepaper kickstarts the development of the entire ecosystem.</p><p>While the KERI whitepaper undoubtedly offers invaluable insights into the intricate workings and underlying rationale of the protocol, I would caution against starting your KERI journey with it. Its length exceeding 140 pages, may pose a significant challenge for all but a few cybersecurity experts. It is advisable to revisit the whitepaper once you have firmly grasped the foundational concepts of KERI. Nevertheless, should you be inclined towards a more rigorous learning approach, you are certainly encouraged to undertake the endeavor.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*G73ySATub46R2mHWPZFs5g.png" /><figcaption>The KERI Whitepaper, first published July 2019.</figcaption></figure><p>I also recommend <a href="https://github.com/SmithSamuelM/Papers/tree/master/whitepapers">related whitepapers</a> by Dr. Samuel M. Smith as follows:</p><ul><li><a href="https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/IdentifierTheory_web.pdf">Universal Identifier Theory</a>: a unifying framework for combining autonomic identifiers (AID) with human meaningful identifiers.</li><li><a href="https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/SPAC_Message.pdf">Secure Privacy, Authenticity, and Confidentiality (SPAC)</a>: the whitepaper that laid the foundation for the <a href="https://www.trustoverip.org/blog/2023/01/05/the-toip-trust-spanning-protocol/">ToIP trust-spanning protocol</a>.</li><li><a href="https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/SustainablePrivacy.pdf">Sustainable Privacy</a>: a privacy-protection approach in the KERI ecosystem.</li></ul><h3>Read Introductory Contents</h3><p>Before delving into the whitepaper and related specifications, I recommend the following introductory materials, which helped me personally:</p><ul><li><a href="https://www.youtube.com/watch?v=izNZ20XSXR0">KERI Presentation</a> at SSI Meetup Webinar, given by the originator of KERI, Dr. Samuel M. Smith, himself</li><li><a href="https://docs.google.com/presentation/d/1lpzYcPrIox9V4hERtn4Kcf7uq01OVU9u3PuVm1aYzR0/edit#slide=id.ga411be7e84_0_0">KERI for Muggles</a>, by Samuel M. Smith and Drummond Reed. This was a presentation given at the Internet Identity Workshop #33.</li></ul><blockquote>Note: the author of this blog was first exposed to KERI by this presentation.</blockquote><ul><li><a href="https://www.manning.com/books/self-sovereign-identity">Section 10.8</a><strong> </strong>of “Self-Sovereign Identity” by Alex Preukschat &amp; Drummond Reed, Manning Publication (2021). This section was also written by Dr. Samuel M. Smith.</li><li><a href="https://www.windley.com/archives/2020/09/the_architecture_of_identity_systems.shtml">The Architecture of Identity Systems</a>, by Phil Windley. Written by one of the most prominent writers in the SSI ecosystem, Phil compared <strong>administrative</strong>, <strong>algorithm</strong>, and <strong>autonomic</strong> identity systems.</li><li><a href="https://weboftrust.github.io/WOT-terms/?level=3">KERISSE</a>, by Henk van Cann and Kor Dwarshuis, this an educational platform as well as a search engine for the KERI ecosystem.</li></ul><p>More resources can also be found at <a href="https://keri.one/keri-resources/">https://keri.one/keri-resources/</a>. Of course, this <strong>Hitchhiker’s Guide to KERI series</strong> has also been written as one such introductory content.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*HLFiP2r4nI4K7RKf8hagEw.png" /><figcaption>“Self-Sovereign Identity” by Alex Preukschat &amp; Drummond Reed</figcaption></figure><h3>Read the KERI and Related Specifications</h3><p>As of 2024, the specifications for KERI and related protocols are being developed by the <a href="https://wiki.trustoverip.org/display/HOME/ACDC+%28Authentic+Chained+Data+Container%29+Task+Force">ACDC (Authentic Chained Data Container) Task Force</a> under the <a href="https://www.trustoverip.org/">Trust over IP (ToIP)</a> Foundation. Currently, there are four specifications:</p><ul><li><a href="https://trustoverip.github.io/tswg-keri-specification/"><strong>Key Event Receipt Infrastructure (KERI)</strong></a>: the specification for the KERI protocol itself.</li><li><a href="https://trustoverip.github.io/tswg-acdc-specification/#go.draft-ssmith-acdc.html"><strong>Authentic Chained Data Containers (ACDC)</strong></a>: the specification for the variant of Verifiable Credentials (VCs) used within the KERI ecosystem.</li><li><a href="https://trustoverip.github.io/tswg-cesr-specification/"><strong>Composable Event Streaming Representation (CESR)</strong></a>: the specification for a dual text-binary encoding format used for messages exchanged within the KERI protocol.</li><li><a href="https://trustoverip.github.io/tswg-did-method-webs-specification/index.html"><strong>DID Webs Method Specification</strong></a>: the specification did:webs method that improves the security property of did:web with the KERI protocol.</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*NgbXADm3792XoK7I8A7kTg.png" /><figcaption><a href="https://trustoverip.github.io/tswg-keri-specification/">KERI Specification v1.0 Draft</a></figcaption></figure><p>There are also two related protocols, which do not have their own dedicated specifications:</p><ul><li><a href="https://trustoverip.github.io/tswg-cesr-specification/#self-addressing-identifier-said"><strong>Self-Addressing Identifier (SAID)</strong></a>: a protocol for generating identifiers used in the KERI protocol. Almost all identifiers in KERI are SAIDs, including AIDs, ACDCs’ identifiers, and schemas’ identifiers.</li><li><a href="https://trustoverip.github.io/tswg-keri-specification/#out-of-band-introduction-oobi"><strong>Out-Of-Band-Introduction (OOBI)</strong></a>: a discovery mechanism for AIDs and SAIDs using URLs.</li></ul><p>To learn about these specifications, I also recommend my blog, the <a href="https://medium.com/finema/keri-jargon-in-a-nutshell-part-1-fb554d58f9d0">KERI jargon in a nutshell</a> series.</p><blockquote>Note: The KERI community intends to eventually publish the KERI specifications in ISO. However, this goal may take several years to achieve.</blockquote><h3>Check out the KERI Open-Source Projects</h3><p>The open-source projects related to the KERI protocols and their implementations are hosted in <a href="https://github.com/WebOfTrust/"><strong>WebOfTrust Github</strong></a>, all licensed under <a href="https://www.apache.org/licenses/LICENSE-2.0"><strong>Apache Version 2.0</strong></a>.</p><blockquote>Note: <strong>Apache License Version 2.0</strong> is a permissive open-source software license that allows users to freely use, modify, and distribute software under certain conditions. It permits users to use the software for any purpose, including commercial purposes and grants patent rights to users. Additionally, it requires users to include a copy of the license and any necessary copyright notices when redistributing the software.</blockquote><p>Here are some of the important projects being actively developed by the KERI community:</p><h4><strong>Reference Implementation: KERIpy</strong></h4><p>The core libraries and the reference implementation for the KERI protocol have been written in Python, called <strong>KERIpy</strong>. This is by far the most important project that all other KERI projects are based on.</p><ul><li>KERIpy (Python): <a href="https://github.com/WebOfTrust/keripy">https://github.com/WebOfTrust/keripy</a></li></ul><p>KERIpy is also available in Dockerhub and PyPI:</p><ul><li>Dockerhub: <a href="https://hub.docker.com/r/weboftrust/keri">https://hub.docker.com/r/weboftrust/keri</a></li><li>PyPI: <a href="https://pypi.org/project/keri/">https://pypi.org/project/keri/</a></li></ul><h4><strong>Edge Agent: Signify</strong></h4><p>The KERI ecosystem follows the principle of “<strong>key at the edge (KATE)</strong>,” that is, all essential cryptographic operations are performed at edge devices. The <strong>Signify</strong> projects have been developed to provide lightweight KERI functionalities at edge devices. Currently, Signify is already in Python and Typescript.</p><ul><li>SignifyPy (Python) <a href="https://github.com/WebOfTrust/signifypy">https://github.com/WebOfTrust/signifypy</a></li><li>Signify-TS (Typescript) <a href="https://github.com/WebOfTrust/signify-ts">https://github.com/WebOfTrust/signify-ts</a></li></ul><p>Signify is also available in PyPI and NPM:</p><ul><li>PyPI: <a href="https://pypi.org/project/signifypy/">https://pypi.org/project/signifypy/</a></li><li>NPM: <a href="https://www.npmjs.com/package/signify-ts">https://www.npmjs.com/package/signify-ts</a></li></ul><h4><strong>Cloud Agent: KERIA</strong></h4><p>Signify is designed to be lightweight and is reliant on a KERI cloud agent, called <strong>KERIA</strong>. KERIA helps with data storage and facilitates communication with external parties. As mentioned above, all essential cryptographic operations are performed at the edge using KERIA. Private and sensitive data are also encrypted at the edge before being stored in a KERIA server.</p><ul><li>KERIA (Python): <a href="https://github.com/WebOfTrust/keria">https://github.com/WebOfTrust/keria</a></li></ul><p>KERIA is also available in Dockerhub:</p><ul><li>Dockerhub: <a href="https://hub.docker.com/r/weboftrust/keria">https://hub.docker.com/r/weboftrust/keria</a></li></ul><h4>Browser Extension: Polaris</h4><p>The browser extension project is based on Signify-TS for running in browser environments. There is also a companion repository called the Polaris Web for building frontend applications that are compatible with the Signify browser extension.</p><ul><li>Signify Browser Extension: <a href="https://github.com/WebOfTrust/signify-browser-extension">https://github.com/WebOfTrust/signify-browser-extension</a></li><li>Polaris web: <a href="https://github.com/WebOfTrust/polaris-web">https://github.com/WebOfTrust/polaris-web</a></li></ul><blockquote>Note: The Signify browser extension project was funded by <a href="https://provenant.net/">Provanant Inc.</a> and developed by <a href="https://www.rootsid.com/">RootsID</a>. The project has been donated to the WebOfTrust Github project under Apache License Version 2.0.</blockquote><h3>Study KERI Command Line Interface (KLI)</h3><p>Once you grasp the basic concept of KERI, one of the best ways to start learning about the KERI protocol is to work with the<strong> KERI command line interface (KLI)</strong>, which uses simple bash scripts to provide an interactive experience.</p><p>I recommend the following tutorials on KLI:</p><ul><li><a href="https://www.youtube.com/watch?v=GqjsRuu0V5A&amp;list=PL6wYtOHmJEBNqoQ7OURLPz2eXa8J_Mcjt&amp;index=1"><strong>KERI &amp; OOBI CLI Demo</strong></a>, by Phillip Feairheller &amp; Henk van Cann.</li><li><a href="https://kentbull.com/"><strong>KERI KLI Tutorial Series</strong></a>, by Kent Bull. Currently, two tutorials are available: (1) <a href="https://kentbull.com/2023/01/27/keri-tutorial-series-kli-sign-and-verify-with-heartnet/">Sign &amp; Verify</a> with KERI and (2) <a href="https://kentbull.com/2023/03/09/keri-tutorial-series-treasure-hunting-in-abydos-issuing-and-verifying-a-credential-acdc/">Issuing ACDC</a> with KERI.</li></ul><p>Many more examples of KLI scripts can be found in the KERIpy repository, at:</p><ul><li><strong>KLI demo scripts</strong>: <a href="https://github.com/WebOfTrust/keripy/tree/main/scripts/demo">WebOfTrust/keripy/scripts/demo</a>.</li></ul><p>While KLI is a good introductory program for learning the KERI protocol, it is crucial to note that<strong> KLI is not suitable for developing end-user (client-side) applications in a production environment.</strong></p><blockquote>Note: KLI can be used in production for server-side applications.</blockquote><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*1q9JGLjS-RA3m14SofQIUw.png" /><figcaption><a href="https://kentbull.com/2023/01/27/keri-tutorial-series-kli-sign-and-verify-with-heartnet/">KERI KLI Series: Sign and Verify</a> by Kent Bull</figcaption></figure><h3>Build an App with Signify and KERIA</h3><p>For building a KERI-based application in production environments, it is recommended by the KERI community to utilize <strong>Signify</strong> for edge agents and <strong>KERIA</strong> for cloud agents. These projects were specifically designed to complement each other, enabling the implementation of “<strong>key at the edge (KATE)</strong>”. That is, essential cryptographic operations are performed at edge devices, including key pair generation and signing, while private and sensitive data are encrypted before being stored in an instance of KERIA cloud agent.</p><p>The Signify-KERIA protocol by Philip Feairheller can be found here:</p><ul><li><strong>Signify/KERIA Request Authentication Protocol (SKRAP)</strong>: <a href="https://github.com/WebOfTrust/keria/blob/main/docs/protocol.md">https://github.com/WebOfTrust/keria/blob/main/docs/protocol.md</a></li></ul><p>The API between a Signify client and KERIA server can be found here:</p><ul><li><strong>KERI API (KAPI)</strong>: <a href="https://github.com/WebOfTrust/kapi/blob/main/kapi.md">https://github.com/WebOfTrust/kapi/blob/main/kapi.md</a></li></ul><p>Example Signify scripts for interacting with a KERIA server can also be found here:</p><ul><li><strong>Example scripts</strong>: <a href="https://github.com/WebOfTrust/signify-ts/tree/main/examples/integration-scripts">https://github.com/WebOfTrust/signify-ts/tree/main/examples/integration-scripts</a></li></ul><h3>Join the KERI Community!</h3><p>To embark on your KERI journey, I recommend joining the KERI community. As of <strong>April 2024</strong>, there are three primary ways to engage:</p><h4>Join the WebOfTrust Discord Channel</h4><p>The WebOfTrust Discord channel is used for casual discussions and reminders for community meetings. You can join with the link below:</p><ul><li><a href="https://discord.gg/YEyTH5TfuB">https://discord.gg/YEyTH5TfuB</a>.</li></ul><h4>Join the ToIP ACDC Task Force</h4><p>The ACDC Task Force under the ToIP foundation focuses on the <strong>development of the KERI and related specifications</strong>. It also includes reports on the news and activities of the community’s members as well as in-depth discussions of related technologies.</p><p>The ACDC Task Force’s homepage can be found here:</p><ul><li><a href="https://wiki.trustoverip.org/display/HOME/ACDC+(Authentic+Chained+Data+Container)+Task+Force">https://wiki.trustoverip.org/display/HOME/ACDC+(Authentic+Chained+Data+Container)+Task+Force</a></li></ul><p>Currently, they hold a meeting weekly on <strong>Tuesdays</strong>:</p><ul><li><strong>NA/EU: 10:00–11:00 EST / 14:00–15:00 UTC</strong></li><li><strong>Zoom Link</strong>: <a href="https://zoom.us/j/92692239100?pwd=UmtSQzd6bXg1RHRQYnk4UUEyZkFVUT09">https://zoom.us/j/92692239100?pwd=UmtSQzd6bXg1RHRQYnk4UUEyZkFVUT09</a></li></ul><p>For all authoritative meeting logistics and Zoom links, please see the <a href="https://wiki.trustoverip.org/display/HOME/Calendar+of+ToIP+Meetings">ToIP Calendar</a>.</p><blockquote>Note: While anyone is welcome to join meetings of ToIP as an observer, only members are allowed to contribute. You can join ToIP for free <a href="https://www.trustoverip.org/get-involved/membership/">here</a>.</blockquote><h4>Join the KERI Implementer Call</h4><p>Another weekly meeting is organized every <strong>Thursday:</strong></p><ul><li><strong>NA/EU: 10:00–11:00 EST / 14:00–15:00 UTC</strong></li><li><strong>Zoom link</strong>: <a href="https://us06web.zoom.us/j/81679782107?pwd=cTFxbEtKQVVXSzNGTjNiUG9xVWdSdz09">https://us06web.zoom.us/j/81679782107?pwd=cTFxbEtKQVVXSzNGTjNiUG9xVWdSdz09</a></li></ul><p>In contrast to the ToIP ACDC Task Force’s meeting, the implementer call focuses on the development and maintenance of the open-source projects in <a href="https://github.com/WebOfTrust/">WebOfTrust</a> Github. As a result, the weekly Thursday meetings tend to delve deeper into technical details.</p><blockquote>Note: There is also a weekly meeting on DID Webs Method every Friday. See the ToIP DID WebS Method Task Force’s homepage here: <a href="https://wiki.trustoverip.org/display/HOME/DID+WebS+Method+Task+Force">https://wiki.trustoverip.org/display/HOME/DID+WebS+Method+Task+Force</a>.</blockquote><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=2d1724afa432" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/the-hitchhikers-guide-to-keri-part-3-how-do-you-use-keri-2d1724afa432">The Hitchhiker’s Guide to KERI. Part 3: How do you use KERI?</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[vLEI Demystified Part 2: Identity Verification]]></title>
            <link>https://medium.com/finema/vlei-demystified-part-2-identity-verification-519102614b8e?source=rss----55e22a76bbfc---4</link>
            <guid isPermaLink="false">https://medium.com/p/519102614b8e</guid>
            <category><![CDATA[keris]]></category>
            <category><![CDATA[self-sovereign-identity]]></category>
            <category><![CDATA[cryptography]]></category>
            <dc:creator><![CDATA[Yanisa Sunanchaiyakarn]]></dc:creator>
            <pubDate>Fri, 29 Mar 2024 10:47:55 GMT</pubDate>
            <atom:updated>2024-04-29T02:21:42.516Z</atom:updated>
            <content:encoded><![CDATA[<p><strong><em>Authors: </em></strong><em>Yanisa Sunanchaiyakarn &amp; Nuttawut Kongsuwan, Finema Co. Ltd.</em></p><p>vLEI Demystified Series:</p><ul><li><a href="https://medium.com/finema/vlei-demystified-part-1-comprehensive-overview-212349c09643">Part1: Comprehensive Overview</a></li><li><a href="https://medium.com/finema/vlei-demystified-part-2-identity-verification-519102614b8e"><strong>Part 2: Identity Verification</strong></a></li><li><a href="https://medium.com/finema/vlei-demystified-part-3-qvi-qualification-program-59b4eba308f0">Part 3: QVI Qualification Program</a></li></ul><p>This blog is the second part of the vLEI Demystified series. <a href="https://medium.com/finema/vlei-demystified-part-1-comprehensive-overview-212349c09643">Part 1</a> of the series outlines different stakeholders, their roles, and six types of credentials that are involved in the trust chain and the foundational structures of the vLEI ecosystem. This part delves deeper into the qualifications and verification procedures that persons representing these organizations have to go through prior to the issuance of vLEI credentials.</p><h3>Overview of vLEI Identity Verification</h3><p>Before participating in the vLEI ecosystem, including obtaining and issuing vLEI credentials, the <strong>representatives </strong>of all organization stakeholders must undergo rigorous identity verification processes to confirm their <strong>legal identity</strong>.</p><blockquote><strong>Note</strong>: <strong>Legal identity</strong> is defined as the basic characteristics of an individual’s identity. e.g. name, sex, place, and date of birth conferred through registration and the issuance of a certificate by an authorized civil registration authority following the occurrence of birth. [Ref: <a href="https://unstats.un.org/legal-identity-agenda/]">https://unstats.un.org/legal-identity-agenda/]</a></blockquote><p>With some exceptions, <strong>authorized representatives</strong> of each organization stakeholder are responsible for performing identity verification on representatives of organizations downstream within the vLEI trust chain. That is, <strong>GLEIF verifies qualified vLEI issuers (QVIs), QVIs verify legal entities (LEs), and LEs verify role representatives</strong>, as shown below.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*pF6J4kmKJT9MpZjy" /><figcaption>The vLEI trust chain</figcaption></figure><blockquote><strong>Note:</strong> The authorized representatives are the persons designated by an organization (either GLEIF, a QVI, or a legal entity) to officially represent the organization.</blockquote><blockquote><strong>Note:</strong> There are two types of role representatives: official organization role (OOR) and engagement context role (ECR).</blockquote><h4>GLEIF Authorized Representative (GARs)</h4><p>GARs are controllers of the GLEIF Root AID, GLEIF Internal Delegated AID (GIDA), and GLEIF External Delegated AID (GEDA).</p><p>As the root of trust of the vLEI ecosystem, GLEIF established an internal process to verify their GARs, as outlined in the <a href="https://www.gleif.org/vlei/introducing-the-vlei-ecosystem-governance-framework/2022-12-16_verifiable-lei-_vlei_-ecosystem-governance-framework-gleif-identifier-governance-framework_v1.0_final.pdf">GLEIF Identifier Governance Framework</a>. This includes:</p><ul><li>The policies and processes for the genesis events of the GLEIF Root AID, GIDA, and GEDA</li><li>Detailed identity verification process of all GARs where they <strong>mutually authenticate each other</strong></li><li>Contingency plans such as a designated survivor policy as well as restrictions on joint travel and in-person attendance of meetings.</li></ul><h4>Designated Authorized Representative (DAR)</h4><p>DARs are representatives authorized to act on behalf of a Qualified vLEI Issuer (QVI) or an LE.</p><ul><li>Identity verification on a QVI’s DAR is performed <strong>by an external GAR</strong>.</li><li>Identity verification on an LE’s DAR is performed <strong>by a QAR</strong>.</li></ul><h4>Qualified vLEI Issuer Authorized Representatives (QARs)</h4><p>A QAR is a representative designated by a QVI’s DAR to carry out vLEI operations with GLEIF and LEs.</p><ul><li>Identity verification on a QAR is performed <strong>by an external GAR</strong>.</li></ul><p>This process is detailed in <a href="https://www.gleif.org/vlei/introducing-the-vlei-ecosystem-governance-framework/2023-12-15_vlei-egf-v2.0-qualified-vlei-issuer-identifier-governance-framework-and-vlei-credential-framework_v1.3_final.pdf">Qualified vLEI Issuer Identifier Governance Framework and vLEI Credential Framework</a>.</p><h4>Legal Entity Authorized Representatives (LARs)</h4><p>An LAR is a representative designated by an LE’s DAR to request the issuance and revocation of LE vLEI credentials and Role vLEI credentials.</p><ul><li>Identity verification on an LAR is performed <strong>by a QAR</strong>.</li></ul><p>This process is detailed in <a href="https://www.gleif.org/vlei/introducing-the-vlei-ecosystem-governance-framework/2023-12-15_vlei-egf-v2.0-legal-entity-vlei-credential-framework_v1.2_final.pdf">Legal Entity vLEI Credential Framework</a>.</p><h4>Role Representatives (OOR and ECR Persons)</h4><p>A role representative, either an OOR or ECR person, is designated by an LAR to represent an LE in a functional or official organization role, respectively. The identity verification process for a role representative depends on whether an authorization vLEI credential is used, see <a href="https://medium.com/finema/vlei-demystified-part-1-comprehensive-overview-212349c09643">Part 1</a> of the series for more detail.</p><ul><li>In the case where an authorization vLEI credential is used, identity verification on a role representative is performed <strong>by both a QAR and an LAR</strong>.</li><li>In a case where an LE issues a role vLEI credential directly without using an ECR authorization vLEI credential, identity verification of an ECR person needs to be performed <strong>by only a LAR</strong>.</li></ul><p>This process is detailed in the following documents:</p><ul><li><a href="https://www.gleif.org/vlei/introducing-the-vlei-ecosystem-governance-framework/2023-12-15_vlei-egf-v2.0-qualified-vlei-issuer-authorization-vlei-credential-framework_v1.2_final.pdf">Qualified vLEI Issuer Authorization vLEI Credential Framework</a></li><li><a href="https://www.gleif.org/vlei/introducing-the-vlei-ecosystem-governance-framework/2023-12-15_vlei-egf-v2.0-legal-entity-official-organizational-role-vlei-credential-framework_v1.2_final.pdf">Legal Entity Official Organizational Role vLEI Credential Framework</a></li><li><a href="https://www.gleif.org/vlei/introducing-the-vlei-ecosystem-governance-framework/2023-12-15_vlei-egf-v2.0-legal-entity-engagement-context-role-vlei-credential-framework_v1.2_final.pdf">Legal Entity Engagement Context Role vLEI Credential Framework</a></li></ul><h3>Identity Verification Processes</h3><p>The identity verification process of all representatives in the vLEI ecosystem includes two subprocesses namely:</p><ul><li><strong>Identity Assurance Process</strong>, which verifies the veracity and existence of a legal identity, as well as binding the legal identity to a representative</li><li><strong>Identity Authentication Process</strong>, which binds the representative to an autonomic identifier (AID)</li></ul><p>Once the identity verification process is completed, a vLEI credential may be subsequently issued to the AID that has been bound to the representative.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/992/0*_pigqtHW_QChByEQ" /><figcaption>Illustration of the identity verification process</figcaption></figure><h3>Identity Assurance</h3><p>The first stage of identity verification for the vLEI ecosystem is called identity assurance. This step involves an identity proofing process to verify the legal identities of all individuals prior to obtaining vLEI credentials. The vLEI Ecosystem Governance Framework (EGF) requires that Identity Assurance is performed according to<strong> Identity Assurance Level 2 (IAL2)</strong> as defined in <a href="https://pages.nist.gov/800-63-3/sp800-63a.html">NIST SP 800–63A</a>.</p><p>The National Institute of Standards and Technology (NIST) standardized the identity proofing process in their Special Publication (SP) 800–63A. Although originating in the United States, SP 800–63A is one of the most influential standards for identity proofing and is widely referenced by various industries and governments worldwide.</p><h4>Identity Assurance Level</h4><p>NIST SP 800–63A has categorized the degrees of assurance in one’s identity into 3 levels:</p><ul><li>Identity Assurance Level 1 (IAL1): The service provider is not required to validate or link the applicant’s self-asserted attributes to their real-life identity.</li><li><strong>Identity Assurance Level 2 (IAL2): Either remote or physical identity proofing is required at this level. The applicant’s submitted evidence supports the real-world their real-world identity and verifies that the applicant is accurately linked to this identity.</strong></li><li>Identity Assurance Level 3 (IAL3): Physical presence is required for the identity proofing process at this level. Identifying attributes must be verified by an authorized and trained service provider representative.</li></ul><p>Only IAL2 is relevant in the context of the vLEI EGF.</p><h4>Identity Resolution, Validation, and Verification</h4><p>Identity proofing in NIST SP 800–63A consists of three main components, namely:</p><ul><li><strong>Identity Resolution</strong>: a process for uniquely distinguishing an individual within a given context.</li><li><strong>Identity Validation</strong>: a process for determining the authenticity, validity, and accuracy of the identity evidence</li><li><strong>Identity Verification</strong>: a process for establishing a linkage between the claimed identity and the person presenting the identity evidence.</li></ul><p>For example, an applicant for a vLEI credential could present a verifier with a set of required identity evidence. The verifier must <strong>resolve</strong> the applicant’s legal identity and <strong>validate</strong> that the presented information on the collected evidence is legitimate. Validation may involve confirming the information with an authoritative source and determining that there is no alteration to the images and data of the presented evidence. Subsequently, the verifier may <strong>verify</strong> the applicant by comparing the applicant’s live image with the one displayed on the provided identity evidence.</p><h4>Identity Evidence Collection</h4><p>During identity resolution and validation, the collection of “<strong>identity evidence</strong>” is required to establish the uniqueness of the individual’s identity.</p><blockquote><strong>Note</strong>: Identity evidence is defined as information or documentation provided by the applicant to support the claimed identity. Identity evidence may be physical (e.g. a driver’s license) or digital.</blockquote><p>To comply with IAL2, one of the following sets of identity evidence must be collected:</p><ul><li>a piece of STRONG or SUPERIOR evidence if the evidence’s issuing source confirmed the claimed identity by collecting at least two forms of SUPERIOR or STRONG evidence before and the service provider validates the evidence with the source directly; <strong>OR</strong></li><li>two pieces of STRONG evidence; <strong>OR</strong></li><li>one piece of STRONG evidence plus two pieces of FAIR evidence</li></ul><p>NIST SP 800–63A defines five tiers of identity evidence’s strength: UNACCEPTABLE, WEAK, FAIR, STRONG, and SUPERIOR. While the strength of specific identity evidence, e.g., a driver’s license, may vary across jurisdictions, NIST provides <a href="https://pages.nist.gov/800-63-3-Implementation-Resources/63A/resolution/">examples of common evidence</a> and their estimated strength, based on their general quality characteristics, for instance:</p><ul><li>SUPERIOR: passports and permanent resident cards</li><li>STRONG: driver’s licenses and U.S. military ID cards</li><li>FAIR: school ID cards and credit/debit cards</li></ul><p>Further details on the ​​strengths of identity evidence can be found in <em>Section 5.2.1</em> on the NIST SP 800–63A.</p><h3>Identity Authentication</h3><p>After completing identity assurance, an organization representative who applies for a vLEI credential may proceed to identity authentication, which establishes a connection between the representative — whose legal identity has been assured to meet IAL2 — to an <a href="https://medium.com/finema/keri-jargon-in-a-nutshell-part-1-fb554d58f9d0#7344">autonomic identifier (AID)</a>.</p><p>Once such a connection has been established, a vLEI credential could be issued to the AID with confidence that the representative is the sole controller of the AID. Subsequently, the representative may cryptographically prove their control over the AID and the issued vLEI credential.</p><blockquote><strong>Note</strong>: An autonomic identifier (AID) is a persistent self-certifying identifier (SCID) that is derived and managed by cryptographic means without reliance on any centralized entity or distributed ledger technology.</blockquote><h4>Credential Wallet Setup</h4><p>Before identity authentication can begin, a credential wallet must be set up for the organization representative. The primary role of a credential wallet includes:</p><ul><li>Creation, storage, and management of key pairs</li><li>Creation, storage, and management of AIDs.</li><li>Digital signature creation and verification</li></ul><p>The specification for a credential wallet is detailed in <a href="https://www.gleif.org/vlei/introducing-the-vlei-ecosystem-governance-framework/2023-12-15_vlei-egf-v2.0-technical-requirements-part-1-keri-infrastructure_v1.2_final.pdf">Technical Requirements Part 1: KERI Infrastructure</a>. The credential wallet must also be used during the live <strong>Out-of-band-Introduction (OOBI) session</strong> to complete the identity authentication process.</p><blockquote><strong>Note</strong>: A credential wallet for vLEI credentials is essentially <strong>a KERI-compatible identity wallet</strong>. It must be compliant with three specifications currently being developed under the Trust Over IP (ToIP) Foundation, namely, <a href="https://trustoverip.github.io/tswg-keri-specification/">KERI</a>, <a href="https://trustoverip.github.io/tswg-acdc-specification/#go.draft-ssmith-acdc.html">ACDC</a>, and <a href="https://trustoverip.github.io/tswg-cesr-specification/">CESR</a> specifications.</blockquote><h4>Out-of-band-Introduction (OOBI) Protocol</h4><p>The identity authentication process is implemented using the <strong>Out-of-band-Introduction (OOBI) Protocol</strong>, which is a protocol defined in the <a href="https://trustoverip.github.io/tswg-keri-specification/#out-of-band-introduction-oobi">ToIP Key Event Receipt Infrastructure (KERI) specification</a>. The OOBI protocol provides a discovery mechanism for verifiable information related to an AID — including its key event log (KEL) and its service endpoint — by associating the AID with a URL.</p><p>For example, an AID EaU6JR2nmwyZ-i0d8JZAoTNZH3ULvYAfSVPzhzS6b5CM may provide a service endpoint at <a href="http://www.example.com">www.example.com</a> with an <strong>OOBI URL</strong> of</p><pre>http://www.example.com/oobi/EaU6JR2nmwyZ-i0d8JZAoTNZH3ULvYAfSVPzhzS6b5CM</pre><p>The OOBI protocol is “out-of-band” as it enables any internet and web search infrastructure to act as an “out-of-band” infrastructure to discover information that is verified using the “in-band” KERI protocol. The OOBI protocol leverages the existing IP and DNS infrastructure for discovery so that a dedicated discovery network is not needed.</p><blockquote><strong>Note</strong>: The OOBI by itself is insecure, and the information discovered by the OOBI must be verified using the KERI protocol.</blockquote><p>This OOBI URL may be used to discover the AID’s KEL as well as send messages to the AID, including sending a challenge message in a challenge-response protocol and sending a vLEI credential.</p><h4>Challenge-Response Protocol</h4><p>To establish a connection between a representative and an AID, the <a href="https://en.wikipedia.org/wiki/Challenge%E2%80%93response_authentication"><strong>challenge-response protocol</strong></a><strong> </strong>is implemented to ensure that the representative holds the private key that controls the AID.</p><p>With the OOBI protocol, the verifier uses the OOBI URL as a service endpoint to deliver the challenge message to the AID controller. The verifier of an AID then generates a random number as a <strong>challenge message</strong> and sends it to the AID controller. The AID controller then uses the private key associated with the AID to sign a digital signature on the challenge. The signature is the <strong>response</strong> to the challenge message and is returned to the verifier. Finally, the verifier verifies the response using the public key of the AID.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*sjhLpCJCE-k9IGjR" /><figcaption>Illustration of a challenge-response session</figcaption></figure><h4>Man-In-the-Middle Attack</h4><p>However, there is a risk that an attacker could intercept the communication between the representative and the verifier in a man-in-the-middle (MITM) attack. Here, the attacker obtains the authentic OOBI URL, which contains the representative’s AID, and sends a false OOBI URL, which instead contains the attacker’s AID, to the verifier.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*FVokOx_cJ8kc8gte" /><figcaption>Illustration of a challenge-response session that is intercepted by a man-in-the-middle (MITM) attack.</figcaption></figure><h4>Real-time OOBI Session</h4><p>To mitigate the risk of an MITM attack, the vLEI EGF specifies an authentication process that is called a <strong>real-time</strong> <strong>OOBI session </strong>that a representative and their verifier must complete before issuance of a vLEI credential.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*Vl19fv3M_JywlKKp" /><figcaption>An illustration of an OOBI session</figcaption></figure><p>During a real-time OOBI session, the representative and the verifier must organize a real-time <strong>in-person</strong> or a <strong>virtual face-to-face</strong> meeting, e.g., using a Zoom call. For a virtual face-to-face meeting, there are extra requirements as follows:</p><ul><li>The meeting must be <strong>continuous </strong>and<strong> uninterrupted </strong>throughout the entire OOBI session.</li><li>Both <strong>audio </strong>and<strong> video feeds </strong>of<strong> </strong>all participants must be active throughout the entire OOBI session.</li></ul><p>The OOBI session consists of the following steps:</p><p>1) The identity verifier performs <strong>manual</strong> verification of the representative’s legal identity, which has been verified during the identity assurance process. For example, if the representative had provided a passport as their identity evidence during identity assurance, they may present the passport to the verifier once again during their live session.</p><p>2) After the verifier confirms that the evidence is accurately associated with the representative present in the meeting, they must exchange their AIDs through an out-of-band channel. For example, OOBI URLs can be shared in the live chat of a Zoom call or shared via QR codes via video feeds.</p><p>3) The verifier sends a unique challenge message to cryptographically authenticate the representative’s AID.</p><p>4) The representative uses their private key that is associated with the AID to sign and respond to the challenge.</p><p>5) The verifier verifies the response using the public key obtained from the AID’s key event log (KEL).</p><p>6) The challenge-response protocol is repeated where the representative is now the challenger and the verifier the responder.</p><h4>Group Real-time OOBI Session for QARs and LARs</h4><p>For issuance of QVI and LE vLEI credentials, all QARs and all LARs of the candidate QVI and LE, respectively, must be present in the real-time OOBI session.</p><ul><li>For the issuance of a QVI vLEI credential, <strong>2 External GARs</strong> and <strong>at least 3 QARs</strong> must be present during the real-time OOBI session.</li><li>For the issuance of a LE vLEI credential, <strong>1 QAR</strong> and <strong>at least 3 LARs</strong> must be present during the real-time OOBI session.</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*vjKfEVpf0CzOk2x2" /><figcaption>An example authentication process of LARs by QARs</figcaption></figure><p>Once the Authentication steps are completed, the identity verifier can now sign the vLEI credential to the representatives of the vLEI candidate organizations. However, the vLEI credential issuance process cannot be completed by a single representative. To meet the required weight threshold of the multi-signature scheme stated in the vLEI EGF, another representative in control of the issuer’s AID must combine the authority and approve the vLEI issuance. For instance, a QAR may perform the required verification processes on all LARs of an LE and initiate the issuance of an LE vLEI credential. At least one other QAR must review and approve the issuance.</p><h3>Conclusion</h3><p>While the verification processes that applicants seeking vLEI credentials have to complete before the issuance might appear rather involved, these thorough steps to verify the identity of individuals representing organizations are crucial to safeguarding against identity theft, impersonation, and other fraudulent activities across various industries. After the <strong>identity assurance</strong> process validates the legal identities of representatives, and the <strong>identity authentication</strong> process cryptographically associates the representatives to AIDs. After identity verification, vLEI credentials may be issued with confidence, maintaining the integrity and reliability of the vLEI ecosystem.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=519102614b8e" width="1" height="1" alt=""><hr><p><a href="https://medium.com/finema/vlei-demystified-part-2-identity-verification-519102614b8e">vLEI Demystified Part 2: Identity Verification</a> was originally published in <a href="https://medium.com/finema">Finema</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>