<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Caribou Digital - Medium]]></title>
        <description><![CDATA[Caribou Digital: building ethical inclusive digital economies - Medium]]></description>
        <link>https://medium.com/caribou-digital?source=rss----be144764b2ed---4</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Wed, 01 Apr 2026 06:43:18 GMT</lastBuildDate>
        <atom:link href="https://medium.com/feed/caribou-digital" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[The State of Digital ID — DICE 2025 (Zurich)]]></title>
            <link>https://medium.com/caribou-digital/the-state-of-digital-id-dice-2025-zurich-0f35491a2e5d?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/0f35491a2e5d</guid>
            <category><![CDATA[digital-wallet]]></category>
            <category><![CDATA[migration]]></category>
            <category><![CDATA[digital-identity]]></category>
            <category><![CDATA[european-commission]]></category>
            <dc:creator><![CDATA[Dr. Emrys Schoemaker]]></dc:creator>
            <pubDate>Fri, 07 Mar 2025 14:18:42 GMT</pubDate>
            <atom:updated>2025-03-07T14:18:42.025Z</atom:updated>
            <content:encoded><![CDATA[<h3>The State of Digital ID — DICE 2025 (Zurich)</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/766/1*Io6SfWQC15ujrjuYUFJiqg.png" /></figure><p>Digital wallets remain the cutting edge of innovation in the field of digital identity — and Europe is leading the way in terms of policy development, technical innovation, and ecosystem building. On Tuesday the European Commission released the latest draft of the <a href="https://eu-digital-identity-wallet.github.io/eudi-doc-architecture-and-reference-framework/latest/architecture-and-reference-framework-main/">Architecture Reference Framework (ARF)</a>, the technical guidance for the standards and protocols that European digital wallets (EUDI) are expected to follow. But the ARF is <em>not</em> legislation, which creates complications — and opportunities — for policymakers, developers, and issue advocates.</p><p>At the <a href="https://lu.ma/DICEecosystems?coupon=RWX2NO">Digital unConference Europe</a> (DICE), the conversations and sessions were dominated by the EU wallet and its implications. The conference’s focus on building “ecosystems” is an important reminder that <a href="https://medium.com/caribou-digital/the-difference-between-digital-identity-identification-and-id-41580bbb7563">digital ID is not a thing or a consumable product</a>, but a tool people use to achieve transaction goals. The key to the value of a digital ID scheme is what you can <em>do</em> with it. And key to what you can do with a digital wallet is who will trust its credentials and rely on it as proof of the claim that you make (that you are who you say you are and can do what you say you can do). This tripartite structure of issuers, holders, and relying parties is the core of an identification ecosystem.</p><p><strong>Privacy and security: </strong>The EU efforts to develop a digital wallet offer one of the most cutting-edge efforts to implement societal scale digital identity systems — and importantly, one that aims to put users in control of their data. Yet the EUDI is far from perfect, and the tension between political imperatives and technical design remains. The reliance on government-issued personal identification data as the EUDI’s foundational credential has raised concerns about the potential for surveillance through linking usage of the credentials, and thus revealing the identity and/or behaviors of the holder. For example, usage of the credential to prove age might reveal a taste for alcohol, or specific bar or alcohol preferences. As DICE participants noted, <a href="https://www.linkedin.com/posts/viky-manaila-%F0%9F%92%AF-0690aa1_arf-eidas-digitalwallet-activity-7302800180255350784-4egH?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAAACNrMoBVJwvORPlthLv_lXtpRTGBvHXfK8">the ARF 1.6 has measures on privacy</a>, but has not yet resolved the issue if unlinkability and this traceability.</p><p>The exclusion of cryptographic technologies such as Zero Knowledge Proofs (excluded because they are not (yet) included in the list of approved electronic signature technologies) also means that the current specification for the wallet exclude the use of technologies that could mitigate these concerns — an exclusion that persists in the latest ARF. These specifications are critical in determining how user-centric and protective the EUDI wallet will be. They are also going to be significant in their wider adoption and interoperability — the Swiss specification requires the possibility for unlinkability, for example..</p><p><strong>Governance vs. technology:</strong> One significant theme of the conference was the tension between relying on governance and regulation versus technological solutions, particularly in regard to privacy and protection. The current ARF limits the use of advanced cryptographic technologies, and so relies on regulation to prevent issuers encoding trackers into credentials. This is a weak form of protection. What happens if those rules are no longer followed? Also, what happens when the wallet is adopted in places where the rule of law is weak or authorities surveil users?</p><p><strong>Cross-border ecosystems: </strong>The question of ecosystem and standards is also significant when thinking about the EUDI in the context of cross-border usage and as part of a global ecosystem of digital identification. The EUDI wallet is currently designed for European citizens and legal residents. While a core use case is intra-European travel, the wallet’s potential is far broader than that. There has already been discussion about its potential to <a href="https://ecdpm.org/work/tech-sovereignty-and-new-eu-foreign-economic-policy">complement European foreign policy</a> and for it to hold <a href="https://www.globalpolicyjournal.com/blog/19/09/2024/digital-public-infrastructure-sovereignty-what-european-approach-dpi-might-look">an infrastructural “Brussels effect.” </a>This is not just idle speculation — <a href="https://www.biometricupdate.com/202502/georgia-seeks-candidates-to-build-digital-id-wallet">Georgia is already developing its own digital identity wallet using the European framework</a>. But for the European wallet to really function as part of a global ecosystem will require thinking through how to answer cross-border and migration questions such as:</p><ul><li>How would a European hospital trust a digital credential claiming medical expertise when hiring a doctor from outside the EU?</li><li>How would a university trust digital educational credentials issued from a non-European academic institution?</li></ul><p>Currently, these verifications are often manual processes — a phone call is made to the issuing medical institution or university, or the individual has to pass national tests to requalify. Digitalizing this broader ecosystem could enable easier migration and access to employment and education.</p><p>Digital identity is an ecosystem, one in which issuers, holders, and verifiers are the core actors. Building out these trust frameworks is key to realizing the potential — and mitigating the risks — of innovation in digital identification. If the EUDI wallet is to be an enabler of cross-border movement, and supporter of the migration and expertise that Europe needs, these challenges must be addressed.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*HskOv1hoWhCu1xG5" /></figure><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=0f35491a2e5d" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/the-state-of-digital-id-dice-2025-zurich-0f35491a2e5d">The State of Digital ID — DICE 2025 (Zurich)</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Digitalization ‘for all’?]]></title>
            <link>https://medium.com/caribou-digital/digitalization-for-all-fb86e03ad679?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/fb86e03ad679</guid>
            <dc:creator><![CDATA[Thea Kirsch]]></dc:creator>
            <pubDate>Wed, 04 Dec 2024 11:59:20 GMT</pubDate>
            <atom:updated>2024-12-04T12:14:29.108Z</atom:updated>
            <content:encoded><![CDATA[<h3>Digitalization ‘for all’? An Update on Germany’s Public Sector from the 2024 Smart Country Convention</h3><p><em>As governments around the world step up to regulate artificial intelligence and biometrics, a different class of technology aims to disrupt digital identity: digital wallets. Caribou is leading a research project that aims to advance migrants’ identification needs in digital identity wallet policy and technology design. </em><a href="https://www.cariboudigital.net/identity-and-migration/"><em>Explore the project</em></a><em>.</em></p><p>Since 2018, the Smart Country Convention has set out to bring together Germany’s digital economy and political decision-makers, as well as other stakeholders in the ever-growing field of public sector digital transformation.</p><p>The event is hosted by Messe Berlin, the Berlin Expo Center, and bitkom e.V., one of Germany’s largest industry associations for the digital economy. According to <a href="https://www.smartcountry.berlin/en/concept/facts-and-figures/">its website</a>, the Smart Country Convention — or SCCON for short — has become “the biggest event for smart cities, smart regions and e-government” in Germany. Indeed, with more than 18,000 attendees, over 17,000 square meters of exhibition space and a conference program featuring more than 650 speakers on 7 stages, the event is a large-scale, upbeat demonstration of what the future of the German public sector might look like — or at least what technology vendors, policymakers and other professionals in this field imagine it will.</p><p>In between ping-pong matches, live podcast recordings, and free frozen yoghurt, keynote speakers and panelists discussed the current state of digital transformation in the German public sector.<strong> </strong>They shed light on challenges, best practices and how to work towards Germany’s policy priorities for digital transformation, as outlined in the <a href="https://digitalstrategie-deutschland.de/static/eb25ff71f36b8cf2d01418ded8ae3dc2/Digitalstrategie_EN.pdf">2022 Digital Strategy</a> of the now defunct <a href="https://www.bbc.com/news/articles/c7v3r046pzzo">Traffic Light Coalition</a>. These insights also revealed the stark discrepancy between the image projected by SCCON and where the country actually stands. Take the <a href="https://www.youtube.com/watch?v=fRtjBQ5YSaQ">opening keynote by Kai Wegner</a>, the governing mayor of Berlin, on the second day of SCCON. He announced that Berliners can now register/change their address online — if they have downloaded the AusweisApp, opened a BundID account, have an NFC-enabled smartphone or special card reader, and either the German national ID card, the eID card for citizens of the European Union and the European Economic Area, or an electronic residence card with an active eID function (and remember the PIN code they need for all of this to work).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ZBUwL3-sn1oM2yjukmg32w.png" /><figcaption>October 16, 2024, Day 2 Opening Keynote.</figcaption></figure><p>The excitement and sense of digital awakening evoked by Wegner’s keynote, set against a backdrop of flashing lights and cinema-size screens, illustrates the immense undertaking Germany has ahead of it. So far, the country <a href="https://www.bundesrechnungshof.de/SharedDocs/Downloads/DE/Berichte/2023/onlinezugangsgesetz-volltext.pdf?__blob=publicationFile&amp;v=2">has fallen far short of its stated goal of digitizing access to nearly 600 public services by 2022.</a> On the day of the deadline, only 19% of these services were available online. <a href="https://insm.de/aktuelles/publikationen/digitale-transformation-deutschland-scheitert-beim-e-government">By the beginning of 2024, only 153 services had gone online nationwide.</a> In terms of the overall target, this represents an implementation rate of just 26.6% — a full year after the original deadline. Germany’s sluggish pace makes the city-state of Hamburg,<a href="https://dashboard.digitale-verwaltung.de/"> where 284 public services are available online</a>, the (literal) North Star of this ongoing restructuring process.</p><p><strong>At the heart of this public administration overhaul is the issue of digital identity management and how governments, whether at the federal, regional or local level, enable access to services.</strong> It’s against this backdrop that another name has emerged in the competition for the most innovative city: Wiesbaden. <a href="https://www.youtube.com/watch?v=YSLpFpihvSg&amp;list=PLudpuI-TLPa9sr2tZEOY0My7i62E7Wo82&amp;index=24">Maral Koohestanian, Deputy Mayor of Wiesbaden, was one of four panelists who discussed the state of digital identity in Germany at this year’s SCCON, moderated by Clemens Schleupner of bitkom.</a> Koohestanian, <a href="https://voltdeutschland.org/neuigkeiten/maral-koohestanian-fuehrt-volt-in-die-bundestagswahl-2025">who is now leading the Bundestag election campaign for Volt</a>, emphasized Wiesbaden’s pragmatic approach. <a href="https://webid-solutions.com/de/ressourcen/presse/egovernment-wiesbadens-politik-beschliesst-videoidentifikation-stadtweit/">Working with WebID and ekom, the city has adopted VideoID to provide access to public services.</a></p><p>According to Koohestanian, Wiesbaden’s approach is unique in Germany. Many people have already been introduced to VideoID, for example when opening a bank account. So the process is known and accepted. In comparison, the identity verification approach taken by Berlin and other cities is based on the national eID infrastructure, as mentioned above. While Germany’s national eID infrastructure covers three types of documents (i.e. the national identity card [Personalausweis], the eID card for EU/EAA citizens, the electronic residence permit), VideoID can be used to verify around 200 international identity documents. VideoID therefore is open to more users, also because it requires less specific equipment.</p><p>During the panel discussion, Koohestanian underlined the advantage that VideoID offers in terms of the inclusiveness of the approach and its overall user acceptance. <a href="https://www.youtube.com/watch?v=YSLpFpihvSg&amp;list=PLudpuI-TLPa9sr2tZEOY0My7i62E7Wo82&amp;index=24">The numbers she presented</a> seem to speak for themselves: When registering a wedding date at the Wiesbaden registry office, 98% opted to do so online, compared to 2% of people who preferred to register in person (in all cases, the ceremony itself takes place in person). Of the online users, 92.5% chose VideoID, compared to 7.5% who preferred the eID option. If people want to register/change their address online, <a href="https://www.wiesbaden.de/medien/rathausnachrichten/PM_Zielseite.php?showpm=true&amp;pmurl=https://www.wiesbaden.de/guiapplications/newsdesk/publications/Landeshauptstadt_Wiesbaden/141010100000474138.php">in Wiesbaden this service is for now carried out exclusively via VideoID.</a></p><p>Koohestanian’s emphasis foreshadowed the theme of the next day’s opening keynote by Federal Minister Lisa Paus (Family Affairs, Senior Citizens, Women and Youth). “Some people today are afraid that the digitalization train could leave without them,” <a href="https://www.youtube.com/watch?v=WU6Prs2rOdo&amp;list=PLudpuI-TLPa-wn8ynMFrNZCEFGBL60Pga&amp;index=57">Paus said in her speech</a>, stressing the need to strengthen inclusion, participation and accessibility of the digital transformation in Germany. While focusing on the inclusion of the elderly, Paus raised the question of how to ensure that digitalization is for all. As important as this question is, it however first requires a definition of what ‘for all’ means.</p><p><strong>Looking at the emerging digital identity ecosystem in Germany, ‘all’ currently seems to include only those who have access to the country’s existing eID infrastructure</strong>, i.e. people with German citizenship, EU citizens living in Germany, and non-EU citizens who have successfully applied for a residence permit. This may sound like Germany has all its citizens and residents covered. But the situation is more complicated than it might seem at first glance. Take, for example, non-EU citizens who come to Germany to take up a job offer. <a href="https://www.bamf.de/EN/Themen/MigrationAufenthalt/ZuwandererDrittstaaten/Migrathek/Einreisebestimmungen/einreisebestimmungen-node.html">If they enter on a work visa, this title is usually valid for up to 12 months.</a> During this time, they will have to deal with the German bureaucracy, especially the immigration authorities at their place of residence to apply for and receive their electronic residence permit.</p><p>At present, this interaction can be a largely paper-based, face-to-face process, depending on the local authority, <a href="https://www.youtube.com/watch?v=cIE-jsSUZ3g&amp;list=PLudpuI-TLPa9sr2tZEOY0My7i62E7Wo82&amp;index=54">a topic that was the subject of another panel discussion at SCCON.</a> During this panel discussion, Engelhard Mazanke, head of the Berlin Landesamt für Einwanderung, the city-state’s immigration authority, explained that practices vary widely across Germany. Not every immigration authority allows things to be sent to them as PDFs. Often documents have to be submitted in the original, perhaps even with an apostille. Mazanke criticized the fact that migration-related procedures are always dealt with on the assumption of potential misuse, even when processing applications for what he described as the ‘Rolls-Royce’ category of all residence permits, the EU Blue Card. Opportunities for digitizing processes and simplifying bureaucracy are limited if this is the default attitude towards non-Germans, he explained.</p><p><a href="https://netzpolitik.org/2024/wiesbaden-und-koeln-bedenkliches-videoident-verfahren-soll-gang-aufs-amt-sparen/">Concerns about misuse are also a reason for criticism of the use of VideoID in Wiesbaden.</a> The German eID infrastructure, on the other hand, is regarded as highly secure. In the context of eIDAS 2.0, it provides the only means of identification suitable for verifying the identity of Germany-based EUDI wallet users in accordance with LoA high requirements. At the SCCON panel on digital identity, the representative of the German Federal Ministry of the Interior raised the issue of the reliability of identity systems outside Europe when asked how people in Germany who are outside the national eID infrastructure could be integrated into the country’s emerging wallet ecosystem. About an hour later, on another stage, <a href="https://www.youtube.com/watch?v=cIE-jsSUZ3g&amp;list=PLudpuI-TLPa9sr2tZEOY0My7i62E7Wo82&amp;index=54">Tobias Lindner, Minister of State at the Federal Foreign Office, emphasized the need for attractive migration processes and for Germany to present itself as welcoming and service-oriented to new residents and citizens.</a></p><p>As such, SCCON brought to the fore another challenge for Germany’s digital transformation efforts. Not only is there a tension between inclusion and security concerns. The messages from government officials and policymakers tell different stories, illustrating the fragmentation of political decision-making and implementation processes. <strong>Given the centrality of digital identity to Germany’s vision of modern statehood, it would certainly be a good topic for the country to start speaking with one voice.</strong></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=fb86e03ad679" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/digitalization-for-all-fb86e03ad679">Digitalization ‘for all’?</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Meeting the target and missing the point: Putting society at the center of digital public…]]></title>
            <link>https://medium.com/caribou-digital/meeting-the-target-and-missing-the-point-putting-society-at-the-center-of-digital-public-ab0552e5a5f2?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/ab0552e5a5f2</guid>
            <category><![CDATA[digital-development]]></category>
            <category><![CDATA[measurement]]></category>
            <category><![CDATA[digital-transformation]]></category>
            <category><![CDATA[impact]]></category>
            <category><![CDATA[idp]]></category>
            <dc:creator><![CDATA[Dr. Emrys Schoemaker]]></dc:creator>
            <pubDate>Wed, 06 Nov 2024 11:27:51 GMT</pubDate>
            <atom:updated>2024-11-06T15:13:00.600Z</atom:updated>
            <content:encoded><![CDATA[<h3>Meeting the target and missing the point: Putting society at the center of digital public infrastructure</h3><p><em>Written by </em><a href="https://www.linkedin.com/in/jessosborn/"><em>Jessica Osborn</em></a><em> — CEO,</em><a href="https://www.linkedin.com/in/emrysschoemaker/"><em> Emrys Schoemaker</em></a><em> —Senior Director of Advisory &amp; Policy, and </em><a href="https://www.linkedin.com/in/niamh-barry-mel/"><em>Niamh Barry</em></a><em> — Senior Director of Measurement &amp; Impact, all at Caribou Digital.</em></p><p>Alongside this year’s World Bank and IMF Annual Meetings, and following the insightful <a href="https://www.codevelop.fund/global-dpi-summit">Co-Develop DPI Summit in Cairo</a> earlier in the month, Caribou Digital participated in several conversations on the social and economic impact of digital public infrastructure (DPI).</p><p>Together, these events demonstrated a welcome shift in the conversation toward the importance of putting people at the center of DPI’s <strong>design and implementation</strong> in order to increase adoption and use. Yet, this people-centric approach seems more nascent in discussions of DPI’s <strong>measurement and impact</strong>, which still often centers on institutional efficiency and access. While these are important goals (and often the initial impetus for DPI implementation), by omitting nuanced consideration of people-level impact we risk — at best — missing an opportunity for DPI to drive more meaningful development outcomes and — at worst — DPI causing people harm. Digital transformation affects the lived experiences of citizens in very real ways, and by bringing into view goals on inclusion, agency, and empowerment, we uncover a whole range of metrics that must be considered to ensure that the impact on people’s lives is positive. The need to build an efficiency-based investment case for DPI should not trump the need to build the human impact case.</p><h3>DPI’s outcome problem: A “shared means to many ends”</h3><p>That people are underrepresented in the conversation on DPI measurement is symptomatic of the fact that, while there is growing consensus around the <a href="https://www.undp.org/publications/accelerating-sdgs-through-digital-public-infrastructure-compendium-potential-digital-public-infrastructure">“whole of society” approach</a> to DPI implementation, this is still nascent when it comes to measuring DPI’s impact. DPI is an emergent system that is deeply interconnected, and as such it requires a systems-level theory of change and measurement approach.</p><p>The description of DPI as “<a href="https://www.codevelop.fund/insights-1/what-is-digital-public-infrastructure">a shared means to many ends</a>” highlights the numerous possibilities of use and, therefore, the numerous potential outcomes for different actors within a given system — government, civil society, private sector, businesses, households, and individuals. These are connected actors; thus, impact and information flows are also multidirectional.</p><p>As a DPI community, we have many reasonable hypotheses (see Caribou’s illustrative examples below) but not a coherent narrative on the multitude of outcomes that DPI — in its diverse forms — could enable. A shared understanding of DPI’s potential outcomes for different system actors could unlock multi-stakeholder collaboration on the “right measures” and mitigate the risk of misalignment and diminished effectiveness. Investing time in defining outcomes is crucial, ensuring they reflect the voices and needs of all stakeholders. Only then can metrics that genuinely serve these outcomes be defined.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*hjyzidzN9a5Zkj38LYGtlw.jpeg" /></figure><p><strong>Caribou’s illustrative examples of DPI outcomes</strong></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/964/1*Kgjkw6afyTYbtTPs8r5z9Q.png" /></figure><h3>Metrics align intention and value</h3><blockquote>“When a measure becomes a target, it ceases to be a good measure.”</blockquote><blockquote><a href="https://builtin.com/data-science/goodharts-law">Goodhard’s Law</a></blockquote><p>Metrics are useful ways of measuring outcomes — but only when they are aligned with a broader understanding of potential impact. Fundamentally, outcomes are expressions of what is valued; they reflect intention and galvanize collective action around what gets measured. A focus on misaligned outcomes can have lasting and challenging real-world effects; the financial inclusion sector’s fixation on account access, exacerbated by global measurement tools like <a href="https://www.worldbank.org/en/publication/globalfindex">Findex</a>, is a case in point. The onus is on us as a development community to ensure an inclusive and “whole of society” approach to defining and measuring the changes that can result from DPI to drive genuinely inclusive and meaningful impact.</p><p>Who gets to define these outcomes is also a critical question involving power dynamics that influence whose voices and needs are prioritized. DPI is necessarily a state-driven initiative, but it implies a rearticulation of at least a triad of relationships in the social contract: between the state and individuals, between individuals and the market, and between the state and the market. There are power dynamics and deeply held (and sometimes contested) values underpinning all three relationships, pointing to the complexity and necessity of involving all stakeholders in finding common ground and defining outcomes that matter.</p><h3>A moment in time for DPI measurement</h3><p><a href="https://www.linkedin.com/in/cvmadhukar/">C. V. Madhukar</a> has said that we are at a <a href="https://www.rockefellerfoundation.org/insights/grantee-impact-story/co-develop-spearheads-a-global-opportunity-for-dpi-adoption/">unique moment in time</a> in digital transformation. This unique moment offers an opportunity for the development community to align key stakeholders on a common set of DPI outcomes and the right metrics to measure those outcomes. These metrics could: 1) provide a clearer understanding of the benefits DPI delivers to different groups; 2) reveal the risks of DPI, so that products and services can course-correct, and 3) enable comparisons between approaches that could help define <a href="https://dial.global/good-dpi/">“Good DPI”</a>, akin to the influential efforts to mobilize<a href="https://omidyar.com/wp-content/uploads/2020/09/All-ID-Must-be-Good-ID_Final_3.7.19.pdf"> consensus around “Good ID</a>.”</p><p>This clarity could guide funding decisions and channel resources toward solutions with the greatest potential for impact. Defining such a measurement framework requires a systems-focused theory of change that incorporates individuals, businesses, civil society actors, and governments, and that is underpinned by a critical synthesis of the existing evidence (in this regard, DIAL and Co-Develop’s forthcoming DPI Evidence Compendium is an excellent first step).</p><h3>Digital development measurement practices can show the way</h3><p>While the multifaceted nature of DPI presents a measurement challenge, we are not starting from scratch. As a digital development community, we have learned a great deal from measuring digital initiatives, and these form a valuable knowledge base from which to start. Some key learnings:</p><ul><li><strong>Prioritize outcomes over adoption metrics.</strong> Measurement systems reflect values and intentions, and we must prioritize outcomes tracking alongside — easily and digitally obtained — adoption tracking to ensure that decision-making extends beyond access and use. Funders and implementers should measure the change they want to see in order to drive inclusive impact.</li><li>Based on their extensive experience supporting DPI implementation, Public Digital’s call to <a href="https://public.digital/pd-insights/blog/2024/10/the-value-of-metrics-in-digital-public-infrastructure">measure value from the perspective of service users</a> is an important reminder to focus on outcomes. Building on this, we could also draw on<a href="https://en.wikipedia.org/wiki/Capability_approach"> Amartya Sen’s influential “human capabilities” approach</a>, as well as C. V. Madhukar’s emphasis on <a href="https://www.codevelop.fund/insights-1/from-government-solutions-to-societal-capabilities">societal capabilities</a> to consider outcomes from a multi-stakeholder perspective. To make a compelling case for DPI, it must be clear that DPI makes a real difference in the public’s lives and that there must be a swift response to any harm — something that matters to politicians, policymakers, planners, implementers, and, most importantly, people.</li><li><strong>Adopt a systems-focused, complexity-aware theory of change.</strong> DPI warrants a systems-led, complexity-aware theory of change and measurement framework informed through <a href="https://medium.com/disruptive-design/tools-for-systems-thinkers-systems-mapping-2db5cf30ab3a">system mapping</a>, <a href="https://medium.com/caribou-digital/shifting-through-the-noise-in-the-age-of-information-overload-the-power-of-evidence-synthesis-6157e5bc472a">evidence synthesis</a>, and deep and wide stakeholder consultation. As DPI is both ever-dynamic and advancing rapidly, theories of change must also evolve continuously. This approach should consider both opportunities and risks for various actors engaging with DPI. Without identifying all sides, we risk a one-sided view of impact, potentially overlooking significant risks to different stakeholders. Developing a nuanced and adaptive theory of change can support DPI to be responsive, equitable, and impactful for all involved.</li><li><strong>Embed iterative measurement within tech systems. </strong>Data on metrics can often be captured in real time using digital solutions themselves, enabling feedback loops that drive continuous improvement. Such <a href="https://medium.com/caribou-digital/traditional-evaluation-looks-backward-innovation-looks-forward-5bb094190e1a">cost-efficient embedded measurement</a> and <a href="https://www.betterevaluation.org/sites/default/files/2021-11/MandE_for_Adaptive_Management_WP2_What_Is_AM_%26_how_does_it_work_202009.pdf">adaptive management</a> approaches can ensure that DPI initiatives focus on delivering <a href="https://www.ucl.ac.uk/bartlett/public-purpose/publications/2024/mar/digital-public-infrastructure-and-public-value-what-public-about-dpi">public value</a> beyond deployment and adoption.</li><li><strong>Utilize a multi-method approach.</strong> Iterative measurement (above) may need to be triangulated with other instruments in order to capture all required data. <a href="https://globalfindex.worldbank.org">Findex-type </a>survey data may be required to gather some data points. Additionally, literature measurement can act as a “purpose navigator,” ensuring that deployments deliver tangible public benefit.</li></ul><h3>DPI impact at a societal scale requires collective action</h3><p>By building consensus on the outcomes that matter and metrics that measure those outcomes — particularly as they reflect the lived experiences of those impacted — DPI can support inclusive growth, empower individuals, and deliver societal-scale transformation.</p><p>The knowledge, tools, and momentum to make a real difference exist, but impact requires collective action and a shared vision.</p><p>Please reach out to Jess (j<a href="http://jess@cariboudigital.net">ess@cariboudigital.net</a>), Emrys (e<a href="http://emrys@cariboudigital.net">mrys@cariboudigital.net</a>), or Niamh (<a href="mailto:niamh@cariboudigital.net">niamh@cariboudigital.net</a>) if you would like to discuss our thinking further.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ab0552e5a5f2" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/meeting-the-target-and-missing-the-point-putting-society-at-the-center-of-digital-public-ab0552e5a5f2">Meeting the target and missing the point: Putting society at the center of digital public…</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[To AI or Not to AI? Insights from the Biometrics Institute Congress, 2024]]></title>
            <link>https://medium.com/caribou-digital/to-ai-or-not-to-ai-insights-from-the-biometrics-institute-congress-2024-0e2668418bd3?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/0e2668418bd3</guid>
            <dc:creator><![CDATA[Keren Weitzberg]]></dc:creator>
            <pubDate>Fri, 01 Nov 2024 17:01:11 GMT</pubDate>
            <atom:updated>2024-11-01T17:01:11.230Z</atom:updated>
            <content:encoded><![CDATA[<h4>Keren Weitzberg &amp; Aaron Martin</h4><p>AI is on everyone’s lips. So perhaps it’s not surprising that <a href="https://www.biometricsinstitute.org/event/biometrics-institute-congress-and-side-events-2024/">this year’s Biometrics Institute Congress</a> was filled with much hand-wringing about AI. A self-described “non-profit,” the Biometrics Institute is probably best understood as part industry lobbying group, part think tank, part research institute. Each year, it holds a conference in London, which brings together policymakers, vendors, regulators, privacy rights groups, and the occasional not-so-undercover academic (like ourselves).</p><p>Here are some of our key insights from this year’s congress:</p><h3><strong>Biometric vendors are currently debating how to define themselves vis-a-vis AI and how to engage with the current AI hype cycle</strong></h3><p>“AI” is a notoriously slippery term, so much so that scholars, civil society groups, and policymakers continue to argue over its <a href="https://carnegieendowment.org/posts/2022/10/one-of-the-biggest-problems-in-regulating-ai-is-agreeing-on-a-definition?lang=en">very definition</a>. Such debates are not purely semantic; rather, they shape how an industry is regulated, how it approaches funders and clients, how it is publicly understood, and how it understands itself. In their annual ‘<a href="https://www.biometricsinstitute.org/state-of-biometrics-report-2024/">state of the industry’ report</a>, the Biometric Institute tackles this question, asking <a href="https://x.com/BiometricsInst/status/1852365480574730340">“To AI or not to AI”</a>? (We would attempt to summarize the key points for blog readers but alas it is proprietary knowledge that is only available to paid members…) At the Congress, panelists and keynote speakers raised what seemed almost existential and ontological questions. One session, for example, was entitled “What is the relationship between AI and biometrics?”</p><p>From one perspective, the relationship between AI and biometrics may seem obvious: AI is expected to enhance the functionality of identity checks. Thanks to machine learning and artificial neural networks, biometric systems have become <a href="https://www.linkedin.com/pulse/artificial-intelligence-biometrics-irreversible-couple-veridas-yfhtf/">far more accurate and precise </a>in recent years, improving their technical performance and increasing their spread and market share. More recently, generative AI is posing new risks and vulnerabilities for the sector, including sophisticated forms of synthetic identity fraud, such as <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC5362102/">face morphing</a>, which has the <a href="https://www.biometricupdate.com/202409/fraunhofer-develops-a-system-to-address-face-morphing-attacks-in-border-control">potential</a> to disrupt the security of the travel sector. OpenAI’s release of its real-time voice API, for example, has renewed alarms about fraudsters <a href="https://x.com/BiometricUpdate/status/1851739442547224825">circumventing voice recognition software</a>.</p><p>But AI is also part of a powerful tech and regulatory imaginary. At the Congress, AI seemed less a technology (or set of technologies) to be adopted than a loaded, polyvalent term to be contended with. Generating a kind of “hyperreality,” AI seems to be everything and nothing at once, invoking both dystopian and utopian futures, producing vociferous proponents, equally vocal detractors, and a growing (if sometimes quieter and more measured) group of skeptics.</p><p>The biometrics industry has long been anxious with its public reputation, particularly in the wake of a string of controversies over <a href="https://amnesty.ca/features/racial-bias-in-facial-recognition-algorithms/#:~:text=Facial%20recognition%20is%20less%20accurate,positive%20matches%20in%20image%20databases.">encoded racism</a> within facial recognition algorithms and in light of mounting resistance to the use of biometric systems for policing, migration control, and <a href="https://privacyinternational.org/long-read/4528/biometrics-collection-under-pretext-counter-terrorism">state repression</a>. A <a href="https://www.biometricsinstitute.org/congress-to-tackle-risks-of-ai-and-facial-recognition/">recent industry survey</a> by the Biometrics Institute revealed concerns that public mistrust with AI would spill over into the sector, further inflaming public opinion: “A significant 80% of respondents believe public opinion on AI will directly impact their views on biometrics. This highlights the need to address public concerns about AI to build trust in biometric applications.” This is one of many reasons why vendors and clients may seek to distance themselves from the AI moniker, or at least carefully navigate how they relate to the capacious and ambiguous term.</p><h3><strong>Regulators and industry are not necessarily at odds with one another</strong></h3><p>The Biometrics Institute is a space of engagement — one where regulators and industry can speak productively and where regulators can make a case for compliance. This is in contrast to common understandings about the relationship between regulation and business, whereby the regulators are thought to be adversarial and mistrusting of industry operators.</p><p>Several representatives of governmental regulatory bodies were in attendance at this year’s Congress, including the UK’s Information Commissioner’s Office (ICO). Other key public stakeholders in attendance were the EU’s DG CNECT, which oversees the EU’s AI Act, and the Office of Privacy and Civil Liberties at the US Department of Justice. The tone struck in their presentations was one of cooperation and assistance. As John Edwards, UK Information Commissioner, told the audience, “we are on the same side.” Emphasizing that the UK is a space where biometric technologies can flourish, he argued that regulators can help industry: “We want you to be able to use biometric data that add value to society and protect people’s privacy.”</p><p>This is not necessarily a story of regulatory capture, but it does speak to the way that industry and regulatory bodies are actively shaping one another. It certainly reflects an increasingly accepted mode of regulation that emphasizes cooperation and highlights the benefits of technological innovation, potentially at the expense of fundamental rights protection.</p><h3><strong>Rather than restrict biometric use cases, AI regulations may, in the long run, facilitate (and legitimate) their spread</strong></h3><p>Rather than necessarily inhibiting the biometrics industry, AI regulation can help the sector identify and manage risk, benefiting corporate players.<em> </em>Take, for example, the <a href="https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence">EU AI Act</a>, which has recently begun to come into force. Several presentations at the Congress were devoted to the new Act and its implications for industry. Irina Orssich, Head of Sector AI Policy at DG CNECT, explained that the Act takes a risk-based approach to biometrics. It divides use cases into a taxonomy of risk — from unacceptable to high-risk to limited to low/minimal risk. Compliance with the EU AI Act and adoption of this taxonomy can be seen as a form of risk mitigation — a means for companies to limit their liability and exposure in ways that will keep regulators at bay. Importantly, however, it also legitimates those applications that are deemed to be less risky according to the rules.</p><figure><img alt="Biometrics Institute Congress, panelists discuss EU AI Act" src="https://cdn-images-1.medium.com/max/1024/0*_rvrE9qpi10Vdr2q" /></figure><p>“Besides minimising risks,”<a href="https://doi.org/10.1080/17579961.2021.1898300"> notes legal scholar Nathalie Smuha</a>, “regulation could facilitate AI’s uptake, boost legal certainty, and hence also contribute to advancing countries’ position in the…‘race to AI.’” The same could be said about biometrics and how emerging regulations will facilitate their further adoption and acceptance in different contexts, including consumer applications and more security-oriented spaces like borders. It is therefore incumbent upon critical voices to assess how regulators, and the rules they are mandated to enforce, further entrench biometrics in our everyday lives (whether or not they are ultimately understood to be AI) and the implications of this legitimization for our societies and polities.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=0e2668418bd3" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/to-ai-or-not-to-ai-insights-from-the-biometrics-institute-congress-2024-0e2668418bd3">To AI or Not to AI? Insights from the Biometrics Institute Congress, 2024</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Breaking down power imbalances through co-creation]]></title>
            <link>https://medium.com/caribou-digital/breaking-down-power-imbalances-through-co-creation-0e49bc084ff5?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/0e49bc084ff5</guid>
            <category><![CDATA[collaboration]]></category>
            <category><![CDATA[youth]]></category>
            <category><![CDATA[cocreation]]></category>
            <category><![CDATA[monitoring-and-evaluation]]></category>
            <category><![CDATA[impact]]></category>
            <dc:creator><![CDATA[Chelsea Horváth]]></dc:creator>
            <pubDate>Mon, 16 Sep 2024 14:18:14 GMT</pubDate>
            <atom:updated>2024-09-16T14:28:05.738Z</atom:updated>
            <content:encoded><![CDATA[<p><em>Written by </em><a href="https://www.linkedin.com/in/chelseamhorvath/"><em>Chelsea Horváth</em></a><em>, Measurement &amp; Impact Manager, and </em><a href="https://www.linkedin.com/in/gnatabaalo/"><em>Grace Natabaalo</em></a><em>, Research &amp; Insights Manager, both at Caribou Digital.</em></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*WE2WvdsAowJU45AiVnVJfw.jpeg" /></figure><p><a href="https://usaidlearninglab.org/innovations-partnering/co-creation"><strong>Co-creation</strong></a><strong> </strong><a href="https://academic.oup.com/rev/article/32/2/286/7204117"><strong>has become</strong></a><strong> an increasingly </strong><a href="https://www.duckofminerva.com/2023/06/bridging-the-gap-between-research-and-policy-lessons-from-co-creation-in-the-aid-sector.html"><strong>important topic</strong></a><strong> and practice within the </strong><a href="https://www.unicef.org/innocenti/stories/3-outcomes-research-co-creation"><strong>research</strong></a><strong>, evaluation, and </strong><a href="https://plan-international.org/un/blog/2021/04/01/youth-participation-co-creation-generation-equality/"><strong>development</strong></a><strong> communities.</strong></p><p>Like many others in our community of practice, at Caribou Digital, we’re reflecting on co-creation in our work. At first glance, co-creation seems simple enough — create something with others.</p><p>But when the rubber hits the road, sticky questions arise. Who needs to be involved? What information is shared and how? How much time and resources are required to co-create? How is consensus reached? Who makes the final decision? Through trial and error and learning from others in the field, we’d like to share our experience and lessons on co-creation within research.</p><h3>Caribou Digital’s approach to co-creation</h3><p>At Caribou Digital, we understand co-creation to be an “<a href="https://www.usaid.gov/sites/default/files/2023-01/co-creation_toolkit_interactive_guide_-_march_2022%20%283%29.pdf">approach</a> that brings people together to collectively produce a mutually valued outcome and that involves a participatory process assuming some degree of shared power and decision-making.”</p><p>At conferences and in requests for proposals, we often see that co-creation is confused with collaboration (see the table below created by the authors).</p><p>The key differences between the two can be found in the definition above: breaking down power structures and decision-making. Without time and resources dedicated to those aspects, attempts at co-creation become more like collaboration.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1012/0*xlZre0n42tCdFE25" /><figcaption>A table outlining the differences between consultation, collaboration, and co-creation.</figcaption></figure><h3>Using co-creation to center young people as experts in their own digital futures</h3><p>In partnership with the Mastercard Foundation, Caribou Digital researched young people’s experiences with digital technologies in Africa, selecting 20 young people from across seven countries to co-create with. They included young people whose stories are not often seen or heard, such as women, people living with disabilities, refugees, and those living in rural areas.</p><p>The research team recognized that, despite good intentions, power imbalances would exist among the young people, the Mastercard Foundation, and Caribou Digital. These would hinder important insights that could lead to more strategic and relevant recommendations.</p><p>From the outset, we created an environment to alleviate these power imbalances. The co-creation process involved treating the young people as experts whose stories shaped the report, emphasizing collaboration and flexibility. This approach was outlined in the Terms of Reference, which each young person signed at the beginning of the project. At the first video conferencing session, expectations were aligned and rules of engagement were set. The young people reviewed and provided feedback on the research coding framework, shaping the language and direction of the project. Video conferencing sessions to share experiences were made inclusive and accessible, with flexible post-session reflection assignments to accommodate all needs. During the report-writing phase, panelists reviewed drafts, edited their quotes, and provided feedback, culminating in a discussion on how best to present the final report.</p><p>In reflecting on our co-creation process, three core learnings emerged.</p><h3>Lesson #1: Storytelling and reflection assignments yield richer data in a non-extractive way.</h3><p>Rather than extract young people’s experiences through various data collection methods, we used storytelling and reflection assignments to co-create this research. From the beginning, Caribou Digital emphasized that the young people were the experts. Their stories were the foundation of the report; our role was to facilitate and listen. The online video conference format allowed the young people to build on one another’s experiences, feel validated, and connect in a non-extractive process. Post-session reflection assignments (for example, asking the young people to reflect on how digital technologies have impacted their choice and agency) allowed them to reflect on their own and in a convenient mode (audio message or email). Providing feedback on the research process, one young person shared, <em>“The room was always accommodating of all of us who wanted to speak, and the moderators were tolerant of our views. I felt [at] home to speak/write from the reality of my experience.”</em></p><h3>Lesson #2: Double the time and resources needed for co-creation.</h3><p>Co-creation required more time, planning, and resources than initially thought. Every video conference session required thoughtful preparation to ensure a welcoming and inclusive environment — from the slide deck to the video captions. Reflection assignments and video recordings were analyzed carefully to ensure they accurately represented the young people’s experiences. Extra time was needed for the young people to review report drafts, edit quotes, and expand on their experiences. A safe estimate for others looking to use this co-creation approach would be to double the time and human resources needed.</p><h3>Lesson #3: Accountability, transparency, and flexibility are key co-creation ingredients.</h3><p>It was important for Caribou Digital to develop a trusted working relationship with the young people to keep them engaged throughout the research process. We were accountable when things weren’t working well and shared how the young people’s feedback was incorporated into the report. We were transparent with expectations for the research and when honorarium payments were delayed. We were flexible when the young people couldn’t provide feedback on time or attend a video conference session due to busy schedules. These practices kept the young people engaged throughout the research process. When asked to provide anonymous feedback on the research process, one participant shared, <em>“[Caribou] was always in touch both in the Zoom session and WhatsApp to guide in case anything wasn’t right. […] We also had timely reminders for the meetings, and at no point was I caught offside or unaware of a meeting.”</em></p><h3>Catalyzing research with co-creation</h3><p>When done well, co-creation is an incredibly powerful practice that can elevate and amplify marginalized voices and improve the quality of research products. Our co-creation journey with these 20 young people was enriching and insightful, underscoring the value of trust and transparency.</p><p>By prioritizing youth voices and experiences, the 20 young people, Caribou Digital, and the Mastercard Foundation crafted a <a href="https://www.cariboudigital.net/wp-content/uploads/2024/05/Youth-Voices-Report.pdf?utm_source=webpage&amp;utm_medium=website&amp;utm_campaign=YVR">powerful report</a> that reflects young people’s perspectives and experiences on digital technologies in Africa. One young person shared, <em>“I feel like [co-creation] is a good approach because it lends to the authenticity of the report since these are our lived experiences […] It also makes the report relatable to fellow youth especially.”</em></p><p>Caribou Digital is committed to continuing this approach and conducting more co-created research. If you’re interested in participating in such initiatives or have ideas for collaboration, we invite you to connect with us at <a href="mailto:chelsea@cariboudigital.net">chelsea@cariboudigital.net</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=0e49bc084ff5" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/breaking-down-power-imbalances-through-co-creation-0e49bc084ff5">Breaking down power imbalances through co-creation</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Conjuring innovation: Tech pilots as products]]></title>
            <link>https://medium.com/caribou-digital/conjuring-innovation-tech-pilots-as-products-ed11be47e95f?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/ed11be47e95f</guid>
            <category><![CDATA[humanitarian-aid]]></category>
            <category><![CDATA[pilot]]></category>
            <category><![CDATA[blockchain]]></category>
            <category><![CDATA[cash-transfers]]></category>
            <category><![CDATA[innovation]]></category>
            <dc:creator><![CDATA[Dr Margie Cheesman]]></dc:creator>
            <pubDate>Wed, 28 Aug 2024 09:20:40 GMT</pubDate>
            <atom:updated>2024-08-29T08:42:16.559Z</atom:updated>
            <content:encoded><![CDATA[<p>A recent Forbes <a href="https://www.forbes.com/sites/matthewkeller/2024/07/08/blockchain-makes-cash-based-humanitarian-aid-secure-fast-and-transparent/">article</a> claimed ‘Blockchain makes cash-based humanitarian aid secure, fast and transparent’.</p><p><strong>But how do aid professionals actually experience it?</strong> <br><strong>Are these claims truly being fulfilled? <br>What impact does blockchain innovation have for organisations in practice?</strong></p><p>My <a href="https://www.tandfonline.com/doi/full/10.1080/14650045.2024.2389284?scroll=top&amp;needAccess=true#abstract">latest research article (<em>Conjuring a Blockchain Pilot: Ignorance and Innovation in Humanitarian Aid</em>)</a><em> </em>lifts the bonnet on humanitarian innovation. Based on ethnographic research in Jordan, I explore what is at stake when an aid organisation experimentally applies a blockchain pilot project in refugee camps.</p><blockquote><strong><em>This innovation, I suggest, comes with a mix of genuine promise, authentic expertise, but also blind faith and strategic ignorance.</em></strong></blockquote><p>Tech pilots aren’t just designed to help people: regardless of what they achieve, they are valuable products for aid industry actors to promote.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*rFCnN0cEY47-UUUDLZvpoQ.jpeg" /></figure><h4><strong>The Blockchain Pilot</strong></h4><p>The Blockchain Pilot was introduced to replace the traditional cash-in-hand system with a blockchain-based <em>digital wallet</em>, integrated with biometric iris recognition. This system aimed to improve the security, speed, and transparency of aid payments while significantly reducing costs by bypassing conventional financial intermediaries. It also promised to empower Syrian refugee women by providing them with independently held digital wallets. However, a key appeal of the pilot was its potential to attract funding and boost the organisation’s reputation among donors.</p><h4><strong>How conjuring works: Ignorance in innovation</strong></h4><p>In the paper I argue that The Blockchain Pilot was ‘conjured’ as a product to be promoted to a competitive marketplace of aid donors. In social studies of capitalist markets, ‘conjurings’ are the spectacles and magical appearances that draw an audience of investors. I suggest that conjurings are not just about appearance and show. They involve key forms of ignorance: (i) confusion, (ii) illusion, (iii) disappearance, and (iv) misdirection.</p><p><strong>i.</strong> <strong>Confusion</strong><br>Aid professionals involved in the pilot expressed confusion about blockchain. Despite being expected to represent and defend the pilot, most staff had little understanding of how blockchain operated. This confusion was not unique to this organisation. The universal mystification surrounding blockchain made promotional claims about it difficult to evaluate or refute.</p><p><strong>ii.</strong> <strong>Illusion</strong><br>Blockchain was often treated as a magic technological object capable of achieving a range of desirable effects without clear explanation. Aid professionals conflated blockchain with other features of automation or digitalisation which did not actually require blockchain. ‘Digital wallet’ was a misnomer: refugees could not access the balance and transactions record on a personal device; they could not credit money, only withdraw it; they did not have custody of the wallet, the aid organisation did.</p><p><strong>iii.</strong> <strong>Disappearance</strong><br>The hierarchical design of the system meant that aid workers did not have access to the blockchain ledger. This design reinforced existing power asymmetries within the organisation and disconnected them from valuable information. Aid workers disappeared from the aid delivery process, replaced by the private companies and biometric cameras.</p><p><strong>iv.</strong> <strong>Misdirection</strong><br>Promoting The Blockchain Pilot often involved diverting attention away from its negative impacts on people. Aid organisations focused on quantitative metrics like cost-effectiveness and transaction speed, while downplaying the social and practical challenges faced by the refugees and aid workers.</p><p>Ignorance is not an insulting term denoting simply the absence of knowledge. It is actively produced, it can be both strategic and inadvertent, and it is shaped by hierarchical power relations and neoliberal business models in aid. The politics of ignorance is therefore something we need to take seriously when we analyse organisations and technological change.</p><p>This study is not just a cautionary tale for practitioners in aid. Beyond refugee camps and beyond blockchain, the conjuring of innovation products can take precedence over delivering meaningful value to the people they enrol.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ed11be47e95f" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/conjuring-innovation-tech-pilots-as-products-ed11be47e95f">Conjuring innovation: Tech pilots as products</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Rethinking innovation funding in the age of AI]]></title>
            <link>https://medium.com/caribou-digital/rethinking-innovation-funding-in-the-age-of-ai-ef51a81a57db?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/ef51a81a57db</guid>
            <category><![CDATA[llm]]></category>
            <category><![CDATA[funding]]></category>
            <category><![CDATA[startup]]></category>
            <category><![CDATA[ai]]></category>
            <category><![CDATA[grant]]></category>
            <dc:creator><![CDATA[Rosie Afia-Ford]]></dc:creator>
            <pubDate>Thu, 22 Aug 2024 15:51:09 GMT</pubDate>
            <atom:updated>2024-08-22T15:52:04.109Z</atom:updated>
            <content:encoded><![CDATA[<p>Applicants can now use generative AI to craft powerful funding proposals.</p><p><strong>What does it mean for organizations running competitive grants and innovation funds?</strong></p><p>A significant shift is underway in the ever-evolving landscape of impact investing and competitive grant-making programs. In recent years, artificial intelligence (AI) has become a buzzword in many domains, including in donor funding landscapes. It is pushing funding organizations to rethink how they approach innovation funding and how to ensure the “do no harm” principle applies when delivering innovation for social, environmental, and economic impact.</p><p>At Caribou Digital, we’re keenly focused on how generative AI can impact, modulate, and drive an inclusive and ethical digital world. Large language models (LLMs), like ChatGPT, have particularly piqued our interest in our fund management work and are causing us to reflect on our approaches and practices. This blog post highlights some of these reflections.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*uA8jK_sxYaYV4S6lRDL43w.jpeg" /></figure><h3>Embracing LLMs in grant-writing: A double-edged sword</h3><p>ChatGPT’s emergence has brought about three critical lessons for consideration:</p><h4><strong>1) Generative AI can break down barriers to applying for grants (like time and skill gaps)</strong></h4><p>ChatGPT and other LLMs are impressively proficient in writing grant applications. There are even some LLMs focused specifically on grant writing, like Grantable and others. The “traditional” grant application process has been a grueling task. It can be complex, time-consuming, and disempowering for applicants. It often takes senior staff away from their day-to-day duties and regularly offers no reward for their efforts. Applicants are commonly unsuccessful because they fail to clearly and effectively convey their idea, innovation, or project plan. However, new tools — from <a href="https://app.grammarly.com/">Grammarly</a> to grant-writing LLMs — have the potential to save applicants time and money in this process. They can make grant-writing more accessible and less intimidating, as well as reduce language barriers or address accessibility issues for applicants with disabilities.</p><h4><strong>2) Generative AI makes it easier to communicate compelling ideas clearly</strong></h4><p>Encouraging AI in grant proposals can democratize idea sharing, allowing for a broader range of applicants to present their visions compellingly and coherently. AI could level the playing field for small organizations with limited or no access to experienced grant writers. Or, fund managers may see that grant applicants with disabilities and those who are neurodiverse are better able to write applications without worrying about how their dyslexia (for example) might limit their chances of funding success. So, it is Caribou Digital’s theory that a more diverse pool of applicants can now complete grant applications quickly and unlock critical funding.</p><h4><strong>3) Generative AI, if used effectively by fund managers, can encourage “unusual suspects” to apply to their grant programs</strong></h4><p>By lowering the traditional barriers to entry for grants, like time and language costs, LLMs open doors for a more diverse pool of innovators.</p><p><strong>Here’s a case study to demonstrate how LLMs could reach “unusual suspect” innovators.</strong></p><ul><li>As a fund manager, Caribou Digital usually requests grant applications in a single language: English. This is mainly because we manage grants in English, so all our policies, templates, and tools for tracking require input in English.</li><li>We understand this immediately creates a bias against non-native English speakers, who have to convey complex, often technical ideas in their second or third language.</li><li>If innovators could apply for community-based projects in more relevant languages (e.g., Swahili, Luganda, Arabic, Bengali, etc.), would more people apply with truly exciting and/or community-based ideas?</li><li>Today, even basic LLM translation services can enable small, community-based organizations to quickly submit quality applications. Hypothetically, these tools would allow us to receive applications in local dialects and engage throughout the grant period in some of those languages, even if our team doesn’t have fluency in the selected language.</li><li>But we also need to be highly conscious that these bold changes to processes could also contribute new biases, as LLMs are well known to be poor advocates for generating high-quality content in non-English languages. (See, for example, this article on AI <a href="https://restofworld.org/2024/hugging-face-ai-boom/">language equity issues</a> from Rest of World.)</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*BIXpY4Eu1OfDltVD" /><figcaption>Photo by <a href="https://unsplash.com/@omilaev?utm_source=medium&amp;utm_medium=referral">Igor Omilaev</a> on <a href="https://unsplash.com?utm_source=medium&amp;utm_medium=referral">Unsplash</a></figcaption></figure><h3>How can we identify authentic talent? Why we are rethinking our practices</h3><p>In the context of generative AI and grant-making, fund managers need to be acutely aware of how biases could get built into project design. Even without the widespread use of LLMs, there is almost always bias in the selection of grants. It is therefore logical to assume <strong>LLMs can exacerbate existing (or even create new) bias in grant-awarding processes.*</strong> This selective bias makes it incredibly challenging to engage with grant-making tools. It is our responsibility as fund managers to actively ensure no conscious or unconscious bias is introduced into the process.</p><p>If, for example, fund managers allow AI tool use in grant applicants, we must also invest in a rigorous evaluation of bias, perhaps even involving undercover critical colleagues as independent teams to reduce bias in application processes. By doing so, we ensure that using AI in grant-making processes does not inadvertently perpetuate existing inequities.</p><p>While AI can polish and perfect an application, it’s essential to develop mechanisms that enable fund managers to capture the authentic talent behind “artificial intelligence.” It’s time to rethink how we structure our submission practices and interfaces. We must find ways for applicants to demonstrate their authentic selves beyond the more polished face that LLMs and other AI tools can provide. This requires a fundamental shift in our approach: embracing AI where it enhances equity and inclusion while remaining vigilant against its potential to introduce new forms of bias.</p><p>At Caribou Digital, we’re committed to exploring innovative methods that allow for a more genuine representation of applicants’ potential. By doing so, we can ensure that the best ideas, no matter where they come from, have a fair chance to shine. We’re currently thinking about ways we can support genuineness in applications, such as:</p><ul><li><strong>Allowing applicants to provide a video application</strong> (rather than solely text-based applications).</li><li><strong>Reducing or removing the need for computer access </strong>by running an application process on WhatsApp or mobile phone, for example.</li><li><strong>Plugging into existing platforms that allow applications</strong> to be submitted from an existing profile or organizational presence (e.g., <a href="https://www.f6s.com/">f6s</a> or <a href="https://www.linkedin.com/">Linkedin)</a>.</li><li><strong>Working with community-based organizations </strong>who can make initial recommendations or referrals on behalf of potential grantees and omitting lengthy written applications.</li></ul><p>We know that none of these ideas will exclude bias in grant applications and assessments (some might even exacerbate it).<strong> </strong>However, AI tools in grant-writing have highlighted the need for innovation in how we assess authenticity and potential, and it’s time to test some new and innovative approaches to assessing innovation.</p><p>Please <a href="mailto:rosie@cariboudigital.net">reach out</a> if you’d like to discuss this further.</p><p>*The perception of bias varies widely; what seems unbiased to one person may be seen differently by someone with a different background or political belief. One excellent showcase of some examples of this is the <a href="https://restofworld.org/series/the-rise-of-ai/">Rest of the World AI series</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ef51a81a57db" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/rethinking-innovation-funding-in-the-age-of-ai-ef51a81a57db">Rethinking innovation funding in the age of AI</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Identity and Migration — research update #2: Digital ID in Kenya]]></title>
            <link>https://medium.com/caribou-digital/identity-and-migration-research-update-2-digital-id-in-kenya-340243df365d?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/340243df365d</guid>
            <category><![CDATA[digital-id]]></category>
            <category><![CDATA[kenya]]></category>
            <category><![CDATA[proxy]]></category>
            <category><![CDATA[fintech]]></category>
            <category><![CDATA[shariah-compliant]]></category>
            <dc:creator><![CDATA[Dr. Emrys Schoemaker]]></dc:creator>
            <pubDate>Wed, 31 Jul 2024 14:02:16 GMT</pubDate>
            <atom:updated>2024-07-31T14:02:16.413Z</atom:updated>
            <content:encoded><![CDATA[<h3>Identity and Migration — research update #2: Digital ID in Kenya</h3><p>(Authors: Keren Weitzberg, Nora Naji, and Emrys Schoemaker)</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/650/0*lnE3Y_U_9L_PAEOq.png" /><figcaption>Source: <a href="https://www.craftsilicon.com/">Craft Silicon website</a></figcaption></figure><p>In this post, we share an updates from our ongoing research project in Kenya, outlining emerging Digital Identity innovation around proxy verification and Sharia compliant identification</p><p><strong>Agency banking and guarantors</strong></p><p>As part of our research, we are exploring how migrants (including under- and undocumented people) access financial services. Irregular migrants, refugees, and asylum seekers (and even some regular migrants) often struggle to access the same kinds of banking and financial services as citizens. A common barrier is the lack of a foundational credential that global regulatory regimes, such as the Financial Action Task Force, require in order for service providers to enable access to things like SIM cards and bank accounts.</p><p>We are meeting people, however, who are trying to develop fintech for underserved populations and regions, and for people who lack these foundational credentials. One such organization is <a href="https://www.craftsilicon.com">Craft Silicon</a>, headquartered in Kenya. We spoke to their Head of Islamic Banking, who told us about a Sharia-compliant product, currently being rolled out in Yemen in partnership with a local bank, which will enable people to open up low-balance bank accounts in order to receive remittances. This product, which will soon include a mobile wallet, will be accessible to undocumented people, who will be able to onboard by registering with an agent using a guarantor. The guarantor will have to attest to the client’s identity and provide their biometrics during the registration process.</p><p>This is not the first product that uses a guarantor system, but it is arguably unique in its claim to be Sharia compliant. The use of guarantors can offer undocumented people, including migrants, basic banking services. However, such products also have limited utility (such as their low balances) due to the need to comply with a strict financial regulatory environment.</p><p>Providing fintech to the unbanked may not be a panacea for poverty (despite some of the more quixotic claims behind the financial inclusion narrative), but such services are nevertheless desperately needed. Often, these products are piloted successfully but then abandoned or never scaled. We hope to see that change in the future.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=340243df365d" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/identity-and-migration-research-update-2-digital-id-in-kenya-340243df365d">Identity and Migration — research update #2: Digital ID in Kenya</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Identity and Migration — Digital ID in Kenya]]></title>
            <link>https://medium.com/caribou-digital/identity-and-migration-digital-id-in-kenya-ac2821f15086?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/ac2821f15086</guid>
            <category><![CDATA[digital-id]]></category>
            <category><![CDATA[migration]]></category>
            <category><![CDATA[digital-transformation]]></category>
            <category><![CDATA[kenya]]></category>
            <category><![CDATA[protest]]></category>
            <dc:creator><![CDATA[Dr. Emrys Schoemaker]]></dc:creator>
            <pubDate>Wed, 31 Jul 2024 13:28:31 GMT</pubDate>
            <atom:updated>2024-07-31T13:56:14.735Z</atom:updated>
            <content:encoded><![CDATA[<h3><strong>Identity and Migration — research update #1: Digital ID in Kenya</strong></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*GrfaC9SD_iYUw8U3" /><figcaption>Source: <a href="https://x.com/JuliusKBitok/status/1803703841017213159">Julius Bitok, Kenya’s Principal Secretary, State Department for Immigration &amp; Citizen Services</a></figcaption></figure><p>(Authors: Keren Weitzberg, Nora Naji, and Emrys Schoemaker)</p><p>In this post, we share an updates from our ongoing research project in Kenya, outlining emerging Digital Identity initiatives and the implications of recent protests.</p><p><strong>The Shirika Plan</strong></p><p>Last year, the Kenyan government announced the “Shirika Plan” — an initiative promising to transform the country’s refugee camps, Dadaab and Kakuma, into integrated settlements, in line with the UN’s <a href="https://globalcompactrefugees.org/about-digital-platform/global-compact-refugees">Global Compact on Refugees</a> (Shirika means ‘coming together’ in Swahili). This plan seeks to better integrate Kenya’s approximately 600,000 refugees and asylum seekers into the economy and various national systems. On the surface, it represents a positive policy shift for a government that has often been hostile to refugees. In the past, the Kenyan state has threatened to close the Dadaab Refugee Complex, periodically suspended refugee registration, and even <a href="https://www.amnesty.org/en/latest/news/2014/04/kenya-somalis-placed-catch-amid-crackdown-refugees/">deported Somali refugees</a> in contravention of international law. Yet, as refugee rights advocates like Victor Nyamori of Amnesty International (interviewed <a href="https://www.thenewhumanitarian.org/podcasts/2024/03/07/whats-unsaid-kenya-new-integration-plan-for-refugees">here</a> for <em>The New Humanitarian</em>) have suggested, this new plan may be hype, aimed at courting international donors, rather than meaningful policy change.</p><p>Why does this matter for those who follow digital identity developments?</p><p>Integrating migrants and refugees into national registration systems can have far-reaching effects — from ‘normalizing’ often discriminated groups to <a href="https://www.codastory.com/authoritarian-tech/kenya-biometrics-double-registration/">problematizing legal status</a>. As Immigration and Citizenship Services Principal Secretary Julius Bitok explained during a roundtable at<a href="https://medium.com/@kweitzberg/the-digital-identity-wallet-from-cape-town-to-amsterdam-57e61718f967"> this year’s ID4Africa AGM in Cape Town</a>, the Shirka Plan will also entail the incorporation of refugees into Kenya’s newly launched (and controversial) digital identity project known as <a href="https://www.biometricupdate.com/tag/maisha-namba">Maisha Namba </a>(Life Number). According to Bitok, refugees will receive Maisha cards and Maisha Nambas (unique identity numbers) alongside Kenyan citizens and residents. But what additional rights and services, if any, this will afford them is less than clear.</p><p>Time will tell how meaningful such developments are for the hundreds of thousands of refugees and asylum seekers living in Kenya, many of whom have been in the country for well over two decades. We will be following this issue closely as part of our empirical research in the country.</p><p><strong>Gen Z Protests and IDs</strong></p><p>Another key development is the <a href="https://nation.africa/kenya/news/gen-z-protests-a-revolution-aided-by-technology--4669846">Gen Z protests in Kenya.</a> In June 2024, after the Kenyan Parliament passed a new and <a href="https://www.businessdailyafrica.com/bd/economy/finance-bill-mps-drop-punitive-taxes-as-kenyans-protest-4661698">controversial finance bill</a>, Kenya erupted in nation-wide anti-government protests driven primarily by Gen Z (those currently between the ages of 12 and 27). They protested against unemployment, economic equality, and corruption, soon being joined by larger segments of the population. Kenya’s government responded to the protests with brutal measures, including abductions and extrajudicial killings.</p><p>Despite the waning of protests over the last weeks, pressure on the Kenyan government remains high. In July, activists published an Action Plan to monitor the Ruto government’s efforts to reform key areas affecting young people, which was widely circulated on <a href="https://x.com/Alvinmwangi254/status/1809223365590179931">social media</a>.</p><p>Notably, one of the action points in the document concerns the growing costs of identification. The Action Plan calls on the government to “ban the request of government issued documents for job seekers except for a national identification card; drop the replacement ID fee from KES. 1,000 fee to KES. 200; lower the drivers’ license renewal fee by 25% and make all licenses renewable every three years”.</p><p>In Kenya, where youth <a href="https://www.theeastafrican.co.ke/tea/business/number-of-jobless-kenyans-rises-to-2-97m-4184722">unemployment is at a record high</a>, young job seekers are increasingly frustrated with needing to provide extraneous documents, such as certificates of good conduct from the police or tax compliance certificates from the Kenya Revenue Authority, each with a requisite fee. Such demands are especially problematic for certain ethnic and religious groups in Kenya, particularly Muslims, who have faced historic <a href="https://www.justiceinitiative.org/voices/out-cold-vetting-nationality-kenya">discrimination</a> in access to IDs and legal documents. As <a href="https://www.standardmedia.co.ke/entertainment/news/article/2001278496/why-certificate-of-good-conduct-is-a-ticket-to-jobless-corner">this article explains,</a> these various “certificates will cost a jobless Kenyan upwards of Sh5,000.” Add to that the new charges for national IDs. Last year, the government announced that ID cards, which were previously free, would cost new applicants 1,000 Kenyan shillings (roughly $6; £5) while the cost of replacing an ID would be increased 20-fold to 2,000 shillings, sparking widespread protests online. The price hikes were <a href="https://www.bbc.co.uk/news/world-africa-67365053">eventually blocked </a>by the Kenya High Court. The government <a href="https://www.citizen.digital/news/govt-makes-u-turn-on-plan-to-increase-id-charges-to-ksh2000-n331308">u-turned</a>, lowering the costs for new IDs to KSh 300 and the fee for a replacement to KSh 1,000–still a marked rise over previous years.</p><p>The government may be trying to extract money from Kenya’s population and pay for its costly new digital identity project (Maisha Namba) by hiking fees. But amidst a cost-of-living crisis, these increased costs are not going down well with Kenya’s youth.</p><p>This controversy also reveals a key source of exclusion in Kenya and elsewhere: prohibitively high fees for mandatory government ID documents. As one Kenyan X user <a href="https://x.com/kenyamoja_/status/1814901690706854221?s=46&amp;t=Cujs2RrGJi7FtwEOS7uudA">commented</a>: “This fee undermines the constitutional right to identification, a fundamental necessity for all Kenyans.”</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ac2821f15086" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/identity-and-migration-digital-id-in-kenya-ac2821f15086">Identity and Migration — Digital ID in Kenya</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[DPI or Digital Transformation? Identifying DPI-specific risks in a Caribou Digital — UNDP convening]]></title>
            <link>https://medium.com/caribou-digital/dpi-or-digital-transformation-identifying-dpi-specific-risks-in-a-caribou-digital-undp-convening-ba3ccd9a5adc?source=rss----be144764b2ed---4</link>
            <guid isPermaLink="false">https://medium.com/p/ba3ccd9a5adc</guid>
            <category><![CDATA[digital-risk]]></category>
            <category><![CDATA[idp]]></category>
            <category><![CDATA[safeguarding]]></category>
            <category><![CDATA[undp]]></category>
            <category><![CDATA[digital-transformation]]></category>
            <dc:creator><![CDATA[Dr. Emrys Schoemaker]]></dc:creator>
            <pubDate>Mon, 22 Jul 2024 13:57:15 GMT</pubDate>
            <atom:updated>2024-07-22T13:57:15.496Z</atom:updated>
            <content:encoded><![CDATA[<h3>DPI or Digital Transformation? Three takeaways from a Caribou Digital — UNDP convening to identify DPI-specific risks</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*kFVbzSHRUqU6Tdz_" /><figcaption>Credit: AI generated</figcaption></figure><p>Caribou Digital and UNDP recently held a convening of practitioners and scholars to explore the distinction between digital public infrastructure (DPI) and mainstream digital transformation, and the specific risks and safeguarding requirements of DPI. As a ‘Chatham House’ style convening of practitioners, policy makers and academics, this blog contains three key takeaways from the discussion.</p><p>Digital public infrastructure (DPI) is, as defined by the <a href="https://www.mea.gov.in/Images/CPV/G20-New-Delhi-Leaders-Declaration.pdf">G20 New Delhi Leaders Declaration</a>, made up of ‘secure and interoperable digital systems that enable the delivery of public services, together with an enabling governance environment and public value goals.’ The UN Secretary General’s Office of the Special Envoy on Technology (OSET) and the UN Development Programme (UNDP) convened the <a href="https://www.dpi-safeguards.org/">DPI Safeguarding Initiative</a> to support the <a href="https://www.undp.org/news/interim-report-digital-public-infrastructure-safeguards-open-public-comment">development of a safe, inclusive, and rights-protecting DPI framework</a>. The initiative includes the <a href="https://www.dpi-safeguards.org/wg-members">DPI Safeguarding Initiative Working Groups</a>, convened by OSET and UNDP to support the development of the framework.</p><p>This convening brought together leading stakeholders from the digital development academic and practitioner communities to consider four scenarios that present DPI and ‘generic’ approaches to common aspects of digital transformation. Participants’ discussion was structured around two specific questions:</p><ul><li>What distinguishes a ‘digital public infrastructure approach’ from a ‘generic’ approach to digital transformation?</li><li>What are the specific risk profiles of a ‘digital public infrastructure’ approach to digital transformation?</li></ul><p><strong>Takeaway 1: Countries don’t think of DPI; instead, they think of systems. So the distinction between DPI and ‘generic’ digital transformation is more of a ‘how’ difference rather than a ‘what’ difference.</strong></p><p>One of the recurring themes throughout the conversation was that, from a purely technology perspective, there is not much difference between a DPI and generic approach to digital transformation. The various components or attributes of DPI are not in themselves new. It’s a policy perspective and implementation approach — the ‘how’ — where DPI’s distinction lies.</p><p>One of the main characteristics of a DPI perspective is the idea of ‘thinking horizontally, rather than vertically’ so that systems work across ministries or sectors, rather than vertically siloed in one — for example, enabling a single registration to verify identity and eligibility for multiple systems and services. One of the main attributes that characterize this is the interoperability necessary to enable the horizontal flow of data. Interoperability is of course not new, and there is a wealth of literature and knowledge around mitigating the risks of interoperability that the DPI Safeguarding Initiative Working Groups and wider DPI community can draw from.</p><p>Another dimension that participants flagged was that ‘how’ questions are also often normative questions; that is, they introduce considerations of rights and inclusion. Examples include the policy dimensions of identification systems and the rights and entitlements that identification brings: in other words, what legal rights does being identified grant holders of that identity?</p><p><strong>Takeaway 2: The distinction of DPI is the implications for development pathways and choice.</strong></p><p>The second takeaway was DPI’s significance for development pathways and choice, namely in two dimensions. First, some participants discussed how DPI could strengthen states to make sovereign choices about their digital transformation path. Once DPI reduces digital transformation, from a technical architecture-systems perspective, to minimal building blocks and their core functionality, it opens a conversation about the importance of control for those blocks, instead of a conversation about a pre-selected system and its features. Participants felt this was particularly significant in the context of countries in the Global South who often depend on external financing for their development trajectory and are often forced to adopt the path of their funders. This is particularly the case in the context of donors who provide financing or investment for specific sectors or silos such as health, education, or welfare. Focusing on a DPI approach instead can put power back in the hands of the government.</p><p>The second aspect of sovereignty and choice was focused on individuals. Some participants highlighted how in some cases making systems mandatory — such as identification systems — can force users, whether they want to or not, to adopt systems that may serve state interests before individual interests. This consideration introduced another way of thinking about DPI risk: the risk of doing it too well, which could enable authoritarianism and autocracy, or not well enough, which could lead to service failure and a loss of trust and confidence in the state.</p><p><strong>Takeaway 3: DPI’s complex business models present risks to both owners and systems.</strong></p><p>Another takeaway from the discussion was the significance of DPI’s business models. One of the challenges that participants flagged was around the ownership of systems. For example, if DPI is government led, this can have implications for the business case of systems — for example around the necessary internal capacity to build or buy systems, and manage technical development and procurement. Another challenge that participants flagged was around the scaling of business models. Scaling competitively, and thus justifying public investment, was as important to consider as the potential for scale that the system introduced.</p><p>The characteristics of DPI also lead to other dimensions of business model risk. For example, interoperability introduces challenges for commercial suppliers of systems, as interoperability (should) make DPI elements (identity, payments, data sharing) commodity services. This is significant because, historically, companies prefer to make profits rather than compete on price in a commodity service market.</p><p>Another risk of DPI that participants flagged is its emphasis on open source. A number of participants flagged the challenge of open source and ‘abandonware’ — systems and technologies that are developed by a community (or lone developer) and then abandoned. This is a key risk for critical infrastructure and requires governments to develop the relevant capacity to mitigate the risk of infrastructure failure.</p><p><strong>Reflection: Evaluation is key. How do we know if an intervention is a DPI approach, and what difference does DPI make?</strong></p><p>A more general reflection on this discussion is the importance of monitoring and learning: establishing the basis for evaluating impact and whether a particular initiative is actually DPI in nature. What are the distinctive aspects of a deployment that can tell us whether it is upholding the principles and values of DPI? If DPI is an approach and a policy shift, how can we measure whether an approach and policy agenda is DPI in nature?</p><p>At Caribou Digital, we have a particular focus on monitoring, evaluation, and learning. We’re often asked to assess the impact of interventions. So we find it striking that in the conversation around DPI there has been very little attention paid to the difference a DPI approach makes, compared to a generic approach to digital transformation. Being clear about that difference is important on its own terms, but also to justify investments and to support the case for a DPI approach to policymakers and decision-makers, especially elected representatives.</p><p>The work to unpack distinctions between DPI and generic digital transformation is critical to developing appropriate safeguards, and to the broader effort to advance the case for and understanding of DPI. The work of developing instructive scenarios to break down the differences will confirm, and there is an open invitation to provide feedback that helps refine the existing scenarios, and to suggest and contribute new ones.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ba3ccd9a5adc" width="1" height="1" alt=""><hr><p><a href="https://medium.com/caribou-digital/dpi-or-digital-transformation-identifying-dpi-specific-risks-in-a-caribou-digital-undp-convening-ba3ccd9a5adc">DPI or Digital Transformation? Identifying DPI-specific risks in a Caribou Digital — UNDP convening</a> was originally published in <a href="https://medium.com/caribou-digital">Caribou Digital</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>