<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[51nodes - Medium]]></title>
        <description><![CDATA[51nodes based in Stuttgart is a provider of crypto economy solutions. 51nodes supports companies and other organizations in realizing their Blockchain projects. We offer technical consulting and implementation with a focus on smart contracts, DApps and tokenization of assets - Medium]]></description>
        <link>https://medium.com/51nodes?source=rss----89b961a921c---4</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Mon, 16 Feb 2026 16:04:20 GMT</lastBuildDate>
        <atom:link href="https://medium.com/feed/51nodes" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[Why Upgrading OpenZeppelin Smart Contracts from Version 4 to Version 5 is Unsafe]]></title>
            <link>https://medium.com/51nodes/why-upgrading-openzeppelin-smart-contracts-from-version-4-to-version-5-is-unsafe-e08be30efd8a?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/e08be30efd8a</guid>
            <category><![CDATA[blockchain]]></category>
            <category><![CDATA[solidity]]></category>
            <category><![CDATA[openzeppelin]]></category>
            <category><![CDATA[ethereum]]></category>
            <category><![CDATA[smart-contracts]]></category>
            <dc:creator><![CDATA[Majd]]></dc:creator>
            <pubDate>Fri, 22 Dec 2023 14:15:51 GMT</pubDate>
            <atom:updated>2023-12-23T19:20:06.338Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/673/1*5WjQaHVQt16XB5Lr_2TeeQ.png" /><figcaption><a href="https://forum.openzeppelin.com/t/contract-upgrade-failed-because-variables-are-deleted-in-openzepplin-5-0/38341">unsafe upgradability note</a></figcaption></figure><p>OpenZeppelin is a well-known and trusted company in the blockchain world. It provides secure open-source smart contracts and libraries, which accelerate the blockchain development worldwide. For more than two years, developers from all over the world have been using OpenZeppelin’s <em>v4</em> contracts to build applications. The <em>v4</em> contracts were maintained, well-documented, and audited over a long period. However, On the 5th of October, a new journey began with the release the <em>v5</em> of OpenZeppelin contracts. In this article, we will explain - with an example - why it is not safe to upgrade existing deployments from OpenZeppelin Contract <em>v4</em> to <em>v5</em>.</p><h4>TLDR</h4><p>The <em>v5</em> of the contracts includes significant changes and improvements. Some contracts and libraries have been restructured, while others have been removed. However, the most crucial reason for the incompatibility is that the <em>v5</em> contracts of OpenZeppelin integrate a significant change to the storage layer. The contracts in <em>v5</em> adopt <strong>the namespaced storage layout</strong>, which is defined in <a href="https://eips.ethereum.org/EIPS/eip-7201"><strong>ERC-7201</strong></a>. This change implies that the new logic of <em>v5</em> contracts will query existing data and write new data in <strong>incorrect</strong> storage slots, making an upgrade from <em>v4</em> to <em>v5</em> not safe. Developers have several options to update an existing contract, such as reinitializing the contract, deploying it as a new proxy, or choosing to continue with updated v4 contracts. Each option has its own unique challenges and requirements.</p><h4>Prerequisites</h4><p>We consider the knowledge of upgradeable smart contracts and rudimentary knowledge of the Ethereum Virtual Machine (EVM) storage as a requirement for understanding this article. The following tutorials give a good overview about upgradability of smart contracts:</p><ul><li><a href="https://www.youtube.com/watch?v=JgSj7IiE4jA">https://www.youtube.com/watch?v=JgSj7IiE4jA</a></li><li><a href="https://www.youtube.com/watch?v=kWUDTZhxKZI">https://www.youtube.com/watch?v=kWUDTZhxKZI</a></li><li><a href="https://www.youtube.com/watch?v=YJZV9uiDbJI">https://www.youtube.com/watch?v=YJZV9uiDbJI</a></li></ul><p>For the EVM storage layer we recommend the following articles:</p><ul><li><a href="https://steveng.medium.com/ethereum-virtual-machine-storage-layout-beb9a72a07e9">https://steveng.medium.com/ethereum-virtual-machine-storage-layout-beb9a72a07e9</a></li><li><a href="https://www.adrianhetman.com/unboxing-evm-storage/">https://www.adrianhetman.com/unboxing-evm-storage/</a></li><li><a href="https://medium.com/@Knownsec_Blockchain_Lab/knowsec-blockchain-lab-depth-understanding-of-evm-storage-mechanism-and-security-issues-50509acea373">https://medium.com/@Knownsec_Blockchain_Lab/knowsec-blockchain-lab-depth-understanding-of-evm-storage-mechanism-and-security-issues-50509acea373</a></li></ul><h4>Comparing how state variables are stored in OpenZeppelin upgradeable contracts in v4 versus v5</h4><p>The Ethereum Virtual Machine (EVM) stores state variables in a key-value store with 256-bit keys and 256-bit values. These state variables are stored in a compact, continuous manner. Therefore, <strong>the order</strong> of the state variables in a smart contract determines their position in the storage.</p><p>OpenZeppelin <em>v4</em> upgradeable contracts primarily use the default approach for storage management. The storage slots for state variables are determined by their sequential order in the contract. Therefore, base contracts often include a ‘storage buffer’ — defined as <strong><em>uint256[49] __gap; </em></strong>— to reserve storage slots. This will allow future versions of that contract to use up those slots without affecting the storage layout of child contracts. This approach of storage handling complicates contract management and upgradeability by increasing the risk of mistakes and storage conflicts, where new upgrades may unintentionally overwrite existing storage slots.</p><p>To simplify upgradability and minimize storage conflicts, OpenZeppelin <em>v5</em> adopted the <strong>namespace storage layout,</strong> which has been proposed officially as <strong>ERC-7201</strong> on June 20, 2023. The ERC-7201 recommends using pseudorandom locations derived from a namespace as the basis for new storage trees in base contracts. The root slot for storage is calculated using a <a href="https://eips.ethereum.org/EIPS/eip-7201#formula">specific formula</a>. This approach does not require changes on the EVM level and can be implemented by grouping state variables within a struct and using assembly language to access specific storage slots directly, as shown in the example:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/602/1*ZoQ8W_gLnUAdFo2yBFizbw.png" /><figcaption>Example of namespace-based storage locations in a smart contract following <a href="https://eips.ethereum.org/EIPS/eip-7201">ERC-7201</a></figcaption></figure><p>In the example smart contract <strong><em>x</em></strong> and <strong><em>y</em></strong> are state variables. However, they are not stored on the first and second slots in the storage as default. They will be stored in the storage tree under the given root <em>MAIN_STORAGE_LOCATION</em> = <em>0x183a…b500, </em>which is calculated using the suggested formula in ERC-7201.</p><p>When a <em>v4</em> upgradeable contract is deployed on the network, the state variables of the contract are initialized, stored, and managed using the default storage pattern based on their order in the contracts. However, if the implementation contract is upgraded using <em>v5</em> contracts, a critical issue occurs. The new <em>v5</em> contracts are designed to interact with storage in a different way, using the namespaced storage layout introduced in ERC-7201. As a result, these updated contracts will not recognize or access the original storage slots used by the old contract. This mismatch in storage access will lead to dangerous and incorrect states where variables may not reflect the intended values, and functions may behave unpredictably, leading to loss of funds or compromised security.</p><h4>Unsafe Upgradability Example</h4><p>The following <strong><em>ExampleV4</em></strong> contract uses OpenZeppelin <em>v4</em> and imports the following contracts <strong><em>Initializable</em></strong>, <strong><em>OwnableUpgradeable</em></strong>, and <strong><em>UUPSUpgradeable</em></strong>. It sets the value of <em>exampleStateVariable</em>, the address of the owner, the address of implementation contract, and the status of the contract to Initialized during the initialization process when it is deployed as proxy.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/904/1*NEgIUdRNixj945TWaZH-vA.png" /><figcaption>example upgradable contract using v4 version</figcaption></figure><p>To check the storage keys of the state variables, we can deploy it as <a href="https://remix-ide.readthedocs.io/en/latest/run_proxy_contracts.html">proxy using Remix IDE</a> and then debug it using the <a href="https://remix-ide.readthedocs.io/en/latest/debugger.html">Debugger</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/700/1*maeTLLFwIIC7Osk7qFxpjQ.png" /><figcaption>the storage of the contract</figcaption></figure><p>As shown in the picture, slot <em>0x000…000</em> (slot 0) has the value <em>0x01 </em>marking the contract as initialized. At Slot <em>0x000….033</em> (Slot 51) the, address of the owner of the contract is stored. At <em>0x000…0c9 (</em>Slot 201) the value <strong>7</strong> of the <em>exampleStateVariable </em>is stored<em> </em>(<strong>Note</strong>: The variables are not sorted at slots 0, 1, 2 due to the buffer gaps between the contracts). At Slot <em>0x36089…82bbc</em> the address of the implementation contract stored (<strong>Note</strong>: The slot of the implementation contract address was already hardcoded in v4 in a similar way to the namespaced storage layout to prevent conflicts).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*B4ubieJKckFVLtVfmvePVg.png" /><figcaption>map current slots to values in v4</figcaption></figure><p>For demonstration purposes, we will update the implementation contract to <strong>ExampleV5</strong> using OpenZeppelin <em>v5</em> contracts, without introducing new logic. To execute the upgrade, we must import the new <em>v5</em> contracts, make minor modifications to the original contract to align with the new structure, and then deploy it. Then, we invoke the upgradeTo function on the proxy contract, providing the address of the new implementation contract as an argument.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/904/1*I1PW5hI4syYiX4c5CfnYMQ.png" /><figcaption>Code of the upgradeable smart contract in OpenZeppelin version 5</figcaption></figure><p>After upgrading the contract we can call the two state variables in the proxy contract to check their values.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/904/1*5IxkINGc3L8WWWkA9XJvTA.png" /><figcaption>the values of the state variables after the upgrade</figcaption></figure><p>The value of <em>exampleStateVariable</em> is now <strong>1</strong> and not <strong>7</strong> because the upgraded contract reads the value from the first storage slot, which was the same slot of the initialized status. Furthermore, the owner is the zero address <em>0x000…000</em> because the new implemented contract following <em>v5</em> of OpenZeppelin looks for the address of the owner at another storage slot. Inspecting the <em>v5</em> contract, we will find that the owner value is read from slot <em>0x9016…9300,</em> which is empty.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/904/1*GB6smNDzOj7aUVi5DT881w.png" /><figcaption>the <a href="https://github.com/OpenZeppelin/openzeppelin-contracts-upgradeable/blob/v5.0.1/contracts/access/OwnableUpgradeable.sol">OwnableUpgradeable</a> v5 contract</figcaption></figure><p>The same issue applies to the <strong><em>Initialization</em></strong> status of the contract. It is read from slot <em>0xf0c5…6a00</em> and not from <em>0x000…000</em> as it was in <em>v4</em> contracts</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/658/1*JUVL1OAgmGzZrif4gBf2Bg.png" /><figcaption>the <a href="https://github.com/OpenZeppelin/openzeppelin-contracts-upgradeable/blob/v5.0.1/contracts/proxy/utils/Initializable.sol">Initializable</a> v5 contract</figcaption></figure><p>This means our very simple updated contract has incorrect values, is not initialized, and has the zero address as Owner.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Bc1wVdNiPRpEsYdGTZjhXA.png" /><figcaption>compare current slots to values before and after upgrade shows the misplacing of state variables between old and new v5 contract</figcaption></figure><p>At this point, anyone would be able to initialize it again giving the address of owner to control the contract. Even if the original owner still managed to initialize the contract before anyone else the contract will stay in an unsafe state. To explain this we will re-initialize and debug the storage of the upgraded contract using Remix.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/904/1*_Vka33pQg0Q8EubBqlq7jg.png" /><figcaption>The storage keys and values after the upgrade and re-initialization of the contract</figcaption></figure><p>As displayed on the above storage snapshot, the original (v4) storage slots at <em>0x000…0c9</em>, <em>0x000…033</em> are not empty. Therefore, new state variables added to the contract could unintentionally use these slots, creating invalid values.</p><h4>Going Forward with Existing Contracts</h4><p>For developers, finding the best way to update existing contracts is crucial. There are several options for updating deployed <em>v4 </em>upgradable contracts, some of these options are:</p><ul><li><strong>Reinitializing the Contract</strong>: This process involves clearing the storage and migrating the data within the contract. While this approach might be feasible for simple contracts, for complex contracts with dynamic data types such as arrays, strings, and mappings, the storage becomes overly complicated, making migration within the contract impractical, if not impossible.</li><li><strong>Deploying as a Proxy</strong>: Similar to the <a href="https://sergiomartinrubio.com/articles/how-to-release-new-versions-of-smart-contracts/#social-migration">Social Migration</a> approach for non-upgradable contracts, this method involves deploying the new contract as a new proxy with a new address, followed by migrating the data and users to this new contract. Depending on the contract type and existing data, this process can become complicated. Additionally, it requires users’ involvement.</li><li><strong>Updating using v4 Contracts</strong>: Likely the most advisable approach is to continue using v4 contracts. This method eliminates the need for data migration or storage restructuring. Developers need to maintain their existing OpenZeppelin v4 contracts, extending or updating only the necessary components for their specific needs when required.</li></ul><h4>Summary</h4><p>In Ethereum, upgradeable contracts are complicated due to potential storage conflicts. The introduction of a new storage layout aims to make the upgrade process easier and cleaner. However, old upgradable <em>v4</em> contracts will face issues with the new layout, as their state variables could be dislocated, leading to an invalid contract state. Therefore, a native upgrade from <em>v4</em> contracts to <em>v5</em> is considered as not safe and is not recommended, requiring careful selection of strategies for updating <em>v4</em> Upgradeable Contracts.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=e08be30efd8a" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/why-upgrading-openzeppelin-smart-contracts-from-version-4-to-version-5-is-unsafe-e08be30efd8a">Why Upgrading OpenZeppelin Smart Contracts from Version 4 to Version 5 is Unsafe</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Indexing and Querying Revoked Verifiable Credentials]]></title>
            <link>https://medium.com/51nodes/indexing-and-querying-revoked-verifiable-credentials-e229dc2781d4?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/e229dc2781d4</guid>
            <category><![CDATA[blockchain]]></category>
            <category><![CDATA[self-sovereign-identity]]></category>
            <category><![CDATA[web3]]></category>
            <category><![CDATA[verifiable-credentials]]></category>
            <category><![CDATA[the-graph]]></category>
            <dc:creator><![CDATA[Majd]]></dc:creator>
            <pubDate>Fri, 01 Jul 2022 08:36:03 GMT</pubDate>
            <atom:updated>2022-07-01T08:35:50.188Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*cf_nHmZ_YMKWONZYpkYo5w.jpeg" /></figure><p>Due to its immutability and censorship resistance, the Blockchain seems to be a suitable place to manage identities and credentials. However, querying the data on the blockchain in a trusted and efficient way is one of the most challenging issues every Web3 developer has to face while developing a DApp. Therefore, this article describes a simple approach to revoke verifiable credentials and a decentralized and efficient way to index and query those revoked credentials using the <a href="https://thegraph.com/en/">Graph protocol</a>.</p><p>We consider the knowledge of Self-Sovereign Identity (SSI) and rudimentary knowledge of the <a href="https://github.com/decentralized-identity/ethr-did-resolver/blob/master/doc/did-method-spec.md">Ethr DID method</a> as a requirement for understanding this article.</p><h4>Ethr DID and Credentials</h4><p>The Ethr DID method uses Ethereum addresses as identifiers. By default, the DID of the issuer/subject does not require an on-chain registration and it is constructed from the Ethereum address and controlled by its private key. This feature minimizes the effort of integrating Identities in many on-chain use cases.</p><p>When an off-chain verifiable credential is created, the issuer entity must set the “<em>issuer</em>” property to its Ethr DID and the “<em>credentialSubject.id</em>” property to the Ethr DID of the subject. Example:</p><pre>&quot;verifiableCredential&quot;: {<br>    &quot;@context&quot;: [..],<br>    &quot;issuer&quot;: {<br>      &quot;id&quot;: &quot;<strong><em>did:ethr:0xf1232f840f3ad7d23fcdaa84d6c66dac24efb198</em></strong>&quot;<br>    },<br>    &quot;issuanceDate&quot;: &quot;2022-07-2T12:00:00.000Z&quot;,<br>    &quot;credentialSubject&quot;: {<br>      &quot;id&quot;: &quot;<strong>did:ethr:0x435df3eda57154cf8cf7926079881f2912f54db4</strong>&quot;,<br>      &quot;degree&quot;: &quot;Bachelor Of Arts&quot;<br>    },<br>    &quot;proof&quot;: { ... }<br>  }</pre><p>After that as usual the issued credential will be used by the subject to generate a verifiable presentation. The verifier will verify the presentation and accept or reject it depending on its business logic.</p><h4>Credential Revocation</h4><p>Verifiable Credentials are similar to certificates, some of them have an expiration date and some don’t. However, in both cases, the <em>issuer</em> and the <em>subject</em> must be able to revoke them. To enable this critical requirement, we need a tamper-proof revocation data registry that is always available and cannot be censored by any of the mentioned parties. This can be achieved using the Blockchain. Such a registry can be implemented as a simple smart contract that stores a reference of the credential and a reference of the entity who revoked it.</p><p>The revocation process:</p><ul><li>Get the hash value of the issued verifiable credential using the <em>keccak256</em> hash function after converting the credentials to bytes. This value will be used as the on-chain reference to the credential</li><li>Send a transaction to the revocation smart contract to call the revoke method revoke(bytes32 digest)giving the extracted hash value as input</li><li>The entity that revoked the credential is the address of the invoker and it can be matched with the issuer or subject of the credential based on their DIDs</li></ul><p>The additional important key that will help use indexing those revocations is that when a credential is revoked, the revocation smart contract triggers an event that includes the <em>keccak256 </em>hash of that credential and the address of the entity that revoked it.</p><p>In the Github <a href="https://github.com/51nodes/vc-revocation-graph">Repository</a>, we prepared a simple project that:</p><ul><li>Deploys a revocation smart contract that stores references to the credential and the entity that revoked it.</li><li>Uses the <em>did-jwt-vc</em> and <em>ethr-did</em> libraries to issue 3 off-chain verifiable credentials by 3 different issuers and revoke them on-chain</li></ul><p>Please check the<em> ./ssi-contracts </em>folder and follow the instructions in the <a href="https://github.com/51nodes/vc-revocation-graph/tree/main/ssi-contracts"><em>README</em></a> file. After following the steps you should have the address of the revocation registry smart contract deployed on the <em>Goerli </em>network<em>. </em>The 3 revoked transactions can be found on <a href="https://goerli.etherscan.io/">Etherscan</a> using the address of the deployed contract.</p><p><strong>Why do we need to index the revoked Credentials?</strong></p><p>In a verification scenario, the verifier gets a verifiable presentation that includes <em>1 to n </em>verifiable credentials from the subject. In addition to the normal validation process of the credentials, the verifier must also check the on-chain status of the credential at any time easily and efficiently. To do that the verifier could request the status of each credential by calling the methodrevoked(address issuer, bytes32 digest) in the revocation registry contract. However, this approach of getting the information is limited, costly, and not efficient for many reasons, some of which are:</p><ul><li>It requires knowledge of the Blockchain and Web3 technology</li><li>It is slow and limited because of the latency, number of requests, node capability</li><li>It requires running a blockchain node or relaying on a node provider</li><li>Duplicate requests for the same credentials are sent over and over again</li><li>It requires the verifier to be online all the time</li></ul><p>And this is when <strong>indexing</strong> comes into play to provide an easy, fast and simple way to iterate and query data. Indexing is especially needed in the Blockchain area where its underlying design makes it difficult to search for relevant data.</p><p><strong>But How can we index the Blockchain?</strong></p><p>One approach is iterating all transactions and checking if the transaction interacts with our revocation contract. This method is very slow and resource-intensive. The other approach which is used by most developers is to subscribe to events that are triggered by the target contract. This method is much easier, supported by tools and libraries like web3js, and is more resource-saving. Now we come to the second important question of <strong>who</strong> should do the indexing? There are some possible answers:</p><p>1. <strong>The verifier himself. </strong>But it is not a good solution because each verifier needs to build up and maintain its own logic and infrastructure. Furthermore, the server of the verifier that is responsible for the indexing is a Single-point-of-failure</p><p>2.<strong> External ingestion service </strong>that indexes the data on behalf of the verifier. Such a service could be very useful. However, it creates one of the most critical issues which is the lack of <strong><em>TRUST</em></strong>. Verifiers need to trust the ingestion service and relying on it eliminates the advantage of using the Blockchain</p><p>3. <strong>Decentralized indexing protocol</strong> like <strong>The Graph protocol</strong>, which enables decentralized applications to query complex data from the blockchain without having to develop and operate proprietary indexing servers or use a centralized service provider. This is the preferred solution in most use cases because:</p><ul><li>It is distributed and decentralized. Therefore it removes the Single-point-of-failure and the missing trust issue</li><li>It does not require advanced Web3 knowledge. Developers use only <strong><em>GraphQL</em></strong> requests to query the data from already implemented and indexed Subgraphs</li><li>The technology behind it has been designed specifically for the Blockchain. Therefore, it is quick and easy to query and filter</li><li>Cost-efficient because indexers are competing with each other to provide the best service for the lowest price</li><li>Can be used by many verifiers around the world with minimal latency because the nodes are distributed and public</li></ul><p>The creators of the Graph protocol recognized the indexing and querying problem at the end of 2017 and started working to solve it by defining and implementing a decentralized indexing protocol. The <a href="https://thegraph.com/docs/en/about/introduction/">introduction page</a> of the Graph protocol offers detailed information about the protocol and you can also join the free and very useful courses provided by <a href="https://thegraph.academy/courses/"><em>The Graph Academy</em></a><em>.</em></p><p><strong>Develop and Deploy a Revocation Subgraph</strong></p><p>A Subgraph defines which data should be indexed from a blockchain, and how it should be stored. In our simple example, the core data the verifier requires is the hash of the credential and who revoked it. To make things easier for verifiers we could also add the timestamp and block number.</p><p>To keep the article simple we will deploy the subgraph on a <a href="https://thegraph.com/docs/en/hosted-service/what-is-hosted-service/">hosted service</a>. Hosted service can be seen today as a good and easy place to start with the development of Subgraphs because it is free to use and reduces the deployment and testing complexity of Subgraphs. Developers can at any time migrate their Subgraphs to the decentralized network.</p><p>The following steps to develop and deploy the Subgraph are described in detail with all the related files in this <a href="https://github.com/51nodes/vc-revocation-graph/tree/main/credential-revocation-subgraph-files">repository</a> :</p><ul><li>First, initialize a new sample project using the <a href="https://github.com/graphprotocol/graph-cli"><em>graph cli</em></a></li><li>Then modify some of the configurations like adding a startBlock and checking the address of the contract</li><li>After that comes the main part of the development which is to define the entities and implement the <em>handle</em> method that will be executed when an event is emitted by a smart contract</li><li>In the end, build the project and deploy it on the <em>hosted service</em> using the <em>graph cli</em> and an access token</li></ul><p>When the Subgraph is deployed and synchronized we can query the indexed data using simple <strong><em>GraphQL </em></strong><em>requests.</em></p><p>Example: open https://api.thegraph.com/subgraphs/name/<strong>&lt;github-name&gt;</strong>/credential-revocation-graph in browser and run the following query to get the hashes of all credentials that are revoked by Issuer-A from ./ssi-contracts</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*7NSnQbUcFT_RW0c3sngmaw.png" /><figcaption>Query revoked credentials by an issuer</figcaption></figure><p>The results should contain only a single revoked credential unless you modified the scripts inside ./ssi-contract or used an already existing contract for this query. You can also compare the results with the logged data of the executed script in./ssi-contract</p><h4>Summary</h4><p>In this article, we described one of many approaches on how to revoke issued credentials. This outlined approach benefits from the Ethr DID characteristic and it depends on the Blockchain as an immutable registry accessible for all parties. We also talked about the likely best way to index and query revoked credentials without relying on centralized services. And at the end, we showed how to develop a simple Subgraph that could run on the Graph protocol decentralized indexing <a href="https://thegraph.com/explorer/">network</a> and index the required data.</p><p><a href="https://www.51nodes.io/">51nodes GmbH</a> is a provider of crypto-economy solutions based in Stuttgart, Germany.</p><p>51nodes supports companies and other organizations in realizing their Blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (DApps), integration of blockchain with industry applications, and tokenization of assets.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=e229dc2781d4" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/indexing-and-querying-revoked-verifiable-credentials-e229dc2781d4">Indexing and Querying Revoked Verifiable Credentials</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Improving Scalability and Privacy of Blockchains: 2022 Update on Zero-Knowledge Proofs]]></title>
            <link>https://medium.com/51nodes/improving-scalability-and-privacy-of-blockchains-2022-update-on-zero-knowledge-proofs-2d90615f0dd?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/2d90615f0dd</guid>
            <category><![CDATA[scalability]]></category>
            <category><![CDATA[web3]]></category>
            <category><![CDATA[blokchain]]></category>
            <category><![CDATA[zero-knowledge-proofs]]></category>
            <category><![CDATA[privacy]]></category>
            <dc:creator><![CDATA[Julian Voelkel]]></dc:creator>
            <pubDate>Tue, 17 May 2022 11:34:08 GMT</pubDate>
            <atom:updated>2022-05-17T11:34:08.178Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*qRkIBiGxEcgEpM46BzBLQA.png" /></figure><p>This article serves as an overview of current <strong>Zero-Knowledge Proof</strong> (<strong>ZKP</strong>) implementations in the crypto space and discusses what to expect from ZKP as an exciting cryptographic method in the upcoming months and years. Specifically, we provide a snapshot of some of the most interesting projects and how ZKP improves important properties of today’s blockchains’ <strong>infrastructure</strong>, <strong>tooling</strong>, and <strong>applications. </strong>In an <a href="https://medium.com/51nodes/selectively-disclosed-verifiable-credentials-79a236b81ee2">earlier article</a> on ZKPs written at the end of 2020, we have taken a closer look at the core principles of Zero-Knowledge Proofs, their usage in Verifiable Credentials, and the state of available implementations back then. Now, in 2022 ZKPs are increasingly used as a solution for some of the blockchains’ scalability issues. We have also started to see improvements in its usage for privacy enhancements, though the topic of privacy seems to be in an earlier stage than scalability.</p><h3>Primer on ZKPs</h3><p>Essentially, a ZKP is all about trust between multiple parties. In simpler terms, this means one party is able to prove to a second party that it holds a certain piece of information without disclosing the relevant information to the receiving party. For instance, proving to the cashier that you are older than 21 without revealing your actual age. To get a better idea of ZKPs core principles, please refer to <a href="https://www.youtube.com/watch?v=fOGdb1CTu5c">this ELI5 video</a>.</p><h3><strong>ZKP Infrastructure and Tooling</strong></h3><p>ZKP solutions discussed in this section focus on improving whole blockchain networks or on providing specialized tools for those. For blockchain networks, ZKPs can help with keeping the ledger size consistent or with providing an overall more efficient approach to <strong>Distributed Ledger Technology</strong> (<strong>DLT</strong>). SDKs and tools can in return be used for writing ZKP logic and for compiling this logic into <a href="https://crypto.stackexchange.com/questions/66037/what-is-the-role-of-a-circuit-in-zk-snarks">ZKP circuits</a>.</p><h4>Mina Protocol</h4><p><a href="https://minaprotocol.com/">Mina Protocol</a> is a layer 1 protocol that takes blockchain infrastructure to a new level by basing blockchain interactions on ZKPs. While other scaling solutions aim to decrease transaction size and cost, Mina has a more holistic approach. <a href="https://docs.minaprotocol.com/en">Mina is a succinct blockchain, with a constant size of about</a> 22KB.</p><p>With Mina, ZKPs are directly integrated into smart contracts and so-called zkApps that can be built using those ZKP-enabled smart contracts. zkApps manage their state off-chain (mostly synchronously) and then store a proof of their state on-chain once the computation is finished. Thus, Mina and zkApps allow the building of highly efficient applications. Fortunately, zkApps are built using Typescript not requiring learning a special-purpose programming language like Solidity with Ethereum.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/848/0*J_ax3VY8KOa_QQMB.jpg" /><figcaption><a href="https://docs.minaprotocol.com/en/zkapps">zkApp Architecture [2]</a></figcaption></figure><h4><strong>StarkNet/StarkEx</strong></h4><p><a href="https://starkware.co/starknet/">StarkNet</a> is a layer 2 network for Ethereum using the “StarkEx Protocol” for providing faster and more cost-effective transactions and increased privacy. Instead of directly sending transactions to layer 1 Ethereum, StarkEx uses <a href="https://ethereum.org/en/developers/docs/scaling/zk-rollups/">zkRollups</a> for creating proofs for the transaction on layer 2 and then stores those proofs in batches on layer 1.</p><p>The StarkEx Protocol is defined in five different components, i.e., <a href="https://docs.starkware.co/starkex-v4/overview"><strong>Application</strong>, <strong>StarkEx</strong> <strong>Service, SHARP, Stark Verifier, </strong>and<strong> StarkEx Contracts</strong></a><strong> </strong>of which some are acting on-chain and others off-chain.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*4kYrQj2BWBGzOfri" /><figcaption><a href="https://docs.starkware.co/starkex-v4/overview">StarkEx (high-level)</a> architecture</figcaption></figure><p>The high-level architecture above shows how users will use an Ethereum-based application that sends transactions to the StarkEx Service. This service uses StarkWare’s <a href="https://www.cairo-lang.org/docs/">Cairo language for creating zero-knowledge programs</a> that prove the application-relevant state. After a proof has been computed off-chain it is stored on-chain using StarkEx Contracts. The Stark Verifier can then be used to verify the proofs by either checking the state of the StarkEx Contract or by using the SHARP service in case one needs to prove the states of multiple different applications at once.</p><h4><a href="https://dusk.network/">DuskNet</a></h4><p><a href="https://dusk.network/">DuskNet</a> is a business-oriented ZKP-based blockchain with fast transactions for building <a href="https://research.binance.com/en/projects/dusk-network">privacy-preserving smart contracts and confidential tokens for the financial sector</a> that respect confidentiality agreements. Driving factors for DuskNet are mostly privacy concerns and compliance with GDPR rules and the essential need of companies to keep information secure.</p><p>To provide these properties in a blockchain network, <a href="https://dusk.network/news/zero-knowledge-plonk-demo">DuskNet uses PLONKs</a> that are in general faster than <a href="https://crypto.stanford.edu/bulletproofs/">bulletproofs</a> but in need of a trusted setup. For more information on PLONKs, I can recommend <a href="https://vitalik.ca/general/2019/09/22/plonk.html">this blog from Vitalik</a>.</p><p>One interesting use case of DuskNet is the <a href="https://dusk.network/cases/security-tokens">XSC Security Token Standard</a>, which provides permission management for an asset&#39;s lifecycle. The ledger records all transactions, but the access rights of token holders are not lost once a token holder fails on a transaction or the access keys are lost. This property is an important requirement of securities law.</p><h4>Nightfall 3</h4><p><a href="https://github.com/EYBlockchain/nightfall_3">EY’s Nightfall 3</a> provides a secure and privacy-preserving solution for transacting ERC-20 tokens as well as ERC-721 tokens at low cost. <a href="https://www.ey.com/en_gl/news/2021/07/ey-contributes-a-zero-knowledge-proof-layer-2-protocol-into-the-public-domain-to-help-address-increasing-transaction-costs-on-ethereum-blockchain">The successor of EY’s Nightfall called Nightfall 3 aims to improve the performance of such transactions even more while simplifying the developer experience</a>. In addition, Nightfall 3 provides the ability to transfer ERC-1155 tokens.</p><p>Nightfall’s performance improvements are achieved by combining the existing ZK solution with o<a href="https://ethereum.org/en/developers/docs/scaling/optimistic-rollups/">ptimistic rollups</a>, creating a ZK-optimistic rollup hybrid. In this scheme, ZK transactions are grouped and then sent to the ledger as an optimistic rollup.</p><h4>Aleo</h4><p><a href="https://www.aleo.org/post/the-future-of-zero-knowledge-with-aleo">Aleo</a> promises to be the <a href="https://www.aleo.org/post/the-future-of-zero-knowledge-with-aleo#:~:text=Aleo%20is%20the%20first%20decentralized,%2C%20consumer%2C%20and%20enterprise%20needs.">“first decentralized, open source platform to enable both private and programmable applications”</a>. To further encourage the benefits Aleo uses the example of DEXs like Uniswap. A DEX on Aleo would keep the number of tokens you own disclosed or hide from where you got those tokens in the first place. Furthermore, all of this privacy is enabled without removing the ability to integrate with data from public blockchains.</p><p>Aleo comes with a variety of tools that should help new developers to build applications using Aleo.</p><p><strong>Leo</strong> —<strong> </strong>Aleos programming language inspired by JavaScript, Rust, and Scala for writing ZKP applications.</p><p><strong>Aleo Studio</strong> — Aleos IDE for writing applications with Leo.</p><p><strong>Aleo Package Manager </strong>— For publishing the packages and applications written with Aleo Studio.</p><p><strong>snarkOS</strong> — A decentralized OS for running Aleo. SnarkOS contains important logical components for writing ZKP applications and for proving states publicly.</p><h3>ZKP Applications</h3><p>Once you have decided on a certain infrastructure, may it be a ZKP-based blockchain network or a non-ZKP network you can build your own application. This is where ZKPs get interesting for the common user as the benefits are becoming more and more obvious at this level.</p><h4>Hyperledger Aries</h4><p><a href="https://www.hyperledger.org/use/aries">Aries</a> describes itself as <a href="https://www.hyperledger.org/use/aries#:~:text=Hyperledger%20Aries%20provides%20a%20shared,peer%2Dto%2Dpeer%20interactions">“a shared, reusable, interoperable tool kit designed for initiatives and solutions focused on creating, transmitting, and storing verifiable digital credentials.”</a> We discussed Hyperledger Aries and their use of Hyperledger Indy’s “Anoncreds” in our <a href="https://medium.com/51nodes/selectively-disclosed-verifiable-credentials-79a236b81ee2">earlier article</a>. Interesting to note at this point is that we can see the first integrations of <a href="https://w3c-ccg.github.io/ldp-bbs2020/">BBS+ signatures</a> (signature-based ZKPs) that enable selective disclosure.</p><p>Combined with the <a href="https://www.w3.org/TR/vc-data-model/#json-ld">W3Cs JSON-LD credentials</a>, Hyperledger Aries could be one of the first movers in providing the ability to use ZKP-enabled W3C credentials in messaging, resulting in a more private and secure exchange of information between users or even devices.</p><h4>Iden3</h4><p><a href="https://docs.iden3.io/">Iden3</a> is an open-source project aiming to provide a new and decentralized solution for digital identities. Based on ZKPs, Iden3 powers really neat use cases like <a href="https://iden3-docs.readthedocs.io/en/latest/_downloads/9cdef79316906f09753cbca69c965e92/iden3_ethcc_presentation.pdf"><strong>anonymous logins</strong> and <strong>reputation proofs</strong></a> while not requiring users to disclose their actual identity. Although it is in the early stages of development, the use cases of an open-sourced and community-driven digital identity solution for end users seem manifold.</p><p>Iden3 developed its own Circom language and the <a href="https://github.com/iden3/circom">Circom 2.0 compiler</a> that allows creating <a href="https://vitalik.ca/general/2021/01/26/snarks.html">ZK-Snarks</a> (another type of ZKPs) on a more abstract and not mathematical level. This way, Circom allows for easy creation and integration of ZKPs. The figure below shows how <a href="https://docs.circom.io/">Circom</a> and SnarkJS can be used in combination.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*qEDfIwBa8xBWzcK_" /><figcaption><a href="https://docs.circom.io/">Circom &amp; SnarkJS</a></figcaption></figure><h4><strong>Loopring</strong></h4><p><a href="https://loopring.org/#/about">Loopring is an open-sourced decentralized exchange using an Automated-Market-Maker(AMM) and zkRollups to provide a fast and cheap method for token exchanges and payment</a>s. Loopring and similar applications could very well introduce a significant improvement for multiple fields in the crypto industry like Decentralized Finance (DeFi) and NFT trading.</p><p>One of Loopring’s <a href="https://loopring.org/#/post/gamestop-nft-marketplace-powered-by-loopring-l2">latest announcements includes the hosting of GameStop’s upcoming NFT marketplace</a>. This might introduce the advantages of ZKPs to a broader audience.</p><h4>dydx</h4><p><a href="https://dydx.exchange/">dydx</a> is another <a href="https://cryptobriefing.com/beginners-guide-perpetual-trading-dydx/">decentralized exchange mostly focusing on perpetual trading</a>. While dydx was initially launched on Ethereum’s layer 1 it <a href="https://dydx.exchange/blog/public">introduced trading as a layer 2 solution for Ethereum in mid-2021</a> through zkRollups. The solution used for enabling zkRollups is StarkWare’s StarkEx (described above). Using the layer 2 solution instead of the layer 1 solution provides various improvements to trading <a href="https://integral.dydx.exchange/scaling-with-starkware/">like instant off-chain settlement</a>. For a more detailed comparison of the benefits of the layer 2 approach please see <a href="https://academy.shrimpy.io/post/what-is-dydx-explaining-the-popular-crypto-derivatives-dex">this blog</a>.</p><h4>Sorare</h4><p><a href="https://sorare.com/">Sorare</a> uses ZKPs in an app for building your own soccer teams and trading your player cards. Like dydx, Sorare introduced its scaling solution of choice in the middle of 2021 by integrating the StarkEx Protocol. While there&#39;s nothing too special about Sorare, I think it is a great example of where the crypto (consumer) space is heading.</p><p><a href="https://medium.com/sorare/scaling-sorare-on-ethereum-with-starkware-ccb1b0338ad3">In a blog entry, Pierre Duperrin explained the reasoning behind Sorare’s decision to use Ethereum layer 2 scaling solutions that use ZKPs instead of other scaling solutions</a>. The key argument for ZKPs seems to be the <a href="https://www.gemini.com/cryptopedia/blockchain-trilemma-decentralization-scalability-definition">scalability trilemma</a>. While there are other scaling solutions out there, ZKP-based solutions provide the advantage of not compromising on security or decentralization while increasing scalability.</p><h3>Conclusion</h3><p>ZKPs help to improve the scalability and privacy of a lot of existing solutions in the crypto space — for instance decentralized exchanges. Furthermore, ZKP can be a key enabler for the Web3.0 by improving privacy — not just on a transaction level but also on a network level. Users will be able to interact almost anonymously while still being able to provide the necessary information to participate in networks and applications.</p><p>Tools like Hyperledger Aries and networks like DuskNet will further enable businesses to utilize ZKP technology in privacy-preserving solutions. I can also think of scenarios where businesses will be able to integrate private solutions with public solutions. For example, using ZKP-enabled digital identities to participate in open markets privately. Personally, I am confident to say that crypto’s public sector (especially DeFi) will experience a lot of growth in 2022 and early 2023 through ZKP-based scalability improvements. This can already be seen by examples like Sorare or Loopring but will only accelerate once more applications integrate ZKPs.</p><p><a href="https://www.51nodes.io/">51nodes GmbH</a> is a provider of crypto-economy solutions based in Stuttgart, Germany.</p><p>51nodes supports companies and other organizations in realizing their Blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (DApps), integration of blockchain with industry applications, and tokenization of assets.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=2d90615f0dd" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/improving-scalability-and-privacy-of-blockchains-2022-update-on-zero-knowledge-proofs-2d90615f0dd">Improving Scalability and Privacy of Blockchains: 2022 Update on Zero-Knowledge Proofs</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Getting Started with Blockchain Development: Pros and Cons of Tatum vs. Moralis]]></title>
            <link>https://medium.com/51nodes/getting-started-with-blockchain-development-pros-and-cons-of-tatum-vs-moralis-91dd5acea624?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/91dd5acea624</guid>
            <category><![CDATA[dapps]]></category>
            <category><![CDATA[blockchain-development]]></category>
            <category><![CDATA[nft]]></category>
            <category><![CDATA[api]]></category>
            <category><![CDATA[sdk]]></category>
            <dc:creator><![CDATA[Daniel Niemczyk]]></dc:creator>
            <pubDate>Fri, 25 Feb 2022 07:41:11 GMT</pubDate>
            <atom:updated>2022-02-25T07:41:09.556Z</atom:updated>
            <content:encoded><![CDATA[<p>Blockchain development and the development of decentralized apps (Dapps) has changed over the past 5 years. When I started with crypto in 2017, Ethereum had just launched, and Solidity programming became the new thing a lot of people wanted to learn. The only problem these days was that little information to get started was available, whereas today we can build on thousands of Youtube videos, courses like <a href="https://cryptozombies.io/">Cryptozombies </a>or <a href="https://openzeppelin.com/contracts/">Openzeppelin </a>standards.</p><p>Today, one doesn’t even have to know anything about Solidity programming or blockchains at all to create a product or service that is built on top of a blockchain. Nowadays there are services like Tatum and Moralis, providing almost everything you need to instantly start developing your product or service instead of managing all the “behind the scenes” things on your own.</p><p>This post explains how to get started with blockchain development using the platforms of Tatum and Moralis. I discuss pros and cons and what you can build with them today.</p><h3>Introduction</h3><p>The two main features of Moralis and Tatum are a <strong>Web API</strong> and an <strong>SDK</strong> for application development based on a public blockchain. With the Web API we can retrieve all the necessary data from a blockchain and with the SDK we can easily use code to interact with the blockchain to make a transaction or call a function of a smart contract. The blockchain data retrieved by the Web API can be accessed with the SDK as well. Without using such a service a developer must run his own full node to know the blockchains state in order to make a correct transaction based on the current state.</p><p>Moralis and Tatum work on multiple public blockchains like Ethereum, Binance Smart Chain, and Polygon. Thus, one can retrieve data from all these blockchains using a single Web API. While <a href="https://docs.moralis.io/moralis-server/web3-sdk/intro#supported-chains">Moralis</a> “only” covers 6 blockchains, the <a href="https://docs.tatum.io/supported-blockchains">Tatum API</a> covers around 40. Both APIs can be used to access the main network and a test network of a blockchain like for example Rinkeby or Ropsten for Ethereum.</p><p>Behind the scenes, services like Tatum and Moralis run a full node of every blockchain they support. Having access to all blockchains’ data, they are able to aggregate it so that one can access all users’ balances, token balances or NFTs in one place. The API can then be used within the respective SDK to help developers creating their product or service. The following picture shows the general architecture of Moralis.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*lJDeNv9fB64B_tNv7d7bRg.png" /><figcaption><a href="https://moralis.io/">The architecture of Moralis</a></figcaption></figure><h3>Web API comparison of Moralis and Tatum</h3><p>Using a Web API to get all necessary blockchain data saves developers a lot of time and money. Retrieving blockchain data via Tatum and Moralis is free of charge. Both APIs are able to:</p><ul><li>Get all Transactions an address has made.</li><li>Get the current block of any chain.</li><li>Get the native balance or balance of any ERC-20 token.</li><li>Get all NFT information of a user or a contract.</li><li>Get and write data from and to <a href="https://ipfs.io/">IPFS</a>.</li></ul><p>Both, Moralis and Tatum cover all “basic” API requests. Differences are listed in the following.</p><h4><strong>Requests specific to Moralis’ API</strong></h4><ul><li>Get all Events emitted at a specific smart contract.</li><li>Get liquidity pair reserves for a Uniswap based exchange (DeFi).</li><li>Resolve an <a href="https://unstoppabledomains.com/learn">unstoppable domain</a> and return the address.</li></ul><p>Good thing to know: The Moralis Web API differentiate between all Ethereum Virtual Machine (EVM) blockchains like Ethereum, Binance Smart Chain, Avalanche and Polygon and the new created Solana Web API for Solana. For Solana, one can “only” retrieve users’ current balance, transactions, NFTs and tokens at the moment.</p><h4><strong>Requests specific to Tatum’s API</strong></h4><ul><li>Generate a wallet and get the private key, the mnemonics and the address.</li><li>Deploy a standard ERC-20, ERC-721 and ERC-1155 contract.</li><li>Interact with virtual accounts and the off chain ledger — I will cover this later in more detail.</li></ul><h4>How to use the APIs</h4><p>After you registered for Moralis you can see on the left side of the screen the Web3 API Button. Once clicked you can use the whole Web API within the browser (see figure below). The <em>x-api-key</em> and the whole <em>curl </em>request are shown when you execute a request. Using the API from outside the browser like for example with Postman or curl is also possible.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*mxfiRxsoH-JZOMauAUvJew.png" /><figcaption><a href="https://admin.moralis.io/web3Api">The Moralis Web3 API webinterface</a></figcaption></figure><p>Using Tatum’s API works a bit different. After the registration you will be redirected to your dashboard. Here you can find your API keys — one for all main networks and one for all test networks.</p><p>To see the whole API you have to go back to the homepage to find the <a href="https://tatum.io/apidoc.php">documentation of the API</a>. Sadly, it is not possible to use the API within the browser. However, Tatum provides request code examples in 13 programming languages (see below).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*1X0UYGycnirNtVvZaAEaRQ.png" /><figcaption><a href="https://tatum.io/apidoc.php#tag/Blockchain-Polygon-(Matic)">Tatum API example to create a new wallet on Polygon</a></figcaption></figure><h4>Pricing</h4><p>Both APIs have a very good free version that can be used immediately after registration. The following figures show the number of requests possible with the APIs, depending on the plan you may choose.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*qi7T-vUJVWh7sSyeAYNgXg.jpeg" /><figcaption><a href="https://moralis.io/pricing/">Moralis Pricing for number of requests</a></figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*iiZkoFqoxe4uGSA6yFjieQ.png" /><figcaption><a href="https://tatum.io/pricing.html">Tatum Pricing for number of requests</a></figcaption></figure><h4>Web API conclusion</h4><p>Most importantly first: both API’s from Moralis and Tatum are working great and as expected and I can recommend both to you. Moralis has a little bit less supported blockchains and features than Tatum but looks more modern on the other hand. With both API’s you have access for all “basic” requests and if you really need something advanced you should look at both API’s and compare if one of these API’s is already dealing with your problem. If so just choose it and try it out.</p><h3><strong>How to use Moralis’ blockchain SDK</strong></h3><p><a href="https://moralis.io/">Moralis</a> describes itself as “the ultimate Web3 Development Platform” and is often said to be “the firebase of crypto”. Regarding help in blockchain software development, a JavaScript SDK is available. You can either use it as a plain npm module named “moralis” or you can use the “react-moralis” module to use the SDK within your React project. All the following paragraphs will focus on EVM blockchains but like already mentioned Moralis also support Solana.</p><h4>Moralis server</h4><p>Starting a new project with Moralis requires you to create a Moralis Server within your Dashboard.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*6FtObUxoZlBdoNgiyeAemg.png" /><figcaption><a href="https://admin.moralis.io/servers">Add a new Moralis Server within your Dashboard</a></figcaption></figure><p>Creating a Moralis server requires only to select a name, your region, main network or test network and which blockchains should be included and listened to. In the free version of Moralis, you can run 3 of these servers at the same time but you must revoke them manually after 7 days.</p><p>After you created your Moralis server, you can open either its details or its dashboard. Opening the details will show you your <strong>Server URL</strong> and your <strong>Application ID</strong>. These two keys have to be imported in your SDK to use it correctly. In the dashboard you can see all your additional databases Moralis is creating for you.</p><p>In the following paragraphs I will show use cases of these additional databases and cover some of the more advanced features you can do with your Moralis server that is helpful for Dapp development.</p><h4><strong>Authenticating users</strong></h4><p>The Moralis SDK provides a simple way to track all users of a Dapp. Either users can register themselves with a username and password or do it the web3 way by using Metamask. The following code example shows how to connect your Dapp with your Moralis server and how to authenticate users the web3 way.</p><pre>const Moralis = require(&#39;moralis&#39;);</pre><pre>const serverUrl = &quot;YOUR_SERVER_URL&quot;;<br>const appId = &quot;YOUR_APP_ID&quot;;</pre><pre>Moralis.start({ serverUrl, appId });</pre><pre>Moralis.authenticate()</pre><p>The authenticate function will open the user’s Metamask window and request a signature from the user by clicking the “Sign” button. Note that there is no gas fee involved in signing because it is not a transaction.</p><p>Now, here’s the cool part. Once a new user logged in, you can observe a new row added to your “User” table in the Moralis dashboard with the address of the user. Furthermore, the Moralis server creates additional tables like for example “EthTransactions”, where you can see all transactions made of all users on the blockchain networks you selected when creating the server.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*dDBgoaWK5FEHIESfe-QOGg.png" /><figcaption>Moralis Server Dashboard — Get all transaction made from all users</figcaption></figure><p>Because your tables can get fast quite big, your whole database is capable of managing queries and filters in order to only get and show the desired data.</p><h4>Listening to events</h4><p>One feature only Moralis provides, is listening to events of a specific contract. To this respect, open your server details and go to the “sync” tab. Now either choose to “follow” a specific address or a specific event. If you choose to follow an event, you only have to paste its name with parameter, its ABI as JSON and the contract address. After creation the data will synchronize automatically and the result can be found within a newly generated table in your server. A more detailed tutorial can be found <a href="https://moralis.io/sync-and-index-smart-contract-events-full-guide/">here</a>.</p><h4>Cloud-based queries</h4><p>Complex database queries can be run on the Moralis server in the cloud. For example, if you have a table with movie ratings with 1–5 stars and you would like to know the average film rating, you normally would first get all ratings and then calculate the average. To directly get the answer, just deploy and call the query as a JavaScript function within your Moralis server. For more information about the correct syntax check out Moralis’ documentation <a href="https://docs.moralis.io/moralis-server/cloud-code">here</a>.</p><h4>Getting data with Moralis’ SDK</h4><p>Let’s take a closer look at the actual JavaScript SDK and how to retrieve blockchain data. Moralis differentiates between their “normal” API for all EVM blockchains and their Solana API. The following paragraphs refer to the EVM blockchain API.</p><p>To get some data, there are 4 entry points:</p><ul><li>With <em>Moralis.Web3API.native.functionName(options) </em>we can access the Web API data within our SDK. Click <a href="https://docs.moralis.io/moralis-server/web3-sdk">here </a>for more information.</li><li>With the <em>useMoralisQuery(‘tablename’)</em> function we can get access to all the data within a specific table of our Moralis server. With the React SDK it should also be possible to enable live updates whenever your table you reference to updates. However, this didn’t work for me while testing.</li><li>With the <em>Moralis.executeFunction(‘options’)</em> function we can interact with a smart contract directly in order to get the state of a public variable or the result of a read only function. Click <a href="https://docs.moralis.io/moralis-server/web3/web3">here </a>for more information.</li><li>With the <em>Moralis.Cloud.run(‘function_name’)</em> function we can execute our cloud functions and get the result back.</li></ul><p>Using these methods, you can retrieve all blockchain data required for building your Dapp. Of course, all these possibilities can be combined within your Dapp. For example, many requests like getting the specific ERC-20 balance of an address can be resolved using any of these four possibilities.</p><h4>Creating transactions with Moralis’ SDK</h4><p>Knowing how to get blockchain data, let’s look at how to make a transaction with the Moralis SDK. The general approach of doing a transaction with the Moralis SDK is by conducting the actual transaction via Metamask. This way, the user and the developer don’t have to deal with any private keys, because Metamask handles them. Metamask provides the SDK with the information, which network the user is currently connected to, so we don’t have to deal with any blockchain network code. The SDK creates a correct transaction with the currently connected network on its own. Thus, the user only has to click on a button within the Dapp and Metamask to accept the transaction.</p><p>Making a transaction with the Moralis SDK is a one liner of code. Only the parameters of a transaction, e.g., the transaction amount, the receiver and the contract address have to be passed as JSON parameter to the <em>Moralis.transfer(options)</em> function. Note the different parameter settings in the code examples below to create a transaction for respectively sending a native coin, an ERC-20 token or an NFT. For more information click <a href="https://docs.moralis.io/moralis-server/sending-assets">here</a>.</p><pre>1) Transfer native coin options<br>const options = <br>{type: &quot;native&quot;, <br> amount: Moralis.Units.ETH(&quot;0.5&quot;), <br> receiver: &quot;0x..&quot;<br>}</pre><pre>2) Transfer ERC20 Token options<br>const options = <br> {type: &quot;erc20&quot;,<br> amount: Moralis.Units.Token(&quot;0.5&quot;, &quot;18&quot;),<br> receiver: &quot;0x..&quot;,<br> contractAddress: &quot;0x..&quot;<br>}</pre><pre>3) Transfer ERC721 Token options<br>const options = <br>{type: &quot;erc721&quot;,<br> receiver: &quot;0x..&quot;,<br> contractAddress: &quot;0x..&quot;,<br> tokenId: 1<br>}</pre><pre>// Call the Moralis.transfer function with options as parameter<br>const transaction = await Moralis.transfer(options);</pre><pre>// You can wait until its confirmation and do sth. with the result<br>const result = await transaction.wait();</pre><h4>Interacting with smart contracts</h4><p>For interacting with a function of a smart contract, you have to create an options variable as JSON string that includes the contract address, the function name, the <a href="https://www.quicknode.com/guides/solidity/what-is-an-abi">ABI </a>of the function, and the parameters. Now call the <em>Moralis.execute(options)</em> function with the options string as parameter. That’s everything it takes to create a transaction with the Moralis SDK as shown in the following <a href="https://docs.moralis.io/moralis-server/web3/web3">code example</a>.</p><pre>let options = {<br> contractAddress: &#39;0x...&#39;,<br> functionName: &#39;FUNCTION_NAME&#39;,<br> abi: [ FUNCTION_ABI_AS_JSON ],<br> params: {<br>     parameter1: value1,<br>    }<br>};</pre><pre>const tx = await Moralis.executeFunction(options);<br>const result = await tx.wait();</pre><h4>Moralis conclusion</h4><p>Moralis appears modern and helps building a Dapp. The Web API works great and using the SDK, it becomes a one liner to handle user authentication, getting data from any blockchain or creating valid transactions. Also, the documentation is very well written and detailed. They also have a very active <a href="https://www.youtube.com/c/MoralisWeb3">Youtube Channel</a>.</p><p>A limitation of Moralis is that it only deals with reading from and writing to a blockchain. Creating your own wallet or deploying a new smart contract are not possible. Therefore, scripting can get complicated because to send a transaction you have to deal with Metamask in the first place. But if you are only looking for a SDK that helps you build the JavaScript/backend part of your Dapp then choosing Moralis makes a good choice.</p><h3><strong>How to use Tatum’s blockchain SDK</strong></h3><p><a href="https://tatum.io">Tatum</a> describes itself as “The ultimate blockchain development platform” that offers a unified framework for 40+ blockchain protocols to reduce the complexity of blockchain development. While Moralis specialized in building Dapps, Tatum is a more general tool with a lot of methods needed to deal with a blockchain. Methods for creating and managing your own wallets and private keys, deploying smart contracts and interacting with them are included in the Tatum JavaScript SDK and their API.</p><p>Maybe because of the broad scope of Tatum, it feels a little bit unfinished. They feature a lot of services and endpoints instead of specializing on one specific. Thus, their Web API documentation is so big that it takes more than 2–3 seconds to load and render the whole page. Also, a lot of things are missing completely, and some other things are not properly explained yet in the documentation. However, Tatum told me that they are working on it already. So as well as in the Moralis section of this article, I will focus in the following paragraphs on EVM blockchains only, since the Tatum API is most developed for these kind of blockchains.</p><p>In the following paragraphs, I will show you the most important things you can put to use already today with Tatum.</p><h4><strong>Initializing Tatum’s SDK</strong></h4><p>Tatum’s SDK is as well written for JavaScript and can be downloaded as npm module with the name ‘@tatumio/tatum’. Once installed you can import it within your project and use one of the hundreds of functions Tatum provides. Calling any of these functions is a one liner. But compared to Moralis you don’t pass all the options as a single JSON string, but rather each parameter is passed individually.</p><h4>Generating a wallet</h4><p>By creating a wallet from scratch, we can script any transaction we want because we know the private key and can sign transactions on our own. To create a wallet with the Tatum SDK you need to import the <em>generateWallet </em>function and the <em>Currency </em>Enum. Once you call the function with the desired cryptocurrency, like for example Bitcoin or Ethereum, you will get your <em>24-mnemonic words</em> and your <em>xpub </em>for your newly generated <a href="https://www.investopedia.com/terms/h/hd-wallet-hierarchical-deterministic-wallet.asp">HD-wallet</a>.</p><p>With these two variables, Tatum can create new addresses with the corresponding private keys for your new wallet using the <em>generateAddressFromXPub </em>and <em>generatePrivateKeyFromMnemonic </em>functions, which you have to import as well. <a href="https://docs.tatum.io/your-first-app">The following code</a> demonstrates the whole process.</p><pre>1) const {generateWallet, generateAddressFromXPub, generatePrivateKeyFromMnemonic, Currency} = require(&quot;@tatumio/tatum&quot;);</pre><pre>2) const btcWallet = await generateWallet(Currency.BTC, false);</pre><pre>3) const btcAddress = generateAddressFromXPub(Currency.BTC, false,    &quot;YOUR_XPUB&quot;, &quot;HD_WALLET_INDEX&quot;);</pre><pre>4) const btcPrivateKey = await   generatePrivateKeyFromMnemonic(Currency.BTC, false, &quot;YOUR_MNEMONICS&quot;, &quot;HD_WALLET_INDEX&quot;)</pre><p>In the first line of code, we import the SDK and all required functions. In the second line, we generate a Bitcoin wallet and get the <em>mnemonic</em> and the <em>xpub</em>. In the third line, we create a new address from the <em>xpub</em> and in the fourth line, we create the private key for it. Because we are dealing with a HD-Wallet you must also pass the index of it as a parameter.</p><h4>Tatum’s Key Management System (KMS)</h4><p>To create a transaction, we need our newly created private key to be passed as a parameter to each function. Because it is never a good idea to send your private key across the internet, Tatum provides two approaches: (1) You can sign your transaction locally — this way you must have your private key stored locally on your computer, or (2) you can use the Tatum Key Management System (KMS). KMS is a cloud solution by Tatum, allowing to import your private keys encrypted with a password of your choice to let Tatum sign transactions for you within their service. This way, the private key doesn’t have to be sent across the internet because Tatum signs your transaction and only sends the signature of the transaction. For more information, click <a href="https://docs.tatum.io/tutorials/how-to-securely-store-private-keys">here</a>.</p><h4><strong>Accessing blockchain data</strong></h4><p>Currently, the Tatum SDK does not support yet to get all the blockchain data being available via the API. So to access blockchain data via Tatum, one should consult the <a href="https://tatum.io/apidoc.php">Web API docs</a> and make a HTTP request. This method can be used to build your own Dapp. I recommend to create a separate function for each request, in which you pass all necessary parameters and do the actual HTTP request within this function. This way, you create your own little framework for one-liner requests to get any necessary data from the blockchain.</p><h4>Interacting with a smart contract</h4><p>Interacting with a smart contract doesn’t work yet with the SDK at the time of writing. Instead, you must use the API as described <a href="https://tatum.io/apidoc.php#operation/EthBlockchainSmartContractInvocation">here</a>.</p><h4><strong>Transfering coins and tokens</strong></h4><p>At least the transfer functions for native coins, ERC-20 tokens, NFTs and ERC-1155 tokens exist already in Tatum’s SDK and are documented. The <a href="https://docs.tatum.io/guides/blockchain/how-to-send-ethereum-transaction">following code example</a> shows how to create an ERC-20 token transaction. For more information, check out the <a href="https://docs.tatum.io/guides/blockchain">documentation</a>.</p><pre>import {sendEthOrErc20Transaction, ethGetTransaction, Currency} from &#39;@tatumio/tatum&#39;;</pre><pre>const transaction = await sendEthOrErc20Transaction({<br>chain: Currency.&#39;CHAIN_TO_USE&#39;, <br>fromPrivateKey: &#39;YOUR_PRIVATE_KEY&#39;,<br>contractAddress: &#39;CONTRACT_ADDRESS&#39;,<br>digits: DIGITS_OF_TOKEN,<br>amount: &quot;AMOUNT_TO_SEND--0.1 for example&quot;,<br>to: &quot;TO_ADDRESS&quot;<br>});</pre><pre>const transaction = await ethGetTransaction(transaction.txId);</pre><h4>Deploying a standard smart contract</h4><p>Tatum’s SDK allows deploying standard smart contracts for EVM-compatible blockchains. Thus, one can deploy a smart contract with custom parameters for token standards listed below. Detailed information on parameters to use regarding using one of the token standards are linked to the list:</p><ul><li><a href="https://docs.tatum.io/guides/blockchain/how-to-create-an-erc-20-token">ERC-20</a></li><li><a href="https://docs.tatum.io/guides/blockchain/how-to-create-nft-token">ERC-721 (NFT)</a></li><li><a href="https://docs.tatum.io/guides/blockchain/how-to-create-erc-1155-multi-tokens">ERC-1155</a></li></ul><h4>Virtual accounts</h4><p>Virtual accounts are more or less the equivalent of a Moralis server. It’s a separate private ledger maintained by Tatum to enable, e.g., setting up webhook notifications when at a specific address a new transaction is observed. While for Moralis you must paste your address to synchronize and the webhook URL endpoint in your server’s dashboard, in Tatum you must set it up using the SDK or API. Moralis’ dashboard has great usability but Moralis provides no API for specifying webhooks. Tatum on the other hand has no dashboard but provides an API and SDK for the setup.</p><p>Unfortunately, Tatum’s SDK did not work properly during my experiments and I had to use the API again to test the webhooks. The steps for setting up a virtual account in Tatum and to subscribe to blockchain events are:</p><ul><li>Create a new wallet and save your ‘xpub’</li><li>Create a new virtual account with the ‘xpub’ as one of the parameters. Only a private key corresponding to this ‘xpub’ is allowed to do a transaction within this virtual account.</li><li>Connect the address to sync with your virtual account</li><li>Initialize a subscription that listens to incoming events and send the data to the endpoint of your choice.</li></ul><h4>Tatum conclusion</h4><p>Tatum’s SDK is a not a complete product at the moment, given its claims. Tatum aims at offering much more features than Moralis for interacting with a blockchain, especially when starting from scratch. Tatum is also more flexible by supporting more blockchains and already provides more usable features that can be helpful for scripting: Generating a wallet, managing keys, or deploying a new smart contract works great. The documentation is quite okay, but I would like to see Tatum to improve on that in the future. For functions not present in the SDK yet, you can use Tatum’s API instead, which works fine. To conclude, Tatum already provides a decent service and given the assumption that they will improve their documentation and SDK, I can recommend using Tatum’s SDK and API for developing a Dapp.</p><h3>Other blockchain SDKs and APIs worth considering</h3><p>Besides Moralis and Tatum, several other services exist that provide a Web API and a SDK for Dapp development. But none of them are as advanced as Moralis and Tatum and can be used only for some parts of your Dapp probably.</p><ul><li><a href="https://idexo.com/">Idexo</a>: It is another service, offering an SDK for blockchain development that feels like the very small brother of Tatum. Idexo provides only an SDK and is very limited regarding the number of blockchains and features. When trying to evaluate it, I had problems with registration and login. What Idexo is most known for is their Telegram bot that allow users to mint an NFT within Telegram.</li><li><a href="https://thegraph.com/en/">The Graph</a>: This is a Web API, focussing on complex and automated queries on Ethereum and IPFS.</li><li><a href="https://nownodes.io/nodes">NOWNodes</a>: They offer a Web API that is comparable to Tatum. The service supports also 40+ blockchains and a lot of endpoints are supported for each blockchain.</li><li><a href="https://infura.io/">Infura</a>: Infura provides some tools and endpoints to interact with the Ethereum blockchain and IPFS.</li><li><a href="https://v1.cosmos.network/">Cosmos</a>: If you even want to build a custom blockchain, then the Cosmos SDK might be worthwhile. Creating an own blockchain gets almost as simple as deploying a new token with it.</li></ul><h3>Conclusion</h3><p>It’s astonishing to experience how easy blockchain development has become due to tools and services like Moralis and Tatum. I recommend to use Moralis for developing a Dapp and to use Tatum for close interaction with different blockchains. I expect more to come as these services are evolving quickly. On top of that many other services exist to keep an eye on.</p><p>51nodes GmbH based in Stuttgart is a provider of crypto economy solutions. 51nodes supports companies and other organizations in realizing their Blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (Dapps), integration of blockchain with industry applications, and tokenization of assets.</p><p>Thanks to <a href="https://medium.com/@draklein">Achim Klein</a> and <a href="https://medium.com/@jpbu">Jan-Paul Buchwald</a> for their help in the course of developing this article.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=91dd5acea624" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/getting-started-with-blockchain-development-pros-and-cons-of-tatum-vs-moralis-91dd5acea624">Getting Started with Blockchain Development: Pros and Cons of Tatum vs. Moralis</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Das Potenzial von Selbstsouveränen Identitäten am Beispiel der Energiewirtschaft]]></title>
            <link>https://medium.com/51nodes/das-potenzial-von-selbstsouver%C3%A4nen-identit%C3%A4ten-am-beispiel-der-energiewirtschaft-22224d30de53?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/22224d30de53</guid>
            <category><![CDATA[ssi-use-cases]]></category>
            <category><![CDATA[ssi]]></category>
            <category><![CDATA[energy]]></category>
            <category><![CDATA[decentralization]]></category>
            <dc:creator><![CDATA[Dr. Achim Klein]]></dc:creator>
            <pubDate>Mon, 22 Nov 2021 11:44:53 GMT</pubDate>
            <atom:updated>2021-11-22T11:47:21.525Z</atom:updated>
            <content:encoded><![CDATA[<p>Bisherige Lösungen für das Managen von Identitäten sind gekennzeichnet von Datensilos, die einerseits jeder Diensteanbieter mit hohem Aufwand selbst betreibt — oder andererseits von globalen Identitätsanbietern, die den Diensteanbietern diesen Aufwand abnehmen können. Das Problem des isolierten Identitätsmanagements ist, dass der durchschnittliche Internetnutzer über 20 verschiedene Identitäten hat, die separat angelegt und gewartet werden müssen — mit entsprechendem Aufwand und daraus resultierend Inkonsistenzen zwischen den Datensilos. Globale Identitätsmanager wie Google helfen, diese Probleme mit Hilfe des Single-Sign-On unter Nutzung einer Google-ID zu überwinden. Für den Nutzer ist dies bequem und einfach. Jedoch wird die Hoheit über persönliche Daten und deren kommerzielle Verwertung aus der Hand gegeben.</p><p>Vor diesem Hintergrund streben Selbstsouveräne Identitäten (SSI) die Verknüpfung der Nutzerfreundlichkeit einer Google-ID mit einem hohen Grad an Nutzerkontrolle und Privatsphäre an.</p><h4><strong>Was genau ist SSI?</strong></h4><p>SSI bilden die neueste Entwicklungsstufe digitaler Identitäten. Eine <em>Identität </em>meint alle Attribute einer Person, Organisation, oder Gegenstands, die dieses Subjekt definieren. <em>Attribute </em>beschreiben die Charakteristika eines Subjekts. Eine <em>digitale Identität</em> meint hier die digitale Repräsentation dieser Attribute. Je nach Verwendungskontext können verschiedene <em>Teilidentitäten </em>benötigt werden, die nur einen Teil der Attribute umfassen.</p><p>Ein <em>Credential </em>ist ein fälschungssicherer und überprüfbarer Nachweis von Attributen eines Subjekts. Credentials werden in einem persönlichen digitalen Wallet gehalten. Das Subjekt ist der <em>Holder </em>der Nachweise, nachdem es diese von einem Issuer beschafft hat. Ein <em>Issuer </em>stellt Credentials nach einem Schema aus, das eine übergreifende Verwendung und allgemeingültige Interpretation erlaubt. Ein <em>Verifier </em>fordert, z.B. für den Zugriff auf einen Dienst, einen Nachweis einer Berechtigung (ein Credential) von dem zugreifenden Subjekt. Über ein <em>Verifiable Data Registry</em> werden dazu <em>Identifikatoren </em>ausgetauscht, die auf Subjekte, deren Credentials und deren Issuer verweisen können. Das Verifiable Data Registry enthält keine subjektbezogenen Daten.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/592/1*CIvFM4GiXkwo6momcebtww.png" /><figcaption>Rollen und Informationsflüsse beim Identitätsmanagement mit SSI (Quelle: <a href="https://www.w3.org/TR/vc-data-model/#roles)">https://www.w3.org/TR/vc-data-model/#roles)</a></figcaption></figure><h4><strong>Welchen Mehrwert bringt SSI?</strong></h4><p>SSI überwindet das inkonsistente und — für Nutzer als auch Diensteanbieter — aufwändige Identitätsmanagement in isolierten Datensilos und gleichzeitig die Abhängigkeit von bisherigen Single-Sign-On-Identitätsmanagern wie Google und Twitter, die die Anmeldevorgänge der Nutzer kontrollieren, auswerten und verwerten. Mit SSI haben Subjekte wie Personen oder Maschinen eine einzige Identitätsverwaltung (zumindest für einen gewissen Namensraum). Die einer Identität zugeordneten Berechtigungsnachweise müssen nur einmal eingeholt werden und können dann nach Bedarf passgenau für beliebig viele Anmelde- und Nachweisprozesse eingesetzt werden. Analog zur Plastikkärtchen-Identität können die Nachweise geprüft werden, ohne den Herausgeber zu involvieren.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*bY_mRUBfQqEVxx8G_zULyw.png" /><figcaption>Heutiges Identitätsmanagement in Silos vs. Selbstsouveränes Identitätsmanagement (Adaptiert von: <a href="https://digitaleweltmagazin.de/2019/08/12/chancen-der-self-sovereign-identities-ssi-aus-sicht-von-unternehmen-fuer-das-identity-access-management-iam/">https://digitaleweltmagazin.de/2019/08/12/chancen-der-self-sovereign-identities-ssi-aus-sicht-von-unternehmen-fuer-das-identity-access-management-iam/</a>)</figcaption></figure><p>Für Diensteanbieter vereinfachen sich Onboardingprozesse, das sogenannte Know-your-Customer und die Compliance zu den damit verbundenen Regularien. Zudem sind die bereitgestellten Identitätsinformationen aktuell. Durch die Vereinfachung der Onboardingprozesse werden diese auch mit einer höheren Wahrscheinlichkeit als bislang von den Nutzern abgeschlossen.</p><p>Nutzer von Diensten (beliebige Subjekte, auch Maschinen) können ihre Identität in Plug-and-Play-Manier einsetzen und ihre Identität einfach zwischen verschiedenen Diensten portieren. Das Verwalten und Aktuellhalten von dutzenden Nutzerkonten und das Merken der zugehörigen Passwörter entfällt, da nur noch ein Zugang zum persönlichen Wallet erforderlich ist. Konsequenterweise reduzieren sich auch IT-Sicherheitsprobleme, die durch die Verwendung des gleichen Passworts für verschiedene Nutzerkonten entstehen.</p><h4>Welche Herausforderungen gibt es?</h4><p>Einschränkend kann vermerkt werden, dass die Vorzüge und Nutzbarkeit einer selbstsouveränen Identität sich zunächst auf diejenigen Dienstanbieter beziehen, die auf einer Verifiable Data Registry präsent sind — ähnlich den mobilen Anwendungen auf den App Stores. In Analogie dazu ist ein SSI-Netzwerk umso wertvoller für seine Nutzer, je mehr Dienste darüber nutzbar sind. Schließlich ist hier ein Wachstums- und Konsolidierungsprozess zu erwarten, der über die Durchsetzung von bestimmten Angeboten entscheidet.</p><p>Ein kritischer Gedanke ist, dass mit dem Verifiable Data Registry eine ähnliche Abhängigkeit der Identitäts-Subjekte wie bislang zu den Single-Sign-On-Anbietern aus den Sozialen Medien entstünde. Dieser Gedanke führt jedoch in die Irre, denn das Verifiable Data Registry kann mit Hilfe von Distributed Ledger-Technologie in einem Blockchain-basierten Netzwerk betrieben werden. Mehrere dezentrale Rechenknoten, die von unterschiedlichen Akteuren betrieben werden können, würden das Netzwerk bilden. Oftmals sind hier auch Non-Profit-Organisationen wie <a href="https://sovrin.org/">Sovrin </a>oder die <a href="https://energyweb.org/reports/the-energy-web-chain/">Energy Web Foundation</a> aktiv. Ein deutlicher Kontrast zu den bisherigen Single-Sign-On-Anbietern.</p><p>Ein ausschlaggebender Faktor für den Erfolg von SSI wird sein, dass die Issuer von Credentials das erforderliche Vertrauen herstellen können. Weiterhin wird, wie so oft, die Nutzererfahrung und die daraus resultierende Akzeptanz entscheidend sein. Lösungsanbieter wie Spherity nutzen eine Cloud, um ein <a href="https://medium.com/spherity/multi-party-computation-b4d2a7c3042">nutzerfreundliches Identitätswallet</a> bieten zu können, das auch Private Keys verwahrt und vor dem ungewollten Verlust der Identität schützt.</p><h4><strong>Wer treibt SSI voran?</strong></h4><p>Die Bundesregierung und die Europäische Union sind unter anderem Treiber von SSI. Sie fördern das Entwickeln der Konzepte, Technologie, Standards und Regulierungen bis hin zu Pilotierungen in verschiedensten Branchen und Bereichen.</p><p>Das Bundesministerium für Wirtschaft und Energie fördert z.B. das <a href="https://digitale-identitaeten.de/">„Schaufenster sichere digitale Identitäten“</a>. Eines der Schaufensterprojekte ist IDunion. <a href="https://idunion.org/">IDunion </a>arbeitet an der Schaffung eines offenen Ökosystems für dezentrale, selbstsouveräne Identitäten. Dazu wird mit Blockchain-Technologie ein verteiltes Identitätsnetzwerk aufgebaut. Das Projekt wird von 47 namhaften Forschungs- und Industriepartnern vorangetrieben. 51nodes ist Partner von IDunion.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*pR1OISxpI1Wx9liL52bUPA.png" /><figcaption>Schaufensterprojekte Digitale Identitäten (Quelle: <a href="https://digitale-identitaeten.de/schaufensterprojekte/">https://digitale-identitaeten.de/schaufensterprojekte/</a>)</figcaption></figure><p>Die Europäische Union finanziert z.B. das <a href="https://essif-lab.eu/">„European Self-Sovereign Identity Framework Lab“</a>. 20 Teilprojekte kümmern sich um die Schaffung und Erweiterung eines Open Source-Softwareframeworks für SSI. Weitere 42 Teilprojekte entwickeln kommerzielle Komponenten and Dienste unter Nutzung des SSI-Frameworks.</p><h4>Anwendungsbeispiel: Wie kann SSI der Energiewirtschaft helfen?</h4><p>Vor dem Hintergrund der Energiewende stehen die Betreiber von Übertragungs- und Verteilnetzen für elektrischen Strom vor großen Herausforderungen und neuen gesetzlichen Pflichten. Über eine Million (privater) dezentraler Stromerzeuger müssen effizient und effektiv in bestehende energiewirtschaftliche Prozesse (z.B. der <a href="https://www.next-kraftwerke.de/wissen/regelenergie">Regelleistung</a> und des <a href="https://www.transnetbw.de/de/strommarkt/systemdienstleistungen/redispatch-2-0">Engpassmanagements im Redispatch 2.0</a>) integriert werden.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/665/1*_PHuJJDjX7FOPcBF8WVgvQ.png" /><figcaption>Leistungen und Prozesse der Energiewirtschaft (Quelle: <a href="https://www.next-kraftwerke.de/wissen/systemdienstleistungen">https://www.next-kraftwerke.de/wissen/systemdienstleistungen</a>)</figcaption></figure><p>Aktuell sind energiewirtschaftliche Prozesse gekennzeichnet durch produktspezifische und teilweise aufwendige Registrierungsprozesse an verschiedenen Stellen. Beispielsweise kann man hier das <a href="https://www.marktstammdatenregister.de/MaStR">Marktstammdatenregister </a>der Bundesnetzagentur, die <a href="https://www.regelleistung.net/">Regelleistungsplattform</a> der Übertragungsnetzbetreiber und die Plattform <a href="https://netz-connectplus.de/home/projekt/">connect+</a> zur Kooperation und zum Datenaustausch der Netzbetreiber nennen. Hinzu kommt eine geringe Motivation der Betreiber von kleinen Energieerzeugungsanlagen und der Sicherung der Aktualität der Daten an den unterschiedlichen Registrierungsorten.</p><p>Die Frage von erforderlichen Nachweisen betrifft z.B. die Art der Stromerzeugung (z.B. EEG-Anlage) bzw. für welches energiewirtschaftliche Produkt die Anlage eingesetzt werden darf und kann. Die Identifikatoren (Energy Identification Code) für Anlagen werden zentral durch den <a href="https://www.energiecodes-services.de/bdew-codes.html">BDEW </a>vergeben. Weiterhin gibt es Vorgaben zur Marktkommunikation durch den Regulator, die mit Hilfe einer Zertifikatinfrastruktur für Signierung und Verschlüsselung erfolgt. D.h. die Zertifikate müssen bei allen genannten Stellen eingepflegt und aktuell gehalten werden. In einem “Internet of Energy”, wenn Millionen von Endkunden mit ihren dezentralen Flexibilitätsanlagen (z.B. Elektromobilität, Heimspeicher, Wärmepumpen, etc.) auf den Energiemarkt drängen, kommen diese Systeme zur Abwicklung der Massenprozesse für Registrierungs- und Veränderungsanforderungen in der notwendigen Geschwindigkeit an ihre Grenzen.</p><p>Der Ansatz von SSI verspricht demgegenüber die Vision der Integration von Millionen von dezentralen Anlagen in die Prozesse und IT-Systeme der Netzwirtschaft in unaufwändiger Plug-and-Play-Manier, ohne dass Mehrfachanmeldungen und parallele Konsistenzhaltung in redundanten isolierten Identitätsmanagementsystemen weiter erforderlich wären. Damit sollten Hürden für Anlagenbetreiber sinken und die Prozesse der Netzwirtschaft können schlanker und reibungsärmer aufgestellt werden. Schließlich würde ein wichtiger Beitrag für das Gelingen der Energiewende geleistet.</p><p>Aktuell arbeitet 51nodes gemeinsam mit dem Übertragungsnetzbetreiber <a href="https://www.transnetbw.de/de">TransnetBW</a> an der Erörterung von SSI im Kontext der Energiewirtschaft. Bislang konnten schon mehr als zehn potenzielle Anwendungsfälle für SSI identifiziert werden.</p><h4><strong>Interesse an SSI?</strong></h4><p><a href="https://www.51nodes.io/">51nodes</a> ist ein Anbieter von Lösungen für die Crypto Economy. 51nodes unterstützt Unternehmen und andere Organisationen bei der Umsetzung ihrer SSI- und Blockchain-Projekte. Sie haben Fragestellungen zu digitalen Identitäten und Anknüpfungspunkte zur Nutzung von SSI? Kommen Sie gerne auf uns zu!</p><h4>Weiterführende Informationen zu SSI</h4><p>(1) <a href="https://tykn.tech/self-sovereign-identity/">Self-Sovereign Identity: The Ultimate Beginners Guide</a> (Webseite) <br>(2) <a href="https://www.fim-rc.de/wp-content/uploads/2021/06/Fraunhofer-FIT_SSI_Whitepaper.pdf">Self-Sovereign Identity: Grundlagen, Anwendungen und Potenziale portabler digitaler Identitäten</a> (Whitepaper des Fraunhofer FIT) <br>(3) <a href="https://digitaleweltmagazin.de/2019/08/12/chancen-der-self-sovereign-identities-ssi-aus-sicht-von-unternehmen-fuer-das-identity-access-management-iam/">Chancen der Self-Sovereign Identities aus Sicht von Unternehmen für das Identity &amp; Access Management</a> (Webseite) <br>(4) Ein <a href="https://www.hindawi.com/journals/scn/2021/8873429/">Überblick über das SSI-Ökosystem</a> (wissenschaftlicher Artikel) <br>(5) Eine technische Umsetzungsplattform für SSI in der Energiewirtschaft: <a href="https://energy-web-foundation.gitbook.io/energy-web/foundational-concepts/self-sovereign-identity">Energy Web Decentralized Operating System</a> (Technische Konzeptdokumentation)</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=22224d30de53" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/das-potenzial-von-selbstsouver%C3%A4nen-identit%C3%A4ten-am-beispiel-der-energiewirtschaft-22224d30de53">Das Potenzial von Selbstsouveränen Identitäten am Beispiel der Energiewirtschaft</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Exploring IOTA 2.0 Smart Contracts in a Private Network: Developing a Prediction Market]]></title>
            <link>https://medium.com/51nodes/exploring-iota-2-0-smart-contracts-in-a-private-network-developing-a-prediction-market-c2d81988f75e?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/c2d81988f75e</guid>
            <category><![CDATA[ethereum]]></category>
            <category><![CDATA[iota]]></category>
            <category><![CDATA[smart-contracts]]></category>
            <category><![CDATA[prediction-markets]]></category>
            <category><![CDATA[rust]]></category>
            <dc:creator><![CDATA[Dr. Achim Klein]]></dc:creator>
            <pubDate>Thu, 16 Sep 2021 20:53:58 GMT</pubDate>
            <atom:updated>2021-09-16T20:53:58.736Z</atom:updated>
            <content:encoded><![CDATA[<p><em>Abstract </em>— Smart contracts are being introduced to IOTA, which has proposed an interesting take on distributed ledgers by proposing the Tangle. The Tangle promises to provide a more scalable transaction infrastructure with lower fees, potentially overcoming limitations of Ethereum and other blochain-based ledgers. Due to the current alpha state of IOTA smart contracts, the official documentation lags and it is a bit of challenge to get a working setup. To help interested developers and the IOTA community to more quickly try out smart contracts, this article describes how to set up a private IOTA network with the latest state of the software and demonstrates a workable prediction market smart contract developed in Rust.</p><h3>Introduction</h3><p>Smart contracts are codified contracts using rules and algorithms that can automatically trigger and incur electronic transactions of cryptocurrencies like Ether. The expected economic impact of smart contracts is very large as they allow for improving the efficiency of many existing business models and they also allow for completely new business models. For instance payouts of insurance contracts can be automated using smart contracts or autonomous cars could also pay tolls and parking fees automatically.</p><p>While smart contracts are already well known amendments from blockchain approaches like Ethereum, these approaches suffer from shortcomings like low throughput, and high transaction costs. For these reasons, the IOTA foundation had set out to first propose a new and more scalable transaction system based on the so called “Tangle”. Secondly, IOTA version 2.0 is currently under development, and it includes an approach to smart contracts. Smart contracts under IOTA 2.0 are promised to be built on an infrastructure that both scales well, and incurs low transaction costs.</p><p>Before the background of a promising new approach for smart contracts, I started to explore how to set up a private IOTA network for development and how to develop and deploy a smart contract on it. As the IOTA developer documentation is lagging it was quite a challenge to get a working solution. Thus, the <strong>main contributions</strong> of this article are, (1) a proper description how to set up the network and how to enable smart contracts, and (2) a fully working demonstration smart contract, which implements a simple prediction market in which multiple network participants can predict and bet on a certain outcome of an event.</p><p>The <strong>outline</strong> of this article is, I give (1) an overview of IOTA and the network’s smart contract integration, (2) a description of how to set up an environment for developing and testing IOTA smart contracts, (3) an implementation of a simple prediction market as a smart contract, (4) some notes and insights on IOTA and smart contracts, and (5) a conclusion.</p><h3>The IOTA Network and Smart Contracts</h3><p>IOTA is about a new kind of public and permissionless distributed ledger for exchanging value and data. The IOTA network has been designed to overcome the main bottlenecks of Blockchain-based distributed ledgers. Due to the organization of transactions (of value) in a chain, there is just one end to append new ones, which makes it slow. Thus, in contrast to the Blockchain-based approaches the distributed ledger of IOTA is organized in a different way. The Blockchain, which is central to Bitcoin or Ethereum for instance, is replaced by the Tangle. The Tangle connects transactions via edges in a directed (and acyclical) graph (see next figure). In contrast to the Blockchain, there are multiple nodes (representing transactions) on which new edges to new nodes, i.e., transactions, can be appended.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/740/1*wkujYx46q_Wb4V-Rj7iMRA.png" /><figcaption>The Tangle data structure of IOTA’s distributed ledger. Green: validated transactions, White: not yet validated, Grey: new transactions. <a href="https://iota-einsteiger-guide.de/was-ist-iota.html">Source</a> (2021–09–10)</figcaption></figure><p>Beside the data structure, there are further key differences to classic Blockchain-based approaches. The consent mechanism requires no miners because all users help in validating transactions. Therefore, transactions can be essentially conducted with zero fees. The current IOTA network still requires a central coordinator, defining trusted transaction milestones, which other transactions need to reference to be also trusted. To achieve true decentralization, the planned update to IOTA 2.0 should overcome this limitation.</p><p>The key properties and key promises of <a href="https://www.iota.org/get-started/what-is-iota">IOTA</a> are of being</p><ul><li>highly scalable by a new data structure allowing for parallel transactions</li><li>requiring few resources and being suitable for devices and sensors</li><li>having zero-fee transactions</li><li>running fast transactions</li><li>finally approving messages within seconds</li><li>providing a distributed network, which is robust against attacks</li></ul><p>These properties would enable a layer of trust and a very scalable and efficient messaging system for the <a href="https://www.iota.org/get-started/what-is-iota">“machine economy”</a> with a very large amount of devices connected to the Internet. The current version of IOTA on the Mainnet is <a href="https://en.wikipedia.org/wiki/IOTA_(technology)">1.5</a>, which went live in April 2021. Version 1.5 is seen as an intermediate step in maturing the IOTA technology and proving its usability and practical value. This would form the basis for smart contracts that use the messaging capabilities of the IOTA network. Smart contracts are supposed to be introduced in version 2.0 the latest and might <a href="https://twitter.com/iota/status/1428704497992413185">be even backported to 1.5</a>. At the time of writing, smart contracts are subject to development.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/742/1*1qGJric4onj1wqWmj07Neg.png" /><figcaption>IOTA network layers. <a href="https://github.com/iotaledger/goshimmer/blob/develop/documentation/static/img/protocol_specification/layers.png">Source</a> (2021–09–10)</figcaption></figure><p>The design for IOTA 2.0 foresees different layers (see figure above) that separate the general underlying messaging, transacting, and validating infrastructure from the application layer, which comprises also smart contracts, which would then use layer 1 capabilities to realize their features. A main reason for separating smart contracts into another layer is to not compromise the messaging functionality. Smart contracts run on Wasp nodes on layer 2 in connection with Goshimmer on layer 1, responsible for conducting transactions and messaging (see next figure).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/403/1*hD7FpXTIZw0VPvmaYSiVxg.png" /><figcaption>GoShimmer for messaging and Wasp nodes for smart contracts in an IOTA 2.0 network. <a href="https://github.com/iotaledger/wasp/blob/master/documentation/ISCP%20architecture%20description%20v3.pdf">Source</a> (2021–09–10)</figcaption></figure><p>Due to the separation of Wasp and Goshimmer, they need a technical capability to connect and interact with each other. Therefore, Goshimmer nodes contain a “txstream” plugin, which needs to be activated for connecting Wasp nodes. Smart contracts are developed in the Rust programming language. <a href="https://www.rust-lang.org/">Rust</a> is known to generate very performant and memory efficient executables while preventing the developer from many classes of errors by a helpful compiler and a well-designed language. To test smart contracts and run unit tests against smart contract functionality, the <a href="https://pkg.go.dev/github.com/iotaledger/wasp/packages/solo">“solo” environment</a> written in Go can be used.</p><h3>Setup</h3><p>The described setup defines a private IOTA 2.0 network you can run on your laptop. The network hosts a layer 1 Goshimmer node for messaging and a layer 2 Wasp node for running smart contracts. On this basis, the next chapter shows how to develop and deploy a smart contract to this network.</p><h4>Environment</h4><p>I used Ubuntu 20.04 LTS with latest updates and upgrades and</p><ul><li>installed <strong>git</strong></li><li>installed <a href="https://www.rust-lang.org/"><strong>rust</strong></a> 1.53.0 for developing native IOTA smart contracts by<br>curl --proto &#39;=https&#39; --tlsv1.2 -sSf https://sh.rustup.rs | sh</li><li>installed <strong>wasm-pack</strong> 0.10.0 for compiling smart contracts into “WebAssembly” binaries by<br>curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh</li></ul><p>Installed dependencies for <a href="https://github.com/facebook/rocksdb/blob/master/INSTALL.md">rocksdb</a> required by Goshimmer as its underlying database engine:</p><ul><li>sudo apt-get install libgflags-dev</li><li>sudo apt-get install libsnappy-dev</li><li>sudo apt-get install zlib1g-dev</li><li>sudo apt-get install libbz2-dev</li><li>sudo apt-get install liblz4-dev</li><li>sudo apt-get install libzstd-dev</li></ul><p>Cloned and built Goshimmer <strong>0.7.5</strong> from the <strong>develop</strong> branch (in the version of 2021–08–16) according the following description:</p><ul><li>git clone -b develop <a href="https://github.com/iotaledger/goshimmer.git">https://github.com/iotaledger/goshimmer.git</a></li><li>git checkout tags/v0.7.5</li><li>cd goshimmer</li><li>go build -tags rocksdb</li></ul><p>Prepared Goshimmer for transaction handling<br>Save the <a href="https://github.com/51nodes/prediction-market-smart-contract/blob/main/goshimmer/config.json"><em>config.json</em> file</a> to your Goshimmer directory to</p><ul><li>enable the <strong>txstream</strong> plugin, which allows Goshimmer to communicate with Wasp nodes</li><li>disable the <strong>portcheck</strong> plugin</li></ul><p>To check whether Goshimmer synchronizes its time and whether messaging works, open up the dashboard on your local machine by <a href="http://127.0.0.1:8081/dashboard">http://127.0.0.1:8081/dashboard</a>. It needs to display</p><ul><li>TangleTime Synced: Yes</li><li>Message: DBVBaNbSEsq8D1SuNd7ULLeSPTXLwQBwfW1agWKnFX23 (as an example) — that is, the message must <em>not</em> read 1111111111111111111111111 .. (containing only ones)</li></ul><p>Before running Goshimmer, delete the message database of previous (erroneous) attempts ( <em>if any</em>) by rm -rf mainnetdb in the mainnetdb subdirectory of Goshimmer. The database will be automatically generated again with a fresh start of Goshimmer.</p><p>Also in case tangle time does not synchronize (see the dashboard on your local machine by <a href="http://127.0.0.1:8081/dashboard">http://127.0.0.1:8081/dashboard</a> or the Goshimmer log: “can’t issue payload: tangle not synced”), shutdown Goshimmer, delete the message database, and restart Goshimmer.</p><p>Run Goshimmer in its directory as follows — after having created the ./assets/snapshotTest.bin file as described subsequently:</p><p>./goshimmer --autopeering.seed=base58:8q491c3YWjbPwLmF2WD95YmCgh61j2kenCKHfGfByoWi --node.enablePlugins=bootstrap,prometheus,spammer,&quot;webapi tools endpoint&quot;,activity,snapshot,txstream --messageLayer.startSynced=true --autopeering.entryNodes= --node.disablePlugins=clock --messageLayer.snapshot.file=./assets/snapshotTest.bin --messageLayer.snapshot.genesisNode= --metrics.manaResearch=false --mana.enableResearchVectors=false --mana.snapshotResetTime=true --statement.writeStatement=true --statement.writeManaThreshold=1.0 --config=./config.json</p><h4>Create a cli wallet</h4><p>The cli wallet can send IOTA funds to a Wasp wallet, where it is used to deploy smart contracts. We generate an initial transaction with funds for our cli wallet.</p><p>Install the cli-wallet in a new directory</p><ul><li>wget https://github.com/iotaledger/goshimmer/releases/tag/v0.7.5 download cli-wallet-0.7.5_Linux_x86_64.tar.gz</li><li>tar -xf cli-wallet-0.7.5_Linux_x86_64.tar.gz</li></ul><p>Set reuse_addresses=true in the config.json of cli-wallet:</p><p>To create a new wallet run ./cli-wallet init , returning</p><blockquote><em>IOTA 2.0 DevNet CLI-Wallet 0.2 GENERATING NEW WALLET … [DONE] ================================================================ !!! PLEASE CREATE A BACKUP OF YOUR SEED !!! !!! !!! !!! E7owJWtDBGSUAZUWQkn1kHG5zUy2PLQf6eEr3RoMCJs7 !!! !!! !!! !!! PLEASE CREATE A BACKUP OF YOUR SEED !!! ================================================================ CREATING WALLET STATE FILE (wallet.dat) … [DONE]</em></blockquote><p>Note <em>your</em> SEED for allocating funds to this wallet.</p><p>We generate a custom genesis snapshot, with the transaction that allocates the funds.</p><p>Go to the Goshimmer installation directory and then to the following subdirectory ./tools/genesis-snapshot</p><p>Paste the seed of the previously generated cli wallet to the following command</p><p>go run main.go --token-amount 3500000 --seed E7owJWtDBGSUAZUWQkn1kHG5zUy2PLQf6eEr3RoMCJs7 --snapshot-file snapshotTest.bin</p><p>Now,</p><ul><li>go to your Goshimmer directory and inside of it run</li><li>mkdir assets</li><li>cp ./tools/genesis-snapshot/snapshotTest.bin ./assets/snapshotTest.bin to provide the generated snapshotTest.bin file to Goshimmer.</li></ul><h4>Setting up a Wasp node for smart contracts</h4><p>I installed Wasp from the <strong>master</strong> branch in the state of 2021–08–03.</p><ul><li>git clone <a href="https://github.com/iotaledger/wasp.git">https://github.com/iotaledger/wasp.git</a></li><li>check out a workable state of the code from the repository. I used this <a href="https://github.com/iotaledger/wasp/commit/05516ca29edd9e93b17ed9a0f788ddb51c407d48">commit</a>.</li><li>go build -tags rocksdb</li><li>go build -tags rocksdb ./tools/wasp-cli</li></ul><p>We need to transfer funds to the Wasp wallet by creating the wallet in the first place by ./wasp-cli init</p><p>We need to get the address of the wallet by ./wasp-cli balance , returning something like</p><blockquote>Address index 0 Address: 1Ah4cqMPdrDGx6Htapk7NZUxxcYHsP1C3oAugEYHVmACj</blockquote><p>To send funds to this wallet, paste <em>your</em> address into this command and run it in the cli-wallet’s directory:</p><p>./cli-wallet send-funds -amount 40000 -dest-addr 1Ah4cqMPdrDGx6Htapk7NZUxxcYHsP1C3oAugEYHVmACj</p><p>Now, ./wasp-cli balance returns a balance of 40,000 IOTA.</p><p>Finally, configure wasp-cli to be able to connect to the local Goshimmer node and to form a committee of one local Wasp node by saving the <a href="https://github.com/51nodes/prediction-market-smart-contract/blob/main/wasp/wasp-cli.json"><em>wasp-cli.json</em> file</a> to the directory of your wasp-cli.</p><h4>Deploying a chain</h4><p>Smart contracts are deployed on a chain, which needs to be deployed first:</p><p>./wasp-cli chain deploy --committee=0 --quorum=1 --chain=predmarketchain --description=&quot;Prediction Market&quot;</p><p>where</p><ul><li><em>committee=0</em> specifies to use one Wasp node only, which handles smart contracts.</li><li><em>quorum=1</em> says one Wasp node is enough here — for development and testing</li></ul><p>Now we have to provide funds to the chain by</p><p>./wasp-cli chain deposit IOTA:1000 --chain=predmarketchain ,</p><p>reducing the wasp wallet’s balance by 1,000 IOTA.</p><h3>A Prediction Market Smart Contract</h3><p>The private IOTA network is now used to develop and deploy a smart contract. I report the design and implementation of a simple prediction market in Rust, how to build and deploy it, and finally how to use it.</p><h4>Design</h4><p>A classic example of realizing a smart contract is given by a prediction market. A prediction market is a virtual electronic market allowing to predict outcomes of future events by placing a monetary bet on a certain outcome. Such events could be sports events, political events, future prices of stocks, or other events with uncertain future outcomes. For instance, the outcome of a political election could be subject to predictions on a prediction market. A simple binary question to be answered by prediction market participants could be “Will candidate/team A win?” — with possible outcomes being “yes” or “no”.</p><p>Our design of a prediction market for demonstration purposes is simple. We omit a book maker and a pricing mechanism. Formally, we do not pose a question with predefined possible outcomes. Instead, market participants can bet basically on any outcome of an event with an arbitrary amount of tokens until the time for predictions is over. Afterwards, the winning outcome is determined and winning bets placed on the correct outcome receive back their share on the overall amount of tokens placed in bets. Assume, in total 700 tokens were bet on “no” and 300 tokens in total were bet on “yes”, and “yes” is the actual outcome. A single bet on “yes” with 100 tokens receives (100/300)*(700+300) = 333 tokens, making a win of 233 tokens.</p><p>Realizing this design as a smart contract in the IOTA network allows to deploy one contract per question to be answered. The account deploying the contract is in control and has to specify the time until when bets can be placed on outcomes. The actual question to be answered has to be communicated in third party channels. Any network participant can then look up the contract and call a function to place a bet on an outcome by sending some IOTA from their wallet. Finally, after the time for predictions and bets has passed, the deploying account has to call a function to close the prediction market and to provide the actual outcome of the event and correct answer to the question. This triggers the evaluation of all bets with regard to the correct answer. Accounts with the correct answer receive the winning amount of IOTAs in a transaction.</p><p>Note that in a real world and more production-like scenario, one might consider using an <a href="https://blog.iota.org/introducing-iota-oracles/">oracle</a> to provide the outcome of an event. Oracles can stream off-chain data (about events) into the Tangle, so smart contracts can use this data in their evaluations.</p><h4>Implementation</h4><p>Smart contracts for the IOTA network can be implemented in <a href="https://www.rust-lang.org/">Rust</a> and then compiled to a <a href="https://www.rust-lang.org/what/wasm">WebAssembly</a> file.</p><p>Our demonstration smart contract implemented in Rust can be viewed and cloned from this <a href="https://github.com/51nodes/prediction-market-smart-contract">repository</a>. The smart contract first exposes three functions for (1) initializing a prediction market, (2) placing a bet on an outcome, and (3) closing the prediction market for determining winners. When the contract is loaded, the mentioned functions’ implementations are made publicly available under the first string’s name, e.g., “initmarket”.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/5344e81dcddb48c33507805f2c69a911/href">https://medium.com/media/5344e81dcddb48c33507805f2c69a911/href</a></iframe><p>The first function should be called by the account deploying the contract for <strong>initialization. </strong>Optionally, the function can set an end time for betting using the parameter BETENDUTC, which is a date-and-time string in ISO format, assuming time in UTC. In case the parameter is omitted, bets can be placed at any time until the closemarket function is called (see below).</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/d46be7c079448590d97dec4737f438fb/href">https://medium.com/media/d46be7c079448590d97dec4737f438fb/href</a></iframe><p>The second function allows to <strong>place a bet</strong> on a certain outcome value of an event, provided as parameter BETVALUE, e.g. “yes”. The amount to bet is the amount of IOTA sent with the function call. Bets must be placed in time before the <em>betenddatetime</em> has passed, which was set on initialization of the market. To save incoming bets for future evaluation to determine winners, I use two structs, defined in the beginning. That is, I define a hash map, mapping a betting account’s id to a <em>Bet</em> struct, which defines the amount of tokens and the outcome value of the bet. These custom structs are used instead of the built-in map offered by the context object of the function because only a proper hash map allows iterating over all keys and elements stored. As custom objects are not accommodated by the state stored in the context of the function, we need to jsonify it to produce a string, which can then be stored in the state.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/af893e01ef9fe5dace30c0eeca30aa3a/href">https://medium.com/media/af893e01ef9fe5dace30c0eeca30aa3a/href</a></iframe><p>The third function closes the prediction market and is to be called by the contract owner. Calls by other accounts will fail. The function can only be run after the specified end time has passed for predictions, and the function to close the market can be called successfully only once. The function requires a BETVALUE parameter, specifying the winning outcome, e.g., “yes”. The function runs through the stored bets, determines winning bets and the amount of IOTA coins they receive, and sends the IOTA to the wallets of the winners.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/77f8fb7fbb1ddf71c8932d0b0c861cef/href">https://medium.com/media/77f8fb7fbb1ddf71c8932d0b0c861cef/href</a></iframe><h4>Build</h4><p>To build the smart contract, one can pull the accompanying <a href="https://github.com/51nodes/prediction-market-smart-contract">repository</a> on github with the following structure:</p><pre>* Cargo.toml <br>* src/lib.rs <br>* pkg/ <br>* target/</pre><p>The full Rust code of the smart contract is contained in a file called <em>lib.rs</em>. The structure and the naming of files follows standard conventions for Rust. To define dependencies of the smart contract code, the <a href="https://github.com/51nodes/prediction-market-smart-contract/blob/main/Cargo.toml"><em>Cargo.toml</em> file</a> reads as follows:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/396e59bd1247b54f3142f12d97149c63/href">https://medium.com/media/396e59bd1247b54f3142f12d97149c63/href</a></iframe><p>The most important dependency is <em>wasmlib</em>, which is the IOTA’s smart contract library, allowing to produce smart contracts as compiled WebAssembly files. Furthermore, <em>serde</em> is used for serializing custom state objects to json strings for storage in the smart contract’s state. <em>chrono</em> brings some date and time related functionality required to control the end of a prediction market.</p><p>To build the smart contract, run wasm-pack build in the directory where <em>Cargo.toml</em> resides. The other directories like <em>pkg</em> and <em>target</em> are created automatically in the build process. The compiled WebAssembly file is located in the <em>pkg</em> directory and named <em>predictionmarket_bg.wasm</em>.</p><h4>Preparation</h4><p>We proceed by deploying our simple smart contract compiled as a WebAssembly <em>wasm</em> file.</p><p>Note: please adapt the path to the wasm file if required.</p><p>./wasp-cli chain deploy-contract wasmtime predictionmarket &quot;Prediction Market SC&quot; ./prediction-market-smart-contract/pkg/predictionmarket_bg.wasm --chain=predmarketchain --upload-quorum=1 -d --address-index=0</p><p>outputing:</p><pre>uploaded blob to chain -- hash: 6wVabTkRUUGrQzEj8s4yuPC8dfaHGLsoHLXsqvveSw4hPosted on-ledger transaction BChQcWEmMptqRM1z4C9ZffTQXPebg1MMiEYENuAnV7KV containing 1 request:<br>- Request 4RkBSF6BAHgfamyJnXFy4r1YoVU9wvZ9b6uvXUjt2VWAfF5<br>Waiting for tx requests to be processed...<br>Posted off-ledger request 2JdtBzxjP4Bc6Tj2SnZPSctoYDHxuCAvPWJwn6ufMShJ6Lw</pre><p>Now, functions can be called on the contract. First, the same wasp-cli that deployed the contract needs to call the <em>initmarket</em> function. There are two possibilities:</p><p>a) do not specify a specific an end date for the prediction market to simplify testing and development by this call ./wasp-cli chain post-request predictionmarket initmarket --chain=predmarketchain</p><p>b) specify a specific end date and time for the prediction market. The iso format is used and UTC is assumed. In this way, all bets must be placed before this time and the market can be only closed after this time. Run ./wasp-cli chain post-request predictionmarket initmarket string BETENDUTC string &quot;2021-09-08 23:00&quot; --chain=predmarketchain</p><p>Before placing a bet, we check the wallet’s balance with ./wasp-cli balance</p><p>returning</p><pre>Address index 0<br>Address: 1BAgmaSN1RYk5rbbxMK21CZo8t2zQ3EFeMwDPnMhQdQbs<br>Balance:<br>IOTA: 28653<br>------<br>Total: 28653</pre><p>For the deployed prediction market, we assume two possible outcomes “yes” and “no” on which bets can be submitted. To place a bet with 10 IOTA on “no”, we run</p><p>./wasp-cli chain post-request predictionmarket bet string BETVALUE string no --chain=predmarketchain -t IOTA:10</p><p>and afterwards, by ./wasp-cli balance we see the wallet&#39;s balance reduced by 10 IOTA.</p><pre>Address index 0<br>Address: 1BAgmaSN1RYk5rbbxMK21CZo8t2zQ3EFeMwDPnMhQdQbs<br>Balance:<br>IOTA: 28643<br>------<br>Total: 28643</pre><p>We now introduce and prepare the setup for four more participants to the prediction market. Each participant requires another wasp wallet. Thus, we create four new subdirectories <em>waspwallet2</em>, <em>waspwallet3</em>, <em>waspwallet4</em>, <em>waspwallet5</em> and copy <em>wasp-cli</em> and <em>wasp-cli.json</em> to those directories by repeating with regard to these directories</p><p>The last command initializes a new wallet.</p><p><strong>In case the contract’s code is changed, it needs to be re-deployed</strong> using the first and main wasp-cli. You also need to re-deploy the chain on which the contract is deployed first. Due to the redeployment of the contract, the copied versions <em>of wasp-cli.json</em> (in <em>waspwallet2</em> to <em>waspwallet5</em> directories) miss the new address of the deployed <em>predmarketchain</em> chain. In this case, we need to get it back in there to be able to run bets on the same prediction market from another account. Either you noted the address of the chain when it was created, such as</p><pre>activating chain nZBwoJi5q7KGk8D2cgm16PWrdM6aL2qTdCY27HHjZgrK.. OK.</pre><p>or you perform a cat wasp-cli.json in the directory of your first wasp wallet, giving you some information like this (among other information):</p><pre>{<br>  &quot;chains&quot;: {<br>    &quot;predmarketchain&quot;: &quot;nZBwoJi5q7KGk8D2cgm16PWrdM6aL2qTdCY27HHjZgrK&quot;,<br>  },</pre><p>Now you need to provide this address of the <em>predmarketchain</em> to the new wasp wallets’ configuration files. To do this, you can run the following command for the new wallets in the respective directories (e.g. in <em>waspwallet2</em> through <em>waspwallet5</em> in our example).</p><p>./wasp-cli set chains.predmarketchain nZBwoJi5q7KGk8D2cgm16PWrdM6aL2qTdCY27HHjZgrK</p><p>Please adapt the chain’s actual address <em>nZBwoJi5q7KGk8D2cgm16PWrdM6aL2qTdCY27HHjZgrK</em> to yours.</p><p>Finally, all four new wasp wallets need to be funded. First, find out their address by running (for each new waspwallet subdirectory)</p><p>waspwallet2/wasp-cli address</p><p>Then, provide the funds using cli-wallet (in the directory where it resides on your computer) by running</p><p>./cli-wallet send-funds -amount 40000 -dest-addr 1Ah4cqMPdrDGx6Htapk7NZUxxcYHsP1C3oAugEYHVmACj</p><p>and replacing the address <em>1Ah4cqMPdrDGx6Htapk7NZUxxcYHsP1C3oAugEYHVmACj</em> with the actual address found by running ./wasp-cli balance before in the respective subdirectories (of <em>waspwallet2</em> to <em>waspwallet5</em>).</p><h4>Simulation of a Prediction Market</h4><p>Now we are ready to place bets in the deployed contract’s prediction market on behalf of these four new participants. So, in the respective subdirectories we run say</p><ul><li>cd waspwallet2</li><li>./wasp-cli chain post-request predictionmarket bet string BETVALUE string yes --chain=predmarketchain -t IOTA:100</li><li>cd ../waspwallet3</li><li>./wasp-cli chain post-request predictionmarket bet string BETVALUE string no --chain=predmarketchain -t IOTA:50</li><li>cd ../waspwallet4</li><li>./wasp-cli chain post-request predictionmarket bet string BETVALUE string yes --chain=predmarketchain -t IOTA:200</li><li>cd ../waspwallet5</li><li>./wasp-cli chain post-request predictionmarket bet string BETVALUE string yes --chain=predmarketchain -t IOTA:500</li></ul><p>Finally the contract owner (with the first wallet) can close the prediction market by running in the directory of the first wallet</p><p>./wasp-cli chain post-request predictionmarket closemarket string BETVALUE string no --chain=predmarketchain</p><p>In this example, the actual outcome is specified to be “no”. When running this command from a different wasp wallet’s directory, we obtain a log output on the Wasp node</p><pre>You are not authorised to close the prediction market - only contract creator is allowed to close the market.</pre><p>When successfully closing the prediction market, the Wasp node’s log outputs</p><pre>CLOSEMARKET is executed:<br>the winning value is: &quot;no&quot;<br>total amount of bets placed on &quot;no&quot; is 60 IOTA<br>total amount of bets placed on &quot;yes&quot; is 800 IOTA<br>total amount of bets over all values: 860 IOTA<br>1FbCCHv9if3xbnRg3wJ7SY1kBFSdhNFR6Ax6haw6PhYDL placed a bet on &quot;no&quot;, which is a WIN<br>bet amount: 10 IOTA; won amount: 143 IOTA; of total amount placed a bet on 860; where total amount per winning value: 60<br>transferring won amount of IOTA to: 1FbCCHv9if3xbnRg3wJ7SY1kBFSdhNFR6Ax6haw6PhYDL<br>1FZtVTCi2GDuQ1oMGZqpT38akLpcMiMv6a8MVKNJYYdsr placed a bet on &quot;yes&quot;, which is not a win<br>1F81pGLKLhPb5ANFSGWQ7UPMSnPdahNZaZkgrcyaFXvpu placed a bet on &quot;no&quot;, which is a WIN<br>bet amount: 50 IOTA; won amount: 716 IOTA; of total amount placed a bet on 860; where total amount per winning value: 60<br>transferring won amount of IOTA to: 1F81pGLKLhPb5ANFSGWQ7UPMSnPdahNZaZkgrcyaFXvpu<br>17jdFbAhWwF79fBEia6A8AYMTmMYncipaFhDTjsqUbMfp placed a bet on &quot;yes&quot;, which is not a win<br>1BAgmaSN1RYk5rbbxMK21CZo8t2zQ3EFeMwDPnMhQdQbs placed a bet on &quot;yes&quot;, which is not a win<br>consensus/action.go:338 postTransaction: POSTED TRANSACTION: 4Aw6PzQGkk6MFzPVAXZkz8RGiPocYxuxhA9o7qgeDN7h, number of inputs: 2, outputs: 3<br>EVENT: state was synced to block index #11, approving output: [0]4Aw6PzQGkk6MFzPVAXZkz8RGiPocYxuxhA9o7qgeDN7h<br>STATE TRANSITION TO #11. requests: 1, chain output: [0]4Aw6PzQGkk6MFzPVAXZkz8RGiPocYxuxhA9o7qgeDN7h</pre><p>Revisiting the bets, we had placed the following ones:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/0b27b63cd0d3c74c7d815b7070566647/href">https://medium.com/media/0b27b63cd0d3c74c7d815b7070566647/href</a></iframe><p>Bets number 1 and 3 were on “no” and the bet was on the actual outcome “no”. So the total amount of IOTA was 860, the bets on “no” were only 60 IOTA in total. The share of bet number 1 is 10/60 and share of bet number 2 is 50/60. So, wasp wallet 1 receives 1/6 of 860 IOTA, i.e, 143 IOTA. And wasp wallet 3 receives 5/6 of 860 IOTA, i.e, 716 IOTA. Note that when transferring funds, a minimum transaction of fee of 1 IOTA is deducted from the amount to be transferred.</p><p>Note that running the <em>closemarket</em> function of the smart contract a second time leads to an error message in the Wasp node’s log:</p><pre>the prediction market was already closed</pre><h4>Limitations</h4><p>There are some limitations of the presented prediction market</p><ul><li>Only one contract per chain can be deployed because the bets are not stored per contract identification in the chain’s state</li><li>All bets are stored on-chain, so they are public</li><li>Each account (given by a wasp wallet) can place only one bet per deployed prediction market contract</li><li>The actual question asked by the prediction market and the possible outcomes have to be conveyed informally</li><li>Bets are against other market participants — there is no market maker</li></ul><h3>Insights</h3><p>Along the way of producing this article and experimenting with IOTA smart contracts, many small insights were gained. With the hope that these insights will be helpful, I share them in the following.</p><h4>Transaction Time</h4><p>Running transactions in the described setup on a laptop can take several seconds when waiting for the wasp-cli commands to finish. However, they can be run asynchronously when using the wasp-cli, i.e., commands can return immediately after issuing a command. Of course, transactions can run in parallel. When running scripted wasp-cli requests, one has to take care not to create too much load on the Wasp node because otherwise requests will fail with “time out”-errors. To evaluate transaction times in a production setup on the Devnet or Mainnet, further investigations are required.</p><h4>Transaction Fees</h4><p>In principle fees for transactions and deployments are configurable. However, there are some minimum fees that apply such as for deploying a chain (100 IOTA), deploying a smart contract, and posting state changing requests to a smart contract (1 IOTA). With default minimum fees in mind, transaction costs in the IOTA network should be very low compared to transaction fees in the Ethereum network. Note that chain owners could also increase transaction fees.</p><h4>Production Readiness</h4><p>On a first shot, nothing really worked when trying to set up a private IOTA network based on public documentation. That is, one has to find workable versions of the software, find a proper configuration and parameters, and find working ways for funding wallets and deploying contracts. The IOTA 2.0 software is under development and special notice has to be taken on which versions to run and which version of the different pieces (Wasp, GoShimmer, cli) to combine. The documentation in <a href="https://github.com/iotaledger/">IOTA’s repositories</a> is lagging behind the development process. However, the IOTA community is very helpful and can be reached on <a href="https://discord.iota.org/">Discord</a>. So, getting the network up and running depended digging into material on the <a href="https://github.com/iotaledger/">github</a> repositories, third party information (e.g., on <a href="https://www.youtube.com/playlist?list=PLyzQwFp8SFUWHhim4PBhV75W3nLxBWZI6">youtube</a>), own experiments, and asking questions on IOTA’s <a href="https://discord.iota.org/">discord</a> server. Once the IOTA private network and the smart contract are up and running, you are ready for testing and experiments. Further steps taken by IOTA to bring the smart contract implementation and IOTA 2.0 to production readiness can be looked up in their <a href="https://roadmap.iota.org/">roadmap</a>.</p><h4>Developing Smart Contracts</h4><p>Using Rust as a language for developing smart contracts might help producing error-free and stable smart contracts because Rust is well-known for its strict and sophisticated compiler which together with the strong typing and the language design eliminates whole classes of errors common in other languages. Also, the compiler provides helpful error messages. Therefore, using <a href="https://www.innoq.com/de/articles/2020/08/smart-contracts-in-rust/">Rust can provide some advantages</a> over the Solidity language, which is usually used for developing Ethereum smart contracts. However, Rust might be unfamiliar for smart contract developers being used to Solidity.</p><h3>Conclusion</h3><p>This article discussed smart contracts as a new and upcoming feature in IOTA 2.0. Because IOTA has a different approach to conduct and store transactions in a more parallel way with fewer resources than Ethereum, a higher scalability and lower fees are promises that come with this approach. Our experimental setup of a private IOTA network and simple prediction market IOTA smart contract shows that it is already possible to use IOTA smart contracts for development and testing. As the process of setting up the IOTA network and running smart contracts was a bit challenging, I believe this article and example code can provide useful input and support for other developers and the community interested in IOTA smart contracts.</p><p>51nodes GmbH based in Stuttgart is a provider of crypto economy solutions. 51nodes supports companies and other organizations in realizing their Blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (DApps), integration of blockchain with industry applications, and tokenization of assets.</p><p>Thanks to <a href="https://medium.com/@turfa">Majd Turfa</a> and <a href="https://medium.com/@jpbu">Jan-Paul Buchwald</a> for their help in the course of developing this article, experiments, and the setup.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=c2d81988f75e" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/exploring-iota-2-0-smart-contracts-in-a-private-network-developing-a-prediction-market-c2d81988f75e">Exploring IOTA 2.0 Smart Contracts in a Private Network: Developing a Prediction Market</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Kusama Validator Node Setup]]></title>
            <link>https://medium.com/51nodes/kusama-validator-node-setup-5ddcd5e39f99?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/5ddcd5e39f99</guid>
            <category><![CDATA[kusama]]></category>
            <category><![CDATA[polkadot]]></category>
            <category><![CDATA[validator]]></category>
            <dc:creator><![CDATA[Julian Voelkel]]></dc:creator>
            <pubDate>Tue, 11 May 2021 15:09:30 GMT</pubDate>
            <atom:updated>2021-05-11T15:09:30.008Z</atom:updated>
            <content:encoded><![CDATA[<p>How to setup a secure Kusama Validator in 2021.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*MAYvQkafAsbLpBeWA2wVwg.jpeg" /></figure><p>A short time ago, we as <a href="https://www.51nodes.io/">51nodes</a> decided to host our own Kusama Validator. This article will help you to understand how you can set up your own Validator on Polkadot’s Canary Network. At the very least you will walk away with a basic understanding of how staking works in Polkadot and what responsibilities different roles have in the ecosystem. Let&#39;s start with the most important concepts Validators and Nominators.</p><h4>Validators and Nominators</h4><p>In Polkadot’s ecosystem, a node participating in the BABE Consensus is called a <strong>Validator</strong>. Such an entity has three key responsibilities:</p><ul><li>Produce blocks if asked to do so</li><li>Vote over the blocks (more precisely the “chain of blocks”) that still have to be finalized</li><li>Forward messages between Parachains</li></ul><p>To allow your Validator to become part of the active set that is currently validating the chain, you will need to stake a minimum of ~<a href="https://polkaview.network/ksm">4000KSM</a>. Obviously, holding or acquiring such a vast amount of KSM is only possible if you’re quite wealthy, as 4000KSM are 1.28 million dollars as of writing this.</p><p>A <strong>Nominator</strong> is an entity holding vast amounts of KSM, these are most likely entities involved in Polkadot for a long time. The Nominator elects different Validators to Stake his KSM. <em>In short, this role describes someone who does not run his own node. Instead, the entity stakes their KSM most profitably by assigning their stake to a validator</em>. Please understand that staking is risky for the Nominator, if the Validator — for whatever reason — stops running, they both risk getting slashed. The slash would ultimately affect the Nominator&#39;s stake.</p><blockquote>This mechanism incentivizes Nominators to choose Validators that charge a minimal commission fee and have a good reputation in running their nodes. Validators in return get incentivized to run their node as stable as possible. In return Validators can use low commission fees to attract Nominators.</blockquote><h4>Controller and Stash</h4><p>As we now know the two most important roles in the consensus process we can talk about the accounts you have to set up to run your Validator. Two account types are of importance at this stage — The <strong>Controller</strong> and the <strong>Stash</strong>.</p><p>Controller and Stash Accounts are explained quite easily. While the Stash is the entity holding the majority of your funds, the Controller is the entity having control over the Stash and the actions you take — like starting or stopping your Validator.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/370/1*4uoJC8iq6O7Rx3Plu0X_Iw.png" /><figcaption>Stash and Controller [1]</figcaption></figure><h4>Setting up a Secure Validator</h4><p>To set up your Validator you can either choose the <a href="https://guide.kusama.network/docs/en/mirror-maintain-guides-how-to-validate-kusama">manual set-up</a> or the <a href="https://github.com/w3f/polkadot-validator-setup">Secure Validator Setup</a>. As you can already guess from the subtitle we have chosen the second option.</p><p>Within the Secure Validator Setup, you will have two options for setting up the node. The first option is to set up your server and the Polkadot application in one run using Terraform. We decided to use the second option which is to set up your server yourself and then use Ansible to securely set up the Polkadot application.</p><p>For the following setup you will need the following:</p><ol><li>At least one Debian-based machine (preferably Ubuntu 18.04 as used in Polkadot’s Documentation) with the following specs (min. 300GB Storage, 2–8GB Memory, 1–2 CPUs) hosted with the Provider of your choice.</li><li><a href="https://docs.ansible.com/ansible/latest/installation_guide/intro_installation.html">Ansible</a>(v2.8+) installed on your machine</li><li>A minimal of 3KSM depending on the amount you want to stake yourself. 0.3KSM will be needed for requesting an on-chain identity. 1–2 KSM should be held for upcoming transaction fees. The overhead can then be used to stake using your own validator. A higher amount of self stake raises trust in your validator as you project more trust on it yourself.</li></ol><h4>Preparing Your Node</h4><p>Before deploying Polkadot to your remote machine you will need to prepare an additional user and ssh access for this user. Start with <a href="https://www.cyberciti.biz/faq/add-new-user-account-with-admin-access-on-linux/">creating a new admin user</a> and then <a href="https://www.digitalocean.com/community/tutorials/how-to-set-up-ssh-keys-2">create new ssh keys</a>. Preferably you should also disable the root login as described in the ssh key instructions.</p><h4>Running the Secure Validator Setup</h4><p>If you successfully opened a shell to your remote machine, you can adjust the Ansible script for your needs. First, clone the repository:</p><pre>git clone <a href="https://github.com/w3f/polkadot-validator-setup.git">https://github.com/w3f/polkadot-validator-setup.git</a></pre><p>Now start adjusting the <em>ansible/inventory.sample </em>file according to the instructions given in the <a href="https://github.com/w3f/polkadot-validator-setup/blob/master/GUIDE_ANSIBLE.md">setup guide</a>. Below you can find an already edited example inventory file using placeholder values:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/edf231d182350f63a4a47798ea850e57/href">https://medium.com/media/edf231d182350f63a4a47798ea850e57/href</a></iframe><p>If you are done adjusting the inventory file to your needs you should be able to execute the playbook via the provided script:</p><pre>chmod +x setup.sh<br>./setup.sh my_inventory.yml</pre><p>The Ansible Playbook will now be executed and deploy the Polkadot service to your remote machine.</p><p>In case everything worked well your node should show up on the telemetry. It will now take some time until the node is in sync with the network. For us, this took nearly 7 days. Once finished you will only need to <a href="https://github.com/w3f/polkadot-validator-setup/blob/master/GUIDE_ANSIBLE.md">bond your KSM </a>and <a href="https://github.com/w3f/polkadot-validator-setup/blob/master/GUIDE_ANSIBLE.md">set your session keys</a> to enter your validator into the “waiting” list. From there on, it is a game of building trust into your Validator so that enough foreign stake is received to participate in the active validators pool. The more trust you generate in you and your node the more steady your rewards will become in the long term. To achieve more trust in your node you can also have a look at <a href="https://wiki.polkadot.network/docs/en/learn-identity">on-chain identities</a>. For about 0.3KSM you can request a <a href="https://wiki.polkadot.network/docs/en/learn-identity#registrars">Registrar</a> to verify your personal information which Nominators can then use to get more information on who is actually running the node.</p><h4><strong>Monitoring Your Validator</strong></h4><p>As keeping your validator up and running is key to running your node successfully, good monitoring is a must-have. For an initial layer of monitoring, you can use a <a href="https://t.me/Kusama_bot">Telegram Bot</a>. But for the more sophisticated part of the monitoring, most people will use <a href="https://prometheus.io/">Prometheus</a> and <a href="https://grafana.com/">Grafana</a>. You can set these up following the <a href="https://wiki.polkadot.network/docs/en/maintain-guides-how-to-monitor-your-node">instructions given in Polkadot&#39;s documentation</a> but beware that some metrics might not show up in the provided Grafana dashboard as not all metrics are delivered which is what happened to us. This is why we enabled the node exporter in the Secure Validator Setup which provides additional data and can be displayed in Grafana with dashboards like <a href="https://grafana.com/grafana/dashboards/11207/reviews">this</a>.</p><h4>Validator Stats and Nomination</h4><p>In the PolkadotJS-UI under <a href="https://polkadot.js.org/apps/#/staking/query">Validator Stats</a> you can browse Validators and inspect stats like rewards or previous slashes. For example, search for <em>ESgN4sdziBpAubx2pGJ1CapGxetE2E1zUbTQCaRFbxxrW2Y </em>and you will find our Validator. Using the displayed information a Nominator can determine his risk with a Validator. He should now be able to nominate the validator as <a href="https://wiki.polkadot.network/docs/en/maintain-guides-how-to-nominate-polkadot">described in the documentation</a>.</p><p><a href="https://www.51nodes.io/">51nodes GmbH</a> based in Stuttgart is a provider of crypto-economy solutions.</p><p>51nodes supports companies and other organizations in realizing their Blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (DApps), integration of Blockchain with industry applications, and tokenization of assets.</p><p>[1] <a href="https://wiki.polkadot.network/docs/en/learn-staking">https://wiki.polkadot.network/docs/en/learn-staking</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=5ddcd5e39f99" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/kusama-validator-node-setup-5ddcd5e39f99">Kusama Validator Node Setup</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Epirus Chain Explorer for Quorum]]></title>
            <link>https://medium.com/51nodes/epirus-chain-explorer-for-quorum-2695ef417dbc?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/2695ef417dbc</guid>
            <category><![CDATA[quorum]]></category>
            <category><![CDATA[blockchain]]></category>
            <category><![CDATA[epirus]]></category>
            <dc:creator><![CDATA[Julian Voelkel]]></dc:creator>
            <pubDate>Fri, 16 Apr 2021 08:45:54 GMT</pubDate>
            <atom:updated>2021-04-16T08:45:54.696Z</atom:updated>
            <content:encoded><![CDATA[<p>Using Helm Charts to set up the Epirus Chain Explorer and connect it to a running Quorum network.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*N0nqPERfWD1AMd6tE9sGvA.jpeg" /></figure><p>Some weeks ago we published an <a href="https://medium.com/51nodes/quorum-kubernetes-templates-6cda506a6887">article</a> in which we refer to one of our <a href="https://github.com/51nodes/quorum-raft-helm-template/tree/main">GitHub projects</a> that you can use to deploy a quorum network with a dynamic amount of nodes to your Kubernetes cluster. In this article, I will describe how to set up our newest addition to the Helm Chart — the Epirus block explorer.</p><h4>Why we use Epirus</h4><p>If you come across the task to choose a block explorer in 2021 you will most likely have 2–3 viable solutions of which some are licensed and some are free to use. Most Ethereum Clients will most likely favor <a href="https://blockscout.com/poa/sokol/">BlockScout</a>, an open-source block explorer with a wide variety of features. But, if it comes to quorum-based networks we discovered some quirks that do now align well with BlockScout. One example would be the BlockScout having <a href="https://github.com/blockscout/blockscout/issues/3294">problems parsing rafts block timestamp</a> which is in nanoseconds instead of seconds. We discovered that those kinds of problems often arise with open-sourced explorers in combination with quorum-based networks. This could be due to the enterprise nature of quorum which ultimately could lead to fewer open-source contributions.</p><blockquote>Based on those discoveries we decided to stick with the explorer that is covering most of quorums peculiarities and is also referred to in the official quorum documentation — <a href="https://www.web3labs.com/epirus-explorer">Epirus</a>.</blockquote><h4>How to deploy the Explorer</h4><p>To deploy the Epirus Explorer to your cluster adjust the <em>epirus</em> values in the <a href="https://github.com/51nodes/quorum-raft-helm-template/blob/main/quorum/values.yaml"><em>values.yaml</em> file</a>. Enable the <em>ingress</em> value to make it accessible locally or adjust the <em>node</em> value to change to which node Epirus will connect to.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/e48c006b5a050ce74dbc8974ef490b65/href">https://medium.com/media/e48c006b5a050ce74dbc8974ef490b65/href</a></iframe><p>After adjusting these values simply run <em>helm install</em> — if you don&#39;t have a running cluster — or <em>helm upgrade</em> if you already deployed a cluster on your machine to deploy/upgrade the charts.</p><pre>helm install nnodes quorum -n quorum-network                       helm upgrade nnodes quorum -n quorum-network</pre><h4>Accessing the UI</h4><p>After a successful deployment of the explorer, we can access the UI using the Cluster’s IP. Run the following command to get the IP and go to <em>&lt;cluster-ip&gt;/dashboard</em>.</p><pre>kubectl cluster-info</pre><p>If everything deployed and started successfully you should be welcomed by the Epirus Dashboard. Navigate to the different tabs on the left to get information on <em>contracts</em>, <em>transactions</em>, <em>blocks</em>, and the <em>network</em> itself.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*5Nf89ql0g93563lfGoTqzg.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*5KojCSk4DZbq0_8eHVhkVw.png" /></figure><p><a href="https://www.51nodes.io/">51nodes GmbH</a> based in Stuttgart is a provider of crypto-economy solutions.</p><p>51nodes supports companies and other organizations in realizing their Blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (DApps), integration of Blockchain with industry applications, and tokenization of assets.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=2695ef417dbc" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/epirus-chain-explorer-for-quorum-2695ef417dbc">Epirus Chain Explorer for Quorum</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Kusama & Polkadot: Build an application specific Blockchain and launch a Parachain!]]></title>
            <link>https://medium.com/51nodes/kusama-polkadot-build-an-application-specific-blockchain-and-launch-a-parachain-524ff4560acf?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/524ff4560acf</guid>
            <category><![CDATA[substrate]]></category>
            <category><![CDATA[kusama]]></category>
            <category><![CDATA[parachain]]></category>
            <category><![CDATA[polkadot]]></category>
            <category><![CDATA[rococo]]></category>
            <dc:creator><![CDATA[Marco Walz]]></dc:creator>
            <pubDate>Thu, 18 Feb 2021 11:15:52 GMT</pubDate>
            <atom:updated>2021-02-18T15:31:58.266Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*n61GTJQmkSx8ir-a.png" /><figcaption>Source: <a href="https://polkadot.network/">https://polkadot.network/</a></figcaption></figure><blockquote><strong>Note:</strong> This article provides a general and slightly technical introduction to Parachains and indicates how to get started building a Substrate based blockchain which can be launched as Parachain in the Kusama &amp; Polkadot ecosystem. It will explain some parts of Polkadot but it won’t get too much into specific details like staking, slashing or other important mechanics in the ecosystem. Check out the <a href="https://wiki.polkadot.network/">official wiki</a> to learn more about Polkadot.</blockquote><h3>Status quo</h3><p>First of all it is important to mention that the <a href="https://wiki.polkadot.network/docs/en/learn-parachains"><strong>Parachain</strong></a> functionality is not live yet. This functionality is the last missing piece of the Polkadot <a href="https://github.com/polkadot-io/polkadotpaper/raw/master/PolkaDotPaper.pdf">whitepaper</a> and there is a public <a href="https://polkadot.network/launch-parachains/"><strong>rollout plan</strong></a><strong> </strong>available that provides an overview about the progress. However, <a href="https://medium.com/polkadot-network/introducing-rococo-polkadots-parachain-testnet-e3e67fc40b56">since August 2020</a> the Parachain functionality is being tested in the official Parachain Testnet <strong>Rococo</strong> which in V0 only ran with prototype code. In December 2020 <a href="https://medium.com/plasm-network/announcing-plasm-network-parachain-on-rococo-v1-bf49b3f3ca53">Rococo V1 was launched</a> and recently <strong>Plasm Network</strong> <a href="https://medium.com/plasm-network/announcing-plasm-network-parachain-on-rococo-v1-bf49b3f3ca53">announced to be the first project that joined the Rococo V1 Testnet</a> which surely is a big milestone as the <em>codebase of Rococo V1 will also be used in </em><strong><em>Kusama</em></strong><em> and </em><strong><em>Polkadot</em></strong>. Meanwhile other possible future Parachains like <strong>KILT</strong> <a href="https://kilt-protocol.medium.com/kilts-path-to-parachains-ad3be644db8a">followed and announced their own Roadmap</a>.</p><h3>Kusama vs. Polkadot</h3><p>Although Kusama can be seen as the pre-production environment for Polkadot you should be aware that Kusama might and probably will evolve independently from Polkadot. If your project is in an early stage or you want to experiment with Parachains in a real world environment before moving it into production you will probably prefer Kusama over Polkadot first. You might also be interested in testing specific features in Kusama which aren’t yet available on Polkadot.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*qiuv8QDipVkh2f6L.png" /><figcaption>Source: <a href="https://polkadot.network/">https://polkadot.network/</a></figcaption></figure><h3>Relay Chain &amp; Parachains</h3><p>The Relay Chain is the central chain of Polkadot. All validators of Polkadot are staked on the Relay Chain in DOT and validate for the Relay Chain. It has deliberately minimal functionality and its main responsibility is to coordinate the system as a whole, including Parachains. Other specific work is delegated to the Parachains that serve different purposes. All Parachains that are connected to the Relay Chain <strong>share</strong> the same <strong>security</strong>. Polkadot also has a <strong>shared state</strong> between the Relay Chain and all connected Parachains to <em>ensure the validity of the entire system</em>. Smart Contracts are not supported on the Relay Chain but <strong><em>can be</em></strong> <strong><em>supported</em></strong> by Parachains.</p><blockquote>It’s expected that each Parachain will be as light and application specific as possible and serve a specific use case.</blockquote><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*RaNttLQ9gOHCEHSX.png" /><figcaption>Source: <a href="https://medium.com/polkadot-network/parathreads-pay-as-you-go-parachains-7440d23dde06">https://medium.com/polkadot-network/parathreads-pay-as-you-go-parachains-7440d23dde06</a></figcaption></figure><p>In <a href="https://medium.com/polkadot-network/the-path-of-a-parachain-block-47d05765d7a"><em>The Path of a Parachain Block</em></a> it is described in detail how a new Parachain Block is produced.</p><p>You may notice the term <strong>Parathread</strong> in the picture above. The current (optimistic) assumption is that the Relay Chain supports <strong>up to 100 Parachains</strong> and some of these Parachain slots will be permanently reserved to serve the network on system level (e.g. Governance). Thus there probably won’t be enough slots available for all projects which is why they need to be obtained in a so-called auction (more on that below). Whereas Parachains get a guaranteed high throughput by bonding DOT tokens, Parathreads follow a <a href="https://medium.com/polkadot-network/parathreads-pay-as-you-go-parachains-7440d23dde06">pay as you go model</a> where a fee in DOT is being paid to validators for creating blocks. In some cases this even makes sense if there aren’t many or high frequent state updates expected (e.g. DNS, Oracles).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*2KsYeqB_0a04LztN.png" /><figcaption>Source: <a href="https://medium.com/polkadot-network/parathreads-pay-as-you-go-parachains-7440d23dde06">https://medium.com/polkadot-network/parathreads-pay-as-you-go-parachains-7440d23dde06</a></figcaption></figure><p>Parachains can become Parathreads and vice versa as they share the same technical base.</p><h3>Parachain Slot Auction &amp; Parachain Crowdloans</h3><p>In order to become a Parachain in Kusama or Polkadot a slot must be obtained in a <a href="https://wiki.polkadot.network/docs/en/learn-auction">Parachain Slot Auction</a>. Each slot duration is capped to 2 years and divided into 6-month lease periods. Parachains may lease more than one slot over time in order to extend their lease to Polkadot past the 2 year slot duration. Depending on the auction it might be possible that a slot is owned by 4 different Parachains during the 2 year slot duration.</p><blockquote>Parachains don’t need to always inhabit the same slot. As long as a Parachain inhabits any slot it can continue as Parachain.</blockquote><p>Parachain candidates can place bids in unpermissioned <a href="https://en.wikipedia.org/wiki/Candle_auction"><strong><em>candle auctions</em></strong></a> that rely on <a href="https://en.wikipedia.org/wiki/Verifiable_random_function">verifiable random functions</a> (VRFs) where the original mechanism has been slightly modified to be secure on a blockchain. The auction can be divided into two phases:</p><ul><li>Opening Phase</li><li>Closing Phase</li></ul><p>No matter in what phase the auction is, the <strong><em>bids are public</em></strong>. From the first block until the last block during the Closing Phase a winner will be determined randomly and <em>nobody knows which block determines the winner until the Closing Phase is finished</em>.</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FfYc1yolanoE%3Ffeature%3Doembed&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DfYc1yolanoE&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FfYc1yolanoE%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/74f01e0b9f03cb00c1fe3dec94af1a77/href">https://medium.com/media/74f01e0b9f03cb00c1fe3dec94af1a77/href</a></iframe><p>Projects will also be able to launch a <a href="https://wiki.polkadot.network/docs/en/learn-crowdloans"><strong>Crowdloan Campaign</strong></a> with specific parameters which allows them to loan DOTs from the community to obtain a Parachain slot without having to bond all the required DOTs on their own.</p><p>In the video above core developer Shawn Tabrizi explains both variants very well.</p><h3>Substrate</h3><p>After giving an overview about the Polkadot ecosystem it’s time to introduce <strong>Substrate</strong>. It is a modular framework that enables you to create purpose-built blockchains by composing custom and/or pre-built components.</p><p>Substrate is the foundation of all blockchains in the Polkadot ecosystem and requires knowledge in <a href="https://www.rust-lang.org/"><strong>Rust</strong></a>.</p><p>In the Substrate Developer Hub (link below) you can learn everything that is required to get started developing your own Substrate-based chain.</p><p><a href="https://substrate.dev/">Official Substrate Documentation for Blockchain Developers · Substrate Developer Hub</a></p><p>A few important things to know about Substrate:</p><ul><li>The <strong>runtime</strong> of Substrate is referred to as the “state transition function” which contains the business logic that defines the behavior of the blockchain. You as a developer can define <strong>storage items</strong> that represent the state of the blockchain and <strong>functions</strong> that allow users to make changes to the state.</li><li>Substrate ships with <strong>FRAME</strong> (Framework for Runtime Aggregation of Modularized Entities) which is a set of modules (called <strong>Pallets</strong>) and support libraries that simplify runtime development. Pallets are individual modules within FRAME that host domain-specific logic.</li><li>To enable forkless <strong>runtime upgrade capabilities</strong>, Substrate uses runtimes that are built as <strong>WebAssembly (WASM)</strong> bytecode.</li></ul><p>To learn more about how to perform a forkless runtime upgrade you can take a look at the <a href="https://substrate.dev/docs/en/tutorials/upgrade-a-chain/">official tutorial</a>.</p><h3>Pallets vs. Smart Contracts</h3><p>For people heading over from other chains and ecosystems like Ethereum it might be obvious that application specific logic is always written in Smart Contracts which in turn are deployed on the blockchain. But as you already learned the idea of the Polkadot ecosystem is to have different Parachains that are as light and as application specific as possible.</p><p>There are basically 3 possibilities to add custom logic to a Substrated-based blockchain:</p><ol><li>Write <strong>custom Pallets</strong> and include it in the runtime<br>- <a href="https://substrate.dev/docs/en/tutorials/create-a-pallet/">https://substrate.dev/docs/en/tutorials/create-a-pallet/</a><br>- <a href="https://substrate.dev/recipes/pallets-intro.html">https://substrate.dev/recipes/pallets-intro.html</a></li><li>Include the <a href="https://substrate.dev/docs/en/knowledgebase/smart-contracts/contracts-pallet"><strong>Contracts Pallet</strong></a> in the runtime and write contracts in <strong>ink!</strong> (Rust based eDSL)<br>- <a href="https://paritytech.github.io/ink-docs/">https://paritytech.github.io/ink-docs</a><br>- <a href="https://substrate.dev/substrate-contracts-workshop/#/">https://substrate.dev/substrate-contracts-workshop</a></li><li>Include <strong>Frontier</strong> (Pallets that serve as Ethereum compatibility layer) to the runtime and write contracts in <strong>Solidity<br></strong>- <a href="https://github.com/paritytech/frontier">https://github.com/paritytech/frontier</a><br>- <a href="https://substrate.dev/frontier-workshop/#/">https://substrate.dev/frontier-workshop</a></li></ol><blockquote><strong>Custom Pallets</strong> are the best choice if you are building a greenfield project that serves a specific purpose and needs to run its own network.</blockquote><p>Substrate already provides a lot of Pallets which you can combine together as needed, including your custom Pallet(s) in order to build a lightweight and unique runtime for your application specific blockchain:</p><ul><li><a href="https://github.com/paritytech/substrate/tree/v3.0.0/frame">Default Pallets of Substrate</a> (Note: The link points to the v3.0.0 version of Substrate which was released a few days ago. Most tutorials are still based on v2.0.0)</li><li><a href="https://github.com/danforbes/pallet-nft">NFT Pallet</a> (There is also a WIP reference implementation that showcases CryptoKitties on Substrate)</li></ul><p>I also discovered a <a href="https://marketplace-staging.substrate.dev/"><strong>Substrate Marketplace</strong></a> where you are able to find different Pallets (at least the default ones) which you can include into your runtime.</p><p>If you aim to write contracts in <strong>ink!</strong> or port a <strong>Solidity</strong> based application from Ethereum over to a Substrate-based Parachain you don’t necessarily need to launch your own Parachain as there already exist projects that plan to launch a Parachain and include the required Pallets in their runtime:</p><ul><li><a href="https://edgewa.re/"><strong>Edgeware</strong></a><strong> </strong>(ink! based contracts)</li><li><a href="https://moonbeam.network/"><strong>Moonbeam</strong></a> (Solidity contracts)</li></ul><blockquote><strong>Note:</strong> To check out how Moonbeam works you can clone the following repository: <a href="https://github.com/51nodes/moonbeam-playground">https://github.com/51nodes/moonbeam-playground</a></blockquote><h3>Polkadot compatibility with Cumulus</h3><p>When you have written (and hopefully tested ;-)) your custom Pallets you need to make sure that you build a Polkadot compatible runtime that exposes an interface for validating its state transition and provides interfaces to send and receive messages of other Parachains.</p><p><a href="https://wiki.polkadot.network/docs/en/build-cumulus"><strong>Cumulus</strong></a> is an extension to Substrate that makes it easy to transform any Substrate-built runtime into a Polkadot-compatible Parachain. It is still in development, but the idea is that it should be simple to take a Substrate chain and add the Parachain code by importing the crates and adding a single line of code. You can also dive deeper into that topic and learn more about the mechanics of Polkadot by reading the <a href="https://w3f.github.io/parachain-implementers-guide/index.html"><strong>Parachain Implementers Guide</strong></a>.</p><p>If you have built your own Parachain and want to join the official Rococo Testnet you have to follow <a href="https://wiki.polkadot.network/docs/en/build-parachains-rococo#rococo-v1-parachain-requirements">this guide</a>. <em>The auction mechanism mentioned above is not active yet and Rococo is currently controlled by the development leads at Parity Technologies</em>. For joining the network you have to run at least 1 Collator for your Parachain and 1 Validator node for Rococo.</p><blockquote><strong>Note:</strong> Learn how to setup a local Polkadot network based on Rococo by following the Readme of our repository: <a href="https://github.com/51nodes/polkadot-local-rococo">https://github.com/51nodes/polkadot-local-rococo</a></blockquote><h3>Final Thoughts</h3><p>Substrate has a clean design and makes it very easy to build an application specific blockchain for everyone familiar with Rust. If you decide to run your own Parachain (or Parathread) you should also think about how to incentivise other players to run a Collator node in your network — for example by introducing your own network token.</p><p>Although Parachains are not live on Kusama and Polkadot yet we can already see many projects that aim to become a Parachain as soon as possible. It will be interesting to watch the battle of those projects to secure their exclusive Parachain slot on Kusama and Polkadot. Personally I am very interested to see how the auction mechanism works and which projects will secure their Parachain slot via a Crowdloan Campaign.</p><p>I am also looking forward to play around with the interoperability features that Polkadot provides when the implementation is mature enough.</p><p><a href="https://www.51nodes.io/">51nodes GmbH</a> based in Stuttgart is a provider of Crypto Economy solutions.</p><p>51nodes is member of the <a href="https://www.parity.io/announcing-substrate-delivery-partners/">Substrate Delivery Partners</a> program and supports companies and other organizations in realizing their blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (dApps), integration of blockchain with industry applications, and tokenization of assets.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=524ff4560acf" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/kusama-polkadot-build-an-application-specific-blockchain-and-launch-a-parachain-524ff4560acf">Kusama &amp; Polkadot: Build an application specific Blockchain and launch a Parachain!</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Quorum Kubernetes Templates]]></title>
            <link>https://medium.com/51nodes/quorum-kubernetes-templates-6cda506a6887?source=rss----89b961a921c---4</link>
            <guid isPermaLink="false">https://medium.com/p/6cda506a6887</guid>
            <category><![CDATA[kubernetes]]></category>
            <category><![CDATA[51nodes]]></category>
            <category><![CDATA[helm]]></category>
            <category><![CDATA[consensys]]></category>
            <category><![CDATA[quorum]]></category>
            <dc:creator><![CDATA[Julian Voelkel]]></dc:creator>
            <pubDate>Mon, 01 Feb 2021 08:45:44 GMT</pubDate>
            <atom:updated>2021-02-01T17:04:30.924Z</atom:updated>
            <content:encoded><![CDATA[<h4>Setting up a dynamic raft based Quorum network for development and testing using Helm.</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*YECeOxlko9KoOJNw8RNm3A.jpeg" /></figure><h3>Motivation</h3><p>Following <a href="https://medium.com/51nodes/build-a-minimal-quorum-network-90c59ec22226">this</a> informative article provided by my colleague <a href="https://medium.com/@turfa">Majd</a>, in which you can learn the basics about Quorum and how to set up a minimal Quorum network using docker containers, we decided to take things a bit further and use our knowledge to provide a <strong>dynamic Quorum setup deployable to Kubernetes</strong>. In our <a href="https://github.com/51nodes/quorum-raft-helm-template">GitHub repository</a>, we provide a Helm chart and some scripts which aim to make developing and testing on Quorum a lot faster and easier. This article will provide some insights regarding the setup and the functionalities of the project.</p><h3>Tooling &amp; Prerequisites</h3><p>For our setup, we need a running Kubernetes cluster and Helm. Helm enables us to deploy a preconfigured network to a running cluster using the concept of charts. This, in combination with some scripts, gives us the <strong>ability to dynamically add and remove nodes to and from the network</strong>.</p><h4>Minikube (Kubernetes)</h4><p><a href="https://minikube.sigs.k8s.io/docs/">Minikube</a> is one of the multiple tools which you can use to spin up a Kubernetes cluster locally. Other viable options are <a href="https://k3s.io/">k3s</a> or <a href="https://kind.sigs.k8s.io/docs/user/quick-start/">kind</a>. Those tools are widely used to develop and test applications on local infrastructure, before deploying them to the target infrastructure.</p><h4>Helm</h4><p><a href="https://www.bmc.com/blogs/kubernetes-helm-charts/">Here</a> is a pretty good explanation of what Helm does and how it is connected with Kubernetes:</p><blockquote>In simple terms, Helm is a package manager for Kubernetes. Helm is the K8s equivalent of yum or apt. Helm deploys charts, which you can think of as a packaged application. It is a collection of all your versioned, pre-configured application resources which can be deployed as one unit. You can then deploy another version of the chart with a different set of configuration.</blockquote><h3><strong>Dynamic Nodes</strong></h3><p>Now that we know the technical requirements we can take a closer look at the actual <a href="https://github.com/51nodes/quorum-raft-helm-template">repository</a>. The intention for this setup was to use Quorum on Kubernetes. As we did not find any solutions for this besides <a href="https://github.com/ConsenSys/qubernetes">Qubernetes</a> — the officially supported way to deploy Quorum to Kubernetes — we decided to create our own deployments using Helm. This in particular has the benefit of being more flexible than the officially supported way where Kubernetes deployments get generated once and have to be regenerated and redeployed everytime you need another setup. <strong>Using our dynamic approach you will be able to keep the network running while adding or removing nodes.</strong></p><h4>Quorum Configuration</h4><p>The following code shows the <em>values.yaml</em> file in which most of the action will take place. <strong>Those values are used to dynamically fill a set of templates which you can then reuse for the number of nodes you want in your network</strong>. The setup also allows you to modify some additional configurations regarding the deployment such as input parameters for Quorum and Geth.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/28685ca3fa4d5515184195009eb23acd/href">https://medium.com/media/28685ca3fa4d5515184195009eb23acd/href</a></iframe><p>The <em>values</em> file additionally takes input for the nodes which are going to be deployed to the cluster. Some of those values are needed for adding a node to a raft based Quorum cluster and others give some additional functionality like enabling or disabling endpoints.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/40118c656098562e968ff3fba7fc1e32/href">https://medium.com/media/40118c656098562e968ff3fba7fc1e32/href</a></iframe><p>Under <em>endpoints</em>, <em>RPC,</em> as well as <em>WebSocket</em> endpoints, can be turned on and off. This in particular is used for communication inside the cluster. Additionally — if needed — you can enable the ingress controller to easily access nodes from outside the cluster. If you have Ingress enabled you can access the nodes at http://&lt;cluster-ip&gt;/quorum-node&lt;n&gt;-rpc or<em> </em>http://&lt;cluster-ip&gt;/quorum-node&lt;n&gt;-ws<em>. Nodekey</em> and <em>enode</em> represent the key material for a raft node, they can be generated using the bootnode command provided by Geth. The <em>key</em> value is a Geth Keystore file which holds the account credentials for a Geth account. <strong>Only with the combination of bootnode credential and Geth account, the node will be able to function properly. </strong>Note that example credentials as provided in the repository should never be reused in any production environment.</p><h4>Reusing Templates</h4><p>The above values will then be used to fill in the provided templates. Those templates usually take values for exactly one node/deployment.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*mIqWP75lccsu7c0MtmmOwQ.png" /><figcaption>templates needed to deploy a single node</figcaption></figure><p>By looping through the nodes object of the <em>values</em> file at the top of each template, we can reuse these for every new node added to the <em>values</em> file. Here&#39;s an example for a <em>PersistentVolumeClaim</em>:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/b1add2bfe04b1b1b9c69dac915befe9c/href">https://medium.com/media/b1add2bfe04b1b1b9c69dac915befe9c/href</a></iframe><h4>Deploying and Updating the Chart</h4><p>Now, let&#39;s start with deploying the network to a running Kubernetes infrastructure.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/2b8cc879723f57b21bbed558a7aad6c6/href">https://medium.com/media/2b8cc879723f57b21bbed558a7aad6c6/href</a></iframe><p>Upgrades, after changing the configuration of your network can be installed by using:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/0bc773808d10a906df051a305ff62249/href">https://medium.com/media/0bc773808d10a906df051a305ff62249/href</a></iframe><h4>Adding New Nodes</h4><p>Besides adding and removing nodes by modifying the values file manually, we also implemented a more convenient way to do this. For this, we provide multiple scripts that allow us to<strong> add and remove specific as well as multiple nodes</strong>.</p><p>In the following, I will use the <em>addNodes.sh</em> script to dynamically add new nodes to a running cluster.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/9573d68026fbbfaf4bc71308a32b51d3/href">https://medium.com/media/9573d68026fbbfaf4bc71308a32b51d3/href</a></iframe><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/b4ce605c55e304a23bd7e5032e725c65/href">https://medium.com/media/b4ce605c55e304a23bd7e5032e725c65/href</a></iframe><p>The <em>addNodes.sh</em> script now allows me to decide how many nodes I want to add to the cluster. I choose to add 2 additional nodes and the script automatically generates the corresponding credentials, adds them to the values file, and upgrades the deployed Helm chart.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/d93667ba9d881eeb5318c74005730759/href">https://medium.com/media/d93667ba9d881eeb5318c74005730759/href</a></iframe><p>Note the additionally generated value <em>raftId. </em>This value is required for every additional node as it enables the flag “ — raftjoinexisting &lt;<em>raftId</em>&gt;” which is needed to properly add a new node to the initial 3 nodes cluster.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/7be29862f8ba38d781f197e8c98c58a4/href">https://medium.com/media/7be29862f8ba38d781f197e8c98c58a4/href</a></iframe><p>After updating the chart you can see that the new nodes have been added to the cluster. To confirm that the cluster is running and is properly synchronized you can run the following command, which will open a shell to Geth running inside the container. You can then use this open shell to inspect the state of raft (see below) or execute different geth commands. If the resulting <em>nodeActive </em>value is <em>true, </em>the node is properly synced with the cluster and ready to go.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/ad9ceabc84cf69f335d274251a0308dc/href">https://medium.com/media/ad9ceabc84cf69f335d274251a0308dc/href</a></iframe><h3>Conclusion</h3><p>All in all this project should help people to improve testing and development of Quorum networks running on Kubernetes infrastructure. Besides that, it is also a simple and convenient way to gather some experience using Quorum and experiment with various settings, especially if you have problems figuring out the right configuration for your cause. So if you want to take a look at the <a href="https://github.com/51nodes/quorum-raft-helm-template">repository</a> feel free to do so. If you have any suggestions or improvements you might want to create a PR which we will happily review.</p><p><a href="https://www.51nodes.io/">51nodes GmbH</a> based in Stuttgart is a provider of crypto-economy solutions.</p><p>51nodes supports companies and other organizations in realizing their Blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (DApps), integration of Blockchain with industry applications, and tokenization of assets.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6cda506a6887" width="1" height="1" alt=""><hr><p><a href="https://medium.com/51nodes/quorum-kubernetes-templates-6cda506a6887">Quorum Kubernetes Templates</a> was originally published in <a href="https://medium.com/51nodes">51nodes</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>