ASP.NET Hacker2024-02-26T08:49:40.7002462+00:00Mon, 26 Feb 2024 08:49:40 +0000https://asp.net-hacker.rocks/Jürgen Gutschjuergen@gutsch-online.dehttps://asp.net-hacker.rocks/2024/02/19/cloud-native-development-net-aspire.htmlDevelop cloud native applications using .NET Aspire2024-02-19T00:00:00+00:002024-02-19T00:00:00+00:00Mon, 19 Feb 2024 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>At the .NET Conf 2023, Microsoft announced a kind of toolset to build cloud-native applications. That announcement was kind of <a href="https://youtu.be/z1M-7Bms1Jg?si=JCSi-dqZYptF_ZEf">hidden in a talk</a> done by <a href="https://twitter.com/condrong">Glenn Condron</a> and <a href="https://twitter.com/davidfowl">David Fowler</a> about building cloud-native applications using .NET 8, which was also announced at that conference. This talk actually was about .NET Aspire, which I will quickly introduce with this post.</p>
<p>Let's start first by answering a question.</p>
<p>When I did a talk about .NET Aspire recently <a href="https://www.meetup.com/basel-net-user-group/events/297394460/">at the .NET user group in Basel (CH)</a>, one individual in the audience asked me the following question:</p>
<h2>What is a cloud-native application?</h2>
<p>Let's ask the internet to find the right answer:</p>
<blockquote>
<p><strong>Amazon:</strong>
"Cloud native is the software approach of building, deploying, and managing modern applications in cloud computing environments. Modern companies want to build highly scalable, flexible, and resilient applications that they can update quickly to meet customer demands. To do so, they use modern tools and techniques that inherently support application development on cloud infrastructure. These cloud-native technologies support fast and frequent changes to applications without impacting service delivery, providing adopters with an innovative, competitive advantage."
(https://aws.amazon.com/what-is/cloud-native/)</p>
</blockquote>
<blockquote>
<p><strong>Google:</strong>
"A cloud-native application is specifically designed from the ground up to take advantage of the elasticity and distributed nature of the cloud. "
(https://cloud.google.com/learn/what-is-cloud-native)</p>
</blockquote>
<blockquote>
<p><strong>RedHat:</strong>
"Cloud-native applications are a collection of small, independent, and loosely coupled services."
(https://www.redhat.com/en/topics/cloud-native-apps)</p>
</blockquote>
<blockquote>
<p><strong>Oracle:</strong>
"The term cloud native refers to the concept of building and running applications to take advantage of the distributed computing offered by the cloud delivery model. Cloud-native apps are designed and built to exploit the scale, elasticity, resiliency, and flexibility the cloud provides."
(https://www.oracle.com/cloud/cloud-native/what-is-cloud-native/)</p>
</blockquote>
<blockquote>
<p><strong>Microsoft:</strong>
"<em>Cloud-native architecture and technologies are an approach to designing, constructing, and operating workloads that are built in the cloud and take full advantage of the cloud computing model.</em>"
(https://learn.microsoft.com/en-us/dotnet/architecture/cloud-native/definition)</p>
</blockquote>
<blockquote>
<p><strong>Cloud Native Computing Foundation (CNCF):</strong>
"Cloud native technologies empower organizations to build and run scalable applications in modern, dynamic environments such as public, private, and hybrid clouds. Containers, service meshes, microservices, immutable infrastructure, and declarative APIs exemplify this approach.</p>
<p>These techniques enable loosely coupled systems that are resilient, manageable, and observable. Combined with robust automation, they allow engineers to make high-impact changes frequently and predictably with minimal toil."
(https://github.com/cncf/toc/blob/main/DEFINITION.md)</p>
</blockquote>
<p>Every answer is a little different. Basically, it means a cloud-native application is built for the cloud and uses the services the cloud provides to be scalable and resilient.</p>
<h2>What is .NET Aspire doing?</h2>
<p>.NET Aspire helps with <strong>tooling</strong> in VS and the CLI to create and interact with .NET Aspire apps. It also brings some project templates to create new .NET Aspire apps. .NET Aspire helps with <strong>orchestrating</strong>, means running and connecting to multi-project applications and their dependencies. It also provides <strong>components</strong> that connect to cloud dependencies like queues, caches, databases, or even prebuild containers. All those components can be orchestrated and connected to your applications using C#. .NET Aspire creates a deployment-ready development environment. Using the Azure Development CLI (azd) you can easily deploy your cloud native application to Azure.</p>
<p>.NET Aspire is made for local development and it is made for Microsoft Azure. Developments and deployments to other clouds might be possible in the future with the support of the developer community. In the first stage, Microsoft will not support other cloud providers. Which makes sense since Azure is the number one platform for Microsoft.</p>
<p>.NET Aspire uses Docker Desktop to run your cloud-native application. When you press F5 in VS, your apps will be deployed to containers and will run on Docker Desktop locally. When you deploy your cloud-native application, a Bycep script will be created and your apps will be deployed to a new Azure Resource Group inside Azure Container Apps. App Service Containers are not supported yet. AKS is only supported via the community tool Aspirate.</p>
<p>Currently, .NET Aspire is Preview 3. Which means some features might not work or are not yet implemented.</p>
<p>But those limitations are absolutely fine for the moment.</p>
<h2>Why is .NET Aspire needed?</h2>
<p>Actually, it is not needed. There are good tools out there to set up the local development environment the way you can develop cloud-native applications locally. There are also tools that set up your development environment inside the cloud to develop in the same environment where your application will live. This is great and super helpful. Unfortunately, these possibilities are sometimes hard to set up and some teams can't use it for some reason. The easiest way to set up an environment locally for me as a developer on Windows using .NET was to use Docker Compose or to load or emulate the services I needed locally or to be connected to the cloud environment all the time and to use the cloud services directly. Both options are not perfect.</p>
<p>So, you see that .NET Aspire is not needed. But it is super helpful for me as a C# developer.</p>
<h2>Let's have a quick look at .NET Aspire in action</h2>
<p>Therefore, I created a frontend app using the new Blazor Web App and a backend that provides me with the data via a Web API endpoint. Both apps are just the default templates with the weather data demos. I just did a small modification: Instead of generating the weather data in the front end, it now loads them from the API.</p>
<p>When doing right-click on one of the projects and select "Add", you will see two new entries in the context menu:</p>
<ul>
<li>".NET Aspire Component..."</li>
<li>".NET Aspire Orchestration Support..."</li>
</ul>
<p><img src="https://asp.net-hacker.rocks/img/net-aspire/image-20240222214405599.png" alt="image-20240222214405599" /></p>
<p>Selecting ".NET Aspire Orchestration Support...", it creates two new projects in your solution:</p>
<p><img src="https://asp.net-hacker.rocks/img/net-aspire/image-20240222214727408.png" alt="image-20240222214727408" /></p>
<p>The AppHost is the project where you will do the actual composition, we will have a more detailed look later. The ServiceDefaults contains one single code file with extension methods that configure default services and Middlewares the actual projects need to use. Mainly Telemetry and HelthChecks. Actually, these service defaults are added to the actual projects when adding the Aspire Orchestration support. The following code shows the usage of the default in lines 5 and 17:</p>
<p><img src="https://asp.net-hacker.rocks/img/net-aspire/image-20240222215145900.png" alt="image-20240222215145900" /></p>
<p>As you can see, I also configured a HttpClient that connects to the backend API.</p>
<p>I also added the Aspire orchestration support to the backend API and the service defaults are added to that project as well. In this project, I configured a distributed Redis cache in line 14:</p>
<p><img src="https://asp.net-hacker.rocks/img/net-aspire/image-20240222220527042.png" alt="image-20240222220527042" /></p>
<p>This application contains three components: A frontend which is a Blazor web app, a backend which is a minimal API and a Redis cache. These three components need to be orchestrated to run and debug it locally. The problem is, that I don't have a local instance of Redis yet.</p>
<p>This is where Aspire can help us. Let's have a look into the <code>Program.cs</code> of the AppHost Project:</p>
<pre><code class="language-csharp">var builder = DistributedApplication.CreateBuilder(args);
var cache = builder.AddRedis("cache");
var backend = builder.AddProject<Projects.WeatherApp_Backend>("backend")
.WithReference(cache)
.WithReplicas(2);
builder.AddProject<Projects.WeatherApp_Frontend>("frontend")
.WithReference(backend);
builder.Build().Run();
</code></pre>
<p>This looks pretty similar to a regular minimal API without any ASP.NET Core stuff. The first line defines a DistributedApplicationBuilder which is the orchestrator.</p>
<p>Line 3 adds Redis to the orchestration with the name "cache". Remember that we configured the distributed cache with the exact same name in the backend project.</p>
<p>Line 5 adds a project reference to the orchestration with the name backend. It references the cache and it should start two instances of the backend.</p>
<p>Line 9 adds a project reference to the front end. This one needs the backend and adds it as a reference.</p>
<p>How does the frontend know the backend address when the apps are running in orchestration? I do have the same problem when I use docker-compose to orchestrate apps. In this case, i just need to read the endpoint URL from the environment variables:</p>
<pre><code class="language-csharp">IIbuilder.Services.AddScoped(sp => new HttpClient
{
BaseAddress = new Uri(builder.Configuration.GetValue<string>("services:backend:0"))
});
</code></pre>
<p>You will see why this is working a little later.</p>
<p>Let's start the application but ensure Docker Desktop is running first. Since it is all in preview at the moment, you may need to start the application two times. Once the app is started you'll see the URL in the console that pops up. In case no browser opens automatically copy the URL and open it in a browser.:</p>
<p><img src="https://asp.net-hacker.rocks/img/net-aspire/image-20240223094108726.png" alt="image-20240223094108726" /></p>
<p>You will see the really cool Aspire portal in the browser that shows you all the running apps:</p>
<p><img src="https://asp.net-hacker.rocks/img/net-aspire/image-20240223094441698.png" alt="image-20240223094441698" /></p>
<p>This Portal is built with the new ASP.NET Core Blazor web.</p>
<p>On the start screen, you see all the running services. Two instances of the backend app and one instance of the frontend app. You will also recognize the instance of the Redis cache. This is coming from a docker image that got pulled by Aspire and is running as a container now. You will also see that the backends have two endpoint URLs. One is equal to both instances and the other one is the individual URL for that specific container. The one that is equal to both is routed through a kid of a proxy.</p>
<p>This portal doesn't show you only the running services. Because of the Service defaults that got injected into the apps, it can read the health states, the logs, and the telemetry information of your apps. This will help you to debug your locally running apps. Just click through the portal to see the logs, the traces, and the metrics.</p>
<p>When you click on the details link of a specific running service, you can also see the environment variables that got passed to the service. In the next screenshot, you can see that the URL of the backend app will be passed as an environment variable to the frontend. This is the environment variable we used in the frontend to connect to the backend:</p>
<p><img src="https://asp.net-hacker.rocks/img/net-aspire/image-20240223094932947.png" alt="image-20240223094932947" /></p>
<p>The orchestration makes the services to know each other this way. The backend gets the connection string to Redis via the environment variable. This is why the services can interact. So there is almost no magic here. Just C# to orchestrate and environment variables to connect the services to each other.</p>
<h2>Deployment</h2>
<p>As mentioned your cloud-native app will be orchestrated to be cloud-ready. You can easily deploy your application to your Azure subscription. The tool that helps you with that is the Azure Developer CLI (<code>azd</code>). This CLI is super easy to use prepares your app for you and can do the deployment. After the installation of <code>azd</code> you just use it.</p>
<p>With the console of your choice, cd to your solution folder and type <code>azd login</code>. This will open up a browser that you can use to log in with your Azure account.</p>
<p>The following command will prepare your application to be ready for deployment:</p>
<pre><code class="language-shell">azd init
</code></pre>
<p>It creates some configuration files and a <code>Bycep</code> script to set up your environment on Azure. Take a look at it to learn about <code>Bycep</code>.</p>
<p>The next command does the deployment:</p>
<pre><code class="language-shell">azd up
</code></pre>
<p>If you own more than one subscription you are asked which one to use. The CLI is super awesome. It is an interactive one that guides you through the entire deployment. Just follow the instructions.</p>
<p>If the deployment is done your app is up and running on Azure. It is really that easy.</p>
<p>It sets all up on Azure. A Redis is up and running. Your apps are running in Azure Container Apps and if you would have a SQL Server configured in .NET Aspire, it would also set up a SQL Azure for you</p>
<p>Just don't use preview versions of .NET. That won't run on Azure and it took me some time to figure out why my cloud native app is not running on Azure. The easiest way to not stumble into that issue is to create a <code>global.json</code> and pin your solution to an SDK version of .NET that is supported on Azure.</p>
<h2>Conclusion</h2>
<p>This is just an introduction post about .NET Aspire. I hope it gives you a good overview of it.</p>
<p>I will definitely follow the releases of .NET Aspire and I'm really looking forward to using the final release for the development of real applications that will go into production.</p>
<p>I really like it and will - for sure - write more deep dive about it. I also did a <a href="https://www.meetup.com/basel-net-user-group/events/297394460/">talk at the .NET user group Basel</a> and would also do it at your user group, if you like. I'm also open to conference talks.</p>
<p>Just one thing I would really like to have is the Aspire portal to be deployed as well. I think this will be super helpful to monitor applications in production. As far as I know, there are no plans yet to have this portal as a tool for production. On the other hand, if you don't properly secure this portal, it could be a really dangerous security risk and all the information that the portal provides is also available on the Azure portal. So there isn't a real need for that.</p>
<p>Do you want to learn more about .NET Aspire? Follow the docs that are super complete and also contain super helpful tutorials about all the built in components: https://learn.microsoft.com/de-de/dotnet/aspire/</p>https://asp.net-hacker.rocks/2023/04/03/appsec-role-at-yoo.htmlApplication Security at YOO2023-04-03T00:00:00+00:002023-04-03T00:00:00+00:00Mon, 03 Apr 2023 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>For about a year, I was working on a pretty exciting project. I defined and created a new role for our company that is responsible for application security. Actually, application security was never a missing aspect of our development process. All my colleagues were great developers and highly motivated to create secure applications.</p>
<p>The actual problem was a missing standard, that sets up every project in the same secure way, which helps the QA to also test security aspects, use the same tools to improve the software quality and software security and keep the awareness on security during the entire development process.</p>
<h2>Defining the new role</h2>
<p>Defining and creating this new role also means that I will take over this new role and be responsible, to ensure and to maintain the security standards throughout the entire company. I will also be responsible to keep awareness about security throughout the entire company and to train the developers, the DevOps responsible person as well as the QA. I am not the only person that is responsible for application security in general. My job will be to make all colleagues feel responsible to create secure applications. That they all have security in mind, that all features will be analyzed from the security perspective as well.</p>
<p>And this is why the process of secure software development doesn't start with development or DevOps.</p>
<h2>Adding security to the company</h2>
<p>Actually, the secure software development process starts at the sales phase. The salesperson needs to know what type of customer he is selling our services to, what type of data the potential customer will be handle with the new project, and what type of possible risks there are. The salesperson needs to know what level of security he needs to sell. (Exactly, we need to sell security. See next section why)</p>
<p>The process continues with the requirements engineering, even the UX and UI specialists need to take care of security. DevOps is following, by setting up the secure software development infrastructure and secure deployments per project. DevOps is also supporting the developers with the right tools that check for software quality and possible vulnerabilities and code flaws while building and delivering the software. In the development phase, required security aspects need to be implemented and QA needs to know how to test this. Maybe tooling will support the QA to make automated security tests.</p>
<h2>Selling security</h2>
<p>True, we need to sell levels of security because ensuring application security needs some effort. The more secure the more effort during development and afterward, while ensuring and testing the application security. A potential customer in a sensitive or risky environment should know that security is not for free. A bank, a power plant, and other big and risky industry are paying for security personnel that keeps unauthorized people outside of their restricted areas. Such companies should be aware and willing to pay for implementing higher security mechanisms to keep unauthorized people out of their digitally restricted areas as well.</p>
<p>Actually, application security needs to be ensured in every project and the basic level of security won't be charged separately.</p>
<h2>What are the standards we use?</h2>
<p>It is the <a href="https://owasp.org">OWASP</a> foundation that helps me dive into new topics. We are going to implement the OWAS Application Security Verification Standard (<a href="https://owasp.org/www-project-application-security-verification-standard/">ASVS</a>) and the mobile version of it (MASVS). ASVS already is divided into three levels of security. Level 1 is the basic level of security that all projects need to implement. Actually, Level 1 is quite basic and despite just a few topics, this is all stuff we as developers almost already knew, used, and implemented in the past. If we have thought about it and if the project pressure wasn't that high. Levels 2 and 3 are adding more security mechanisms to the projects to handle sensitive and critical data and infrastructure.</p>
<p>This standard is helping us like a blueprint for all our projects to keep the levels of security and it helps our QA to know what to test from the security point of view.</p>
<p>Actually, since ASVS is adopting and covering many other standards as well, we will be safe with future security audits, no matter what standard will be used during a possible audit.</p>
<h2>Will YOO be a secure software company?</h2>
<p>The company already creates secure software. Since it was more or less a side aspect in the past, we'll now focus on security by following the standards and the process we have implemented.</p>
<p>So, yes, we can now call ourselves a secure software company. But we are not certified somehow. OWASP and ASVS are no badges nor provide certificates we can put proudly and high-nosed on our website. But we can proudly mention that we are following a standard that was created by well-known and independent security experts.</p>
<h2>My main role is still a software engineer :-)</h2>
<p>The application security role is only an additional role to my position as a software engineer. In a midsized company like the YOO ensuring secure software is not such a high effort that it is needed to create a new position like an application security engineer. And therefore it is just a new additional role for me.</p>
<p>Despite that, my job title will change a little bit and will be called <strong>Software & AppSec Engineer</strong>.</p>
<h2>Learning about application security</h2>
<p>Actually, application security as a global topic was kinda new to me and I never expected that it is needed to have the entire company involved. But that also was the fun part: Talking to other disciplines and talking to people that are not really involved in my day-to-day business. It is not completely technical to implement application security in a software company.</p>
<p>As mentioned, the website of the <a href="https://owasp.org">OWASP</a> Foundation points me to various learning resources. OWASP is full of projects to learn about application security. You might know the OWASP Top 10 list of security risks. I already mentioned the ASVS. But there is a ton more.</p>
<p>Another great learning resource is the Twitter feed of <a href="https://twitter.com/shehackspurple">Tanya Janka</a>. Her talks about application security are amazing and her book is a great read, even on vacation at the beaches of Greece while the kids are playing:</p>
<p><img src="https://asp.net-hacker.rocks/img/appsec/appsec.jpg" alt="Learning about AppSec at the beach of Kos. The Sea and playing kids in the background" /></p>
<p>If you want to learn about application security, <a href="https://twitter.com/shehackspurple">follow her on Twitter</a>, <a href="https://shehackspurple.ca">read her blog</a>, <a href="https://shehackspurple.ca/books/">read her book</a>, and <a href="https://shehackspurple.ca/talks/">watch her talks</a>. You also need to dive into the <a href="https://owasp.org">OWASP website</a> and the various projects of the foundation.</p>
<h2>Furthermore</h2>
<p>And maybe, you as a software-producing company would like to adopt the same standards and processes. If you would like to know, how we created and implemented the secure software process, feel free to ask. If you need help to make your software development process more secure, we would be happy to help as well.</p>
<p>Learn more: <a href="https://www.yoo.digital/applicationsecurity">https://www.yoo.digital/applicationsecurity</a></p>
<h2>Conclusion</h2>
<p>I'm happy to start working on my new role officially this week and I'm happy to bring the YOO a step further in creating and delivering high-quality and secure software. I'm also pretty excited about how it will go and grow over time. The implementation of a secure software process will never be complete and needs to be adjusted whenever it's needed. It is a living process that needs reviews and adjustments regularly.</p>https://asp.net-hacker.rocks/2023/03/08/play-with-playwright.htmlPlay with Playwright2023-03-08T00:00:00+00:002023-03-08T00:00:00+00:00Wed, 08 Mar 2023 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<h2>What is Playwright?</h2>
<p><a href="https://playwright.dev/">Playwright</a> is a Web UI testing framework that supports different languages and is maintained by Microsoft. Playwright can be used with JavaScript/TypeScript, Python, Java and for sure C#. It comes with windowless browser support with various browsers. It has to be used with unit testing frameworks and because of this, you can just run it within your CI/CD pipeline. The syntax is pretty intuitive and I actually love it. Besides that the documentation is really good and helps a lot to easily start working with it.</p>
<p>In this blog post, I don't want to introduce Playwright. Actually, the website and the documentation is a much better resource to learn about the it. I would like to play around with it and to use it differently. Instead of testing a pre-hosted web application, I'd like to test a web application that is self hosted in the test project using the <code>WebApplicationFactory</code>. This way you have really isolated UI tests that don't relate to on another infrastructure and won't fail because of network problems.</p>
<p>Does it work?</p>
<p>Let's try it:</p>
<h2>Setting up the solution</h2>
<p>The following lines create an ASP.NET Core MVC project and an NUnit test project. After that, a solution file will be created and the projects will be added to the solution. The last command adds the NUnit implementation of Playwright to the test project:</p>
<pre><code class="language-shell">dotnet new mvc -n PlayWithPlaywright
dotnet new nunit -n PlayWithPlaywright.Tests
dotnet new sln -n PLayWithPlaywright
dotnet sln add .\PlayWithPlaywright\
dotnet sln add .\PlayWithPlaywright.Tests\
dotnet add .\PlayWithPlaywright.Tests\ package Microsoft.Playwright.NUnit
</code></pre>
<p>Run those commands and build the solution:</p>
<pre><code class="language-shell">dotnet build
</code></pre>
<p>The build is needed to copy a PowerShell script to the output directory of the test project. This PowerShell script is the command line interface to control playwright.</p>
<p>At next we need to install the required browsers to execute the tests via that PowerShell:</p>
<pre><code class="language-shell">.\PlayWithPlaywright.Tests\bin\Debug\net7.0\playwright.ps1 install
</code></pre>
<h2>Generating test code</h2>
<p>Using the <code>codegen</code> command helps you to autogenerate test code that can be copied to the test project:</p>
<pre><code class="language-shell">.\PlayWithPlaywright.Tests\bin\Debug\net7.0\playwright.ps1 codegen https://asp.net-hacker.rocks/
</code></pre>
<p>This command opens the Playwright Inspector where you can record your test case. While clicking through your application the test code will be generated on the right hand side:</p>
<p><img src="https://asp.net-hacker.rocks/img/playwright/playwright01.png" alt="plaiwright codegen" /></p>
<p>Instead of testing an external website like I did, you can also call <code>codegen</code> with a locally running application.</p>
<p>Just copy the generated code into the NUnit test project and fix the namespace and class name to match the namespace of your project.</p>
<p>Using the generated code as an example you will be able to write more the tests manually.</p>
<p>If this is done, just run <code>dotnet test</code> to execute the generated test and just to verify that Playwright is working.</p>
<h2>Start playing</h2>
<p>Usually Playwright is testing applications that are running somewhere on a server. This as one simple problem: If the test cannot connect to the running application because of network issues the test will fail. Usually a test should only have one single reason to fail: It should fail because the expected behavior didn't occure.</p>
<p>The solution would be to test a web application that is hosted on the same infrastructure and within the same process as the actual test.</p>
<p>Microsoft already provided the possibility to <a href="https://learn.microsoft.com/en-us/aspnet/core/test/integration-tests?view=aspnetcore-7.0">write integration tests</a> against a web application using the <a href="https://learn.microsoft.com/en-us/dotnet/api/microsoft.aspnetcore.mvc.testing.webapplicationfactory-1?view=aspnetcore-7.0">WebApplicationFactory</a>. My Idea was to use this <code>WebApplicationFactory</code> to host an application that can be tested with Playwright.</p>
<p>Since the WebApplicationFactory also provides a HttpClient, I would expect to have an URL to connect to. That HttpClient would have a BaseAddress that I can use to pass to Playwright.</p>
<p>Would this really work?</p>
<h2>WebApplicationFactory and Playwright</h2>
<p>Actually, we can't combine them by default because the <code>WebApplicationFactory</code> doesn't really host a web application over HTTP. That means it doesn't use Kestrel to expose an endpoint over HTTP. The <code>WebApplicationFactory</code> creates a test server that hosts the application in memory and just simulates an actual HTTP server.</p>
<p>We need to find a way to start a HTTP server, like Kestrel, to host the application. Actually we could start <code>WebApplicationBuilder</code> but the Idea was to reuse the configuration of the Program.cs of the application we want to test. Like it is done with the <code>WebApplicationFactory</code>.</p>
<p><a href="https://twitter.com/Donbavand/">Daniel Donbavand</a> actually found a solution <a href="https://danieldonbavand.com/2022/06/13/using-playwright-with-the-webapplicationfactory-to-test-a-blazor-application/">how to override the WebApplicationFactory to actually host the application</a> over HTTP and to get an endpoint that can be used with Playwright. I used Daniels solution but made it a little more Generic.</p>
<p>Let's see how this works together with Playwright.</p>
<p>First, add a project reference to the web project within the Playwright test project and add a package reference to <a href="https://nuget.org">Microsoft.AspNetCore.Mvc.Testing</a>.</p>
<pre><code class="language-shell">dotnet add .\PlayWithPlaywright.Tests\ reference .\PlayWithPlaywright\
dotnet add .\PlayWithPlaywright.Tests\ package Microsoft.AspNetCore.Mvc.Testing
</code></pre>
<p>The first one is needed to use the <code>Program.cs</code> with the <code>WebApplicationFactory</code>. The second one adds the <code>WebApplicationFactory</code> and the test server to the test project.</p>
<p>To use the <code>Program</code> class that is defined in a <code>Program.cs</code> that uses the minimal API you can simply add an empty partial Program class to the <code>Program.cs</code>.</p>
<p>I just put the following line at the end of the <code>Program.cs</code>:</p>
<pre><code class="language-csharp">public partial class Program { }
</code></pre>
<p>To make the Playwright tests as generic as possible I created an abstract <code>SelfHostedPageTest</code> class that inherits the <code>PageTest</code> class that comes with Playwright and use the <code>CustomWebApplicationFactory</code> there and just expose the server address to the test class that inherits the <code>SelfHostedPageTest</code>:</p>
<pre><code class="language-csharp">public abstract class SelfHostedPageTest<TEntryPoint> : PageTest where TEntryPoint : class
{
private readonly CustomWebApplicationFactory<TEntryPoint> _webApplicationFactory;
public SelfHostedPageTest(Action<IServiceCollection> configureServices)
{
_webApplicationFactory = new CustomWebApplicationFactory<TEntryPoint>(configureServices);
}
protected string GetServerAddress() => _webApplicationFactory.ServerAddress;
}
</code></pre>
<p>The actual Playwright test just inherits the <code>SelfHostedPageTest</code> as follows instead of the <code>PageTest</code>:</p>
<pre><code class="language-csharp">public class PlayWithPlaywrightHomeTests : SelfHostedPageTest<Program>
{
public PlayWithPlaywrightHomeTests() :
base(services =>
{
// configure needed services, like mocked db access, fake mail service, etc.
}) { }
// ...
}
</code></pre>
<p>As you can see, I pass in the Program type as generic argument to the <code>SelfHostedPageTest</code>. The <code>CustomWebApplicationFactory</code> that is used inside is almost the same implementation as done by Daniel. I just added the generic argument for the Program class and added the possibility to pass the service configuration via the constructor:</p>
<pre><code class="language-csharp">internal class CustomWebApplicationFactory<TEntryPoint> :
WebApplicationFactory<TEntryPoint> where TEntryPoint : class
{
private readonly Action<IServiceCollection> _configureServices;
private readonly string _environment;
public CustomWebApplicationFactory(
Action<IServiceCollection> configureServices,
string environment = "Development")
{
_configureServices = configureServices;
_environment = environment;
}
protected override void ConfigureWebHost(IWebHostBuilder builder)
{
builder.UseEnvironment(_environment);
base.ConfigureWebHost(builder);
// Add mock/test services to the builder here
if(_configureServices is not null)
{
builder.ConfigureServices(_configureServices);
}
}
// ...
}
</code></pre>
<p>Now we can use <code>GetServerAddress()</code> to get the server address and to pass it to the <code>Page.GotoAsync()</code> method:</p>
<pre><code class="language-csharp">[Test]
public async Task TestWithWebApplicationFactory()
{
var serverAddress = GetServerAddress();
await Page.GotoAsync(serverAddress);
await Expect(Page).ToHaveTitleAsync(new Regex("Home Page - PlayWithPlaywright"));
Assert.Pass();
}
</code></pre>
<p>That's it.</p>
<p>To try it out. just call dotnet test on the Command Line or PowerShell or run the relevant test in a test explorer.</p>
<h2>Conclusion</h2>
<p>The result with my test project looks like the following while running all the tests when I was offline:</p>
<p><img src="https://asp.net-hacker.rocks/img/playwright/playwright02.png" alt="test result" /></p>
<p>One failing test is the recorded test session of my blog on <a href="https://asp.net-hacker.rocks/">https://asp.net-hacker.rocks/</a> and the other one is the demo test I found on <a href="https://playwright.dev">https://playwright.dev</a>. The passed test is the one that uses the <code>CustomWebApplicationFactory</code></p>
<p>This is exactly the result I expected.</p>
<p>You'll find the the example on my GitHub repository.</p>https://asp.net-hacker.rocks/2023/02/14/circuit-breaker-healthcheck.htmlCreating a circuit breaker health check using Polly CircuitBreaker2023-02-14T00:00:00+00:002023-02-14T00:00:00+00:00Tue, 14 Feb 2023 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>Finally! After months of not writing a blog post, here it is:</p>
<p>A GitHub Issue on the ASP.NET Core Docs points me to <a href="https://github.com/App-vNext/Polly/wiki/Circuit-Breaker">Polly CircuitBreaker</a>. Which is really great. Before that, I didn't even know that circuit breakers is a term in the software industry. Actually, I implemented mechanisms that work like that but never called them circuit breakers. Maybe that's the curse of never having visited a university :-D</p>
<p>http://www.thepollyproject.org/</p>
<h2>What is a circuit breaker?</h2>
<p>Let's assume you have a connection to an external resource that breaks from time to time, which doesn't break your application but degraded the health of your application. In case you check that broken connection your application will be in a degraded state from time to time. What if those connection issues increase? When will it be a broken state? One broken connection out of one thousand might be okay. One out of ten might look quite unhealthy, right? If so, it makes sense to count the number of issues within a period of time and throw an exception if the number of exceptions exceeds the allowed number of exceptions. Exactly this is a circuit breaker.</p>
<p>Please excuse the amateurish explanation, I'm sure <a href="https://martinfowler.com/bliki/CircuitBreaker.html">Martin Fowler can do it much better</a>.</p>
<p>With Polly's circuit breaker, you can define how many specific exceptions are allowed to happen within a specific time period before throwing an exception.</p>
<p>The following snippet shows the usage of Polly's circuit breaker:</p>
<pre><code class="language-csharp">var policy = Policy.Handle<HttpRequestException>()
.CircuitBreakerAsync(
exceptionsAllowedBeforeBreaking: 2,
durationOfBreak: TimeSpan.FromMinutes(1)
));
await policy.ExecuteAsync(async () =>
{
var client = new HttpClient();
var response = await client.GetAsync("http://localhost:5259/api/dummy");
if (!response.IsSuccessStatusCode)
{
throw new HttpRequestException();
}
});
</code></pre>
<p>This creates an <code>AsyncCircuitBreakerPoliy</code> that allows two exceptions within a minute before throwing an exception.</p>
<p>Actually, I wanted to see how this would look like in an ASP.NET Core Health Check. The health check I'm going to show here isn't perfect but shows the concept of it:</p>
<h2>Creating a circuit breaker health check</h2>
<p>Adding a circuit breaker to a health check or in general into a web application requires you to persist the state of that circuit breaker over multiple scopes or requests. This means we need to store instances of the <code>AsyncCircuitBreakerPolicy</code> as singletons in the service collection. <a href="https://github.com/App-vNext/Polly/wiki/Circuit-Breaker#scoping-circuitbreaker-instances">See here</a>.</p>
<h3>Preparing the test application</h3>
<p>To test the implementation I created a minimal endpoint that fails randomly within a new web application:</p>
<pre><code class="language-csharp">app.MapGet("/api/dummy", () =>
{
var rnd = Random.Shared.NextInt64(0, 1000);
if ((rnd % 5) == 0)
{
throw new Exception("new exception");
}
return rnd;
});
</code></pre>
<p>This endpoint returns a random number and fails in case the random number can be divided by five. This exception is meaningless, but the endpoint is good to test the health check we will implement.</p>
<p>We also need to create a health check endpoint that we will call to see the current health state. This endpoint also executes the health check every time we call it. When calling it, the health check will call the dummy API and gets a randomly generated error.</p>
<pre><code class="language-csharp">app.UseHealthChecks("/health");
</code></pre>
<h3>Implementing the health check</h3>
<p>Next, we are going to write a health check that gets an <code>AsyncCircuitBreakerPolicy</code> via the service provider and executes a web request against the dummy breaking endpoint:</p>
<pre><code class="language-csharp">using Microsoft.Extensions.Diagnostics.HealthChecks;
namespace CircuitBreakerChecks;
public class ApiCircuitBreakerHealthCheck<TPolicy> : IHealthCheck where TPolicy : ApiCircuitBreakerContainer
{
private readonly ApiCircuitBreakerHealthCheckConfig _config;
private readonly IServiceProvider _services;
public ApiCircuitBreakerHealthCheck(
ApiCircuitBreakerHealthCheckConfig config,
IServiceProvider services)
{
_config = config;
_services = services;
}
public async Task<HealthCheckResult> CheckHealthAsync(
HealthCheckContext context,
CancellationToken cancellationToken = default)
{
var policy = _services.GetService<TPolicy>()?.Policy;
try
{
if (policy is not null)
{
await policy.ExecuteAsync(async () =>
{
var client = new HttpClient();
var response = await client.GetAsync(_config.Url);
if (!response.IsSuccessStatusCode)
{
throw new HttpRequestException();
}
});
}
}
catch (Exception)
{
return HealthCheckResult.Unhealthy("Unhealthy");
}
return HealthCheckResult.Healthy("Healthy");
}
}
</code></pre>
<p>This health check is generic and receives a container for the <code>AsyncCircuitBreakerPolicy</code> as a generic type argument. We'll see later why.</p>
<p>In the <code>CheckHealthMethod</code> we take the specific container from the service provider to get the actual Polly <code>AsyncCircuitBreakerPolicy</code> and we use it as shown in the first snippet. That part was quite common.</p>
<p>The container is a really simple object that just stores the Policy:</p>
<pre><code class="language-csharp">using Polly.CircuitBreaker;
namespace CircuitBreakerChecks;
public class ApiCircuitBreakerContainer
{
private readonly AsyncCircuitBreakerPolicy _policy;
public ApiCircuitBreakerContainer(AsyncCircuitBreakerPolicy policy)
{
_policy = policy;
}
public AsyncCircuitBreakerPolicy Policy => _policy;
}
</code></pre>
<p>This container gets registered as a singleton to persist the policy for a longer period of time.</p>
<p>The health check also uses a configuration class that passes the configuration arguments to the health check. Currently, it is just the URL of the API to test and the name of the health check registration:</p>
<pre><code>namespace CircuitBreakerChecks;
public class ApiCircuitBreakerHealthCheckConfig
{
public string Url { get; set; }
}
</code></pre>
<p>This configuration class gets registered as transient.</p>
<p>Now let's puzzle that all together to get it running. We could do that all in the <code>Program.cs</code>, but will mess up the file, though.</p>
<p>Instead of messing up the <code>Program.cs</code>, I would like to have a configuration like this:</p>
<pre><code class="language-csharp">builder.Services.AddApiCircuitBreakerHealthCheck(
"http://localhost:5259/api/dummy", // URL to check
"AddApiCircuitBreakerHealthCheck", // Name of the health check registration
Policy.Handle<HttpRequestException>() // Polly CircuitBreaker Async Policy
.CircuitBreakerAsync(
exceptionsAllowedBeforeBreaking: 2,
durationOfBreak: TimeSpan.FromMinutes(1)
));
</code></pre>
<p>In your project, you might need to change the URL to match your local port.</p>
<p>The call in this snippet will register and configure the <code>ApiCircuitBreakerHealthCheck</code>. This means we will create an extension method on the <code>IServiceCollection</code> to stick that all together:</p>
<pre><code class="language-csharp">using Polly.CircuitBreaker;
namespace CircuitBreakerChecks;
public static class IServiceCollectionExtensions
{
public static IServiceCollection AddApiCircuitBreakerHealthCheck(
this IServiceCollection services,
string url,
string name,
AsyncCircuitBreakerPolicy policy)
{
services.AddTransient(_ => new ApiCircuitBreakerHealthCheckConfig { Url = url });
services.AddSingleton(new ApiCircuitBreakerContainer(policy));
services.AddHealthChecks()
.AddCheck<ApiCircuitBreakerHealthCheck<ApiCircuitBreakerContainer>>(name);
return services;
}
}
</code></pre>
<p>That's it.</p>
<h2>Trying it out</h2>
<p>To try it out, run the application and call the health check endpoint in the browser.</p>
<p>The first two calls should display a healthy state for sure. Then it stays healthy until it gets at least two errors within a period of one minute. After that, it stays unhealthy until it gets less than two errors within that period.</p>
<p>Play around with it. Debug the minimal endpoint or debug the health check. It is kind of fun.</p>
<p>I published the demo code to GitHub: https://github.com/JuergenGutsch/aspnetcore-circuitbreaker-healthcheck</p>
<h2>One issue left</h2>
<p>With this implementation, we can just use one registration of this endpoint. Creating a generic health check that way didn't really make sense. The reason is the singleton instance of the Policy CircuitBreaker. This instance would be shared over multiple health check registrations. To solve this we need to find a way to register a unique singleton of a policy per health check registration. But this is a different story.</p>https://asp.net-hacker.rocks/2022/11/15/terminal-winget-posh.htmlWindows Terminal, PowerShell, oh-my-posh, and Winget 2022-11-15T00:00:00+00:002022-11-15T00:00:00+00:00Tue, 15 Nov 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>I'm thinking about changing the console setup I use for some development tasks on Windows. The readers of this block already know that I'm a console guy. I'm using git and docker in the console only. I'm navigating my folders using the console. I even used the console to install, update or uninstall tools using Chocolatey (https://chocolatey.org/).</p>
<p>This post is not a tutorial on how to install and use the tools I'm going to mention here. It is just a small portrait of what I'm going to use. Follow the links to learn more about the tools.</p>
<h2>PowerShell and oh-my-posh</h2>
<p>Actually, working in the console doesn't work for me with the regular <code>cmd.exe</code> and I completely understand why developers on Windows still prefer using windows based tools for git and docker, and so on. Because of that, I was using <code>cmder</code> (https://cmder.app/), a great terminal with useful Linux commands and great support for git. The git support not only integrates the git CLI, but it also shows the current branch in the prompt:</p>
<p><img src="https://asp.net-hacker.rocks/img/terminal/cmder.png" alt="Cmder in action" /></p>
<p>The latter is a great help when working with git; I missed that in the other terminals. Commander also supports adding different shells like git bash, WSL, or PowerShell but I used the cmd shell which has been enriched with a lot more useful commands. This worked great for me.</p>
<p>For a couple of weeks, I'm playing around with the Windows Terminal a little more. The reason why I looked into the Windows Terminal is, that I like the more lightweight settings.</p>
<p>The Windows Terminal (<a href="https://apps.microsoft.com/store/detail/windows-terminal/9N0DX20HK701">download it from the windows store</a>) and <code>oh-my-posh</code> (https://ohmyposh.dev/) are out for a while and I followed <a href="https://www.hanselman.com/blog/a-nightscout-segment-for-ohmyposh-shows-my-realtime-blood-sugar-readings-in-my-git-prompt">Scott Handelman's blog posts about it</a> for a long time but wasn't able to get it running on my machine. Two weeks ago I <a href="https://github.com/JanDeDobbeleer/oh-my-posh/discussions/2994">got some help</a> by <a href="https://twitter.com/jandedobbeleer">Jan De Dobbeleer</a> to get it running. It just turned out that I had too many posh versions installed on my machine, and the path environment variable was messed up. After cleaning my system and reinstalling oh-my-posh on my machine by <a href="https://ohmyposh.dev/docs/installation/windows">following the installation guide</a> it is working quite well:</p>
<p><img src="https://asp.net-hacker.rocks/img/terminal/terminal.png" alt="Terminal and posh in action" /></p>
<p>I still need to configure the prompt a little bit to match my needs 100% but the current theme is great for now and does more as <code>cmder</code> does. I'd like to display the latest tag of the current git repository and the currently used dotnet SDK version, but this will be another story.</p>
<h2>Windows Terminal</h2>
<p>In the Windows Terminal, I configured oh-my-posh for both, the Windows PowerShell 5 and the new PowerShell 7 and set the PowerShell 7 as my default console. I also added configurations to use PowerShell 5, WSL (both Ubuntu 18 and Ubuntu 20), git bash, and the Azure Cloud Shell. I did almost the same with <code>cmder</code> but I like the way how it gets configured in Windows Terminal.</p>
<h2>Winget</h2>
<p>Winget is basically an apt-get for windows and I like it.</p>
<p>As mentioned, Chocolatey is the tool I used to install the tools I need, like git, cmder, etc. I tried it for a while, winget was mentioned on Twitter (unfortunately I forgot the link). Actually, it is much better than Chocolatey because it uses the application registry used by windows, which means it can update and uninstall programs that have been installed without using winget.</p>
<p>Winget is the console version of installing and managing installed programs on Windows and it is natively installed on Windows 10 and Windows 11.</p>
<h2>Conclusion</h2>
<p>So I'm going to change my setup from this ...</p>
<ul>
<li>cmder
<ul>
<li>cmd</li>
<li>chocolatey</li>
</ul>
</li>
</ul>
<p>... to that ...</p>
<ul>
<li>Windows Terminal
<ul>
<li>Powershell7</li>
<li>oh-my-posh</li>
<li>Winget</li>
</ul>
</li>
</ul>
<p>... and it seems to work great for me.</p>
<p>Any other tools that I should have a look at? Just drop me a comment :-)</p>https://asp.net-hacker.rocks/2022/10/25/aspnetcore-globalization.htmlASP.NET Core Globalization and a custom RequestCultureProvider [Updated]2022-10-25T00:00:00+00:002022-10-25T00:00:00+00:00Tue, 25 Oct 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>In this post, I'm going to write about how to enable and use Globalization in ASP.NET Core. Since you don't can change the culture depending on route values by default, I show you how to create and register a custom <code>RequestCultureProvider</code> that does this job.</p>
<blockquote>
<p><strong>UPDATE:</strong></p>
<p><a href="https://twitter.com/hishambinateya/">Hisham Bin Ateya</a> pointed me to the [fact via Twitter](TWITTER STATUS) that there already is a <code>RequestCultureProvider</code> that can change the culture depending on route values in ASP.NET Core. Because of that, please see the last section in this blog post just as an example about how to create a custom <code>RequestCultureProvider</code>.</p>
<p>I also restructured the post a little bit. to separate general information about Globalization and <code>RequestCultureProvider</code>. If you are familiar with Globalization, just skip the fist sections and jump to the second last section.</p>
</blockquote>
<h2>About GLobalization</h2>
<h3>Resources Files</h3>
<p>Like in the old time of the .NET Framework, the resources (strings, images, icons, etc.) for different languages are stored in so-called resource files that end with <code>resx</code> stored in a folder called <code>Resources</code> by default.</p>
<p>Unlike in the good old time of the .NET Framework, the right resource files will be fetched automatically by the implementation of the specific <code>Localizer</code> if you follow some naming conventions.</p>
<ul>
<li>If you inject the Localizer into a controller, the localizer should be named like <code>Controllers.ControllerClassName.[Culture].resx</code> or put to a subfolder called <code>Controllers</code> and named like <code>ControllerClassName.[Culture].resx</code> .</li>
<li>If you are injecting the Localizer into a view, it is almost the same as for the controllers. The difference is just to have a view name in the resource path instead of a controller name: <code>Views.ControllerName.ViewName.[culture].resx</code> or <code>Views/ControllerName/ViewName.[culture].resx</code>.</li>
</ul>
<p>It is up to you to decide how you like to structure your resource files. Personally, I prefer the folder option. Also, an autogenerated code file as you might know from the past is no longer needed since you need to use a localizer to access the resources.</p>
<p>Unfortunately there is now way yet to add a resource file via the .NET CLI. Maybe there will be a template in the future. I created the resource file with the Visual Studio 2022 and copied it to create the other files needed.</p>
<h3>Localizers</h3>
<p>You no longer need to use the resource manager to read the actual localized strings from the resource files. You can now use an <code>IStringLocalizer</code> or an <code>IHtmlLocalizer</code>. The latter doesn't HTML-encode the strings that are stored in the resource files and can be used to localize strings that contain HTML code if needed.:</p>
<pre><code class="language-csh">using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Localization;
namespace Internationalization.Controllers;
public class HomeController : Controller
{
private readonly IStringLocalizer<HomeController> _localizer;
public AboutController(IStringLocalizer<HomeController> localizer)
{
_localizer = localizer;
}
public IActionResult Index()
{
return View(new { Title = _localizer["About Title"]});
}
}
</code></pre>
<p>The resource key named "About Title" doesn't need to exist or even the resource file doesn't need to exist. If the Localizer doesn't find the key, the key itself gets returned as a string. You can use any kind of string as a key. This can help you to develop the application without having the resource files in place.</p>
<p>You can even inject a localizer in the Razor View like this:</p>
<pre><code class="language-Html">@using Microsoft.AspNetCore.Mvc.Localization
@inject IViewLocalizer Localizer
@model HomeIndexViewModel
@{
ViewData["Title"] = Localizer["Title"];
}
<h1>@ViewData["Title"]</h1>
</code></pre>
<p>In this case, it is an <code>HtmlLocalizer</code> the key can also contain HTML that doesn't get encoded when writing it out to the view. Even if it's not recommended to save HTML in resource files it might be needed in some cases. You shouldn't do that because HTML should be part of the frontend templates like Razor, Blazor, etc.</p>
<p>Instead of using the <code>ViewLocalizer</code> in the Razor Templates, you can also localize the entire View. Therefore you need to suffix the view name with the needed culture or put the view in a subfolder called like the culture. How localized views are handled needs to be configured when enabling Localization and Globalization.</p>
<h3>Enabling Globalization in ASP.NET Core</h3>
<p>As usual, you need to add the required services to enable localization:</p>
<pre><code class="language-csharp">builder.Services.AddLocalization(options =>
{
options.ResourcesPath = "Resources";
});
builder.Services.AddControllersWithViews()
.AddViewLocalization(LanguageViewLocationExpanderFormat.Suffix)
.AddDataAnnotationsLocalization();
</code></pre>
<p>The first line adds general localization to be used in the C# code, like controllers, etc. setting the <code>ResourcePath</code> in the options is optionally and just added to the snippets, to show you that you can change the path where the resources are stored.</p>
<p>After that, the <code>ViewLocalization</code>, as well as the <code>DataAnnotationLocalization,</code> was added to the Service Collection. The <code>LanguageViewLocationExpanderFormat</code> tells the View localizer that in the case of localized views, the culture was added as a suffix to the filename instead of being part of the folder structure.</p>
<p>After adding the needed services to the service collection the required middleware needs to be added as well:</p>
<pre><code class="language-csharp">app.UseRequestLocalization(options =>
{
var culture = new List<CultureInfo> {
new CultureInfo("en-en"),
new CultureInfo("fr-fr"),
new CultureInfo("de-de")
};
options.DefaultRequestCulture = new RequestCulture("en-en");
options.SupportedCultures = culture;
options.SupportedUICultures = culture;
});
</code></pre>
<p>This Middleware uses pre-configured <code>RequestCultureProviders</code> to set the <code>Culture</code> and the <code>UICulture</code> to the current process. With this culture set, the localizers can select the right resource files or the right localized views.</p>
<p>That's it with enabling Localization and Globalization. With this information, you should be able to create multilanguage applications already.</p>
<h3>Culture vs. UI Culture</h3>
<p>Setting the culture will set the application to a specific language and optional a region. If you also set the <code>UI Culture</code>, you make a distinction between translating texts and between how numbers, dates, and currencies are displayed. The <code>UI culture</code> is used to load resources from a corresponding resource file and the <code>Culture</code> is used to change the way how numbers, dates, and currencies are formatted and displayed.</p>
<p>In some cases, it makes sense to handle that separately. If you only like to translate your page without taking care of number and date formats, etc. you should only change the <code>UI Culture</code>.</p>
<h3>Localize ViewModels</h3>
<p>While enabling view localization, we also enabled <code>DataAnnotationsLocalization</code>. This helps you to translate labels for form fields in case you use the <code>@Html.LabelFor()</code> method. This doesn't need to specify the <code>ResourceType</code> anymore. Since there is no longer an autogenerated C#-File, there is also no <code>ResourceType</code> specified. Inside the <code>ViewModel</code> you just need to specify the <code>DisplayAttribute</code>:</p>
<pre><code class="language-csharp">public class EmployeeViewModel
{
[Display(Name = "Number")]
public int Number { get; set; }
[Display(Name = "Firstname")]
public string? Firstname { get; set; }
[Display(Name = "Lastname")]
public string? Lastname { get; set; }
[Display(Name = "Department")]
public string? Department { get; set; }
[Display(Name = "Phone")]
public string? Phone { get; set; }
[Display(Name = "Email")]
public string? Email { get; set; }
[Display(Name = "Date of birth")]
public DateTime DateOfBirth { get; set; }
[Display(Name = "Size")]
public decimal Size { get; set; }
[Display(Name = "Salery")]
public decimal Salery { get; set; }
}
</code></pre>
<p>The <code>DataAnnotationsLocalizer</code> will automatically use the string that is set in the <code>Name</code> property as a key to search for the relevant resource. This also works for the <code>Description</code> and the <code>ShortName</code> properties.</p>
<p>The resource file that is used to translate the display names has to be placed inside subfolders called <code>ViewModels/ControllerName</code>. Example: <code>/Resources/ViewModels/Home/EmployeeModel.de-DE.resx</code></p>
<h2>Creating a custom RequestCultureProvider</h2>
<h3>RequestCultureProviders</h3>
<p>As mentioned <code>RequestCultureProviders</code> retrieve the culture from somewhere and prepare it for the process to work with the culture. The <code>RequestCultureProviders</code> return a <code>ProviderCultureResult</code> that has the property <code>Culture</code> and <code>UICulture</code> set. Both cultures can differ if needed. In most cases, it will be the same.</p>
<p>There are three preconfigured <code>RequestCultureProviders</code>:</p>
<ul>
<li>
<p><code>QueryStringRequestCultureProvider</code>
This provider extracts the <code>Culture</code> and <code>UICulture</code> from query string values if there are any. This means you can switch the language by just setting the query strings. <code>?culture-de-DE&ui-culture=de-DE</code></p>
</li>
<li>
<p><code>CookieRequestCultureProvider</code>
This provider extracts the culture information from a specific cookie. The cookie-name is <code>.AspNetCore.Culture</code> and the value of the cookie might look like this: <code>c=es-MX|uic=es-MX</code> (<code>c</code> is the culture and <code>uic</code> is the ui-culture)</p>
</li>
<li>
<p><code>AcceptLanguageHeaderRequestCultureProvider</code>
That provider extracts the language information from the <code>Accept-Language</code> Header that gets sent by the browsers. Every browser has preferred languages configured and sends those languages to the server. With this information, you can localize your application-specific to the user's language</p>
</li>
</ul>
<p>As you have seen in the previews section, not every language sent by the accept-language header, cookie, or query string gets accepted by your application. You need to define a list of supported cultures and a default request culture that is used if the language sent by the client isn't supported by your application.</p>
<h3>The custom RequestCultureProvider</h3>
<blockquote>
<p><strong>UPDATE:</strong></p>
<p>Actually there is an existing RequestCultureProvider in ASP.NET Core that can change the culture depending on rout values. Since it isn't in the default list of registered RequestCultureProvider, I expected that there is none. That was wrong.</p>
<p>Since there is one already, just see the following section as an example about how to create a custom RequestCultureProvider.</p>
</blockquote>
<p>What I am missing in the list of <code>RequestCultureProviders</code> is a <code>RouteValueRequestCultureProvider</code>. A provider that is getting the culture information from a route value in case it is part of the route like this <code>/en-US/Home/Index/</code></p>
<p>Let's assume we have a route configured like this:</p>
<pre><code class="language-csharp">app.MapControllerRoute(
name: "default",
pattern: "{culture=en-us}/{controller=Home}/{action=Index}/{id?}");
</code></pre>
<p>This adds the culture as part of the route.</p>
<p>Actually, I built a <code>RouteValueRequestCultureProvider</code> that handles the route values:</p>
<pre><code class="language-csharp">using Microsoft.AspNetCore.Localization;
namespace Internationalization.Providers;
// <summary>
/// Determines the culture information for a request via values in the route values.
/// </summary>
public class RouteValueRequestCultureProvider : RequestCultureProvider
{
/// <summary>
/// The key that contains the culture name.
/// Defaults to "culture".
/// </summary>
public string RouteValueKey { get; set; } = "culture";
/// <summary>
/// The key that contains the UI culture name. If not specified or no value is found,
/// <see cref="RouteValueKey"/> will be used.
/// Defaults to "ui-culture".
/// </summary>
public string UIRouteValueKey { get; set; } = "ui-culture";
public override Task<ProviderCultureResult?> DetermineProviderCultureResult(HttpContext httpContext)
{
if (httpContext == null)
{
throw new ArgumentNullException(nameof(httpContext));
}
var request = httpContext.Request;
if (!request.RouteValues.Any())
{
return NullProviderCultureResult;
}
string? queryCulture = null;
string? queryUICulture = null;
if (!string.IsNullOrWhiteSpace(RouteValueKey))
{
queryCulture = request.RouteValues[RouteValueKey]?.ToString();
}
if (!string.IsNullOrWhiteSpace(UIRouteValueKey))
{
queryUICulture = request.RouteValues[UIRouteValueKey]?.ToString();
}
if (queryCulture == null && queryUICulture == null)
{
// No values specified
return NullProviderCultureResult;
}
if (queryCulture != null && queryUICulture == null)
{
// Value for culture but not for UI culture so default to culture value for both
queryUICulture = queryCulture;
}
else if (queryCulture == null && queryUICulture != null)
{
// Value for UI culture but not for culture so default to UI culture value for both
queryCulture = queryUICulture;
}
var providerResultCulture = new ProviderCultureResult(queryCulture, queryUICulture);
return Task.FromResult<ProviderCultureResult?>(providerResultCulture);
}
}
</code></pre>
<p>This <code>RouteValueRequestCultureProvider</code> reads the culture and the ui-culture out of the route values and returns a <code>ProviderCultureResult</code> that will be used by the Localizers.</p>
<p>The route engine handles the generation of the route URLs for us if we use the MVC mechanisms to create links and tags. We'll now have the selected language and region everywhere in the routes.</p>
<p>To create a language changer, we Just need to change the culture in the route value like this:</p>
<pre><code class="language-html"><ul class="navbar-nav flex-grow-1 justify-content-end">
<li class="nav-item">
<a class="nav-link text-dark" asp-area=""
asp-controller="@Context.GetRouteValue("Controller")"
asp-action="@Context.GetRouteValue("Action")"
asp-route-culture="en-US">EN</a>
</li>
<li class="nav-item">
<a class="nav-link text-dark" asp-area=""
asp-controller="@Context.GetRouteValue("Controller")"
asp-action="@Context.GetRouteValue("Action")"
asp-route-culture="de-DE">DE</a>
</li>
<li class="nav-item">
<a class="nav-link text-dark" asp-area=""
asp-controller="@Context.GetRouteValue("Controller")"
asp-action="@Context.GetRouteValue("Action")"
asp-route-culture="fr-FR">FR</a>
</li>
</ul>
</code></pre>
<p>Changing the culture and the UI culture also changes the way how dates, numbers, and currencies are displayed. This means the language changer is also changing the region and will display the currency in Euro in case you change the region to a region that uses the Euro as local currency. You need to keep this in mind when working with financial data because just changing the currency doesn't make sense if you don't convert the actual numbers to the local currency as well. If you don't want to change the currency, you should hard code the way how to format and display currency. Just fixing the culture of a region and changing the UI culture would also set the numbers and dates to a fixed format which is not what we want to have.</p>
<p>This is the start page of the sample project in French.</p>
<p><img src="https://asp.net-hacker.rocks/img/globalization/francais.png" alt="French localized UI" /></p>
<p>(I apologies for wrong translations. Unfortunately it is more than 25 years in the past when I was learning French in school.)</p>
<h2>Sample application and Conclusion</h2>
<p>This is actually working and I created a small application to demonstrate this. This sample includes all the topics of this post. You will find the <a href="https://github.com/JuergenGutsch/aspnetcore-globalization">sample project in Github</a>.</p>
<p>Microsoft reduces the complexity a lot. On the other hand, if you were used to work with more complex resource handling in the past, you will stumble upon small things you won't expect like me. However, adding Globalization and Localization in .NET 7 is easy and I like the way how to get it working.</p>https://asp.net-hacker.rocks/2022/10/03/aspnetcore7-updates.htmlASP.NET Core 7 updates2022-10-03T00:00:00+00:002022-10-03T00:00:00+00:00Mon, 03 Oct 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>Release candidate 1 of ASP.NET Core 7 is out for around two weeks and the release date isn't that far. The beginning of November usually is the time when Microsoft is releasing the new version of .NET. Please find the announcement post here: <a href="https://devblogs.microsoft.com/dotnet/asp-net-core-updates-in-dotnet-7-rc-1/">ASP.NET Core updates in .NET 7 Release Candidate 1</a>. I will not repeat this post but pick some personal highlights to write about.</p>
<h3>ASP.NET Core Roadmap for .NET 7</h3>
<p>First of all, a <a href="https://github.com/dotnet/aspnetcore/issues/39504">look at the ASP.NET Core roadmap for .NET 7</a> shows us, that there are only a few issues open and planned for the upcoming release. That means the release is complete and almost only bugfixes will be pushed to that release. Many other open issues are already stroked through and probably assigned to a later release. Guess we'll have a published roadmap for ASP.NET Core on .NET 8 soon. At the latest at the beginning of November.</p>
<p>What are the updates of this RC 1?</p>
<h3>A lot of Blazor</h3>
<p>Even this release is full of Blazor improvements. Those working a lot with Blazor will be happy about improved JavaScript interop, debugging improvements, handling location-changing events, and dynamic authentication requests coming with this release.</p>
<p>However, there are some quite interesting improvements within this release that might be great for almost every ASP.NET Core developer:</p>
<h3>Faster HTTP/2 uploads and HTTP3 performance improvements</h3>
<p>The team increases the default upload connection window size of HTTP/2, resulting in a much faster upload time. Stream handling is always tricky and needs a lot of fine-tuning to find the right balance. Improving the upload speed by more than five times is awesome and really helpful to upload bigger files. Even in HTTP/3 the performance was increased by reducing HTTP/3 allocations. Feature parity with HTTP/1, HTTP/2, and HTTP/3 is as useful as Server Name Indication (SNI) when configuring connection certificates.</p>
<h3>Rate limiting middleware improvements</h3>
<p>The rate-limiting middleware got some small improvements to make it easier and more flexible to configure. You can now add attributes to controller actions to enable or disable rate limiting on specific endpoints. To do the same on Minimal API endpoints and endpoint groups you can use methods to enable or disable rate limiting. This way you can enable rate-limiting for an endpoint group, but disable it for a specific one inside this group.</p>
<p>You can specify the rate limiting policy on both attributes, endpoints, and endpoint groups methods. Unlike the attributes that support named policies, only the Minimal API methods can also take an instance of a policy.</p>
<h3>Experimental stuff added to this release</h3>
<p>WebTransport is a new draft specification for HTTP/3 that works similarly to WebSockets but supports multiple streams per connection. The support for WebTransport is now added as an experimental feature to the RC1</p>
<p>One of the new features in .NET 7 is gRPC JSON transcoding to turn gRPC APIs into RESTful APIs. Any RESTful API should have an OpenAPI documentation, and so should gRPC JSON transcoding. This release now contains experimental support to add Swashbuckle Swagger to gRPC to render an OpenAPI documentation</p>
<h3>Conclusion</h3>
<p>ASP.NET Core on .NET 7 seems to be complete now and I'm really looking forward to the .NET Conf 2022 beginning of November which will be the launch event for .NET 7.</p>
<p>And exactly this reminds me to start thinking about the next edition of my book "Customizing ASP.NET Core" which needs to be updated to .NET 8 and enhanced by probably three more chapters next year.</p>https://asp.net-hacker.rocks/2022/07/26/aspnetcore7-outputcaching.htmlASP.NET Core on .NET 7.0 - Output caching2022-07-26T00:00:00+00:002022-07-26T00:00:00+00:00Tue, 26 Jul 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>Finally, Microsoft added output caching to the ASP.NET Core 7.0 preview 6.</p>
<p>Output caching is a middleware that caches the entire output of an endpoint instead of executing the endpoint every time it gets requested. This will make your endpoints a lot faster.</p>
<p>This kind of caching is useful for APIs that provide data that don't change a lot or that gets accessed pretty frequently. It is also useful for more or less static pages, e.g. CMS output, etc. Different caching options will help you to fine-tune your output cache or to vary the cache based on header or query parameter.</p>
<p>For more dynamic pages or APIs that serve data that change a lot, it would make sense to cache more specifically on the data level instead of the entire output.</p>
<h2>Trying output caching</h2>
<p>To try output caching I created a new empty web app using the .NET CLI:</p>
<pre><code class="language-shell">dotnet new web -n OutputCaching -o OutputCaching
cd OutputCaching
code .
</code></pre>
<p>This will create the new project and opens it in VSCode.</p>
<p>In the <code>Program.cs</code> you now need to add output caching to the <code>ServiceCollection</code> as well as using the middleware on the <code>app</code>:</p>
<pre><code class="language-csharp">var builder = WebApplication.CreateBuilder(args);
builder.Services.AddOutputCache();
var app = builder.Build();
app.UseOutputCache();
app.MapGet("/", () => "Hello World!");
app.Run();
</code></pre>
<p>This enables output caching in your application.</p>
<p>Let's use output caching with the classic example that displays the current date and time.</p>
<pre><code class="language-csharp">app.MapGet("/time", () => DateTime.Now.ToString());
</code></pre>
<p>This creates a new endpoint that displays the current date and time. Every time you refresh the result in the browser, you got a new time displayed. No magic here. Now we are going to add some caching magic to another endpoint:</p>
<pre><code class="language-csharp">app.MapGet("/time_cached", () => DateTime.Now.ToString())
.CacheOutput();
</code></pre>
<p>If you access this endpoint and refresh it in the browser, the time will not change. The initial output got cached and you'll receive the cached output every time you refresh the browser.</p>
<p>This is good for more or less static outputs that don't change a lot. What if you have a frequently used API that just needs a short cache to reduce the calculation effort or to just reduce the database access. You can reduce the caching time to, let's say, 10 seconds:</p>
<pre><code class="language-csharp"> builder.Services.AddOutputCache(options =>
{
options.DefaultExpirationTimeSpan = TimeSpan.FromSeconds(10);
});
</code></pre>
<p>This reduces the default cache expiration timespan to 10 seconds.</p>
<p>If you now start refreshing the endpoint we created previously, you'll get a new time every 10 seconds. This means the cache get's released every 10 seconds. Using the options you can also define the size of the cached body or the overall cache size.</p>
<p>If you provide a more dynamic API that receives parameters using query strings. You can vary the cache by the query string:</p>
<pre><code class="language-csharp">app.MapGet("/time_refreshable", () => DateTime.Now.ToString())
.CacheOutput(p => p.VaryByQuery("time"));
</code></pre>
<p>This adds another endpoint that varies the cache by the query string argument called "time". This means the query string <code>?time=now</code>, caches a different result than the query string <code>?time=later</code> or <code>?time=before</code>.</p>
<p>The <code>VaryByQuery</code> function allows you to add more than one query string:</p>
<pre><code class="language-csharp">app.MapGet("/time_refreshable", () => DateTime.Now.ToString())
.CacheOutput(p => p.VaryByQuery("time", "culture", "format"));
</code></pre>
<p>In case you like to vary the cache by HTTP headers you can do this the same way using the <code>VaryByHeader</code> function:</p>
<pre><code class="language-csharp">app.MapGet("/time_cached", () => DateTime.Now.ToString())
.CacheOutput(p => p.VaryByHeader("content-type"));
</code></pre>
<h2>Further reading</h2>
<p>If you like to explore more complex examples of output caching, it would make sense to have a look into the samples project:</p>
<p>https://github.com/dotnet/aspnetcore/blob/main/src/Middleware/OutputCaching/samples/OutputCachingSample/Startup.cs</p>https://asp.net-hacker.rocks/2022/04/01/aspnetcore7-files-and-streams-in-minimalapi.htmlASP.NET Core on .NET 7.0 - File upload and streams using Minimal API2022-04-01T00:00:00+00:002022-04-01T00:00:00+00:00Fri, 01 Apr 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>It seems the Minimal API that got introduced in ASP.NET Core 6.0 will now be finished in 7.0. One feature that was heavily missed in 6.0 was the File Upload, as well as the possibility to read the request body as a stream. Let's have a look how this would look alike.</p>
<h2>The Minimal API</h2>
<p>Creating endpoints using the Minimal API is great for beginners, or to create small endpoints like for microservice applications, or of your endpoints need to be super fast, without the overhead of binding routes to controllers and actions. However, endpoints created with the Minimal API might be quite useful.</p>
<p>By adding the mentioned features they are even more useful. And many more Minimal PI improvements will come in ASP.NET Core 7.0.</p>
<p>To try this I created a new empty web app using the .NET CLI</p>
<pre><code class="language-shell">dotnet new web -n MinimalApi -o MinimalApi
cd MinimalApi
code .
</code></pre>
<p>This will create the new project and opens it in VSCode.</p>
<p>Inside VSCode open the Program.cs that should look like this</p>
<pre><code class="language-Csharp">var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();
app.MapGet("/", () => "Hello World!");
app.Run();
</code></pre>
<p>Here we see a simple endpoint that sends a "Hello World!" on a GET request.</p>
<h2>Uploading files using <code>IFormFile</code> and <code>IFormFileCollection</code></h2>
<p>To upload files we should map an endpoint that listens to POST</p>
<p>Inside the <code>Program.cs</code>, lets create two endpoints, one that receives a <code>IFormFile</code> and another one that receives a <code>IFormFileCollection</code></p>
<pre><code class="language-csharp">app.MapPost("/upload", async(IFormFile file) =>
{
string tempfile = CreateTempfilePath();
using var stream = File.OpenWrite(tempfile);
await file.CopyToAsync(stream);
// dom more fancy stuff with the IFormFile
});
app.MapPost("/uploadmany", async (IFormFileCollection myFiles) =>
{
foreach (var file in files)
{
string tempfile = CreateTempfilePath();
using var stream = File.OpenWrite(tempfile);
await file.CopyToAsync(stream);
// dom more fancy stuff with the IFormFile
}
});
</code></pre>
<p>The <code>IFormfile</code> is the regular interface <code>Microsoft.AspNetCore.Http.IFormFile</code> that contains all the useful information about the uploaded file, like <code>FileName</code>, <code>ContentType</code>, <code>FileSize</code>, etc.</p>
<p>The <code>CreateTempfilePath</code> that is used here is a small method I wrote to generate a temp file and a path to it. It also creates the folder in case it doesn't exist:</p>
<pre><code class="language-csharp">static string CreateTempfilePath()
{
var filename = $"{Guid.NewGuid()}.tmp";
var directoryPath = Path.Combine("temp", "uploads");
if (!Directory.Exists(directoryPath)) Directory.CreateDirectory(directoryPath);
return Path.Combine(directoryPath, filename);
}
</code></pre>
<p>The creation of a temporary filename like this is needed because the actual filename and extension should be exposed to the filesystem for security reason.</p>
<p>Once the file is saved, you can do whatever you need to do with it.</p>
<blockquote>
<p><strong>Important note:</strong>
Currently the file upload doesn't work in case there is a cookie header in the POST request or in case authentication is enabled. This will be fixed in one of the next preview versions. For now you should delete the cookies before sending the request</p>
</blockquote>
<p><img src="https://asp.net-hacker.rocks/img/aspnetcore7/iformfile.png" alt="iformfile" /></p>
<h1>Read the request body as stream</h1>
<p>This is cool, you can now read the body of a request as a stream and do what ever you like to do. To try it out I created another endpoint into the <code>Program.cs</code>:</p>
<pre><code class="language-csharp">app.MapPost("v2/stream", async (Stream body) =>
{
string tempfile = CreateTempfilePath();
using var stream = File.OpenWrite(tempfile);
await body.CopyToAsync(stream);
});
</code></pre>
<p>I'm going to use this endpoint to to store a binary in the file system. BTW: This stream is readonly and not buffered, that means it can only be read once:</p>
<p><img src="https://asp.net-hacker.rocks/img/aspnetcore7/stream.png" alt="request body as stream" /></p>
<p>It works the same way by using a <code>PipeReader</code> instead of a <code>Stream</code>:</p>
<pre><code class="language-csharp">app.MapPost("v3/stream", async (PipeReader body) =>
{
string tempfile = CreateTempfilePath();
using var stream = File.OpenWrite(tempfile);
await body.CopyToAsync(stream);
});
</code></pre>
<h2>Conclusion</h2>
<p>This features makes the Minimal API much more useful. What do you think? Please drop a comment about your opinion.</p>
<p>This aren't the only new features that will come in ASP.NET Core 7.0, many more will come. I'm really looking forward to the route grouping that is announced in the roadmap.</p>https://asp.net-hacker.rocks/2022/02/21/aspnetcore7.htmlASP.NET Core on .NET 7.0 - Roadmap, preview 1 and file upload in minimal APIs2022-02-21T00:00:00+00:002022-02-21T00:00:00+00:00Mon, 21 Feb 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>I really like the transparent development of .NET and ASP.NET Core. It is all openly discussed publicly announced on GitHub and <a href="https://devblogs.microsoft.com/dotnet/">developer blogs</a>.</p>
<p>Same with the the first preview version of .NET 7.0 which is released just a couple of days ago. Three months after .NET 6.0 was released. This is thee chance to have the first glimpse at ASP.NET Core 7.0 which will be released beginning of November this year.</p>
<h2>Roadmap for ASP.NET Core 7.0</h2>
<p>Did you know that there is already a roadmap for ASP.NET Core 7.0? It actually is and it is full of improvements:</p>
<p><a href="https://github.com/dotnet/aspnetcore/issues/39504">ASP.NET Core Roadmap for .NET 7</a></p>
<p>Even in version 7.0 Microsoft is planning to improve the runtime performance. Also, the ASP.NET Core web frameworks will be improved. Minimal API, SignalR, and Orleans are the main topics here but also Rate Limiting is a topic. There are also a lot of issues about the web UI technologies Maui, Blazor, MVC and the Razor Compiler are the main topics here.</p>
<p>The roadmap refers to the specific GitHub issues that contain a lot of exciting discussions. I would propose to have a detailed look at some of those</p>
<h2>ASP.NET Core 7.0 Preview 1</h2>
<p>Just a couple of days ago Microsoft released the first preview version of .NET 7.0 and Daniel Roth published a detailed explanation about what was done in ASP.NET Core with this release.</p>
<p><a href="https://devblogs.microsoft.com/dotnet/asp-net-core-updates-in-net-7-preview-1/">ASP.NET Core updates in .NET 7 Preview 1</a></p>
<p>Even this year, I will go through the previews and write about interesting upcoming features that will be in the final release like this one:</p>
<h2>IFormFile and IFormFileCollection support in minimal APIs</h2>
<p>This is an improvement that is requested since the Minimal API was announced the first time. You can now handle uploaded files in minimal APIs using <code>IFormFile</code> and <code>IFormFileCollection</code>.</p>
<pre><code class="language-csharp">app.MapPost("/upload", async(IFormFile file) =>
{
using var stream = System.IO.File.OpenWrite("upload.txt");
await file.CopyToAsync(stream);
});
app.MapPost("/upload-many", async (IFormFileCollection myFiles) => { ... });
</code></pre>
<p>(This snippet was copied from the blog post mentioned above.)</p>
<p>I'm sure this makes the minimal APIs more useful than before even if there is some limitation that will be addressed in later preview releases of .NET 7.0.</p>
<h2>What's next?</h2>
<p>As mentioned, I'll pick interesting features from the roadmap and the announcement posts to have a little deeper look at those features and to write about it.</p>https://asp.net-hacker.rocks/2022/02/14/20-years-of-dotnet.html20 years of .NET 2022-02-14T00:00:00+00:002022-02-14T00:00:00+00:00Mon, 14 Feb 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>.NET turns 20 years old today and it is just kind of ... wow!</p>
<p><img src="https://asp.net-hacker.rocks/img/dotnet20.png" alt=".NET 20 Years" /></p>
<p>Yes, the 20-year celebration was was announced for a couple of weeks now, but I didn't really care until I started to think about it.</p>
<blockquote>
<p>I didn't really plan to write about it but the more I think about the last 20 years... you know... And you are completely free to read it. :-D</p>
</blockquote>
<p>Because 20 years of .NET also means to me writing software and getting paid for it for more than 20 years, it also means to me spending almost half of that time as a Software Engineer at the <a href="https://yoo.digital">YOO</a>. This is amazing, surprising, and a little bit scary as well. I already spent almost half of my life turning coffee into code.</p>
<p>I started coding using ASP written in VBScript, which also uses server-side ActiveX libraries written in VB6. At that time I tackled the Microsoft developer community the first time, by asking questions about how to solve my coding problems.</p>
<p>At the time of the second half of 2001, there already was lot of news about ASP+, NET Fx, and other weird stuff. And I started to play around and created the first ASP-like applications that actually compile and execute awesomely fast. I was impressed by .NET 1.0 and writing the first ASP.NET Code using VB.NET.</p>
<blockquote>
<p>I took some time to convince my boss to write the very first project using ASP.NET 1.0 but finally I got the opportunity to rewrite a news module for an existing CMS using the new technology.</p>
</blockquote>
<p>At that time a pretty cool blog was the primary source to learn about new things about .NET and ASP.NET. It was <a href="https://twitter.com/scottgu">Scott Guthrie's</a> blog. The cool thing: That blog <a href="https://weblogs.asp.net/scottgu">is still online</a>. You'll find posts from 2003. Awesome! Scott was the person who invented ASP.NET. Just yesterday, he <a href="https://twitter.com/scottgu/status/1493275595035136004">posted a tweet</a> that shows his notebook that contains the first specks about ASP.NET.</p>
<blockquote>
<p>Now, I'm wondering what this book would look like if it would have been written using <a href="https://asp.net-hacker.rocks/2020/07/22/dotnet-notebooks.html">interactive notebooks</a> :-D</p>
</blockquote>
<p>The release of .NET 1.1 wasn't that good: Project types change and a lot more breaking changes happened and we had doubts about using .NET at all. Luckily Microsoft released a patch that fixes the issues we and a lot more developers had at that time.</p>
<p>My personal 20 years of .NET were a journey with some lows but a lot more heights during the whole time. Not all the stuff I worked with was good. I remember some ugly technology that was put on top of ASP.NET Webforms. Do you remember the ASP.NET Web Parts? I guess this was a bad try to get SharePoint-like behavior in every web application. This feature got even worse together with ASP.NET AJAX. Moving HTML and ViewStates around using JavaScript remote scripting (Ajax) isn't really what Ajax is meant for.</p>
<p>Some years before that, I already worked on a project for Siemens that loads XML-based data via client scripting from the server and displays the data on the client. Also, user input got sent to the server that way. The term Ajax wasn't introduced at that time. We created a single-page application years before the SPAs were a thing at all. Maybe this is the reason why ASP.NET AJAX felt completely wrong to us.</p>
<p>It was .NET 3.5 that changes ASP.NET and .NET a lot. ASP.NET MVC was awesome. It felt a lot more like a real web framework. It doesn't use a ViewState to hack around the stateless nature of the web. Using ASP.NET MVC felt more natural as we knew it from the days when we worked with classic ASP. I started to like .NET again. Even LINQ got introduced and made the handling of data a lot more productive.</p>
<p>Also around that time, I started blogging. I wrote my first blog post in 2007. I already was kinda involved in the German-speaking .NET developer community by contributing what I learned to people that started working with the technology. The feedback was amazing and pushes me forward to continue with the community work. Because of my blog, I got asked to write for technical magazines, and because of this, I got asked to talk at conferences, and so on...</p>
<p>I am mainly a more pragmatic developer. I'm using the tools that are best for me and the current project. .NET isn't always the best tool. I also worked with NodeJS and was impressed by how fast it is to get started and to get your job done. I also worked with python for around a year and had the same experience. Since I started my career scripting solutions using classic ASP, it wasn't a big deal to use JavaScript or Python. Actually, I miss the scripting experience in .NET as well as the flexibility with the language. Maybe I should have a more detailed look into F#. On the other hand, being a pragmatic software developer also means, getting your work done, making the customer happy, and getting paid for your work. In some cases, Python helped me to get things done as well as NodeJS did but in most cases it was .NET that helped me to get the customers happy. Well known since 1.0.</p>
<p>Getting things done using a well-known framework means you can start hacking around issues, <a href="https://asp.net-hacker.rocks/2022/01/04/my-book-2.html">customizing</a> stuff to remove blocking things. Actually, I did work on a pretty fast ASP.NET Webforms application that doesn't use any Webforms UI technology at all. No Webforms Components, no ViewState. It generates XML that got transformed on the server using XSL-Templates and writes the result directly to the output stream.</p>
<p>My personal 20 journey is like a marriage with heights and lows. And some unexpected happenings change a marriage a lot. .NET Core happened and I fell in love with .NET again. It was completely rewritten, lightweight and <a href="https://asp.net-hacker.rocks/2022/01/04/my-book-2.html">customizable</a>. You all know that already. The best thing from my perspective is the lightweight and console first approach. It almost feels like NodeJS and works similar to all the other web tools and frameworks that already exist at that time. .NET wasn't only a framework to build stuff and earn money for that work anymore. With the new .NET, it started to make fun again.</p>
<p>I am a web developer, even if I did some Windows Forms, WPF, Windows Mobile as well as Windows Phone development. I always feel more comfortable working with applications that have an HTML user interface, that make use of CSS and JavaScript. So the new .NET and ASP.NET Core feels even more naturally than ASP.NET MVC ever did. This is absolutely great.</p>
<p>What's coming next? We'll see.</p>
<blockquote>
<p>Writing a book was eating my time to write blog posts. I'm now starting to have a look at the next version of .NET and ASP.NET Core and I will write about that.</p>
<p>There will be no .NET 7.0 update for my book since I decided to write a new edition for every LTS version only. The next LTS will be .NET 8.0 which should be released around November 2023. So I'll have enough time to write blog posts.</p>
</blockquote>
<p>🎉 Happy 20th Anniversary .NET! 🎉</p>https://asp.net-hacker.rocks/2022/01/04/my-book-2.htmlCustomizing ASP.NET Core 6.0 - The second edition2022-01-04T00:00:00+00:002022-01-04T00:00:00+00:00Tue, 04 Jan 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>Just a couple of days ago, the second edition of my book <a href="https://packt.link/EBATu">Customizing ASP.NET Core</a> got released by <a href="https://packt.link/EBATu">Packt</a></p>
<p><img src="https://asp.net-hacker.rocks/img/book/book2e.png" alt="image-20220103222013400" /></p>
<p>The second edition is updated to .NET 6 and includes three new chapters. I also put the chapters into a more logical order :-D</p>
<p><strong>This is the nee table of contents:</strong></p>
<ol>
<li>Customizing Logging</li>
<li>Customizing App Configuration</li>
<li>Customizing Dependency Injection</li>
<li>Configuring and Customizing HTTPS</li>
<li>Configuring WebHostBuilder</li>
<li>Using different Hosting models</li>
<li>Using IHostedService and BackgroundService</li>
<li>Writing Custom Middleware</li>
<li>Working with Endpoint Routing</li>
<li>Customizing ASP.NET Core Identity <strong>[NEW]</strong></li>
<li>Configuring Identity Management <strong>[NEW]</strong></li>
<li>Content Negotiation Using a Custom OutputFormatter</li>
<li>Managing Inputs with Custom ModelBinders</li>
<li>Creating custom ActionFilter</li>
<li>Working with Caches <strong>[NEW]</strong></li>
<li>Creating custom TagHelpers</li>
</ol>
<h2>Working with Packt</h2>
<p>I'd like to thank the Packt team and its motivation and accuracy to create the best result possible. Actually writing isn't that hard, but getting it completely right, nice, and readable afterward is the hard part. Packt did a great job and I really like the result.</p>
<p>Maybe the next project is in the making. ;-)</p>https://asp.net-hacker.rocks/2022/01/01/new-blog-sponsor.htmlNew blog sponsor - YOO inc.2022-01-01T00:00:00+00:002022-01-01T00:00:00+00:00Sat, 01 Jan 2022 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>I warmly welcome the <a href="https://yoo.digital">YOO Inc.</a> as a new sponsor of my blog:</p>
<p><img src="https://asp.net-hacker.rocks/img/yoo-hire-juergen.gif" alt="" /></p>
<p>The YOO Inc., is located in Basel, Switzerland and serves national as well as international clients and specializes in creating custom digital solutions for distinct business needs.</p>
<p>Actually, YOO Inc. is the first official Sponsor of my blog and is sponsoring for a few months already but I got that cool banner just a few weeks ago.</p>
<p>Anyway, Thank YOO!</p>
<p>I got rid of Google AdSense which anyway didn't made a lot of sense on my blog. The less dependencies the better.</p>https://asp.net-hacker.rocks/2021/09/22/github-advisory-database.htmlDo you know the GitHub Advisory Database?2021-09-22T00:00:00+00:002021-09-22T00:00:00+00:00Wed, 22 Sep 2021 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>For a while, I'm trying to get into the topics of application security. Application security is a really huge area and covers a lot of topics. It contains user authentication as well as CORS, various kinds of injections, and many other errors you can do during development. One of the huge topics is about possible vulnerabilities in dependencies.</p>
<p>Almost every developer is using third-party libraries and components directly, via NuGet, NPM, pip, or whatever package manager. I did and still do as well.</p>
<p>While adding a dependency to your application, can you ensure that this dependency doesn't contain any vulnerabilities, or even ensure that the risk of adding vulnerabilities to your application can be reduced? Every code can contain an error that results in a critical issue, so third-party dependencies can do as well. Adding a dependency to your application can also mean adding a security problem to your application.</p>
<p>NPM already has a dependency audit tool integrated. <code>npm --audit</code> checks the packages.config against a vulnerabilities database and tells you about flaws and which version should be safe. In Python, you can install pip package globally to do vulnerabilities checks based on the requirements.txt against an open-source database.</p>
<p>In the .NET CLI you can use <code>dotnet list package --vulnerable</code> to check package dependencies of your project. The .NET CLI <a href="https://devblogs.microsoft.com/nuget/how-to-scan-nuget-packages-for-security-vulnerabilities/">is using the GitHub Advisory Database</a> to check for vulnerabilities: <a href="https://github.com/advisories">https://github.com/advisories</a></p>
<h2>GitHub Advisory Database</h2>
<p>The GitHub Advisory Database contains "the latest security vulnerabilities from the world of open-source software" as GitHub writes here <a href="https://github.com/advisories">https://github.com/advisories</a>.</p>
<p>More about the GitHub Advisory Database:
<a href="https://docs.github.com/en/code-security/security-advisories/about-github-security-advisories">https://docs.github.com/en/code-security/security-advisories/about-github-security-advisories</a></p>
<p>Actually, I see a problem with the GitHub Advisory Database in the .NET universe. There are more than 270 thousand unique packages registered on NuGet.org and there are more than 3.5 million package versions registered but only 140 reviewed advisories Advisory Database.</p>
<p>It is great to have the possibility to check the packages for vulnerabilities but it doesn't make sense to check against a database that only contains 140 entries for that amount of packages.</p>
<p>There might be some reasons for that:</p>
<h3>1st) .NET packages authors don't know about the Advisory Database</h3>
<p>I'm sure many NuGet package authors don't know about the Advisory Database. Like me. I learned about the Advisory Database just a couple of weeks ago.</p>
<h3>2nd) It is not common the the .NET universe to report vulnerabilities</h3>
<p>Compared to the other stacks the .NET universe is pretty new in the open-source world. Sure, are some pretty cool projects that are almost 20 years old, but those projects are only exceptions.</p>
<h3>3rd) There are less vulnerabilities in .NET packages</h3>
<p>Since the .NET packages are based on a good and almost complete framework there might be fewer vulnerabilities than on other stacks. From my perspective, this might be possible, but it is petty much dependent on the kind of package. The more the package is related to a frontend, the more vulnerabilities can occur in such a library.</p>
<h2>Why should I use such a advisory database?</h2>
<p>There are important reasons why you should use the advisory database or even report to an advisory database:</p>
<h3>Don't completely trust your own code</h3>
<p>You as a package author are not really save with your own code. Vulnerabilities can occur in every code. Every good developer is focusing on business logic and doesn't really think about side aspects like application security. It can always happen that you create a critical bug that results in more or less critical vulnerability.</p>
<p>Every time you fix such a case in your code you should report that to the GitHub Advisory Database to tell your users about possible security issues in older versions of your package. This way you protect your package users. This way you tell your users to feel responsible for your users. It doesn't tell your users that you are a bad developer. The opposite is the case.</p>
<p>It is not your fault to produce packages with vulnerabilities accidentally, but it would be your fault if you don't do anything about it.</p>
<h3>Don't completely trust the NuGet packages you use</h3>
<p>Execute a vulnerability check on the packages you use whenever it is possible. Update the packages you use to the latest version whenever it is possible.</p>
<p>Even Microsoft packages contain vulnerabilities as you can see here:
<a href="https://github.com/advisories/GHSA-q7cg-43mg-qp69">https://github.com/advisories/GHSA-q7cg-43mg-qp69</a></p>
<p>GitHub does such checks on the code level for you using the <a href="https://github.com/dependabot">Dependabot</a>. In case you don't host your code on GitHub you should use different tools like the already mentioned CLI tools or commercial tools like <a href="https://www.sonarsource.com/">SonaSource</a>, Snyk, or similar.</p>
<p>You can execute such checks on the build server immediately before you actually build your code. You could continue building in case the vulnerabilities are of a low or moderate level. You could stop building the code in case there are high or critical vulnerabilities</p>
<h3>Safe your users</h3>
<ul>
<li>Check your code</li>
<li>Check your dependencies</li>
<li>Accept reported vulnerabilities of your package and fix them</li>
<li>Report vulnerabilities that occurred in your package after you fixed it.</li>
<li>Don't report vulnerabilities for your packages that actually occur in used packages.
<ul>
<li>This should be done by the other package authors</li>
</ul>
</li>
<li>Report vulnerabilities of used packages in their repositories or to their maintainers
<ul>
<li>To give them a chance to fix it</li>
</ul>
</li>
</ul>
<h2>How to report a vulnerability</h2>
<p>If you own a repository on GitHib you can easily draft and propose a new security advisory to the GitHub database. In your repository on GitHub there is a "Security" tab. If you click on that tab, you'll find the "Security advisories" page on the left-hand menu. Here you see your already drafted advisories as well as a button to create a new one:</p>
<p><img src="https://asp.net-hacker.rocks/img/gsa/gsa01.png" alt="image-20210913215021624" /></p>
<p>If you don't own that repository, you will see the same page but without the button to draft an advisory.</p>
<p>If you click that button, you'll see a nice form to draft the advisory.</p>
<p><img src="https://asp.net-hacker.rocks/img/gsa/gsa02.png" alt="image-20210913220736613" /></p>
<p><img src="https://asp.net-hacker.rocks/img/gsa/gsa03.png" alt="image-20210913220644670" /></p>
<p>Once it is drafted, you can request a CVE identifier or just publish it. The GitHub team will then review it and add it to the advisory database:</p>
<p><img src="https://asp.net-hacker.rocks/img/gsa/gsa04.png" alt="image-20210913221748858" /></p>
<p><img src="https://asp.net-hacker.rocks/img/gsa/gsa05.png" alt="image-20210913221830226" /></p>
<p>That's it.</p>
<p>If you find a critical problem on a repository that you don't own. You should create an issue on that repository, describe what you found. The repository owner should now fix the problem and add an advisory to the database.</p>
<h2>Conclusion</h2>
<p>You definitely should take care of your dependencies and should check them for vulnerabilities. And you definitely should have a look at the GitHub Advisory Database and you should report your advisories there</p>
<p>This would help to keep the applications secure.</p>https://asp.net-hacker.rocks/2021/09/02/aspnetcore6-async-stream.htmlASP.NET Core in .NET 6 - Async streaming2021-09-02T00:00:00+00:002021-09-02T00:00:00+00:00Thu, 02 Sep 2021 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>This is the next part of the <a href="/2021/02/22/aspnetcore6-01.html">ASP.NET Core on .NET 6 series</a>. In this post, I'd like to have a look into async streaming.</p>
<p>Async streaming basically means the usage of <code>IAsyncEnumerable<></code></p>
<h2>IAsyncEnumerable<></h2>
<p>Async streaming is now supported from the controller action down to the response formatter, as well as on the hosting level. This topic is basically about the <code>IAsyncEnumerable<T></code>. This means that those async enumerable will be handled async all the way down to the response stream. They don't get buffered anymore, which improves the performance and reduces the memory usage a lot. Huge lists of data now get smoothly streamed to the client.</p>
<p>In the past, we handled large data by sending them in small chunks to the output stream, because of the buffering. This way we needed to find the right balance of the chunk size. Smaller chunks increase the CPU load and bigger chunks increase the memory consumption.</p>
<p>This is not longer needed. The <code>IAsyncEnumerable<T></code> does this for you with a lot better performance.</p>
<p>Even EF Core supports the <code>IAsyncEnumerable<T></code> to query the data. Because of that, working with EF Core is improved as well. Data you fetch from the database using EF Core can now be directly streamed to the output.</p>
<p>This is more or less what Microsoft wrote about async streaming, but I really like to try it by myself. 😃</p>
<h2>Trying to stream large data</h2>
<p>I'd like to try streaming a lot of data from the database to the client. So I create a new web API project using the .NET CLI:</p>
<pre><code class="language-shell">dotnet new webapi -n AsyncStreams -o AsyncStreams
cd AsyncStreams\
code .
</code></pre>
<blockquote>
<p>Microsoft changed the most .NET CLI project templates to use the minimal API approach.</p>
</blockquote>
<p>This creates a web API project and opens it in Visual Studio Code. We need to add some EF Core packages to work with SQLite and to create EF migrations:</p>
<pre><code class="language-shell">dotnet add package Microsoft.EntityFrameworkCore.Sqlite --version 6.0.0-preview.7.21378.4
dotnet add package Microsoft.EntityFrameworkCore.Design --version 6.0.0-preview.7.21378.4
</code></pre>
<p>To generate that load of data, I also need to add my favorite package GenFu:</p>
<pre><code class="language-shell">dotnet add package GenFu
</code></pre>
<p>This package is pretty useful to create test and mock data.</p>
<p>If you never installed ef global tool you should do it using the following command. The version should be the same as for the Microsoft.EntityFrameworkCore.Design package. I'm currently using the preview 7:</p>
<pre><code class="language-shell">dotnet tool install --global dotnet-ef --version 6.0.0-preview.7.21378.4
</code></pre>
<p>Now let's write some code.</p>
<p>At first, I add a AppDbContext and a AppDbContextFactory to the project:</p>
<pre><code class="language-csharp">using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Design;
namespace AsyncStreams
{
public class AppDbContext : DbContext
{
public DbSet<WeatherForecast> WeatherForecasts => Set<WeatherForecast>();
public AppDbContext(DbContextOptions<AppDbContext> options) : base(options)
{ }
}
public class AppDbContextFactory : IDesignTimeDbContextFactory<AppDbContext>
{
public AppDbContext CreateDbContext(string[] args)
{
var options = new DbContextOptionsBuilder<AppDbContext>();
options.UseSqlite("Data Source=app.db");
return new AppDbContext(options.Options);
}
}
}
</code></pre>
<p>The factory will be used by the EF tool to work work with the migrations.</p>
<p>At next, I need to register the <code>DbContext</code> to the Dependency Injection Container. In the <code>Program.cs</code> I add the following snippet right after the registration of Swagger:</p>
<pre><code class="language-csharp">builder.Services.AddDbContext<AppDbContext>(options =>
{
options.UseSqlite("Data Source=app.db");
});
</code></pre>
<p>Next, I'd like to seed a bigger amount of data. To do that I'm using GenFu in a method called SeedDatabase that I placed in the Program.cs to generate 100000 records of <code>WeatherForecast</code>:</p>
<pre><code class="language-csharp">// ...more usings
using GenFu;
// ...
app.MapControllers();
SeedDatabase(); // Call the seeding
app.Run();
void SeedDatabase()
{
using var context = app.Services.CreateScope().ServiceProvider.GetService<AppDbContext>();
if (context != null && !context.WeatherForecasts.Any())
{
var i = 1;
A.Configure<WeatherForecast>()
.Fill(c => c.Id, () => { return i++; });
var weatherForecasts = A.ListOf<WeatherForecast>(100000);
context.WeatherForecasts.AddRange(weatherForecasts);
context.SaveChanges();
}
}
</code></pre>
<p>I need to create a Scope to get the <code>AppDbContext</code> out of the <code>ServiceProvider</code>. Then we check if the database already contains any data. We also need to configure GenFu to not create random IDs. Otherwise, we would get problems when we safe the data into the database. Then the list of 100000 <code>WeatherForecast</code> gets created and stored into the database, in case there are no.</p>
<blockquote>
<p>I would have used the <code>HasData</code> method in the <code>OnModelCreating</code> method in the <code>AppDbContext</code> to seed the data. But seeding large data using this way doesn't really work since EF Migrations creates an insert statement per record in the migration file. This means the size of the migration file exceeds a lot and applying the migration took hours on my machine before I stopped it. The .NET Host needed almost all RAM and the CPU load was at 50%. I tried to seed one million records and 100000 records with no success. And lost three hours this way.</p>
<p>This is why I did the seeding manually before the application starts as proposed in this documentation:
<a href="https://docs.microsoft.com/en-us/ef/core/modeling/data-seeding">https://docs.microsoft.com/en-us/ef/core/modeling/data-seeding</a></p>
<p>I also tried to get one million records loaded with the client, but I got an <strong>Error: Maximum response size reached</strong> message in Postman, so I left it with 100000. Actually, that points me to the question of where the streaming aspect is... Maybe this is a Postman problem 🤔</p>
</blockquote>
<p>One more thing to do is to change the <code>WeatherForecastController</code> to use the AppDbContext and to return the weather forecasts:</p>
<pre><code class="language-csharp">using Microsoft.AspNetCore.Mvc;
namespace AsyncStreams.Controllers;
[ApiController]
[Route("[controller]")]
public class WeatherForecastController : ControllerBase
{
private readonly ILogger<WeatherForecastController> _logger;
private readonly AppDbContext _context;
public WeatherForecastController(
ILogger<WeatherForecastController> logger,
AppDbContext context)
{
_logger = logger;
_context = context;
}
[HttpGet]
public IActionResult Get()
{
return Ok(_context.WeatherForecasts);
}
}
</code></pre>
<p>At last, I need to create the EF migration and to update the database using the global tool:</p>
<pre><code class="language-shell">dotnet ef migrations add InitialCreate
dotnet ef database update
</code></pre>
<p>Since I don't seed the data with the migrations, it will be fast.</p>
<p>That's it. I start the application using dotnet run and call the endpoint in Postman:</p>
<pre><code class="language-curl">GET https://localhost:5001/WeatherForecast/
</code></pre>
<p>It is fascinating. The CPU load of the AsyncStreams application is quite low, but the memory consumption is pretty much the same, compared to an action method that buffers the data:</p>
<pre><code class="language-csharp">[HttpGet]
public async Task<IActionResult> Get()
{
return Ok(await _context.WeatherForecasts.ToListAsync());
}
</code></pre>
<p>I guess, I need to do some more tests to get a better comparison of the memory consumption.</p>
<h2>What's next?</h2>
<p>In the next part In going to have a look at the <code>HTTP logging middleware</code> in ASP.NET Core 6.0.</p>https://asp.net-hacker.rocks/2021/08/17/aspnetcore6-minimal-apis.htmlASP.NET Core in .NET 6 - Introducing minimal APIs2021-08-17T00:00:00+00:002021-08-17T00:00:00+00:00Tue, 17 Aug 2021 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>This is the next part of the <a href="/2021/02/22/aspnetcore6-01.html">ASP.NET Core on .NET 6 series</a>. In this post, I'd like to have a look into minimal APIs.</p>
<p>With the preview 4, Microsoft simplified the simplest project template to an absolute minimum. Microsoft created this template to make it easier for new developers to start creating small microservices and HTTP APIs.</p>
<p>When I saw the minimal APIs the first time some months ago it reminds me on this:</p>
<pre><code class="language-javascript">var express = require("express");
var app = express();
app.listen(3000, () => {
console.log("Server running on port 3000");
});
app.get("/url", (req, res, next) => {
res.json(["Tony","Lisa","Michael","Ginger","Food"]);
});
</code></pre>
<p>Yes, that is NodeJS using ExpressJS to bootup an http server that provides a minimal API. Actually, the ASP.NET Core minimal APIs looks as easy as NodeJS and ExpressJS. You don't believe? Just have a look.</p>
<h2>Minimal APIs</h2>
<p>To create a minimal API project, you can simply write it on your own or just use the dotnet CLI as usual:</p>
<pre><code class="language-shell">dotnet new web -n MiniApi -o MiniApi
</code></pre>
<p>This command creates a project file, app settings files, and a <code>Program.cs</code> that looks like this:</p>
<pre><code class="language-csharp">using System;
using Microsoft.AspNetCore.Builder;
using Microsoft.Extensions.Hosting;
var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();
if (app.Environment.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
app.MapGet("/", () => "Hello World!");
app.Run();
</code></pre>
<blockquote>
<p>Microsoft changed the empty web project in the dotnet CLI to use minimal APIs. You will not find a <code>Startup.cs</code> anymore. It is all in the <code>Program.cs</code>. This should help new developers to get into ASP.NET Core easier.</p>
</blockquote>
<p>If you already know ASP.NET Core, you will know some of the things that are used here. The <code>WebApplicationBuilder</code> will be created with the default settings to create the hosting environment, the same way as the <code>WebHostBuilder</code>. After Build() was called you can use a <code>WebApplication</code> object to map endpoints and to add Middlewares like the <code>DeveloperExceptionPage</code>.</p>
<p><code>app.Run()</code> starts the application to serve the endpoints.</p>
<p>You can start the project like any other ASP.NET Core project by running <code>dotnet run</code> or by clicking F5 in your IDE.</p>
<p>Actually, it is all working as any other ASP.NET Core project, but most of the stuff is encapsulated and preconfigured in the <code>WebApplicationBuilder</code> and can be accessed via properties. If you like to register some additional services, you need to access the Services property of the <code>WebApplicationBuilder</code>:</p>
<pre><code class="language-csharp">builder.Services.AddScoped<IMyService, MyService>();
builder.Services.AddTransient<IMyService, MyService>();
builder.Services.AddSingleton<IMyService, MyService>();
builder.Services.AddAuthentication();
builder.Services.AddAuthorization();
builder.Services.AddControllersWithViews();
</code></pre>
<p>Here you can also add the known services like authentication, authorization, and even MVC Controllers with views.</p>
<p>To configure the <code>Configuration</code>, <code>Logging</code>, <code>Host</code>, etc. you also need to access the relevant properties.</p>
<p>On the <code>WebApplication</code> instance, it works the same way as configuring your application inside the <code>Configure</code> method of a <code>Startup</code> class. On the <code>app</code> variable, you can register all the middlewares and routes you like. In the sample above, it is a simple GET response on the default route. You could also register MVC, authentication, authorization, HSTS, etc. as you can do in a common ASP.NET Core project.</p>
<p>The only difference is that it is all in one file.</p>
<p>Even if it doesn't make sense to configure an MVC application using minimal APIs, but to demonstrate that minimal APIs are just regular ASP.NET Core under the hood:</p>
<pre><code class="language-csharp">using System;
using System.Net;
using Microsoft.AspNetCore.Builder;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.AspNetCore.Identity;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddAuthentication();
builder.Services.AddAuthorization();
builder.Services.AddControllersWithViews();
var app = builder.Build();
if (app.Environment.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
else
{
app.UseExceptionHandler("/Home/Error");
app.UseHsts();
}
app.UseHttpsRedirection();
app.UseStaticFiles();
app.UseRouting();
app.UseAuthentication();
app.UseAuthorization();
app.UseEndpoints(endpoints =>
{
endpoints.MapControllerRoute(
name: "default",
pattern: "{controller=Home}/{action=Index}/{id?}");
});
app.Run();
</code></pre>
<p>I really like this approach for simple APIs in a microservice context.</p>
<p>What do you think? Just drop me a comment.</p>
<h2>What's next?</h2>
<p>In the next part In going to look into the support for <a href="/2021/09/02/aspnetcore6-async-stream.html">Async streaming</a> in ASP.NET Core.</p>https://asp.net-hacker.rocks/2021/07/19/aspnetcore6-shaddow-copy-iis.htmlASP.NET Core in .NET 6 - Shadow-copying in IIS2021-07-19T00:00:00+00:002021-07-19T00:00:00+00:00Mon, 19 Jul 2021 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>This is the next part of the <a href="/2021/02/22/aspnetcore6-01.html">ASP.NET Core on .NET 6 series</a>. In this post, I'd like to explore the Shadow-copying in IIS.</p>
<p>Since .NET is locking the assemblies that are running by a process, it is impossible to replace them during an update scenario. Specially in scenarios where you self-host an IIS server or where you need to update an running application via FTP.</p>
<p>To solve this, Microsoft added a new feature to the ASP.NET Core module for IIS to shadow copy the application assemblies to a specific folder.</p>
<h2>Exploring Shadow-copying in IIS</h2>
<p>To enable shadow-copying you need to install the latest preview version of the ASP.NET Core module</p>
<blockquote>
<p>On a self-hosted IIS server, this requires a new version of the hosting bundle. On Azure App Services, you will be required to install a new ASP.NET Core runtime site extension
(https://devblogs.microsoft.com/aspnet/asp-net-core-updates-in-net-6-preview-3/#shadow-copying-in-iis)</p>
</blockquote>
<p>If you have the requirements ready, you should add a <code>web.config</code> to your project or edit the <code>weg.config</code> that is created during the publish process (dotnet publish). Since most of us are using continuous integration and can't touch the <code>web.config</code> after it gets crated automatically, you should add it to the project. Just copy the one that got created using dotnet publish. Continuous integration will not override an existing <code>web.config</code>.</p>
<p>To enable it you will need to add some new <code>handlerSettings</code> to the <code>web.config</code>:</p>
<pre><code class="language-xml"><aspNetCore processPath="%LAUNCHER_PATH%" arguments="%LAUNCHER_ARGS%" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout">
<handlerSettings>
<handlerSetting name="experimentalEnableShadowCopy" value="true" />
<handlerSetting name="shadowCopyDirectory" value="../ShadowCopyDirectory/" />
</handlerSettings>
</aspNetCore>
</code></pre>
<p>This enables shadow-copying and specifies the shadow copy directory.</p>
<p>After the changes are deployed, you should be able to update the assemblies of a running application.</p>
<h2>What's next?</h2>
<p>In the next part In going to look into <a href="/2021/08/17/aspnetcore6-minimal-apis.html">minimal APIs</a> in ASP.NET Core.</p>https://asp.net-hacker.rocks/2021/07/12/aspnetcore6-hot-reload.htmlASP.NET Core in .NET 6 - Hot Reload2021-07-12T00:00:00+00:002021-07-12T00:00:00+00:00Mon, 12 Jul 2021 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>This is the next part of the <a href="/2021/02/22/aspnetcore6-01.html">ASP.NET Core on .NET 6 series</a>. In this post, I'd like to have a look at the .NET 6 support for Hot Reload.</p>
<p>In the preview 3, Microsoft started to add support for Hot Reload, which automatically gets started when you write <code>dotnet watch</code>. The preview 4 also includes support for Hot Reload in Visual Studio. Currently, I'm using the preview 5 to try Hot Reload.</p>
<h2>Playing around with Hot Reload</h2>
<p>To play around and to see how it works, I also create a new MVC project using the following commands:</p>
<pre><code class="language-shell">dotnet new mvc -o HotReload -n HotReload
cd HotReload
code .
</code></pre>
<p>These commands create an MVC app, change into the project folder, and open VSCode.</p>
<p><code>dotnet run</code> will not start the application with Hot Reload enabled, but <code>dotnet watch</code> does.</p>
<p>Run the command <code>dotnet watch</code> and see what happens, if you change some C#, HTML, or CSS files. It immediately updates the browser and shows you the results. You can see what's happening in the console as well.</p>
<p><img src="https://asp.net-hacker.rocks/img/aspnetcore6/hotreload.png" alt="Hot Reload in action" /></p>
<p>As mentioned initially, Hot Reload is enabled by default, if you use <code>dotnet watch</code>. If you don't want to use Hot Reload, you need to add the option <code>--no-hot-reload</code> to the command:</p>
<pre><code class="language-shell">dotnet watch --no-hot-reload
</code></pre>
<p>Hot Reload should also work with WPF and Windows Forms Projects, as well as with .NET MAUI projects. I had a quick try with WPF and it didn't really work with XAML files. Sometimes it also did an infinite build loop. Every build</p>
<p>More about Hot Reload in this blog post: <a href="https://devblogs.microsoft.com/dotnet/introducing-net-hot-reload/">https://devblogs.microsoft.com/dotnet/introducing-net-hot-reload/</a></p>
<h2>What's next?</h2>
<p>In the next part In going to look into the support for <a href="/2021/07/19/aspnetcore6-shaddow-copy-iis.html">Shadow-copying in IIS</a>.</p>https://asp.net-hacker.rocks/2021/07/05/aspnetcore6-http3-tls.htmlASP.NET Core in .NET 6 - HTTP/3 endpoint TLS configuration2021-07-05T00:00:00+00:002021-07-05T00:00:00+00:00Mon, 05 Jul 2021 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>This is the next part of the <a href="/2021/02/22/aspnetcore6-01.html">ASP.NET Core on .NET 6 series</a>. In this post, I'd like to have a look into HTTP/3 endpoint TLS configuration.</p>
<p>In the preview 3, Microsoft started to add support for HTTP/3 which brings a lot of improvements to the web. HTTP3 brings a faster connection setup as well as improved performance on low-quality networks.</p>
<p>Microsoft now adds support for HTTP/3 as well as the support to configure TLS (https) for HTTP/3.</p>
<p><a href="https://en.wikipedia.org/wiki/http/3">More about HTTP/3</a></p>
<h2>HTTP/3 endpoint TLS configuration</h2>
<p>Let's see how you can configure HTTP/3 in a small MVC app using the following commands:</p>
<pre><code class="language-shell">dotnet new mvc -o Http3Tls -n Http3Tls
cd Http3Tls
code .
</code></pre>
<p>This commands create an MVC app, change into the project folder and open VSCode.</p>
<p>In the <code>Program.cs</code> we need to configure HTTP/3 as shown in Microsoft's blog post:</p>
<pre><code class="language-csharp">public class Program
{
public static void Main(string[] args)
{
CreateHostBuilder(args).Build().Run();
}
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder
.ConfigureKestrel((context, options) =>
{
options.EnableAltSvc = true;
options.Listen(IPAddress.Any, 5001, listenOptions =>
{
// Enables HTTP/3
listenOptions.Protocols = HttpProtocols.Http3;
// Adds a TLS certificate to the endpoint
listenOptions.UseHttps(httpsOptions =>
{
httpsOptions.ServerCertificate = LoadCertificate();
});
});
})
.UseStartup<Startup>();
});
}
</code></pre>
<p>The flag <code>EnableAltSvc</code> sets a Alt-Svc header to the browsers to tell them, that there are alternative services to the existing HTTP/1 or HTTP/2. This is needed to tell the browsers, that the alternative services - HTTP/3 in this case - should be treated like the existing ones. This needs a https connection to be secure and trusted.</p>
<h2>What's next?</h2>
<p>In the next part In going to look into the support for <code>.NET Hot Reload support</code> in ASP.NET Core.</p>https://asp.net-hacker.rocks/2021/06/15/aspnetcore6-10-blazor-preserve-prerendered-state.htmlASP.NET Core in .NET 6 - Preserve prerendered state in Blazor apps2021-06-15T00:00:00+00:002021-06-15T00:00:00+00:00Tue, 15 Jun 2021 00:00:00 +0000Jürgen Gutschhttps://asp.net-hacker.rocks/<p>This is the next part of the <a href="/2021/02/22/aspnetcore6-01.html">ASP.NET Core on .NET 6 series</a>. In this post, I'd like to have a look into preserve prerendered state in Blazor apps.</p>
<p>In Blazor apps can be prerendered on the server to optimize the load time. The app gets rendered immediately in the browser and is available for the user. Unfortunately, the state that is used on while prerendering on the server is lost on the client and needs to be recreated if the page is fully loaded and the UI may flicker, if the state is recreated and the prerendered HTML will be replaced by the HTML that is rendered again on the client.</p>
<p>To solve that, Microsoft adds support to persist the state into the prerendered page using the <code><preserve-component-state /></code> tag helper. This helps to set a stage that is identically on the server and on the client.</p>
<blockquote>
<p>Actually, I have no idea why this isn't implemented as a default behavior in case the app get's prerendered. It should be done easily and wouldn't break anything, I guess.</p>
</blockquote>
<h2>Try to preserve prerendered states</h2>
<p>I tried it with a new Blazor app and it worked quite well on the <code>FetchData</code> page. The important part is, to add the <code>preserve-component-state</code> tag helper after all used components in the <code>_Host.cshtml</code>. I placed it right before the script reference to the <code>blazor.server.js</code>:</p>
<pre><code class="language-html"><body>
<component type="typeof(App)" render-mode="ServerPrerendered" />
<div id="blazor-error-ui">
<environment include="Staging,Production">
An error has occurred. This application may no longer respond until reloaded.
</environment>
<environment include="Development">
An unhandled exception has occurred. See browser dev tools for details.
</environment>
<a href="" class="reload">Reload</a>
<a class="dismiss">🗙</a>
</div>
<persist-component-state /> <!-- <== relevant tag helper -->
<script src="_framework/blazor.server.js"></script>
</body>
</code></pre>
<p>The next snippet is more or less the same as on Microsoft's blog post, except that the <code>forecast</code> variable is missing there and <code>System.Text.Json</code> should be in the <code>usings</code> as well</p>
<pre><code class="language-csharp">@page "/fetchdata"
@implements IDisposable
@using PrerenderedState.Data
@using System.Text.Json
@inject WeatherForecastService ForecastService
@inject ComponentApplicationState ApplicationState
...
@code {
private WeatherForecast[] forecasts;
protected override async Task OnInitializedAsync()
{
ApplicationState.OnPersisting += PersistForecasts;
if (!ApplicationState.TryTakePersistedState("fetchdata", out var data))
{
forecasts = await ForecastService.GetForecastAsync(DateTime.Now);
}
else
{
var options = new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
PropertyNameCaseInsensitive = true,
};
forecasts = JsonSerializer.Deserialize<WeatherForecast[]>(data, options);
}
}
private Task PersistForecasts()
{
ApplicationState.PersistAsJson("fetchdata", forecasts);
return Task.CompletedTask;
}
void IDisposable.Dispose()
{
ApplicationState.OnPersisting -= PersistForecasts;
}
}
</code></pre>
<p>What is the tag helper doing?</p>
<p>It renders an HTML comment to the page that contains the state in an encoded format:</p>
<pre><code class="language-HTML"><!--Blazor-Component-State:CfDJ8IZEzFk/KP1DoDRucCE
6nSjBxhfV8XW7LAhH9nkG90KnWp6A83ylBVm+Fkac8gozf2hBP
DSQHeh/jejDrmtDEesKaoyjBNs9G9EDDyyOe1o1zuLnN507mK0
Bjkbyr82Mw83mIVl21n8mxherLqhyuDH3QoHscgIL7rQKBhejP
qGqQLj0WvVYdvYNc6I+FuW4v960+1xiF5XZuEDhKJpFODIZIE7
tIDHJh8NEBWAY5AnenqtydH7382TaVbn+1e0oLFrrSWrNWVRbJ
QcRUR5xpa+yWOZ7U52iudA27ZZr5Z8+LrU9/QVre3ehO+WSW7D
Z/kSnvSkpSnGRMjFDUSgWJp3WE/y9ZKIqzmnOymihJARThmUUM
ewmU2oKkb6alKJ9SabJ0Dbj/ZLwJiDpIt1je5RpZGQvEp7SWJy
VMGieHgGL9lp2UIKwCX2HMiVB+b7UpYSby5+EjLW6FB8Yh5yY3
7IK90KVzl/45UDIJWWXpltHMhJqX2eiFxT7QS3p7tbG08jeBBf
6d74Bb7q6yxfgfRuPigERZhM1MEpqYvkHsugj7TC/z1mN2RF2l
yqjbF3VG/bpATkQyVkcZq4ll/zg+98PcXS18waisz7gntG3iwM
u/sf8ugqaFWQ1hS8CU3+JtvINC7bRDfg4g4joJjlutmmlMcttQ
GCCkt+hkGKxeAyMzHbnRkv8pVyPr4ckCjLdW02H5QhgebOWGGZ
etGlFih1Dtr5cidHT0ra72pgWNoSb7jqk4wVE+E5gmEOiuX0N2
/avvuwAnAifY9Sha1cY27ZxcNJQ5ZOejTXwquuitAdotatdk89
id3WDiTt6T0LvUywvMoga8qWIPqeZw+0VmBKJjFOwQRqx1dy9E
qq4zpTBOECcinKTsbnSb5KkRLQkrCQi4MJCkh/JzvKXP+/bksd
8B3ife7ad1aFgYwX/jvAtO8amzGiMaQvgYQyHsOQwqfrYUSFZm
9hGsdXUmWlE/g8VejWlSUiforHpVjPJojsfYfmeLOjRoSPBTQZ
Q0LL4ie/QFmKXY/TI7GjJCs5UuPM=-->
</code></pre>
<p>(I added some line breaks here)</p>
<p>This reminds me of the ViewState we had in ASP.NET WebForms. Does this make Blazor Server the successor of ASP.NET WebForms? Just kidding.</p>
<p>Actually, it is not really the ViewState, because it not gets sent back to the server. It just helps the client restore the state created on the server initially while it was prerendered.</p>
<h2>What's next?</h2>
<p>In the next part In going to look into the support for <a href="/2021/07/05/aspnetcore6-http3-tls.html">HTTP/3 endpoint TLS configuration</a> in ASP.NET Core.</p>