<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Home on Codewrecks</title><link>https://www.codewrecks.com/</link><description>Recent content in Home on Codewrecks</description><generator>Hugo</generator><language>en</language><lastBuildDate>Mon, 02 Feb 2026 06:00:00 +0000</lastBuildDate><atom:link href="https://www.codewrecks.com/index.xml" rel="self" type="application/rss+xml"/><item><title>AI Browser Agents and Security: Isolation Levels to Protect Your Digital Life</title><link>https://www.codewrecks.com/post/ai/ai-browser-isolation-levels/</link><pubDate>Mon, 02 Feb 2026 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/ai/ai-browser-isolation-levels/</guid><description>&lt;p>AI browser agents are becoming increasingly powerful. Tools like &lt;strong>Anthropic&amp;rsquo;s Claude, OpenAI&amp;rsquo;s Operator&lt;/strong>, and similar products can &lt;strong>navigate the web on your behalf&lt;/strong>, clicking buttons, filling forms, and interacting with services. This is incredibly useful, but it introduces a security problem that many people overlook.&lt;/p>
&lt;blockquote>
&lt;p>If the AI controls a browser where you are logged into all your primary accounts, you are giving it the keys to your digital life.&lt;/p></description></item><item><title>Why VS Code for C# Developers</title><link>https://www.codewrecks.com/post/github/why-vs-code-as-csharp-developer/</link><pubDate>Tue, 06 Jan 2026 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/github/why-vs-code-as-csharp-developer/</guid><description>&lt;p>If you&amp;rsquo;re a C# developer who has relied on a full Visual Studio IDE for years, Visual Studio Code (VS Code) might look like too small a tool for serious work. In reality &lt;strong>with big solutions you will find full Visual Studio more productive&lt;/strong> but Visual Studio Code now has some specific features that makes it more useful especially for small project.&lt;/p>
&lt;h2 id="c-developer-toolkit">C# Developer Toolkit&lt;/h2>
&lt;p>For productive C# development in VS Code you only need a few well-chosen pieces: the .NET SDK and &lt;code>dotnet&lt;/code> CLI, the official &lt;code>C#&lt;/code> extension (OmniSharp) for IntelliSense and debugging, a Test Explorer adapter for running unit tests, and a couple of linters/analysers (EditorConfig, Roslyn analyzers or StyleCop) to keep code consistent. These core tools give you a fast edit-build-test loop while keeping your environment simple and reproducible.&lt;/p></description></item><item><title>GitHub Copilot Plan-Then-Execute: Leveraging Background Agents and Git Worktrees</title><link>https://www.codewrecks.com/post/ai/agent-plan-then-execute/</link><pubDate>Sat, 03 Jan 2026 10:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/ai/agent-plan-then-execute/</guid><description>&lt;h1 id="github-copilot-plan-then-execute-in-background-with-git-worktree">GitHub Copilot: Plan Then Execute in Background with Git Worktree&lt;/h1>
&lt;p>GitHub Copilot&amp;rsquo;s recent evolution has introduced a revolutionary capability that fundamentally changes how developers can delegate work to AI: the ability to &lt;strong>plan first and then execute in background&lt;/strong> using &lt;strong>git worktrees&lt;/strong>. This feature represents a real evolution of your everyday worfklow with AI coding assistants.&lt;/p>
&lt;h2 id="the-problem-with-synchronous-ai-development">The Problem with Synchronous AI Development&lt;/h2>
&lt;p>When working with traditional AI assistants in the IDE, we often find ourselves in a situation of &lt;strong>tight coupling&lt;/strong> between us and the AI: the process is synchronous, it is like asking to a colleague to write some code, then wait for them to finish and review. &lt;strong>The problem with AI assistant is that I do not want to waste my time waiting for it to finish&lt;/strong>: I want to be able to delegate the work and continue working on other tasks while the AI works in background and finally presents me with the results.&lt;/p></description></item><item><title>First experiments with Claude Code - Writing specs</title><link>https://www.codewrecks.com/post/ai/claude-agents-first-experiments/</link><pubDate>Sat, 02 Aug 2025 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/ai/claude-agents-first-experiments/</guid><description>&lt;p>When vibe coding, detailed specifications are highly advisable. The &lt;strong>more freedom&lt;/strong> given to the model, the &lt;strong>more the result can deviate&lt;/strong> from your needs.&lt;/p>
&lt;blockquote>
&lt;p>Using a model to help write detailed specifications is an easy way to accomplish this.&lt;/p>&lt;/blockquote>
&lt;p>This allows you to quickly generate good, detailed specs that you can manually refine. I often use various models with Open Web UI for this.&lt;/p>
&lt;p>Now that Claude Code offers agents, and creating an agent is simple, I immediately &lt;strong>tried creating agents to automate this process.&lt;/strong> The results are promising, but not perfect. &lt;strong>Claude Code seems focused on writing code and tends to over-deliver.&lt;/strong> Starting with a simple idea and asking Claude Code to create an agent for brainstorming detailed specifications yields good results, but how good are they?&lt;/p></description></item><item><title>Call o3-pro in Azure OpenAI using C# SDK</title><link>https://www.codewrecks.com/post/ai/call-o3-pro-with-csharp-sdk/</link><pubDate>Wed, 16 Jul 2025 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/ai/call-o3-pro-with-csharp-sdk/</guid><description>&lt;p>Azure OpenAI supports now o3-pro model. This model is different from other ones, here is a brief summary of its advantages: &lt;em>The o3-pro model excels at advanced reasoning, delivers high accuracy, supports long-context tasks, and integrates powerful tools like web search and code execution. It’s cost-effective, reliable, and ideal for complex workflows in research, business, and creative fields.&lt;/em>&lt;/p>
&lt;p>You can find tons of example on how you can call Azure OpenAI models with C# Nuget Package but &lt;strong>o3-pro is a slightly different beast&lt;/strong>. The usual problem is you will use existing code and you got this error.&lt;/p></description></item><item><title>Pin GitHub action SHA to avoid security risk</title><link>https://www.codewrecks.com/post/github/github-sha-pinning/</link><pubDate>Sun, 16 Mar 2025 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/github-sha-pinning/</guid><description>&lt;h1 id="the-problem">The problem&lt;/h1>
&lt;p>When you author GitHub action pipelines, you usually use third party actions, that can be easily referenced in your workflow with simple syntax.&lt;/p>
&lt;p>Usually you refer to a github action in your workflow with the following syntax&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">5
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-yml" data-lang="yml">&lt;span style="display:flex;">&lt;span>- &lt;span style="color:#f92672">name&lt;/span>: &lt;span style="color:#ae81ff">Setup Hugo&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">uses&lt;/span>: &lt;span style="color:#ae81ff">peaceiris/actions-hugo@v2&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">with&lt;/span>:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">hugo-version&lt;/span>: &lt;span style="color:#e6db74">&amp;#39;0.128.0&amp;#39;&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">extended&lt;/span>: &lt;span style="color:#66d9ef">true&lt;/span>&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>This is the &lt;strong>standard way to use a third party action inside your workflow&lt;/strong>, you specify the name of the repository in the uses part of the step, and then specify parameters.&lt;/p></description></item><item><title>SonarCloud analysis in GitHub Actions</title><link>https://www.codewrecks.com/post/github/github-sonarcloud-docker/</link><pubDate>Fri, 28 Feb 2025 06:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/github-sonarcloud-docker/</guid><description>&lt;p>Running Sonar Cloud analysis on your open source project is usually a good thing, it is free, it &lt;strong>gives you tons of useful information&lt;/strong> and you can automate everything for free with &lt;strong>GitHub Actions&lt;/strong>.&lt;/p>
&lt;p>I&amp;rsquo;ve dealt with this ckind of functinoality before in my blog, today I want just to show how to create a GH action that works in linux, because usually if you &lt;strong>take the original action from SonarCloud site it will use a windows machine for building&lt;/strong>.&lt;/p></description></item><item><title>Understanding Azure DevOps Pipeline Statistics</title><link>https://www.codewrecks.com/post/azdo/pills/pipeline-statistics-dashboard/</link><pubDate>Tue, 11 Feb 2025 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/pipeline-statistics-dashboard/</guid><description>&lt;p>Pipeline statistics in Azure DevOps provide a wealth of information that can help you understand the &lt;strong>performance and efficiency of your pipelines&lt;/strong>. Even with really basic informations you can have interesting information. Lets examine&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/pipeline-statistics-dashboard.png"> &lt;img src="../images/pipeline-statistics-dashboard.png" alt="Pipeline statistics dashboard in Azure DevOps" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Pipeline statistics dashboard in Azure DevOps&lt;/em>&lt;/p>
&lt;p>My favorite information (1) is the 80th percentile of pipeline duration, this indicates me &lt;strong>if for some reason some pipeline is becoming slower&lt;/strong>. Actually in &lt;strong>Figure 1&lt;/strong> I can see that seems to be an increase in pipeline duration. &lt;strong>With (2) you can immediately understand which is the step that predates the execution time&lt;/strong>. In this example it is not a mistery, the IN-PROCESS step is actually running a full suite of integration test that uses MongoDB, Elasticsearch, and a complex software end-to-end without the UI.&lt;/p></description></item><item><title>Pill: Problems in Azure DevOps Pipelines due to Shallow Fetch</title><link>https://www.codewrecks.com/post/azdo/pills/shallow-fetch-pipeline/</link><pubDate>Sun, 02 Feb 2025 06:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/shallow-fetch-pipeline/</guid><description>&lt;p>In Azure DevOps, pipelines are a fundamental component for automating the build and release process. One of the key optimizations in these pipelines is the use of &lt;strong>shallow fetch when cloning repositories&lt;/strong>. Unlike a full clone, which downloads the entire history of the repository, a &lt;strong>shallow fetch retrieves only the specific commit needed for the build&lt;/strong>. This is a really welcomed feature, because repositories can contain years of history, or they can have &lt;strong>some big file committed by error in some older commit&lt;/strong>.&lt;/p></description></item><item><title>Azure DevOps Pills: Hide not used features from Team Projects</title><link>https://www.codewrecks.com/post/azdo/pills/hide-not-used-feature/</link><pubDate>Tue, 21 Jan 2025 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/hide-not-used-feature/</guid><description>&lt;p>Azure DevOps is a really complete set of functionalities to manage your Development Team and more. As you can see from Figure 1, it has five main Macro Set of Features that you can use. All these features are visible in the &lt;strong>five icons in the lower right part of the card of each Team Project&lt;/strong>&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/azdo-blocks.png"> &lt;img src="../images/azdo-blocks.png" alt="AzDo five main features blocks" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1:&lt;/strong>&lt;/em> &lt;em>AzDo five main features blocks&lt;/em>&lt;/p>
&lt;p>The very same five macro Feature set is visible on the left menu when you work with the detail of the Team Project. These part are, left to right&lt;/p></description></item><item><title>Azure DevOps Pills: Differences between old and new release pipeline</title><link>https://www.codewrecks.com/post/azdo/pills/release-new-and-old/</link><pubDate>Sun, 05 Jan 2025 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/release-new-and-old/</guid><description>&lt;p>Happy New Year to everyone. Today I&amp;rsquo;ll deal with a common question I got from customer regarding Azure DevOps release pipeline. The problem arise because &lt;strong>we already had a GUI based pipeline in the past and then we had a fully YAML pipeline&lt;/strong> so people are somewhat puzzled on which one to use in their scenario.&lt;/p>
&lt;blockquote>
&lt;p>When you have two way to do the same thing you are often confused on which tool to use to reach your goal&lt;/p></description></item><item><title>Azure DevOps Pills: Cleanup on premise pipeline agents</title><link>https://www.codewrecks.com/post/azdo/pills/cleanup-build-agent/</link><pubDate>Mon, 02 Dec 2024 08:00:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/cleanup-build-agent/</guid><description>&lt;p>Managing pipeline/build agents is something that &lt;strong>you should avoid if possible, preferring &lt;a href="https://www.codewrecks.com/post/old/2019/12/azure-devops-agent-with-docker-compose/">docker based agents&lt;/a> or Microsoft hosted agents&lt;/strong>. Sometimes this is not a viable options, especially if you have lots of integration tests, that runs on mongodb/elasticsearch/etc etc. While it &lt;strong>is quite simple to create a pipeline that uses docker to run these prerequisites&lt;/strong> speed is sometimes a problem that makes this solution not so feasible.&lt;/p>
&lt;p>Azure DevOps has a cost for pipeline that is based on concurrent execution, so it is quite &lt;strong>important that pipelines run fast to use less license but, more important, to give a quick feedback to the team&lt;/strong>. For this reason, we use physical machines, with quick NVMe Disks, RAM and Mongodb / Elastic / SQL installed on bare metal for maximum speed for integration tests.&lt;/p></description></item><item><title>Pills: Accessing your Git Repositories in Azure DevOps in Linux</title><link>https://www.codewrecks.com/post/azdo/pills/accessing-git-from-linux/</link><pubDate>Mon, 18 Nov 2024 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/accessing-git-from-linux/</guid><description>&lt;p>When you need to access your Git Repositories hosted on Azure DevOps in Linux, you have basically two distinct options.&lt;/p>
&lt;p>The first one is the classic &lt;strong>ssh protocol, that is well know to everyone working with linux systems&lt;/strong>.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/ssh-clone.png"> &lt;img src="../images/ssh-clone.png" alt="Choosing SSH as protocol to clone from Azure DevOps" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Choosing SSH as protocol to clone from Azure DevOps&lt;/em>&lt;/p>
&lt;p>This is the preferred way to access &lt;strong>from linux system&lt;/strong> but sadly, Azure DevOps still not support Hardware Key based SSh keys, so you are &lt;strong>limited to use standard RSA keys&lt;/strong> as you can see in &lt;strong>Figure 2&lt;/strong>. Actually this is quite a limitation for those used to Yubikeys or Google Titan keys.&lt;/p></description></item><item><title>Using Castle Windsor in .NET 8</title><link>https://www.codewrecks.com/post/general/using-castle-windsor-in-dotnet-8/</link><pubDate>Thu, 14 Nov 2024 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/using-castle-windsor-in-dotnet-8/</guid><description>&lt;p>Microsoft introduced Dependency Injection in the base framework with .NET Core, but until then, with the classic framework, we used external libraries, I&amp;rsquo;ve used Castle Windsor for years without any problem, but &lt;strong>when .NET 8 was out, we start having tons of problems.&lt;/strong>&lt;/p>
&lt;p>The main problem is that with .NET 8 Microsoft &lt;strong>introduced a concrete and working implementation of Named Dependencies&lt;/strong>, you can &lt;a href="https://weblogs.asp.net/ricardoperes/net-8-dependency-injection-changes-keyed-services">find some details here&lt;/a>. The problem is that Castle Windsor Adapter does not work with keyed services, so I wrote with my friend &lt;a href="https://github.com/AGiorgetti">Alessandro&lt;/a> a working version, that is still in PR with the official driver.&lt;/p></description></item><item><title>Pill: Unable to change Work Item type in Azure DevOps</title><link>https://www.codewrecks.com/post/azdo/pills/unable-change-type/</link><pubDate>Wed, 13 Nov 2024 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/unable-change-type/</guid><description>&lt;p>Today I got a strange error in Azure DevOps, I create a new Product Backlog Item, while I was writing it I realized that it would be better to create a Bug Type. My natural reaction was, save and then use the Change Type command, but I got this error.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/error-change-type.png"> &lt;img src="../images/error-change-type.png" alt="Error Changing a Work Item Type" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Error Changing a Work Item Type&lt;/em>&lt;/p>
&lt;blockquote>
&lt;p>Work item type(s) cannot be moved because it is disabled, hidden or not supported.&lt;/p></description></item><item><title>Pills: Connect Azdo to external software</title><link>https://www.codewrecks.com/post/azdo/pills/connect-azdo-to-external-software/</link><pubDate>Mon, 21 Oct 2024 06:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/connect-azdo-to-external-software/</guid><description>&lt;p>In our team, everything regarding developing is kept in Azure DevOps, but other informations are stored inside a custom software, so we &lt;strong>often have the need to jump between a system and the other one&lt;/strong>. The actual connection is, one element in our software is bound to one or more Work Items in Azure DevOps.&lt;/p>
&lt;blockquote>
&lt;p>Desired result is: Ability to easily create a connection between the two, reduce the need to jump between the two system to see key information.&lt;/p></description></item><item><title>Azure DevOps: Cleanup Docker images for your Pull Requests</title><link>https://www.codewrecks.com/post/azdo/pipeline/clean-docker-images-for-your-pull-requests/</link><pubDate>Tue, 09 Jul 2024 07:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/clean-docker-images-for-your-pull-requests/</guid><description>&lt;p>This article is a prosecution of &lt;a href="https://www.codewrecks.com/post/azdo/pipeline/build-and-create-docker-for-your-pr/">the previous one on creating Docker Images for your Pull Requests&lt;/a> and deals with cleanup of your Docker Registry.&lt;/p>
&lt;h2 id="authentication-to-azure">Authentication to Azure&lt;/h2>
&lt;p>In Azure DevOps you can use connected services to connect to Azure Accounts or external services, but since I&amp;rsquo;m using mainly PowerShell scripts inside my repository, &lt;strong>I often prefer using a Service Principal&lt;/strong>. This is a good practice because you can limit the access of the service principal to only the resources it needs to access, and you can revoke the access at any time.&lt;/p></description></item><item><title>Azure DevOps: Create Docker images for a Pull Request</title><link>https://www.codewrecks.com/post/azdo/pipeline/build-and-create-docker-for-your-pr/</link><pubDate>Sun, 07 Jul 2024 07:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/build-and-create-docker-for-your-pr/</guid><description>&lt;p>The whole Pull Request process mechanism has a single purpose, &lt;strong>have a better quality of the code that reach develop or generally speaking main branch&lt;/strong>. The ability to share the code and being able to get feedback from other members of the team is invaluable, but it is enough?&lt;/p>
&lt;p>The basic concept is: develop is a branch that should be &lt;strong>considered production&lt;/strong> and it is not uncommon for teams to deploy develop branch automatically in internal production servers, a procedure called &lt;strong>dogfooding&lt;/strong>. Here in Nebula Team we deploy automatically develop branch in our internal production servers, and sometimes you intercept bug before they hit master and production of all customer.&lt;/p></description></item><item><title>GitHub Copilot Workspace: first impression</title><link>https://www.codewrecks.com/post/ai/gh-copilot-workspace-first-impression/</link><pubDate>Thu, 13 Jun 2024 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/ai/gh-copilot-workspace-first-impression/</guid><description>&lt;p>I have the luck to have enabled the technical preview of &lt;a href="https://github.blog/2024-04-29-github-copilot-workspace/">Github Copilot Workspace&lt;/a>. As every good developer, I immediately jump at it, without reading anything (spoiler you have a good &lt;a href="https://github.com/githubnext/copilot-workspace-user-manual">user manual&lt;/a>) that can be used to &lt;strong>move your first steps into this marvelous world&lt;/strong>. Not reading the documentation is for me a good way to start using it and verify &lt;strong>how intuitive is the tool and what it can do into a real project&lt;/strong>.&lt;/p></description></item><item><title>Azure DevOps: Package source mapping in pipeline</title><link>https://www.codewrecks.com/post/azdo/pipeline/package-sources-mapping-and-pipeline/</link><pubDate>Tue, 21 May 2024 07:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/package-sources-mapping-and-pipeline/</guid><description>&lt;p>If you use more than one Nuget Feed in your solution and especially if you are using central package versioning, you probably got a warning telling you to use &lt;strong>Package Source Mapping&lt;/strong>. The process is straightforward, it consist in modifying your nuget.config file &lt;strong>to specify for each package the source feed where nuget can find the package&lt;/strong>.&lt;/p>
&lt;p>Here is an example for a solution I&amp;rsquo;m working:&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 5
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 6
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 7
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 8
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 9
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">10
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">11
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">12
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">13
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">14
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">15
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">16
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">17
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">18
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">19
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">20
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">21
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">22
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">23
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">24
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">25
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-powershell" data-lang="powershell">&lt;span style="display:flex;">&lt;span>&amp;lt;&lt;span style="color:#66d9ef">?&lt;/span>xml version=&lt;span style="color:#e6db74">&amp;#34;1.0&amp;#34;&lt;/span> encoding=&lt;span style="color:#e6db74">&amp;#34;utf-8&amp;#34;&lt;/span>?&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&amp;lt;configuration&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;packageRestore&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;add key=&lt;span style="color:#e6db74">&amp;#34;enabled&amp;#34;&lt;/span> value=&lt;span style="color:#e6db74">&amp;#34;True&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;add key=&lt;span style="color:#e6db74">&amp;#34;automatic&amp;#34;&lt;/span> value=&lt;span style="color:#e6db74">&amp;#34;True&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;/packageRestore&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;activePackageSource&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;add key=&lt;span style="color:#e6db74">&amp;#34;nuget.org&amp;#34;&lt;/span> value=&lt;span style="color:#e6db74">&amp;#34;https://api.nuget.org/v3/index.json&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;add key=&lt;span style="color:#e6db74">&amp;#34;ProximoAzDo&amp;#34;&lt;/span> value=&lt;span style="color:#e6db74">&amp;#34;https://pkgs.dev.azure.com/xxx/_packaging/yyy@Local/nuget/v3/index.json&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;/activePackageSource&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;packageSources&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;add key=&lt;span style="color:#e6db74">&amp;#34;nuget.org&amp;#34;&lt;/span> value=&lt;span style="color:#e6db74">&amp;#34;https://api.nuget.org/v3/index.json&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;add key=&lt;span style="color:#e6db74">&amp;#34;ProximoAzDo&amp;#34;&lt;/span> value=&lt;span style="color:#e6db74">&amp;#34;https://pkgs.dev.azure.com/xxx/_packaging/yyy@Local/nuget/v3/index.json&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;/packageSources&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;packageSourceMapping&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;packageSource key=&lt;span style="color:#e6db74">&amp;#34;nuget.org&amp;#34;&lt;/span>&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;package pattern=&lt;span style="color:#e6db74">&amp;#34;*&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;/packageSource&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;packageSource key=&lt;span style="color:#e6db74">&amp;#34;ProximoAzDo&amp;#34;&lt;/span>&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;package pattern=&lt;span style="color:#e6db74">&amp;#34;Jarvis*&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;package pattern=&lt;span style="color:#e6db74">&amp;#34;Proximo*&amp;#34;&lt;/span> /&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;/packageSource&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &amp;lt;/packageSourceMapping&amp;gt;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&amp;lt;/configuration&amp;gt;&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>As you can see I have &lt;strong>two different feed, one is nuget.org and the other is a private feed hosted in Azure DevOps&lt;/strong>. In the ProximoAzdo I can simply specify with wildcard the name of the packages that are to be taken from that specific feed, while the nuget.org has the generic * and is used for anyhing else.&lt;/p></description></item><item><title>GitHub action with ElasticSearch integration tests and SonarCloud</title><link>https://www.codewrecks.com/post/github/action-with-elastic-and-sonarcloud/</link><pubDate>Tue, 23 Apr 2024 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/action-with-elastic-and-sonarcloud/</guid><description>&lt;p>I blogged in the past on how you can Analyze you code with SonarCloud &lt;a href="https://www.codewrecks.com/post/github/github-sonarcloud-codecoverage/">in a GitHub action&lt;/a>. Things changed a little in the latest year but in this post I want to examine a different aspect &lt;strong>running tests that rely on external service, like ElasticSearch&lt;/strong>. Running integration test in GH action is a little more complex than running unit tests, because you need to &lt;strong>setup the environment but thanks to docker usually this can be done with a little effort&lt;/strong>.&lt;/p></description></item><item><title>Pill: Create an environment in an AzDo pipeline</title><link>https://www.codewrecks.com/post/azdo/pills/create-environment-on-pipeline/</link><pubDate>Tue, 19 Mar 2024 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/create-environment-on-pipeline/</guid><description>&lt;p>Scenario: We have to create a new environment for a new customer, and an environment consists of some resources on Azure, plus &lt;strong>an environment in azure DevOps to use with deploy pipeline&lt;/strong>. Since we are deploying with Azure DevOps pipeline, it makes sense to create everything for new customer environment with another pipeline.&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;display:grid;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;display:grid;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 5
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 6
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 7
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 8
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 9
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">10
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">11
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">12
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">13
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">14
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">15
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">16
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">17
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">18
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">19
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">20
&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">21
&lt;/span>&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">22
&lt;/span>&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">23
&lt;/span>&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">24
&lt;/span>&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">25
&lt;/span>&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">26
&lt;/span>&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">27
&lt;/span>&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">28
&lt;/span>&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">29
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">30
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">31
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">32
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">33
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">34
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;display:grid;">&lt;code class="language-yaml" data-lang="yaml">&lt;span style="display:flex;">&lt;span>&lt;span style="color:#f92672">stages&lt;/span>: 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> - &lt;span style="color:#f92672">stage&lt;/span>: &lt;span style="color:#ae81ff">create_environment&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">jobs&lt;/span>:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> - &lt;span style="color:#f92672">job&lt;/span>: &lt;span style="color:#ae81ff">create_environment&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">displayName&lt;/span>: &lt;span style="color:#e6db74">&amp;#34;Create environment if not present&amp;#34;&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">pool&lt;/span>:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">vmImage&lt;/span>: &lt;span style="color:#ae81ff">windows-latest&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">steps&lt;/span>:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> - &lt;span style="color:#f92672">powershell&lt;/span>: |&lt;span style="color:#e6db74">
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> write-Host &amp;#34;We are about to create the environment with api if not present&amp;#34;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> 
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> # Need to create the token in basic auth
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> $AuthHeaders = @{
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> &amp;#34;Authorization&amp;#34; = &amp;#39;Basic &amp;#39; + [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(&amp;#34;:$(AccessToken)&amp;#34;)) 
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> &amp;#34;Content-Type&amp;#34; = &amp;#34;application/json&amp;#34;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> }
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> $listEnvironment = Invoke-RestMethod -uri &amp;#34;https://dev.azure.com/org/teamproject/_apis/distributedtask/environments?api-version=7.1-preview.1&amp;#34; -Headers $AuthHeaders
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74">
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> # Now check if some of the environment already was present in the list.
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span>&lt;span style="color:#e6db74"> $envExisting = $listEnvironment.value | Where-Object { $_.name -eq &amp;#34;$(customer_fullname)&amp;#34; }
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span>&lt;span style="color:#e6db74">
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span>&lt;span style="color:#e6db74"> if ($envExisting) {
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span>&lt;span style="color:#e6db74"> Write-Host &amp;#34;The environment already exists&amp;#34;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span>&lt;span style="color:#e6db74"> exit 0
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span>&lt;span style="color:#e6db74"> } 
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span>&lt;span style="color:#e6db74"> 
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span>&lt;span style="color:#e6db74"> # Create an environment because it does not exists
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> $createEnvPayload = @{
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> name = &amp;#34;$(customer_fullname)&amp;#34;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> description = &amp;#34;Created by pipeline&amp;#34;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> } | ConvertTo-Json
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> $createUri = &amp;#34;https://dev.azure.com/org/teamproject/_apis/distributedtask/environments?api-version=7.1-preview.1&amp;#34;;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> Invoke-RestMethod -uri $createUri -method POST -Headers $AuthHeaders -Body $createEnvPayload&lt;/span>&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>As you can see the stage contains a job that is used to create an environment. This is based on a secret contained in the pipeline called &lt;code>AccessToken&lt;/code> that is &lt;strong>used to authenticate to Azure DevOps REST API&lt;/strong>. The script is quite simple, it first retrieves the list of environments and then checks if the environment is already present. If it is not present, it creates a new environment with a simple POST request to the REST API of Azure DevOps.&lt;/p></description></item><item><title>Pills: Enhancing Azure DevOps WorkItems with Hyperlinking to External Documentation</title><link>https://www.codewrecks.com/post/azdo/pills/use-hyperlink-link-type/</link><pubDate>Tue, 12 Mar 2024 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/use-hyperlink-link-type/</guid><description>&lt;p>A frequently overlooked feature that can significantly enhance functionality in Azure DevOps is the &lt;strong>ability to attach links to a WorkItem&lt;/strong>. A common question among users is: &amp;ldquo;Can I manage documentation in tools like SharePoint and then easily link it to my project in Azure DevOps?&amp;rdquo; This query arises because there&amp;rsquo;s often a limitation on how much text can be written directly into a WorkItem, and it&amp;rsquo;s convenient to attach documentation.&lt;/p></description></item><item><title>Pills: What to do when dotnet restore failed with 401 against an internal feed</title><link>https://www.codewrecks.com/post/azdo/pills/problem-using-internal-nuget-feed-by-powershell/</link><pubDate>Fri, 23 Feb 2024 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/problem-using-internal-nuget-feed-by-powershell/</guid><description>&lt;p>This is an argument I&amp;rsquo;ve already discussed in the past in &lt;a href="https://www.codewrecks.com/post/azdo/pipeline/nuget-feed-authenticate/">A post about nuget authentication&lt;/a>. From a couple of days, in a project I&amp;rsquo;m working into the service started to return 401 even with the technique described in the aforementioned post.&lt;/p>
&lt;p>The sympthom is this error in the script that executed &lt;code>dotnet restore&lt;/code> command.&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-plaintext" data-lang="plaintext">&lt;span style="display:flex;">&lt;span> Unable to load the service index for source https://pkgs.dev.azure.com/organizaion/_packaging/FeedName@Local/nuget/v3/index.json.
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> Response status code does not indicate success: 401
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>In such a situation here is what you need to do to try to solve the problem.&lt;/p></description></item><item><title>Azure Devops Api - Update list of allowed values for Custom Fields</title><link>https://www.codewrecks.com/post/azdo/api/manage-custom-field-with-api/</link><pubDate>Thu, 22 Feb 2024 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/api/manage-custom-field-with-api/</guid><description>&lt;p>The ability to &lt;strong>customize process of Azure DevOps&lt;/strong> is one of the most powerful feature of the platform. Usually you &lt;strong>add custom fields to work items&lt;/strong> to allow tracking information related to your own process and for your organization. One of the most common question I got usually is:&lt;/p>
&lt;blockquote>
&lt;p>How can I create a field that allows for a series of values that is taken from a database of mine?&lt;/p></description></item><item><title>Pills: Exploring Agent Options in Azure DevOps Pipelines: Managed vs. Self-Hosted</title><link>https://www.codewrecks.com/post/azdo/pills/do-i-need-to-deploy-my-agents/</link><pubDate>Thu, 01 Feb 2024 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/do-i-need-to-deploy-my-agents/</guid><description>&lt;p>When configuring Azure DevOps pipelines, developers have a choice to make regarding the execution environment for their pipelines: they can either leverage &lt;strong>Microsoft-managed agents provided in Azure or opt to self-host agents on their own infrastructure&lt;/strong>, whether that be on-premises virtual machines or cloud-based instances. One of the first question that arise is: which I need to use for my organization? Let&amp;rsquo;s explore the pro and cons of each option.&lt;/p></description></item><item><title>Pills: Do not miss repository policies in Azure DevOps</title><link>https://www.codewrecks.com/post/azdo/pills/repository-policies-for-branches/</link><pubDate>Fri, 19 Jan 2024 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/repository-policies-for-branches/</guid><description>&lt;p>If you use Azure DevOps, it&amp;rsquo;s worth &lt;strong>checking in repository settings page all the settings related to the policies of the repository itself&lt;/strong>. This is because often this type of setting is completely ignored, and you lose the opportunity to have very important controls on the repository itself.&lt;/p>
&lt;p>As you can see in Figure 1, there are many interesting policies that can help your team to keep a nice and healty repository.&lt;/p></description></item><item><title>Pill: Include files in your publish profile for C# projects</title><link>https://www.codewrecks.com/post/azdo/pills/msbuild-copy-file/</link><pubDate>Tue, 16 Jan 2024 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/msbuild-copy-file/</guid><description>&lt;p>When publishing an ASP.NET core web project, it&amp;rsquo;s often necessary to include certain files &lt;strong>external to the Visual Studio solution but that are logical part of the project&lt;/strong>. A typical example is frontend build from angular projects. For web projects, it&amp;rsquo;s also common to include some static resources that might be outside of the web project, like images or files.&lt;/p>
&lt;p>At this point, we want the Azure DevOps pipeline to &lt;strong>correctly include all these external files in the final artifacts.&lt;/strong>&lt;/p></description></item><item><title>Pill: Enhancing DevOps with Automated Pull Requests</title><link>https://www.codewrecks.com/post/azdo/pills/automatic-pull-request-close/</link><pubDate>Fri, 12 Jan 2024 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/automatic-pull-request-close/</guid><description>&lt;p>Pull requests are a cornerstone of collaborative software development, particularly with distributed version control systems like Git and platforms such as GitHub or Azure DevOps. However, managing pull requests can become cumbersome, particularly for &lt;strong>branches undergoing extensive modifications and receiving frequent feedback.&lt;/strong> This complexity is evident when preparing a pull request only to find it needs additional changes due to peer review, necessitating a complete retest of the code.&lt;/p>
&lt;p>Such workflows highlight the challenges in finalizing a branch. Developers often find themselves continuously switching to the branch in question to &lt;strong>change the code based on other members feedback&lt;/strong>, re test everything, signal to other members that the PR is updated then move to another branch and repeat. This is where a &lt;strong>comprehensive suite of automated tests, including end-to-end tests, becomes invaluable.&lt;/strong> These tests should cover the entire spectrum of the development process: building the solution, packaging artifacts, deploying to a test server, and conducting basic functionality checks.&lt;/p></description></item><item><title>Resolving .NET8 SDK Resolver Failure in Azure DevOps Pipelines</title><link>https://www.codewrecks.com/post/azdo/pills/strange-error-building-net8/</link><pubDate>Fri, 05 Jan 2024 07:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/strange-error-building-net8/</guid><description>&lt;p>I encountered a problem with a simple pipeline designed for building a .NET Core project, which I had recently updated to .NET8. After updating the pipeline file to use the new version of the SDK, I faced an unexpected issue: &lt;strong>all builds started failing with this error&lt;/strong>.&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-text" data-lang="text">&lt;span style="display:flex;">&lt;span>##[error]src\Intranet\Jarvis.Common.Shared\Jarvis.Common.Shared.csproj(0,0): Error MSB4242: SDK Resolver Failure: &amp;#34;The SDK resolver &amp;#39;Microsoft.DotNet.MSBuildSdkResolver&amp;#39; failed while attempting to resolve the SDK &amp;#39;Microsoft.NET.Sdk&amp;#39;. Exception: &amp;#39;Microsoft.NET.Sdk.WorkloadManifestReader.WorkloadManifestCompositionException: Manifest provider Microsoft.NET.Sdk.WorkloadManifestReader.SdkDirectoryWorkloadManifestProvider returned a duplicate manifest ID &amp;#39;16.4.8968-net8-rc2&amp;#39;.
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>I was really puzzled because I had the same SDK directory on my computer, and it was workgin without any issues, &lt;strong>but failing on that build server&lt;/strong>.&lt;/p></description></item><item><title>GitHub Secrets Scanning and Push Prevention</title><link>https://www.codewrecks.com/post/github/secrets-scanning/</link><pubDate>Thu, 04 Jan 2024 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/secrets-scanning/</guid><description>&lt;p>The risk of inadvertently including secrets in your Git repository has significantly increased in recent years. &lt;code>GitGuardian&lt;/code>, a company providing solutions to prevent secret leakage in repositories, reports astonishing numbers regarding the quantity of secrets leaked in Git repositories. &lt;a href="https://www.gitguardian.com/state-of-secrets-sprawl-report-2023">State of Secrets Sprawl Report 2023&lt;/a>&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/secrets-some-numbers.png"> &lt;img src="../images/secrets-some-numbers.png" alt="More than 10 million secrets leaked, and the number raises every year" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1:&lt;/strong>&lt;/em> &lt;em>More than 10 million secrets leaked, and the number raises every year&lt;/em>&lt;/p></description></item><item><title>Allow easy source debugging for Nuget Packages and GitHub</title><link>https://www.codewrecks.com/post/github/allow-source-debugging-for-nuget-packages/</link><pubDate>Sun, 31 Dec 2023 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/allow-source-debugging-for-nuget-packages/</guid><description>&lt;p>In my previous blog posts, I&amp;rsquo;ve extensively discussed how to &lt;strong>publish symbol libraries for .NET in Azure DevOps / Team Foundation Server&lt;/strong>. Azure DevOps has supported symbol server functionalities for a considerable time, making it straightforward to add steps in your build process for indexing your source code. This capability enables you to &lt;strong>publish your .NET libraries to either an internal or public NuGet feed and facilitates stepping into the original source code for debugging directly within Visual Studio.&lt;/strong>&lt;/p></description></item><item><title>Always use rebase when you pull in Git</title><link>https://www.codewrecks.com/post/github/git-pull-rebase/</link><pubDate>Sun, 31 Dec 2023 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/github/git-pull-rebase/</guid><description>&lt;p>I have always suggested people to only use rebates when you pull changes from the branch you are working on in git. And this because, actually, using merge will make your repository history a mess.&lt;/p>
&lt;p>For a lot of years, this kind of suggestion was not so common. I&amp;rsquo;ve always looked at the teams happily used merge and then complain about how difficult is to read the story of the repository. At a certain moment, get added a nice option that changed the default strategy of reconciled modification when you pull. And instead of using standard merge, you can configure to use rebase.&lt;/p></description></item><item><title>Streamlining Cloud Deployment: Azure DevOps and AWS Integration Strategies</title><link>https://www.codewrecks.com/post/azdo/pipeline/deploy-in-s3-in-aws/</link><pubDate>Thu, 14 Dec 2023 08:50:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/deploy-in-s3-in-aws/</guid><description>&lt;p>Let&amp;rsquo;s assume we need to deploy in a cloud environment and prefer &lt;strong>not to install an agent on each physical environment&lt;/strong>. For example, managing numerous agents across multiple virtual machines becomes cumbersome from an Azure DevOps standpoint. While we can surely create an environment for each distinct installation, usually this create some burden administrating the agents.&lt;/p>
&lt;p>In such a scenario, the optimal approach is to &lt;strong>create simple PowerShell or Bash installation scripts that will be distributed along pipeline artifacts&lt;/strong>. These scripts simply will deploy artifacts in local machine without need of an agent. The only challenge that remains is how to transfer the pipeline artifacts and scripts to the target machines.&lt;/p></description></item><item><title>Running GitVersion in Azure DevOps pipeline with dontet tool</title><link>https://www.codewrecks.com/post/azdo/pipeline/gitversion-powershell/</link><pubDate>Mon, 11 Dec 2023 10:00:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/gitversion-powershell/</guid><description>&lt;p>For me, running GitVersion as part of a Pipeline is a golden standard. I barely &lt;strong>remember a pipeline that does not use GitVersion&lt;/strong> as first task. The reason is simple, it allows me, at least, to give a better naming to build names. Instead of having meaningless date base number I have a semantic build &lt;strong>that immediately gives me the idea of what was built&lt;/strong>.&lt;/p>
&lt;blockquote>
&lt;p>At least GitVersion can give a better name to a build, so why not using it?&lt;/p></description></item><item><title>Introductory video playlist about Semantic Kernel RC</title><link>https://www.codewrecks.com/post/ai/introduction-to-semantic-kernel-rc/</link><pubDate>Mon, 11 Dec 2023 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/ai/introduction-to-semantic-kernel-rc/</guid><description>&lt;p>Semantic Kernel reached Release Candidate and there are some breaking changes from the latest beta. For this reason I re-recorded a series of video I&amp;rsquo;ve planned a couple of week ago with all examples updated to RC3.&lt;/p>
&lt;p>Here is the list on my youtube channel: &lt;a href="https://www.youtube.com/playlist?list=PLn9t_BnhwY0Ic-0IdTAaQNEwoIdNveYHy">Semantic Kernel Playlist&lt;/a>&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://youtu.be/ZyMnw3ryJJ0">Call python function from C#&lt;/a> - How you can call python code from C# to simplify interaction with LLM or other Models&lt;/li>
&lt;li>&lt;a href="https://youtu.be/n8In5rrrodA">Using new Handlebar templates&lt;/a> - How to interact with a LLM (GPT in this example) with classic chat mode&lt;/li>
&lt;li>&lt;a href="https://youtu.be/exu7i8qXkW0">Create plugins in C#&lt;/a> - How to Create plugin in C# and call directly through kernel object&lt;/li>
&lt;li>&lt;a href="https://youtu.be/Vc8dtwxcXbI">AutoPluginInvoker capabilities&lt;/a> - Let Semantic Kernel automatically invoke plugin if needed during your chat interaction with GPT&lt;/li>
&lt;/ul>
&lt;p>All the videos are in my youtube channel.&lt;/p></description></item><item><title>Azure DevOps: Checkout specific branch to avoid gitversion errors in pipeline</title><link>https://www.codewrecks.com/post/azdo/pipeline/checkout-code-in-build-pipeline/</link><pubDate>Thu, 16 Nov 2023 00:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/checkout-code-in-build-pipeline/</guid><description>&lt;p>If you create a new Azure DevOps Pipeline and include running GitVersion, sometimes you may encounter an error like the following:&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 5
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 6
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 7
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 8
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 9
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">10
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">11
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">12
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">13
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">14
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">15
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">16
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">17
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">18
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">19
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">20
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">21
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">22
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">23
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">24
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-text" data-lang="text">&lt;span style="display:flex;">&lt;span>INFO [11/15/23 19:00:57:95] Begin: Calculating base versions 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>INFO [11/15/23 19:00:57:96] Begin: Attempting to inherit branch configuration from parent branch 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>INFO [11/15/23 19:00:57:97] End: Attempting to inherit branch configuration from parent branch (Took: 3.55ms) 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>INFO [11/15/23 19:00:57:97] End: Calculating base versions (Took: 12.03ms) 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>ERROR [11/15/23 19:00:57:99] An unexpected error occurred: 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>System.NullReferenceException: Object reference not set to an instance of an object. 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Core.Handles.ObjectHandle.op_Implicit(ObjectHandle handle) in 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>/_/LibGit2Sharp/Core/Handles/Objects.cs:line 509 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Core.Proxy.git_commit_author(ObjectHandle obj) in /_/LibGit2Sharp/Core/Proxy.cs:line 289 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Core.LazyGroup`1.Dependent`2.LibGit2Sharp.Core.LazyGroup&amp;lt;T&amp;gt;.IEvaluator&amp;lt;TInput&amp;gt;.Evaluate(TInput 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>input) in /_/LibGit2Sharp/Core/LazyGroup.cs:line 88 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Core.LazyGroup`1.&amp;lt;Evaluate&amp;gt;b__6_0(T input) in /_/LibGit2Sharp/Core/LazyGroup.cs:line 36 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Core.GitObjectLazyGroup.EvaluateInternal(Action`1 evaluator) in 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>/_/LibGit2Sharp/Core/GitObjectLazyGroup.cs:line 20 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Core.LazyGroup`1.Evaluate() in /_/LibGit2Sharp/Core/LazyGroup.cs:line 34 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Core.LazyGroup`1.Dependent`2.Evaluate() in /_/LibGit2Sharp/Core/LazyGroup.cs:line 80 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Core.LazyGroup`1.Dependent`2.get_Value() in /_/LibGit2Sharp/Core/LazyGroup.cs:line 73 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at LibGit2Sharp.Commit.get_Committer() in /_/LibGit2Sharp/Commit.cs:line 87 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at GitVersion.Commit..ctor(Commit innerCommit) in 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>D:\a\GitVersion\GitVersion\src\GitVersion.LibGit2Sharp\Git\Commit.cs:line 17 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at GitVersion.Commit.&amp;lt;&amp;gt;c.&amp;lt;.ctor&amp;gt;b__3_0(Commit parent) in 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>D:\a\GitVersion\GitVersion\src\GitVersion.LibGit2Sharp\Git\Commit.cs:line 16 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at System.Linq.Enumerable.SelectEnumerableIterator`2.MoveNext() 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>at System.Linq.Enumerable.Count[TSource](IEnumerable`1 source) &lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>When executing get version within a local directory, the operation typically proceeds without issues. However, complications can arise during the Azure DevOps pipeline process.&lt;/p></description></item><item><title>Azure DevOps: Script Caching in Azure DevOps</title><link>https://www.codewrecks.com/post/azdo/pipeline/release-on-linux-cached-script/</link><pubDate>Mon, 30 Oct 2023 08:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/release-on-linux-cached-script/</guid><description>&lt;p>I&amp;rsquo;m authoring a release pipeline in Azure DevOps on an AWS ARM linux machine, I&amp;rsquo;ve installed the agent and created the script. The pipeline uses artifacts produced by &lt;strong>another build pipeline and depends on a git repository that contains script&lt;/strong>. Here is how resources are declared in the pipeline.&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 5
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 6
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 7
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 8
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 9
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">10
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">11
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">12
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-yaml" data-lang="yaml">&lt;span style="display:flex;">&lt;span>&lt;span style="color:#f92672">resources&lt;/span>: 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">pipelines&lt;/span>:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> - &lt;span style="color:#f92672">pipeline&lt;/span>: &lt;span style="color:#ae81ff">UniqueHost&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">source&lt;/span>: &lt;span style="color:#ae81ff">Publish-UniqueHost&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">branch&lt;/span>: &lt;span style="color:#ae81ff">master&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">repositories&lt;/span>:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> - &lt;span style="color:#f92672">repository&lt;/span>: &lt;span style="color:#ae81ff">JarvisSetupScripts&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">type&lt;/span>: &lt;span style="color:#ae81ff">git&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">ref&lt;/span>: &lt;span style="color:#ae81ff">feature/AWS&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">name&lt;/span>: &lt;span style="color:#ae81ff">JarvisSetupScripts&lt;/span>&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>Usually the question is: &lt;strong>why you store scripts in another repository?&lt;/strong>. The classic approach is writing release scripts inside the very same repository as source files, then include release script inside the build so &lt;strong>release pipeline depends only on one or more pipeline&lt;/strong>. Personally having the script in a different repositories easy script authoring, because you can simply modify the script, push, and then immediately re-trigger the pipeline to verify that everything is working as expected.&lt;/p></description></item><item><title>Pills: Install release agent in ARM machines</title><link>https://www.codewrecks.com/post/azdo/pills/installing-release-agent-in-arm-machine/</link><pubDate>Fri, 27 Oct 2023 09:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/installing-release-agent-in-arm-machine/</guid><description>&lt;p>With Azure Devops Environments you can register Virtual Machines with a dedicated agent that is capable of releasing your software. The procedure is simple, &lt;strong>just create an environment, and a VM resource, and you are greeted with a minimal UI that let you choose configuration&lt;/strong>.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/configure-agent.png"> &lt;img src="../images/configure-agent.png" alt="Configuring an agent for Azure DevOps environment" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Configuring an agent for Azure DevOps environment&lt;/em>&lt;/p>
&lt;p>As you can see it just require you to select the operating system then you can &lt;strong>copy in clipboard a simple script that you can execute in all machines you want to add in an environment&lt;/strong>. You need to have sudo rights to execute the script but running on ARM machine I got this.&lt;/p></description></item><item><title>Pills: Identify nuget packages with vulnerabilities</title><link>https://www.codewrecks.com/post/azdo/pills/nuget-packages-with-vulnerabilities/</link><pubDate>Fri, 27 Oct 2023 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/nuget-packages-with-vulnerabilities/</guid><description>&lt;p>Managing references is easy with Nuget, however, &lt;strong>from a security standpoint, it&amp;rsquo;s not straightforward to ensure your project&amp;rsquo;s security by upgrading vulnerable references&lt;/strong>. GitHub Dependabot does an excellent job flagging vulnerable references, and the entire GitHub ecosystem has a strong emphasis on security. This empowers developers to handle security in the packages they produce.&lt;/p>
&lt;p>Recently, &lt;strong>Visual Studio introduced a feature that immediately warns you if a package in your solution is insecure&lt;/strong>. You can also filter your installed packages to display only those with vulnerabilities.&lt;/p></description></item><item><title>Debugging Production Issues with Dump Files and Visual Studio</title><link>https://www.codewrecks.com/post/visualstudio/debugging-a-minidump/</link><pubDate>Tue, 19 Sep 2023 15:45:18 +0200</pubDate><guid>https://www.codewrecks.com/post/visualstudio/debugging-a-minidump/</guid><description>&lt;p>Sometimes, you may encounter a problem in a production environment, such as a service that suddenly starts consuming a significant amount of RAM and CPU. In the past, I&amp;rsquo;ve seen people attempt to install Visual Studio on a production server to debug the issue directly, instead of relying on logs or other techniques. However, there&amp;rsquo;s a better approach: &lt;strong>creating a dump file of the problematic process from the Task Manager&lt;/strong>. You can just right-click the process on Task Manager and request a MiniDump.&lt;/p></description></item><item><title>Azure DevOps: delete all unstable version of packages in feeds</title><link>https://www.codewrecks.com/post/azdo/misc/clean-artifacts-feed/</link><pubDate>Fri, 08 Sep 2023 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/misc/clean-artifacts-feed/</guid><description>&lt;p>Azure DevOps has a dedicated section for artifacts that allows you to store NuGet, NPM feeds, and more. Thanks to its integration with pipelines, very often, automatic pipelines are generated that &lt;strong>publish packages with every commit in the repository&lt;/strong>. This way, we have the opportunity to have all versions for all dev branches.&lt;/p>
&lt;p>This approach is needed because the usual flow when you develop a new feature in a package is the following:&lt;/p></description></item><item><title>Modifying Azure DevOps Pipeline Decorators with Bing Chatbot Assistance</title><link>https://www.codewrecks.com/post/azdo/pills/pipeline-decorators-on-different-os/</link><pubDate>Tue, 29 Aug 2023 16:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/pipeline-decorators-on-different-os/</guid><description>&lt;p>In the past, I&amp;rsquo;ve discussed using pipeline decorators to clean up build folders. Recently, I faced a challenge where &lt;strong>I needed to modify my decorator to run only if there was a &lt;code>.git&lt;/code> folder&lt;/strong>. To save time, I used Bing Chatbot, which leverages GPT powerful LLM and can search the internet to find latest contents, making this kind of problem-solving a breeze.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/bing-question.png"> &lt;img src="../images/bing-question.png" alt="sample prompt asking to modify a piece of an Azure Devops pipeline" />&lt;/a>&lt;/p></description></item><item><title>Remove submodule completely from your git repository</title><link>https://www.codewrecks.com/post/general/remember-to-remove-submodules/</link><pubDate>Mon, 28 Aug 2023 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/remember-to-remove-submodules/</guid><description>&lt;p>I have a project that &lt;strong>uses git submodules in the past&lt;/strong> then they are removed long time ago, no-one had problem but I&amp;rsquo;ve noticed that Visual Studio had some strange warning during the build.&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">1
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-bash" data-lang="bash">&lt;span style="display:flex;">&lt;span>warning : Could not find a part of the path &lt;span style="color:#e6db74">&amp;#39;C:\develop\xxx\submodules\jarvis.catalog\.git&amp;#39;&lt;/span>. The source code won&lt;span style="color:#960050;background-color:#1e0010">&amp;#39;&lt;/span>t be available via Source Link.&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>It seems that Visual Studio Integration with Git &lt;strong>still found information on submodules and tried to check something in corresponding folders&lt;/strong>. I&amp;rsquo;m not sure that this warning was present since we removed the submodule, it seems more that some settings changed, but nevertheless we still have &lt;strong>some information in our repository pointing to old, not used anymore submodules.&lt;/strong>.&lt;/p></description></item><item><title>Azure Pipelines starts failing indexing symbols</title><link>https://www.codewrecks.com/post/azdo/pipeline/index-symbols-suddently-failing/</link><pubDate>Mon, 31 Jul 2023 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/index-symbols-suddently-failing/</guid><description>&lt;p>A build in Azure Devops recently started to fail during the index symbol task. The error wasn&amp;rsquo;t immediately clear. The error messages are really not informative, One such message was, &amp;ldquo;Request ed7b95f6b6f439e769a3e85422b7172be872403388fae1974fb4233dfd13da66 is sealed. Only expirationDate may be modified.&amp;rdquo;&lt;/p>
&lt;p>Honestly, this error didn&amp;rsquo;t tell me much. &lt;strong>My general advice when facing such perplexing errors is to examine the entire log. Given the vast size of the log&lt;/strong>, it&amp;rsquo;s wise to start with the most common areas where useful information can be found. In the case of most .NET-related tools, I began by inspecting what appeared to be a stack trace. A stack trace in the log often indicates something has gone wrong.&lt;/p></description></item><item><title>Resolving Credential Conflicts in Git</title><link>https://www.codewrecks.com/post/github/pills/multiple-credentials-in-credential-manager/</link><pubDate>Fri, 21 Jul 2023 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/github/pills/multiple-credentials-in-credential-manager/</guid><description>&lt;p>Have you ever found yourself being asked to &lt;strong>select a GitHub account every time you make a push&lt;/strong>? This is often due to multiple access tokens being stored in your Windows credential manager.&lt;/p>
&lt;p>The Git Credential Manager can become confused when it doesn&amp;rsquo;t know which account to use. It&amp;rsquo;s only option in these situations is to ask you which of the stored credentials it should utilize.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/select-account.png"> &lt;img src="../images/select-account.png" alt="Command line interface suddenly opens a window asking you to select accounts" />&lt;/a>&lt;/p></description></item><item><title>Using GitHub Command Line Tool to View Pull Request Info</title><link>https://www.codewrecks.com/post/github/pills/gh_command_line_tool/</link><pubDate>Thu, 20 Jul 2023 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/github/pills/gh_command_line_tool/</guid><description>&lt;p>For people like me who prefer using Git in the command line, there are times when I need to retrieve information about pull requests or other GitHub related tasks. For example, suppose &lt;strong>I need to share a link to a pull request that is under review with one of my colleagues for them to comment on&lt;/strong>. Sure, I could navigate to the GitHub website, locate the repository, navigate to the pull request page and get the link. But, since I&amp;rsquo;m already in the command line, I&amp;rsquo;d prefer a faster way.&lt;/p></description></item><item><title>Pills: Azure Devops auto agents update</title><link>https://www.codewrecks.com/post/azdo/pills/auto-update-agents/</link><pubDate>Tue, 18 Jul 2023 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/auto-update-agents/</guid><description>&lt;p>The ability to update Azure DevOps pipelines is a compelling feature, especially if you manage numerous on-premise agents. This feature eliminates maintenance issues by &lt;strong>allowing all agents to be upgraded with just a single click&lt;/strong>.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/image1.png"> &lt;img src="../images/image1.png" alt="One click update button" />&lt;/a>
&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>One click update button&lt;/em>&lt;/p>
&lt;p>I have an agent that runs only when needed, and it&amp;rsquo;s slightly outdated. By simply clicking a button, I can prompt the server to update all the agents, and I&amp;rsquo;m sure that &lt;strong>after seconds I&amp;rsquo;m running latest agent version&lt;/strong>.&lt;/p></description></item><item><title>GitHub Copilot-X in action: Steps instructions in a single prompt</title><link>https://www.codewrecks.com/post/github/copilot-few-shot-part-2/</link><pubDate>Sat, 20 May 2023 07:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/github/copilot-few-shot-part-2/</guid><description>&lt;p>If you look at &lt;a href="https://www.codewrecks.com/post/github/copilot-x-few-shot-prompt/">previous post on the subject&lt;/a> &lt;strong>I&amp;rsquo;m experimenting with Copilot Chat to have it automate mundane, repetitive operation&lt;/strong> but that can operate on complex classes. In previous example I demonstrated how you can decompose a complex operation in multiple steps, actually guiding Copilot towards desired result.&lt;/p>
&lt;blockquote>
&lt;p>Now the question is: Once you got it right, is it possible to use a single prompt to have desired result?&lt;/p>&lt;/blockquote>
&lt;p>Well the answer is &lt;strong>it depends&lt;/strong>. It is not simple because the AI needs to perform an intermediate series of steps and the result can:&lt;/p></description></item><item><title>GitHub Copilot-X in action: generation of test object with random data</title><link>https://www.codewrecks.com/post/github/copilot-x-few-shot-prompt/</link><pubDate>Sat, 20 May 2023 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/github/copilot-x-few-shot-prompt/</guid><description>&lt;p>In real world software you often have complex classes, in this situation we have &lt;strong>AtomicReadmodels in a project heavily based on Event Sourcing&lt;/strong>. One challenge they present is the presence of only private setters for all properties - a necessity due to the unconventional nature of these classes, &lt;strong>which rely on parsing domain events for property population&lt;/strong>. This results in difficulties when creating test classes in memory during unit testing, as the private setters block external code from setting properties. Reflection or usage of libraries such as &lt;strong>Fasterflect are usually the solution but not without any annoyances&lt;/strong>.&lt;/p></description></item><item><title>Pills: npm private feeds and authentication</title><link>https://www.codewrecks.com/post/azdo/pills/vsts-npm-auth-problems/</link><pubDate>Wed, 17 May 2023 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/vsts-npm-auth-problems/</guid><description>&lt;p>Have you ever encountered an issue obtaining an authentication token for your Azure DevOps npm package feed? Sometimes I got this&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">3
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-text" data-lang="text">&lt;span style="display:flex;">&lt;span>vsts-npm-auth v0.42.1.0
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>-----------------------
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>Couldn&amp;#39;t get an authentication token for https://pkgs.dev.azure.com/prxm/_packaging/JarvisNpmGood/npm/registry/.&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>Private package feeds in Azure DevOps are incredibly useful, not just for Nuget packages, but for NPM as well. However &lt;strong>you must follow the given instructions on the site to connect to your feed, and there will be times when you have to renew credentials.&lt;/strong> The problem is that npm feed in Azure Devops are usually private and under authentication/authorization. For this reason you need to install a special package to allow authentication. &lt;a href="https://www.npmjs.com/package/vsts-npm-auth">You can download tool here&lt;/a>.&lt;/p></description></item><item><title>Unleashing the Power of Copilot in Visual Studio for Exception Handling</title><link>https://www.codewrecks.com/post/github/unleashing-the-power-of-copilot-in-visual-studio-for-exception-handling.md/</link><pubDate>Fri, 12 May 2023 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/github/unleashing-the-power-of-copilot-in-visual-studio-for-exception-handling.md/</guid><description>&lt;p>While running a program, when an exception occurs, sometimes it&amp;rsquo;s easy to figure out what caused it. Other times, it can be more complex, &lt;strong>especially in the case of unique exceptions you might not be familiar with&lt;/strong>. Here, having Copilot chat inside Visual Studio offers an extra gear. Within the exception menu, we now have an option that allows us to ask Copilot for information on that specific exception.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/exception-copilot-1.png"> &lt;img src="../images/exception-copilot-1.png" alt="Copilot AI Assistant in exception box" />&lt;/a>&lt;/p></description></item><item><title>Problem with Castle Windsor resolution in ASP.NET Core</title><link>https://www.codewrecks.com/post/general/castle-resolution-in-asp-net-core/</link><pubDate>Tue, 09 May 2023 09:40:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/castle-resolution-in-asp-net-core/</guid><description>&lt;p>&lt;strong>Castle Windsor is a beautiful library for implementing inversion of control&lt;/strong>, but sometimes problem arise when it is used in projects that start with Full Framework and must be converted to ASP.NET Core during their lifetime. To make this work, an interdependency library is typically used to allow ASP.NET Core infrastructure to resolve dependencies using Castle. This approach helps avoid issues when replacing Castle with other libraries, since Castle is both powerful and complex, allowing for many customizations in dependency resolution. Since it&amp;rsquo;s &lt;strong>not always easy to remove it and make room for new libraries&lt;/strong> the usual solution is to keep using Castle Winsor.&lt;/p></description></item><item><title>GitHub Copilot-X in action: simple code conversion</title><link>https://www.codewrecks.com/post/github/copilot-x-chat-rewrite/</link><pubDate>Tue, 09 May 2023 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/github/copilot-x-chat-rewrite/</guid><description>&lt;p>New &lt;a href="https://github.com/features/preview/copilot-x">Copilot X&lt;/a> from &lt;a href="https://github.com">GitHub&lt;/a> is the next big thing for programmers, because it &lt;strong>brings the power of copilot to the next level&lt;/strong>. Actually I&amp;rsquo;m testing the integrated chat in Visual Studio and Visual Studio Code. The tool is not always perfect, but we really need to understand how and where to use it to gain maximum advantage.&lt;/p>
&lt;p>We often encounter &lt;strong>conversion operations&lt;/strong> that are very mechanical, boring, and prone to errors due to their repetitiveness. When programming, we are very focused when doing something interesting, but when we perform simple operations, such as trivial conversions, we often make mistakes because our mind is elsewhere. Consider the following situation: you have an &lt;strong>aspnet core controller that needs to be converted to a server-side Blazor component&lt;/strong>.&lt;/p></description></item><item><title>Pills: Maximizing the Power of Tags in Azure DevOps</title><link>https://www.codewrecks.com/post/azdo/pills/azdo-tags/</link><pubDate>Mon, 08 May 2023 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/azdo-tags/</guid><description>&lt;p>In Azure DevOps, &lt;strong>using tags allows you to easily classify work items&lt;/strong>. Rather than using additional process fields, tags offer a high level of classification ease because any team member can add a label to a work item.&lt;/p>
&lt;p>In effect, the result &lt;strong>is the ability to create a horizontal taxonomy that&amp;rsquo;s common to all work items and likely common to all team projects&lt;/strong>, enabling efficient filtering and categorization of Work Items.&lt;/p></description></item><item><title>Using GPT-4 to Create a Small Software Project from Specification</title><link>https://www.codewrecks.com/post/ai/super-ai-wizard/</link><pubDate>Sun, 16 Apr 2023 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/ai/super-ai-wizard/</guid><description>&lt;p>As a programmer, I often find myself &lt;strong>seeking ways to expedite the initial stages of a software development project&lt;/strong>. GPT-4 has become a valuable tool in this regard, as it can help generate the foundational code for a new project. By leveraging the power of this advanced AI language model, I can significantly reduce the time and effort required for setting up a project, thus allowing me to focus on more complex aspects of the development process.&lt;/p></description></item><item><title>Troubleshooting GitHub Codespaces PGP Signing Problems</title><link>https://www.codewrecks.com/post/github/codespaces-troubleshooting/</link><pubDate>Thu, 13 Apr 2023 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/github/codespaces-troubleshooting/</guid><description>&lt;p>If you use &lt;strong>GPG keys&lt;/strong> to verify your commits, you&amp;rsquo;ll be glad to know that in &lt;strong>GitHub Codespaces&lt;/strong>, signing is done automatically. All you need to do is &lt;strong>configure the settings&lt;/strong> in your account, and a key will be injected into your Codespaces. As a result, every commit you make in your Codespace will be automatically signed and verified.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/pgp-codespace.png"> &lt;img src="../images/pgp-codespace.png" alt="Configure PGP in codespace" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Configure PGP in codespace&lt;/em>&lt;/p></description></item><item><title>Simplifying Library Debugging with Azure DevOps Symbol Server</title><link>https://www.codewrecks.com/post/azdo/pipeline/streamline-library-debugging/</link><pubDate>Tue, 21 Mar 2023 08:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/streamline-library-debugging/</guid><description>&lt;p>When developing a code library, it is good practice to &lt;strong>publish it on a package manager like NuGet&lt;/strong>. A common objection to this approach is that using a library published as a package can make it difficult to debug the original code. However, this is not a significant issue as it &lt;strong>encourages you to write unit tests within the same project in which you develop your library&lt;/strong>, ensuring that the library is well-tested and free of regressions. Nevertheless, there are times when it is convenient to debug the source code of the library while using in a real project. This is true especially for complex libraries where it is difficult to create unit tests that covers all options.&lt;/p></description></item><item><title>More secure Azure DevOps Pipelines API connection thanks to OAuth Tokens</title><link>https://www.codewrecks.com/post/azdo/api/reschedule-pr-check-use-oauth2-tokens/</link><pubDate>Sun, 19 Mar 2023 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/api/reschedule-pr-check-use-oauth2-tokens/</guid><description>&lt;p>In a previous blog post, I discussed how to reschedule the check of a pull request using a simple PowerShell script within an Azure DevOps pipeline. This time, &lt;strong>I&amp;rsquo;ll explain how to avoid using Personal Access Tokens for authentication&lt;/strong> and switch to a more secure alternative.&lt;/p>
&lt;p>The issue with Personal Access Tokens is that they are bearer token, which means &lt;strong>if they&amp;rsquo;re lost or accidentally leaked in logs, anyone with access to the token can use it to access your services&lt;/strong>. To address this problem, it&amp;rsquo;s better to use a specific Personal Access Token with the minimum required scopes. For instance, if you only need to reschedule Pull Requests checks, grant the token only pull request and build access. This limits the potential damage if the token falls into the wrong hands.&lt;/p></description></item><item><title>Azure Devops Api - Automatically Re-Queue Pull Request Checks</title><link>https://www.codewrecks.com/post/azdo/api/reschedule-pr-check-with-pipeline/</link><pubDate>Tue, 07 Mar 2023 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/api/reschedule-pr-check-with-pipeline/</guid><description>&lt;p>In previous post on this subject &lt;a href="https://www.codewrecks.com/post/azdo/api/reschedule-pr-check-with-api/">Reschedule PR Check with API&lt;/a> I&amp;rsquo;ve demonstrated &lt;strong>a simple PowerShell api script that can automatically re-queue all checks for Opened pull Requests where the check is expired&lt;/strong>. The problem is: you need to schedule this script to run.&lt;/p>
&lt;p>Actually the easiest way to schedule running a script that uses API is &lt;strong>using a standard pipeline. Give the ability to run whenever the target branch changes it is the best choice.&lt;/strong> Since develop is my usual target branch for Pull request I left a CI trigger for changes in develop branch as well as some scheduled run (to include other non usual target branches).&lt;/p></description></item><item><title>Azure Devops Api - Re-Queue Pull Request Checks</title><link>https://www.codewrecks.com/post/azdo/api/reschedule-pr-check-with-api/</link><pubDate>Tue, 21 Feb 2023 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/api/reschedule-pr-check-with-api/</guid><description>&lt;p>Pull Requests checks are a perfect gate to keep your code quality High. The easiest way to &lt;strong>perform a check is to do something inside a standard pipeline and then use that pipeline in branch policies&lt;/strong>. This will allow you to prevent people to push on the main develop branch (main/master/develop) and be forced to do a pull request against that branch and wait for the checks to complete.&lt;/p>
&lt;blockquote>
&lt;p>Remember that checks on a Pull Request actually runs on the result of the merge between source and target branch.&lt;/p></description></item><item><title>Azure DevOps: importance of stable tests in pull requests</title><link>https://www.codewrecks.com/post/azdo/misc/importance-of-stable-test-in-pull-requests/</link><pubDate>Wed, 01 Feb 2023 07:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/misc/importance-of-stable-test-in-pull-requests/</guid><description>&lt;p>Pull Requests are the heartbeat of a project, and &lt;strong>it is probably one of the reason to move to Git if you still are in a different source control&lt;/strong>. Actually a Pull Requests introduce this enormous advantage in a team&lt;/p>
&lt;ul>
&lt;li>You know when a feature/bugfix branch is ready to be inspected by the team&lt;/li>
&lt;li>A single place of discussion&lt;/li>
&lt;li>Automatic Merge and tests run on merge result&lt;/li>
&lt;li>Etc.&lt;/li>
&lt;/ul>
&lt;p>Actually you have a lots of other advantages, but I want to &lt;strong>concentrate the attention on test run&lt;/strong>. A good and healthy project contains Tests: Unit, Integration, Ui, etc, and usually after years running the entire suite is time consuming. &lt;strong>This lead to an antipattern, where developers does not run all the tests before committing&lt;/strong>, but usually runs only a subset of tests, to verify new code.&lt;/p></description></item><item><title>Azure DevOps: pipeline permission to use an agent pool</title><link>https://www.codewrecks.com/post/azdo/pipeline/pipeline-permissions-agent-pool/</link><pubDate>Wed, 25 Jan 2023 07:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/pipeline-permissions-agent-pool/</guid><description>&lt;p>Scenario: We created a &lt;strong>new Agent Pool&lt;/strong> in Azure DevOps called &amp;ldquo;linux&amp;rdquo; and we added some &lt;strong>docker based agents&lt;/strong>, and finally we add this new pool into the available pool for a couple of builds. To verify that agents can indeed run the builds we scheduled run onto this new pool &lt;strong>but pipeline execution failed&lt;/strong>. The error is depicted in &lt;strong>Figure 1&lt;/strong>&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/build-failed-not-allowed-to-run-on-agent.png"> &lt;img src="../images/build-failed-not-allowed-to-run-on-agent.png" alt="Failed build details after changing pool to linux" />&lt;/a>&lt;/p></description></item><item><title>Pills: Conditional Pipeline decorators</title><link>https://www.codewrecks.com/post/azdo/pills/pipeline-decorators-conditional/</link><pubDate>Tue, 17 Jan 2023 00:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/pipeline-decorators-conditional/</guid><description>&lt;p>&lt;a href="https://www.codewrecks.com/post/azdo/pills/pipeline-decorators/">Pipeline decorators&lt;/a> are a really peculiar feature of Azure DevOps, because they allow you to &lt;strong>specify a series of tasks that are run for EVERY pipeline in your organization&lt;/strong>, so they are rarely needed, but nevertheless they are a nice tool to know because there are situation when they are useful. Moreover, in latest &lt;a href="https://docs.microsoft.com/en-us/azure/devops/release-notes/2021/sprint-194-update">Sprint 194 update&lt;/a> they are expanded to support new functionalities, like running &lt;strong>before or after specific tasks&lt;/strong>.&lt;/p></description></item><item><title>Develop locally with GitHub Codespaces and Hugo</title><link>https://www.codewrecks.com/post/github/codespaces-hugo-local/</link><pubDate>Thu, 12 Jan 2023 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/codespaces-hugo-local/</guid><description>&lt;p>I really love &lt;a href="https://www.codewrecks.com/post/github/codespaces-hugo/">Using Hugo and Codespaces to write blog posts&lt;/a>, I&amp;rsquo;ve a really better blogging experience &lt;strong>than wordpress, because I have a simple blog, quick to load, no frills no fuzzes, just a simple blog&lt;/strong>. Using Codespaces is really nice experience and I really never wrote a blog post in my Windows machine until yesterday.&lt;/p>
&lt;p>Yesterday I simply opened my blog repository inside a local instance of Visual Studio Code, but I had a &lt;strong>nasty surprise when I tried to start hugo server&lt;/strong>.&lt;/p></description></item><item><title>Pills: Backup your Azure DevOps server</title><link>https://www.codewrecks.com/post/azdo/pills/backup-your-azure-devops-server/</link><pubDate>Sat, 31 Dec 2022 07:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/backup-your-azure-devops-server/</guid><description>&lt;p>Some days ago I got a call from a friend at customer site that experiences some problems with Azure DevOps server. The symptoms are strange &lt;strong>server starts to become unresponsive, it is not possible even to login with Remote Desktop, it seems that there is some memory problem&lt;/strong>.&lt;/p>
&lt;p>Being unable to diagnose by telephone the problem I suggest them to disable search services, sometimes it can happen that elastic search services consumes too much ram, &lt;strong>since I did not have any other data, I suggests rebooting the machine forcibly from virtualization system and immediately connect with remote destop and disable elastic search service&lt;/strong> then try to better diagnose the problem.&lt;/p></description></item><item><title>Azure DevOps: check typescript linting for a Pull Request</title><link>https://www.codewrecks.com/post/azdo/pipeline/pipeline-check-lint/</link><pubDate>Fri, 30 Dec 2022 07:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/pipeline-check-lint/</guid><description>&lt;p>Pull Requests is the moment when new code undergo formal review to verify that &lt;strong>it mets the basic quality requirement decided by the team&lt;/strong>. Most of the work can be done automatically, thanks to Azure DevcOps pipeline and various tools.&lt;/p>
&lt;p>Some of the checks can be fully automated by special &lt;strong>addin, like integration with SonarCloud&lt;/strong> so you basically does not need to do anything and you have some nice checks done to new code during PR. Sometimes you want to run custom code or checks, and it is really simple to do with few powershell lines and pipelines.&lt;/p></description></item><item><title>Pills: Azure Devops pipeline Counters</title><link>https://www.codewrecks.com/post/azdo/pills/pipeline-counters/</link><pubDate>Wed, 21 Dec 2022 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/pipeline-counters/</guid><description>&lt;p>Sometimes you need to have a unique number in your pipeline, usually this is needed to &lt;strong>generate a unique version number as an example for publishing NuGet packages and avoid conflict&lt;/strong>. If you use Git (and there is no need to not use it) you can use &lt;a href="https://gitversion.net/docs/">GitVersion&lt;/a> to generate &lt;strong>a unique semver version number that is unique for each build&lt;/strong>. But if you are not using Git and GitVersion, or if you need to rebuild the same commit and have a &lt;strong>unique version for each run, regardless of the commit&lt;/strong> you can use a Counter.&lt;/p></description></item><item><title>Pills: Azure Devops pipeline demands</title><link>https://www.codewrecks.com/post/azdo/pills/pipeline-demands/</link><pubDate>Mon, 19 Dec 2022 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/pipeline-demands/</guid><description>&lt;p>Situation: You are waiting a pipeline to run, you view at the agent pool queue and notice that some of the agents are in idle, &lt;strong>they are not running any pipeline, yet you have run that are waiting in queue&lt;/strong>.&lt;/p>
&lt;p>You can have basically two reasons: the first one is you reached maximum number of parallel pipeline that can run on the pool, the second one is that &lt;strong>the pipeline has some demands that are not satisfied by idle agents&lt;/strong>. Troubleshooting demands can be annoying, but basically it is just matter of verifying the tasks that are present in the pipeline and check if all agents have required demands.&lt;/p></description></item><item><title>Azure DevOps Server: restart upgrade wizard</title><link>https://www.codewrecks.com/post/azdo/misc/restart-upgrade-wizard-azure-devops-server/</link><pubDate>Fri, 16 Dec 2022 06:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/misc/restart-upgrade-wizard-azure-devops-server/</guid><description>&lt;p>There are lots of reason why you have an &lt;strong>on-premise installation of Azure DevOps&lt;/strong>, and if you manage it, you must devote some time to keep it upgraded to the latest version.&lt;/p>
&lt;blockquote>
&lt;p>Keep your Azure DevOps server instance up to date constantly to avoid too big updates.&lt;/p>&lt;/blockquote>
&lt;p>Upgrade procedures are really simple, you just &lt;strong>launch the setup.exe from the latest version and follow the wizard&lt;/strong>. Actually not every person knows that the upgrade is basically a set of steps.&lt;/p></description></item><item><title>Azure Devops Api - Export Work Items</title><link>https://www.codewrecks.com/post/azdo/api/api-step-2-work-item-dump/</link><pubDate>Sat, 26 Nov 2022 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/api/api-step-2-work-item-dump/</guid><description>&lt;p>One of the most common scenario where Azure DevOps API shine is &lt;strong>exporting data into some other db/file&lt;/strong>. There can be lots of legitimate reason why you want to export data: Custom Reporting, Custom Analysis, put everything into a file/database to &lt;strong>perform offline query on your data outside Azure DevOps interface&lt;/strong>.&lt;/p>
&lt;p>In previous parts we already saw how to &lt;a href="https://www.codewrecks.com/post/azdo/api/api-step-1-connection/">connect to the server&lt;/a> and how to &lt;a href="https://www.codewrecks.com/post/azdo/api/api-step-1a-connection-check/">check if credentials are ok&lt;/a>; if we want to interact with the various part of the server you need to get an instance of &lt;strong>appropriate client class&lt;/strong>. In sample code, once connection is estabilished, &lt;strong>ConnectionManager class&lt;/strong> is used to obtain an instance of WorkItemTrackingHttpClient class, using &lt;a href="https://learn.microsoft.com/en-us/previous-versions/visualstudio/visual-studio-2013/dn228356%28v=vs.120%29">&lt;strong>GetClient&lt;T>()&lt;/strong> method of VssConnection&lt;/a>.&lt;/p></description></item><item><title>Azure Devops Api - Detect if credentials are ok</title><link>https://www.codewrecks.com/post/azdo/api/api-step-1a-connection-check/</link><pubDate>Sat, 19 Nov 2022 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/api/api-step-1a-connection-check/</guid><description>&lt;p>In previous article of the series I&amp;rsquo;ve &lt;a href="https://www.codewrecks.com/post/azdo/api/api-step-1-connection/">played with Azure DevOps api connection&lt;/a> showing how you can use a &lt;strong>.NET 4.8 full framework application that can login to Azure DevOps with a token or with an interactive login&lt;/strong>.&lt;/p>
&lt;p>When you use interactive login, usually Windows operating system will &lt;strong>retain credentials in credentials store&lt;/strong>, this will avoid asking for credential every time the application is run. The code that perform the check is really simple, it just check if _vssConnection.AuthorizedIdentity.DisplayName is not &amp;ldquo;anynymous&amp;rdquo;. This happens because &lt;strong>the real credential check is done usually when you perform some real query, and not only the connection&lt;/strong>.&lt;/p></description></item><item><title>Quickly create a test instance of KeyCloak in Azure Services</title><link>https://www.codewrecks.com/post/security/start-keycloak-test-instance-in-azure-services/</link><pubDate>Mon, 07 Nov 2022 21:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/security/start-keycloak-test-instance-in-azure-services/</guid><description>&lt;p>&lt;a href="https://www.keycloak.org/">Keycloak&lt;/a> is a leader in the landscape of Identity Provider and if you need a quick instance &lt;strong>for dev testing, you can spin an instance in Azure App Services in less than a minute&lt;/strong>.&lt;/p>
&lt;p>First of all creates a new Azure App Services and choose to use Docker&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/keycloak-app-services.png"> &lt;img src="../images/keycloak-app-services.png" alt="Create a docker based app service" />&lt;/a>
&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Create a docker based app service.&lt;/em>&lt;/p>
&lt;p>Now you can simply choose &lt;strong>the image you want to run, jboss/keycloak&lt;/strong> is perfectly ok for my scenario.&lt;/p></description></item><item><title>Azure Devops Api - Connection</title><link>https://www.codewrecks.com/post/azdo/api/api-step-1-connection/</link><pubDate>Sat, 05 Nov 2022 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/api/api-step-1-connection/</guid><description>&lt;p>Quite often people asks me how to interact programmatically with Azure DevOps server, main purpose &lt;strong>is retrieving data for custom reporting&lt;/strong> but also interacting with Work Item store etc. Luckily enough, all Azure DevOps functionalities are exposed via API, so you can &lt;strong>write small programs to automate mundane tasks&lt;/strong> and obtain what you need.&lt;/p>
&lt;p>In this post I&amp;rsquo;ll show how to setup the project in .NET and how to connect to the server. The example can be found in &lt;a href="https://github.com/alkampfergit/AzureDevopsExportQuickAndDirty">GitHub&lt;/a>.&lt;/p></description></item><item><title>Configure Data Protection API in .NET Core</title><link>https://www.codewrecks.com/post/security/asp-net-core-data-protection-api/</link><pubDate>Thu, 03 Nov 2022 16:00:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/asp-net-core-data-protection-api/</guid><description>&lt;p>Asp.NET core and .NET core comes with a nice interface to handle encryption, as &lt;a href="https://learn.microsoft.com/en-us/aspnet/core/security/data-protection/using-data-protection">documented here&lt;/a>. Now my goal is configuring data protection api for multiple instance of a software, so we need to &lt;strong>share keys in a shared location and at the same time keep them secret&lt;/strong>. Luckily enough .NET core already has everything we need.&lt;/p>
&lt;p>The overall solution will need two parameter to our program, &lt;strong>Folder where to store keys and a certificate thumbprint to protect the keys&lt;/strong>. In my scenario I want to use Self Signed Certificate, because I&amp;rsquo;m not using TLS or other form of server side encryption, I only need an extra layer of protection &lt;strong>to allow reading keys only from machines that have my certificate installed&lt;/strong>. First of all I need some code to generate a Self Sign Certificate, to simplify installation I simply want IT guy to use swagger interface to generate a self signed certificate and then install in all the machine he/she needs.&lt;/p></description></item><item><title>Pills: Visual Studio lost access to authenticated Nuget feed</title><link>https://www.codewrecks.com/post/azdo/pills/vs-lose-authentication-to-nuget-feed/</link><pubDate>Mon, 26 Sep 2022 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/vs-lose-authentication-to-nuget-feed/</guid><description>&lt;p>The symptom is simple, you have a solution &lt;strong>that uses some authenticated Nuget feed&lt;/strong>, like those hosted on Azure DevOps, suddenly Visual Studio stops authenticating and get a 401 error, or simply does not show any version of packages.&lt;/p>
&lt;p>This is a problem that happens to some person recently and today was my turn. I have a build that published &lt;strong>another version of a package to a private Azure DevOps feed&lt;/strong> and after publishing a new version, I start searching for that new version Visual Studio &amp;ldquo;manage nuget packages&amp;rdquo; but have no result. The thing that makes you understand that you have a major problem is that VS &lt;strong>does not list any version except the one installed&lt;/strong>&lt;/p></description></item><item><title>Pills: Git command line failed to authenticate against Azure DevOps</title><link>https://www.codewrecks.com/post/azdo/pills/git-losing-credentials/</link><pubDate>Sat, 27 Aug 2022 07:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/git-losing-credentials/</guid><description>&lt;p>Sometimes it just happens, you issue some Git command and you found that Azure DevOps deny Authentication, and it does not prompt for new credentials, so you are &lt;strong>just stuck not being able to access your account anymore&lt;/strong>.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/git-authentication-failed.png"> &lt;img src="../images/git-authentication-failed.png" alt="Git authentication failure" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Git authentication failure&lt;/em>&lt;/p>
&lt;p>As you can see in Figure 1, you got an Authenticaiton Failed error, and you are not prompted for new credentials. &lt;strong>Azure DevOps uses Personal Access Token to give access to Git, and this is done with an automatic procedure triggered by command line&lt;/strong>. In Figure 2 you can see Credential Manager Windows that appears the first time you are connecting to an Azure DevOps server.&lt;/p></description></item><item><title>Pills: Azure Devops artifacts retention policy</title><link>https://www.codewrecks.com/post/azdo/pills/artifacts-retention-policy/</link><pubDate>Mon, 08 Aug 2022 08:10:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/artifacts-retention-policy/</guid><description>&lt;p>When you use extensively Azure DevOps feeds, you will end with lots of small project that automatically publish packages at each build. I have projects where &lt;strong>each commit will publish a package thanks to GitVersion that generates a unique number for each build&lt;/strong>. This will end in a situation where thousands of packages are generated and uploaded to Azure DevOps.&lt;/p>
&lt;p>In Feed settings page you have a &lt;strong>retention policy&lt;/strong> that will automatically deletes old packages but keep those ones that are recently used, to avoid removing a package that is still in use.&lt;/p></description></item><item><title>Accessing Office 365 with IMAP and OAuth2</title><link>https://www.codewrecks.com/post/security/accessing-office-365-imap-with-oauth2/</link><pubDate>Mon, 01 Aug 2022 10:13:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/accessing-office-365-imap-with-oauth2/</guid><description>&lt;h1 id="the-situation">The situation&lt;/h1>
&lt;p>I&amp;rsquo;ve had the need to upgrade some code that uses IMAP folder to download email, and it uses sometimes Office365 accounts, but Microsoft &lt;strong>will remove in the future basic auth &lt;a href="https://docs.microsoft.com/en-us/exchange/clients-and-mobile-in-exchange-online/deprecation-of-basic-authentication-exchange-online">as described here&lt;/a> in favor of OAuth2 based authentication.&lt;/strong> This is a good move because Basic Auth is not really secure, and with modern authentication and OAuth2 you can &lt;strong>force two factor auth and other more secure login alternative&lt;/strong>.&lt;/p></description></item><item><title>Azure DevOps: Conditional variable value in pipeline</title><link>https://www.codewrecks.com/post/azdo/pipeline/conditional-variable-in-pipeline/</link><pubDate>Tue, 21 Jun 2022 08:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/conditional-variable-in-pipeline/</guid><description>&lt;p>Let&amp;rsquo;s examine a simple situation, in Azure DevOps you do not have a way to change pipeline priority, thus, if &lt;strong>you need to have an agent always ready for high-priority builds, you can resort using more agent Pools&lt;/strong>. Basically you have N license for pipeline, so you can create N-1 agents in the default Pool and &lt;strong>create another pool, lets call it Fast, where you have an agent installed in a High Performance machine&lt;/strong>. When you need to have a pipeline run that has high priority you can schedule to run in Fast Pool and the game is done.&lt;/p></description></item><item><title>Azure DevOps: complete Pull Requests via web interface</title><link>https://www.codewrecks.com/post/azdo/misc/complete-pull-request-via-web-interface/</link><pubDate>Tue, 31 May 2022 06:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/misc/complete-pull-request-via-web-interface/</guid><description>&lt;p>Pull Requests are the Heart Beat of your code, you have a request/bugfix/feature in a form of work item, the team start a branch, write code, then when the code is finished &lt;strong>a Pull Request is opened to ask the team for review before the code is merged into the stable branch and goes to production&lt;/strong>. When you deploy regularly your product in an automatic way, you need to &lt;strong>minimize the risk that new code can break existing one&lt;/strong> and having other members of the team to review the code increment, is an invaluable feature. Also Pull Requests can include discussion about code increment, so you can examine that discussions in the future when you wonder why a piece of code was written in that specific way, and why.&lt;/p></description></item><item><title>Tutorial: Programming Playstation 2</title><link>https://www.codewrecks.com/post/old/memories/playstation2-tutorial/</link><pubDate>Wed, 25 May 2022 05:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/old/memories/playstation2-tutorial/</guid><description>&lt;p>Looking in old backup and old disk can be a dive in the past, you found old stuff like articles tutorial and yesterday I stumbled on my old tutorial on Playstation 2. There were a time when I was really interested in Computer Graphics, and I had a Playstation 2 Linux kit that a dear friend of mine gave me. The kit was really interesting &lt;strong>because it allows you to play with a nice and pretty architecture for the time&lt;/strong>. The kit was somewhat limited but thanks to the community you have some kernel libraries that allows you to have a direct access to DMA and to all units in the PS2.&lt;/p></description></item><item><title>Case insensitive key dictionaries and MongoDb C# serializers</title><link>https://www.codewrecks.com/post/general/mongodb-dictionary-serialization-case-insensitive/</link><pubDate>Wed, 18 May 2022 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/mongodb-dictionary-serialization-case-insensitive/</guid><description>&lt;p>First of all, every C# programmer should know that Dictionary&amp;lt;Tkey, Tvalue&amp;gt; class (as well as other collections) have a special constructor that can be used to specify the &lt;strong>serializer used to compare keys in the dictionary&lt;/strong>. The most obvious situation is where you have a string key &lt;strong>and you want the dictionary to be case insensitive during key search&lt;/strong>.&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">2
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-csharp" data-lang="csharp">&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">public&lt;/span> SortedDictionary&amp;lt;&lt;span style="color:#66d9ef">string&lt;/span>, StringProperty&amp;gt; StringProperties { &lt;span style="color:#66d9ef">get&lt;/span>; &lt;span style="color:#66d9ef">private&lt;/span> &lt;span style="color:#66d9ef">set&lt;/span>; } 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> = &lt;span style="color:#66d9ef">new&lt;/span> SortedDictionary&amp;lt;&lt;span style="color:#66d9ef">string&lt;/span>, StringProperty&amp;gt;(StringComparer.OrdinalIgnoreCase);&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>The above code is inside a class where I need to keep a dictionary of StringProperty class, using a Sorted dictionary where the key &lt;strong>must be case insensitive&lt;/strong>. This allows me to write code like this&lt;/p></description></item><item><title>Developers and TLS what could possibly go wrong</title><link>https://www.codewrecks.com/post/security/developers-and-tls/</link><pubDate>Sat, 30 Apr 2022 08:13:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/developers-and-tls/</guid><description>&lt;h1 id="the-problem-of-not-using-tls-in-developer-machines">The problem of not using TLS in developer machines&lt;/h1>
&lt;p>Lots of time ago, at the time Windows Communication Foundation was a thing, there were good &lt;strong>automatic protections by Microsoft that prevent passing credentials in clear text over an unencrypted (non TLS) channel&lt;/strong>. I was amazed by the number of solution you can find in the internet that to solve the problem suggests to developer to create an &lt;strong>unsecure channel that does not perform this check, allowing for clear text credential to be sent in a standard HTTP channel&lt;/strong>.&lt;/p></description></item><item><title>How to handle certificate error in dotnet WebClient object</title><link>https://www.codewrecks.com/post/security/handle-certificate-errors-in-dotnet-webclient/</link><pubDate>Fri, 18 Mar 2022 08:14:37 +0200</pubDate><guid>https://www.codewrecks.com/post/security/handle-certificate-errors-in-dotnet-webclient/</guid><description>&lt;h2 id="the-situation">The situation&lt;/h2>
&lt;p>This is a simple scenario: I use a WebClient object in .NET to perform some web request to a target web site, everything went good except when the code runs in Xamarin Android, &lt;strong>where it throws an exception in https connection&lt;/strong>. This is usually a puzzling moment, because I&amp;rsquo;m simply doing an HTTP GET request of a page, everything works outside Xamarin where all I got in response is an error telling me that the certificate is not ok.&lt;/p></description></item><item><title>Elasticsearch and weird unicode char</title><link>https://www.codewrecks.com/post/general/elasticsearch-and-weird-unicode-char/</link><pubDate>Sat, 26 Feb 2022 06:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/elasticsearch-and-weird-unicode-char/</guid><description>&lt;p>Scenario: you have a software where you have a standard search made in UI for data in a board, then you need to move the search server side (due to the increasing number of object in the board), so &lt;strong>you switch to ElasticSearch to have nice full text functionalities&lt;/strong>. Then you have an exact request: Does ElasticSearch supports searching with Unicode chars? The answer is yes, but then you find that users complains about not being able to search with Unicode chars.&lt;/p></description></item><item><title>Clone a simple dashboard with API in Azure DevOps</title><link>https://www.codewrecks.com/post/azdo/api/api-clone-dashboard-azure-devops/</link><pubDate>Fri, 11 Feb 2022 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/api/api-clone-dashboard-azure-devops/</guid><description>&lt;p>If you work in Scrum, being able to visualize data on current and past Sprints is an invaluable way to keep track on team improvement. &lt;strong>Azure DevOps allows you to create Dashboards to visualize interesting metrics&lt;/strong>, but actually you do not have a way to create Dynamic Dashboards, I.E. a Dashboard that allows you to specify a parametric query so you can, for example, change the iteration and view how the data changes.&lt;/p></description></item><item><title>Symbol server made easy with Azure DevOps</title><link>https://www.codewrecks.com/post/programming/aspnet/working-with-symbols/</link><pubDate>Sat, 05 Feb 2022 08:13:30 +0000</pubDate><guid>https://www.codewrecks.com/post/programming/aspnet/working-with-symbols/</guid><description>&lt;p>I&amp;rsquo;ve blogged in the past about &lt;a href="https://www.codewrecks.com/post/old/2013/07/manage-symbol-server-on-azure-or-on-premise-vm-and-tf-service/">symbol server&lt;/a>, to recap here is the scenario.&lt;/p>
&lt;p>In your organization you have some sort of &lt;strong>common .NET code that is shared between projects&lt;/strong>, but you have to face and resolve some problems.&lt;/p>
&lt;ol>
&lt;li>How to compile dlls&lt;/li>
&lt;li>How to distribute dlls&lt;/li>
&lt;li>How to make debugging easier for the user of the dll.&lt;/li>
&lt;/ol>
&lt;p>We have clearly some standard solution for all the parts.&lt;/p>
&lt;h2 id="compiling-dll">Compiling dll&lt;/h2>
&lt;blockquote>
&lt;p>You MUST compile dll in a build server.&lt;/p></description></item><item><title>GitHub issue templates</title><link>https://www.codewrecks.com/post/github/github-issue-templates/</link><pubDate>Sat, 22 Jan 2022 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/github-issue-templates/</guid><description>&lt;p>After Microsoft acquisition of GitHub there is some bit of confusion on what to use: Azure DevOps or GitHub for your new projects? Actually the answer &lt;strong>is somewhat complex, but the most honest response is to use whatever of the two you find more adherent on how you work&lt;/strong>.&lt;/p>
&lt;p>The most different part is the issue/board part, because they are very different on the two products, with very different capabilities and very different basic concepts. While Azure DevOps enforce a complex tracking with &lt;strong>explicit WorkItem types and custom fields&lt;/strong>, GitHub does use a &lt;strong>flat approach to the problem using only Issues with labels and few fields&lt;/strong>.&lt;/p></description></item><item><title>GitHub actions templates</title><link>https://www.codewrecks.com/post/github/github-actions-templates/</link><pubDate>Fri, 07 Jan 2022 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/github-actions-templates/</guid><description>&lt;p>GitHub Actions are closing the gap with Azure DevOps pipelines day by day, one of the features introduces months ago was &lt;strong>Action templates, the ability to re-use actions definitions between repositories&lt;/strong>. You can read all the detail in &lt;a href="https://docs.github.com/en/actions/learn-github-actions/creating-starter-workflows-for-your-organization">Official GitHub documentation&lt;/a>.&lt;/p>
&lt;p>Templates are really useful because usually, in one organization, you tend to &lt;strong>use a restricted set of technologies with the very same set of actions to perform to build and release the code&lt;/strong>. You already know that I&amp;rsquo;m a fan of build in PowerShell or other scripting engine, but for simple project, you can simply rely on GitHub actions without resorting to any scripting language.&lt;/p></description></item><item><title>Azure DevOps: run test in PowerShell and publish results</title><link>https://www.codewrecks.com/post/azdo/pipeline/run-test-in-powershell/</link><pubDate>Sat, 01 Jan 2022 07:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/run-test-in-powershell/</guid><description>&lt;p>Creating full build in PowerShell has &lt;a href="https://www.codewrecks.com/post/azdo/pipeline/powershell-build/">lots of advantages&lt;/a> because you can simply &lt;strong>launch the script and having the build run in any environment&lt;/strong>. This simplifies tremendously debugging build scripts and moving Continuous Integration from one engine to other (Ex from Azure DevOps to GitHub actions).&lt;/p>
&lt;p>Clearly you need to thing in advance how to integrate &lt;strong>with your current CI engine&lt;/strong> because usually you will need to communicate information to the engine.&lt;/p></description></item><item><title>GitHub Security enforcer action</title><link>https://www.codewrecks.com/post/github/security-enforcer/</link><pubDate>Fri, 10 Dec 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/security-enforcer/</guid><description>&lt;p>GitHub takes &lt;a href="https://www.codewrecks.com/post/security/github-security-scanning/">security seriously&lt;/a> and gives you some nice capabilities to &lt;strong>improve security of your code through all its lifecycle&lt;/strong>. GitHub actions can be used to automatically run a security code analysis in your repositories, a task that should be run for &lt;strong>all of your repositories in your organization&lt;/strong>.&lt;/p>
&lt;blockquote>
&lt;p>Security scanning should be enabled on all repositories&lt;/p>&lt;/blockquote>
&lt;p>Some days ago in a &lt;a href="https://github.blog/2021-11-22-accelerate-security-adoption-in-your-organization/">GitHub blog post&lt;/a> a new action was announced called &lt;a href="https://github.com/marketplace/actions/advanced-security-enforcer#example-workflow">Advanced-Security-Enforcer&lt;/a> that is aimed &lt;strong>to automate the task of adding GitHub Workflow to perform code analysis&lt;/strong>.&lt;/p></description></item><item><title>GitHub Actions permission settings</title><link>https://www.codewrecks.com/post/github/action-permission/</link><pubDate>Sat, 27 Nov 2021 06:00:00 +0000</pubDate><guid>https://www.codewrecks.com/post/github/action-permission/</guid><description>&lt;p>Continuous integration is absolutely vital for a healthy software project, but in many situation &lt;strong>people gave little attention to security&lt;/strong>. If you are running CI workflows in your machine where &lt;strong>you control the code that is build and every script that is run by CI engine, you are pretty fine&lt;/strong>. In that scenario an attacker should take control of your code to run some malicious script during a CI run, but if you are using a third party task/extension/action, the situation is different.&lt;/p></description></item><item><title>TryHackMe Writeup: Daily Bugle</title><link>https://www.codewrecks.com/post/security/writeups/daily-bugle/</link><pubDate>Sat, 20 Nov 2021 08:13:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/writeups/daily-bugle/</guid><description>&lt;p>Security is one of my side passion on computer engineering, and if you also like security, &lt;a href="https://tryhackme.com/">Try Hack Me&lt;/a> is a nice place to keep under your radar. This morning I had some fun with &lt;a href="https://tryhackme.com/room/dailybugle">Daily Bugle machine&lt;/a> so I decided to publish my raw writeup.&lt;/p>
&lt;h2 id="scan-the-machine">Scan the machine&lt;/h2>
&lt;p>A standard NMap reveals ssh and port 80 opened hosting a nice joomla web site. If you do not want to be especially stealthy, you can let nmap test for vulnerability with standard script, nothing special.&lt;/p></description></item><item><title>Pills: Pipeline decorators</title><link>https://www.codewrecks.com/post/azdo/pills/pipeline-decorators/</link><pubDate>Sat, 20 Nov 2021 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/pipeline-decorators/</guid><description>&lt;p>Pipeline decorators are a really peculiar feature of Azure DevOps, because they allow you to &lt;strong>specify a series of tasks that are run for EVERY pipeline in your organization&lt;/strong>, so they are rarely needed, but nevertheless they are a nice tool to know because there are situation when they are useful. Moreover, in latest &lt;a href="https://docs.microsoft.com/en-us/azure/devops/release-notes/2021/sprint-194-update">Sprint 194 update&lt;/a> they are expanded to support new functionalities, like running &lt;strong>before or after specific tasks&lt;/strong>.&lt;/p></description></item><item><title>IIS cache management for static files</title><link>https://www.codewrecks.com/post/programming/aspnet/iis-static-file-cache/</link><pubDate>Tue, 16 Nov 2021 20:13:30 +0000</pubDate><guid>https://www.codewrecks.com/post/programming/aspnet/iis-static-file-cache/</guid><description>&lt;p>We have an application with ASP.NET web api plus an angular frontend and we use a client library for translation that is based on plain json files hosted on the server. We have a problem with new releases, because some clients &lt;strong>are needed to force cache clear in the browser to see the new translated terms&lt;/strong>, basically it seems that the browser is using older files and not the new ones.&lt;/p></description></item><item><title>Azure DevOps Pills: Pull request template</title><link>https://www.codewrecks.com/post/azdo/pills/pull-request-template/</link><pubDate>Sun, 14 Nov 2021 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/pull-request-template/</guid><description>&lt;p>Azure DevOps is a really big product and sometimes there are really useful features that are poorly publicized and goes under the radar. One of these is &lt;a href="https://docs.microsoft.com/en-us/azure/devops/repos/git/pull-request-templates?view=azure-devops">Pull Request Templates&lt;/a>, a really useful feature that allows you &lt;strong>to specify markdown template for your pull requests&lt;/strong>.&lt;/p>
&lt;p>I do not want to go into technical details, you can find &lt;a href="https://docs.microsoft.com/en-us/azure/devops/repos/git/pull-request-templates?view=azure-devops">all instructions in official documentation&lt;/a> but I&amp;rsquo;d like to point out why this feature is so useful.&lt;/p></description></item><item><title>Azure DevOps: Azure File copy troubleshooting</title><link>https://www.codewrecks.com/post/azdo/pipeline/azure-file-copy/</link><pubDate>Wed, 20 Oct 2021 07:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/azure-file-copy/</guid><description>&lt;p>If you need to copy files in a Azure Blob or in an Azure Virtual machine within a Azure DevOps pipeline, Azure File Copy Task is the right task to use, but sometimes you could find some problem that make it fails. In this post I&amp;rsquo;ll state some common errors I found using it and how to solve.&lt;/p>
&lt;h4 id="wrong-number-of-arguments-please-refer-to-the-help-page-on-usage-of-this-command">Wrong number of arguments, please refer to the help page on usage of this command&lt;/h4>
&lt;p>If you specify &lt;strong>additional command to the task you can have this error&lt;/strong>, actually I was not able to fully troubleshooting the reason, but I discovered that version 4 of the task is somewhat erratic, so it is really &lt;strong>better using version 3 that seems to me really more stable&lt;/strong>.&lt;/p></description></item><item><title>Determine version with GitVersion for a Python project</title><link>https://www.codewrecks.com/post/github/giversion-python-general/</link><pubDate>Sat, 04 Sep 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/giversion-python-general/</guid><description>&lt;p>Project used for this example can be found &lt;a href="https://github.com/alkampfergit/GitGraphAutomation">in GitHub&lt;/a>.&lt;/p>
&lt;p>In GitHub actions you can use .NET based tools, both in Windows and in Linux machines, to accomplish various tasks. I&amp;rsquo;m a great fan of GitVersion tool, used to determine a semantic versioning based on a Git repository that uses git-flow structure. Another nice aspect is that &lt;strong>GitHub action machines based on Linux comes with PowerShell core preinstalled, so I can use actions that comes from PowerShell gallery without any problems&lt;/strong> &amp;hellip; errr.. almost.&lt;/p></description></item><item><title>Analyze Python project with SonarCloud and GitHub</title><link>https://www.codewrecks.com/post/github/python-sonarcloud-actions/</link><pubDate>Sat, 28 Aug 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/python-sonarcloud-actions/</guid><description>&lt;p>SonarCloud is free for Open Source projects, and for languages like Python, that does not need compilation, it can directly &lt;strong>analyze the repository without any intervention from the author.&lt;/strong> This feature is automatically enabled when you setup your Project in SonarCloud and it determines that you have not compiled language.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/sonar-cloud-analysis.png"> &lt;img src="../images/sonar-cloud-analysis.png" alt="Analysis configuration shows that in this project we have CI analysis" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Analysis configuration shows that in this project we have CI analysis&lt;/em>&lt;/p></description></item><item><title>Configure Codespaces for Python projects</title><link>https://www.codewrecks.com/post/github/codespaces-python/</link><pubDate>Thu, 26 Aug 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/codespaces-python/</guid><description>&lt;p>One of the great advantage of Codespaces is the ability &lt;a href="https://www.codewrecks.com/post/github/configuring-codespaces/">to preconfigure the environment&lt;/a> so you do not need to &lt;strong>waste time installing and configuring your toolchain&lt;/strong>. Python is a perfect example of this scenario, I&amp;rsquo;ve a small project &lt;a href="https://github.com/alkampfergit/GitGraphAutomation">to generate Git Graph Representation&lt;/a> and since I&amp;rsquo;m not a full time Python developer, I&amp;rsquo;ve not it installed and perfectly configured in all of my environment. Also I primarily work on Windows, so &lt;strong>Codespaces allows me to test everything on Linux with a single click&lt;/strong>.&lt;/p></description></item><item><title>Generate Git graph with Gitgraph.js and Python</title><link>https://www.codewrecks.com/post/general/git-graph/</link><pubDate>Tue, 24 Aug 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/git-graph/</guid><description>&lt;p>&lt;a href="https://gitgraphjs.com/#0">GitGraph&lt;/a> is a nice library to create a graphic representation of Git log and the really nice aspect &lt;strong>is that it is widely used and produces picture that are easily recognized as Git History&lt;/strong>. It can basically work in many ways, but the easiest is importing commit as json in an HTML page.&lt;/p>
&lt;p>Thanks to very few lines of Python code I realized a simple POC that is able to use git log to extract log as json &lt;strong>then create an html page with extracted json that renders a simple png with the full history&lt;/strong>. You can find the project in &lt;a href="https://github.com/alkampfergit/GitGraphAutomation">Github&lt;/a> and it is really simple to use (just read the readme).&lt;/p></description></item><item><title>Configure Codespaces for a real project</title><link>https://www.codewrecks.com/post/github/configuring-codespaces/</link><pubDate>Fri, 20 Aug 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/configuring-codespaces/</guid><description>&lt;p>In &lt;a href="https://www.codewrecks.com/post/github/codespaces-hugo/">previous post&lt;/a> I explored the many advantages I&amp;rsquo;ve found &lt;strong>using GitHub codespaces to author blog posts directly in a browser&lt;/strong>. That example was surely too simple, after all a hugo blog is just markdown, but nevertheless Codespaces allows me to configure my environment with great easy.&lt;/p>
&lt;p>You can follow the guide &lt;a href="https://docs.github.com/en/codespaces/customizing-your-codespace/configuring-codespaces-for-your-project">on the official link&lt;/a> but here is a quick summary on how I configured my codespaces for my blog. First of all you can directly add a configuration file inside codespace&lt;/p></description></item><item><title>Use GitHub codespaces to author blog post</title><link>https://www.codewrecks.com/post/github/codespaces-hugo/</link><pubDate>Thu, 19 Aug 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/codespaces-hugo/</guid><description>&lt;h2 id="the-scenario">The scenario&lt;/h2>
&lt;p>I played &lt;a href="https://www.codewrecks.com/post/general/github-codespaces-first-impression/">a little bit with GitHub Codespaces&lt;/a> when it was in preview, now it is time to try to use it real activities to &lt;strong>understand scenarios where it can be useful&lt;/strong>.&lt;/p>
&lt;p>To have a real test you need &lt;strong>to setup a goal to verify if the tool is capable of reaching that goal, and if it is an advantage over existing tool&lt;/strong>. My first goal is being able to write Blog Post in hugo with GitHub Codespaces and being able to determine if it is more productive than running a standalone local version of Visual Studio code.&lt;/p></description></item><item><title>Choose environment from branch in GitHub action</title><link>https://www.codewrecks.com/post/github/choose-environment-from-branch/</link><pubDate>Wed, 18 Aug 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/choose-environment-from-branch/</guid><description>&lt;p>I have a friend that asked me how to &lt;strong>choose an &lt;a href="https://docs.github.com/en/actions/reference/environments">enviroment&lt;/a> in a GitHub action based on the branch that triggered the action&lt;/strong>. Usually Environments are used in a sort of Promotion mechanism, where you start deploying on Test, then you have a manual or automatic approval to deploy on staging and finally to production. Even if this is a textbook scenario &lt;strong>sometimes you need to create a simple sequence of steps to deploy your software in an environment and you want to choose environment based on branch&lt;/strong>. If you deploy master you deploy on production, if you deploy develop you deploy test.&lt;/p></description></item><item><title>Fine control of compression level in 7zip</title><link>https://www.codewrecks.com/post/general/some-thoughts-over-7zip/</link><pubDate>Tue, 10 Aug 2021 20:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/some-thoughts-over-7zip/</guid><description>&lt;p>Even if 7zip is not a real DevOps argument, it is really interesting in the context of a build pipeline and &lt;strong>artifacts publishing&lt;/strong>. AzureDevops Pipeilne and GitHub actions have &lt;strong>dedicated actions to upload artifacts to the result of the pipeline/action&lt;/strong> but I often do not like this approach. Even if you can download everything as a zip, I like to create different archives before uploading, for at least two reasons&lt;/p></description></item><item><title>Passing boolean parameters to PowerShell scripts in Azure DevOps Pipeline</title><link>https://www.codewrecks.com/post/azdo/pipeline/powershell-boolean/</link><pubDate>Sun, 08 Aug 2021 20:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/powershell-boolean/</guid><description>&lt;p>Let&amp;rsquo;s start from the problem, I have an Azure DevOps pipeline that calls a PowerShell script and the team needs to change the pipeline allowing &lt;strong>a boolean parameter to be passed to the PowerShell script when you queue the pipeline&lt;/strong>. The first tentative produces this error:&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">4
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-powershell" data-lang="powershell">&lt;span style="display:flex;">&lt;span>C:\a\_work\&lt;span style="color:#ae81ff">56&lt;/span>\s\build.dotnet.ps1 &lt;span style="color:#960050;background-color:#1e0010">:&lt;/span> 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>Cannot &lt;span style="color:#66d9ef">process&lt;/span> argument transformation on parameter &lt;span style="color:#e6db74">&amp;#39;forceInstallPackage&amp;#39;&lt;/span>. Cannot 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>convert value &lt;span style="color:#e6db74">&amp;#34;System.String&amp;#34;&lt;/span> to type &lt;span style="color:#e6db74">&amp;#34;System.Boolean&amp;#34;&lt;/span>. 
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>Boolean parameters accept only Boolean values and numbers, &lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>The original code of the pipeline is the following one.&lt;/p></description></item><item><title>GitHub security scan - an example</title><link>https://www.codewrecks.com/post/security/github-security-scanning/</link><pubDate>Sat, 17 Jul 2021 15:10:00 +0200</pubDate><guid>https://www.codewrecks.com/post/security/github-security-scanning/</guid><description>&lt;p>I&amp;rsquo;ve already blogged &lt;a href="https://www.codewrecks.com/post/github/code-scanning-result/">on the security scanning capability offered by GitHub&lt;/a> and in this post I want to give you another example on a possible output. In previous example I&amp;rsquo;ve shown a result that is quite simple &lt;strong>the library identified a usage of ECB in AES encryption and flagged it as a wrong usage of crypto api&lt;/strong>. It is interesting but less impressive, after all it simply spotted the usage of an enum value related to a vulnerable CypherMode, something that it easy to spot.&lt;/p></description></item><item><title>Playing with Cryptography, Part 1</title><link>https://www.codewrecks.com/post/security/playing-with-cryptography-part1/</link><pubDate>Sat, 17 Jul 2021 09:13:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/playing-with-cryptography-part1/</guid><description>&lt;p>Cryptography is a fascinating subject, surely complex, but as a developer you probably have some &lt;strong>predefined libraries in your language/environment of choice that you can use&lt;/strong>. DotNet is not an exception, so I&amp;rsquo;ve decided to create a sample repository to play a little bit with all cryptography primitives to show how easy is to use them &lt;a href="https://github.com/alkampfergit/DotNetCoreCryptography">https://github.com/alkampfergit/DotNetCoreCryptography&lt;/a>.&lt;/p>
&lt;p>This is not a tutorial, it is more a repository where I played with API to gain more confidence with &lt;strong>.Net Core version of the API&lt;/strong>. The purpose is also to understand if you can &lt;strong>wrap Crypto API to make them simple to use for a developer, avoiding people to use them in different ways across a same software and to make them simpler to use&lt;/strong>.&lt;/p></description></item><item><title>Code coverage in SonarCloud and GitHub Actions</title><link>https://www.codewrecks.com/post/github/github-sonarcloud-codecoverage/</link><pubDate>Thu, 08 Jul 2021 19:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/github-sonarcloud-codecoverage/</guid><description>&lt;p>First of all I want to thank my friend &lt;a href="http://blog.casavian.eu/page/about/">Giulio Vian&lt;/a> for pointing me in the right direction and for its great work in &lt;a href="https://github.com/tfsaggregator/aggregator-cli/blob/master/.github/workflows/build-and-deploy.yml">TfsAggregator Action&lt;/a>.&lt;/p>
&lt;p>My problem was: I used the wizard in GitHub to create a GitHub Action definition to analyze code in SonarCloud, everything runs just fine except I was not able to have Code Coverage nor unit tests result in my analysis. &lt;strong>With Azure DevOps actions and .NET Full Framework project there is no problem&lt;/strong> but with GH and standard Actions no result see, seems to be uploaded.&lt;/p></description></item><item><title>How to create a list of non upgradable software for winget</title><link>https://www.codewrecks.com/post/general/winget-update-selective/</link><pubDate>Sun, 27 Jun 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/winget-update-selective/</guid><description>&lt;p>Edit: I&amp;rsquo;ve stored &lt;a href="https://gist.github.com/alkampfergit/2f662c07df0ca379c8e8e65e588c687b">gist of the script here&lt;/a>&lt;/p>
&lt;p>Winget is finally &lt;a href="https://www.codewrecks.com/post/general/winget-intro/">here&lt;/a> and we can, at least, using a package manager on windows &lt;strong>to simplify application management&lt;/strong>. Everything is good, except that some packages are not ready to be upgraded. &lt;strong>As an example in my system I have python 2.7 and 3.x, winget always try to upgrade 3.x version at each run and messed up my installation&lt;/strong>. At the end of the Nth upgrade of python my Visual Studio code was not able anymore to debug and run python.&lt;/p></description></item><item><title>How to resolve hostname in a WSL instance with internal IP</title><link>https://www.codewrecks.com/post/general/wsl-hostname/</link><pubDate>Thu, 03 Jun 2021 18:40:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/wsl-hostname/</guid><description>&lt;p>Edit: actually I missed that I can add mshome.net to host name to correctly identify the ip of the host computer, like suggested &lt;a href="https://github.com/alkampfergit/personal-blog/discussions/13">here&lt;/a>&lt;/p>
&lt;p>In a WSL2 instance you are running inside a Virtual Environment managed by your Windows Operating system, and the greatest advantage is &lt;strong>the ability to work with a unified filesystem and being able to have a great communication between the two system&lt;/strong>. This is perfect but I have a small problem, if I want to use the WSL2 instance to test some software I&amp;rsquo;m developing I&amp;rsquo;d like to access services running on my host operating system.&lt;/p></description></item><item><title>Winget is finally a thing</title><link>https://www.codewrecks.com/post/general/winget-intro/</link><pubDate>Fri, 28 May 2021 10:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/winget-intro/</guid><description>&lt;p>After a really loooooong time, finally Windows has its own &lt;a href="https://devblogs.microsoft.com/commandline/windows-package-manager-1-0/">Package Manager&lt;/a> called Windows Package Manager 1.0. It was firstly announced last year at build, but &lt;strong>with 2021 build conference it was announced that it is now in version 1.0&lt;/strong>. I do not want to do a full coverage of all the features, but simply share why this is a thing.&lt;/p>
&lt;p>Linux users had package managers from almost the beginning, this imply that &lt;strong>to install all of your favorite tools, you can simply use command line to install everything&lt;/strong>. Windows was an Operating System that was not born with Command Line in mind and this reflects negatively on the experience of &amp;ldquo;new machine configuration&amp;rdquo;.&lt;/p></description></item><item><title>Keep MongoDb logfile size at bay</title><link>https://www.codewrecks.com/post/general/mongo-db-logfiles/</link><pubDate>Thu, 27 May 2021 20:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/mongo-db-logfiles/</guid><description>&lt;p>MongoDb is a great option for NoSql but sometimes it is installed in production forgetting some basic maintenance tasks, like managing log files. You should remember that &lt;a href="https://docs.mongodb.com/manual/tutorial/rotate-log-files/">MongoDb does not automatically rotate log files&lt;/a> as for official documentation.&lt;/p>
&lt;p>This lead to logfiles of Gigabyte size and sometimes they can even be a space problem in your installation. If a &lt;strong>logfile is gone out of control, you cannot delete because it is in use by mongod process, so you need to ask for a manual rotate&lt;/strong>. Just connect to mongodb instance and from mongo.exe command line or any other tool you use issue this command.&lt;/p></description></item><item><title>Azure DevOps: Use specific version of java in a pipeline</title><link>https://www.codewrecks.com/post/azdo/pipeline/java-requirement-sonarcloud/</link><pubDate>Mon, 26 Apr 2021 17:00:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/java-requirement-sonarcloud/</guid><description>&lt;p>I have lots of pipelines with SonarCloud analysis, and in the last months I&amp;rsquo;ve started receiving warning for an old version of Java used in pipeline. &lt;strong>SonarCloud task scanner is gentle enough to warn you for months before dropping the support&lt;/strong>, nevertheless there is always the possibility that you forgot to update some agents so some pipeline starts failing with error&lt;/p>
&lt;blockquote>
&lt;p>The version of Java (1.8.xxx) you have used to run this analysis is deprecated and we stopped accepting it.&lt;/p></description></item><item><title>Return value in PowerShell, a typical error</title><link>https://www.codewrecks.com/post/general/powershell/powershell-return-value/</link><pubDate>Sat, 17 Apr 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/powershell/powershell-return-value/</guid><description>&lt;p>When you create a function in PowerShell you need to remember that if you write output, this will &lt;a href="https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_return?view=powershell-7.1">be included in the returned value&lt;/a>. This means that &lt;strong>if you end your function with return $something you would not get only the content of the variable $something but every output that you did in the function&lt;/strong>.&lt;/p>
&lt;p>For this reason you need to be super careful not to use Write-Output, because all the output will be included in the returned value.&lt;/p></description></item><item><title>Continuous integration: PowerShell way</title><link>https://www.codewrecks.com/post/azdo/pipeline/powershell-build/</link><pubDate>Sat, 17 Apr 2021 07:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/powershell-build/</guid><description>&lt;p>I&amp;rsquo;m a great fan of Azure DevOps pipelines, I use them extensively, but I also a fan of simple building strategies, not relying on some specific build engine.&lt;/p>
&lt;blockquote>
&lt;p>For Continuous Integration, being too much dependent on a specific technology could be limiting.&lt;/p>&lt;/blockquote>
&lt;p>I&amp;rsquo;ve started CI with &lt;strong>many years ago with CC.NET&lt;/strong> and explored various engines, from MsBuild to Nant then Psake, cake etc. I&amp;rsquo;ve also used various CI tools, from TFS to AzureDevOps to TeamCity and others. My overall reaction to those tools was usually good, but &lt;strong>I always feel wrong to be bound to some specific technology&lt;/strong>. What about a customer using something I do not know like Travis CI? Also, when you need to do CI at customer sites, it is hard to force a particular technology. It is too easy to tell to a customer: just use X because it is the best, when the reality is that your knowledge of X is really good so it is your first choice.&lt;/p></description></item><item><title>Hello terraform</title><link>https://www.codewrecks.com/post/devops/terraform-hello-world/</link><pubDate>Sat, 03 Apr 2021 07:00:18 +0200</pubDate><guid>https://www.codewrecks.com/post/devops/terraform-hello-world/</guid><description>&lt;p>I&amp;rsquo;m studying &lt;a href="https://www.oreilly.com/library/view/terraform-up/9781492046899/">Terraform Up and Running&lt;/a> book, a really good book but all the examples are for AWS. I have nothing against AWS, but I&amp;rsquo;m familiar with Azure, so I&amp;rsquo;d like to &lt;strong>start porting some of the example of the book for Azure&lt;/strong>. While I&amp;rsquo;m not sure if I&amp;rsquo;ll keep up with the conversion, if you are curious I&amp;rsquo;ve started the work &lt;a href="https://github.com/AlkampferOpenSource/terraform-up-and-running-code/tree/master/code/terraformAzure">in this repository&lt;/a>, feel free to post any correction (remember that I&amp;rsquo;m learning Terraform, I&amp;rsquo;m not an expert :))&lt;/p></description></item><item><title>Avoid IIS to bind to every IP Address port and other amenities</title><link>https://www.codewrecks.com/post/general/iis-bind-ip/</link><pubDate>Sat, 20 Mar 2021 10:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/iis-bind-ip/</guid><description>&lt;p>Let&amp;rsquo;s suppose you have a Microservices based solution, you have many different processes and each process communicates with standard WebAPI. The usual developer solution is &lt;strong>using a different port for each different program&lt;/strong>, this lead to http://localhost:12345, http://localhost:12346 and so on. This is far from being optimal, because in production, probably, each service could potentially be exposed with a different hostname, something like &lt;a href="https://auth.mysoftware.com">https://auth.mysoftware.com&lt;/a>, &lt;a href="https://orders.mysoftware.com">https://orders.mysoftware.com&lt;/a> and so on. Another problem is that, in production, everything &lt;strong>must be exposed with https&lt;/strong> and too many times developer are not testing with TLS enabled.&lt;/p></description></item><item><title>CodeQL Scanning in GitHub</title><link>https://www.codewrecks.com/post/github/code-scanning-result/</link><pubDate>Sun, 14 Mar 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/code-scanning-result/</guid><description>&lt;p>As you can read directly from &lt;a href="https://github.blog/2020-09-30-code-scanning-is-now-available/">GitHub blog post&lt;/a> GitHub code scanning is now available and ready to use for your repositories.&lt;/p>
&lt;p>I&amp;rsquo;ve blogged in the past &lt;a href="https://www.codewrecks.com/post/github/code-scanning/">about code security scanning in GitHub&lt;/a> but in that post I didn&amp;rsquo;t show what happens when &lt;strong>analysis engine found some possible security problem in your code&lt;/strong>. When something is not ok, you can go on your Security GitHub tab to look for alerts.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/codeql-alert-result.png"> &lt;img src="../images/codeql-alert-result.png" alt="CodeQL alert results in your repository" />&lt;/a>
&lt;em>&lt;strong>Figure 1:&lt;/strong>&lt;/em> &lt;em>CodeQL alert results in your repository&lt;/em>&lt;/p></description></item><item><title>Sonar Cloud analysis in GitHub</title><link>https://www.codewrecks.com/post/github/github-sonar-cloud/</link><pubDate>Sat, 13 Mar 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/github-sonar-cloud/</guid><description>&lt;p>Well, you know me, I like to have my code analyzed by SonarCloud when possible, and since it is free for open source, I always use Azure DevOps pipeline to &lt;strong>automatically analyze code on each push&lt;/strong>. Now that GitHub actions are available, a good solution is to simply &lt;strong>use GitHub actions to analyze code, without disturbing Azure DevOps&lt;/strong>.&lt;/p>
&lt;blockquote>
&lt;p>Azure DevOps pipelines are, in my opinion, more mature than GitHub actions, but for small tasks, it is simpler to go with Actions.&lt;/p></description></item><item><title>Quick ftp upload of multiple files</title><link>https://www.codewrecks.com/post/general/quick-ftp-upload-for-blog/</link><pubDate>Sat, 06 Feb 2021 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/quick-ftp-upload-for-blog/</guid><description>&lt;p>After conversion of all of my blog posts, I have more than 1000 page to upload, and even if I left old image in original location &lt;strong>my publish time increased a lot as you can see from Figure 1&lt;/strong>.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/upload-time-really-high.png"> &lt;img src="../images/upload-time-really-high.png" alt="Upload time is becoming unbearable" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Upload time is becoming unbearable&lt;/em>&lt;/p>
&lt;p>I&amp;rsquo;ve scheduled automatic publish with GitHub Actions, but using a standard ftp action I&amp;rsquo;ve found in the marketplace, my build time increased &lt;strong>from 5 minutes to more than one hour&lt;/strong>. Clearly this is annoying, not because I need to wait 1 hour before new post will appears in the site, but because I &lt;strong>need to wait 1 hour to understand if something went wrong&lt;/strong>.&lt;/p></description></item><item><title>Moving old posts from WordPress to Hugo</title><link>https://www.codewrecks.com/post/general/conversion-of-wordpress/</link><pubDate>Wed, 03 Feb 2021 19:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/conversion-of-wordpress/</guid><description>&lt;p>It was almost one year from my switch from WordPress to Hugo and I&amp;rsquo;m really really satisfied from the result. After some months the only thing that bother me is &lt;strong>the fact that I still need to maintain an instance of WordPress just to keep old posts&lt;/strong>. The only thing that prevented me to migrate was the loss of comments, but actually, after looking to all of my old blog posts, I&amp;rsquo;ve not such a big amount of comments that worth migrating, &lt;strong>the only thing that is important to me is the ability to keep my old post without comments&lt;/strong> so I can get rid of WordPress. I&amp;rsquo;m sorry for all of you that spent time commenting on my blog, but migrating the comments would be a huge work.&lt;/p></description></item><item><title>Execute jobs depending on changed files on commit</title><link>https://www.codewrecks.com/post/azdo/pipeline/execution-condition-file-changed/</link><pubDate>Fri, 15 Jan 2021 17:50:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/execution-condition-file-changed/</guid><description>&lt;p>Configuring a build to build each commit to constantly verify quality of code is usually a good idea, but sooner or after, in big solutions, you start filling pipeline queue. The main problem is that, &lt;strong>when the team grows, the number of commits for each day of work increase and you start having problem in build queue&lt;/strong>. If build queue is more than one hour long, it is still acceptable, but if the queue is even more, it become clear that you should find a solution.&lt;/p></description></item><item><title>Security Onion 2020 - The hunt</title><link>https://www.codewrecks.com/post/security/security-onion-hunt/</link><pubDate>Sun, 03 Jan 2021 10:13:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/security-onion-hunt/</guid><description>&lt;p>I&amp;rsquo;ve done another couple of videos about &lt;a href="https://securityonionsolutions.com/">Security Onion&lt;/a> focusing on how I can use &lt;em>&lt;strong>The hunt to look for anomalies in network traffic&lt;/strong>&lt;/em>. As for the previous video I give a disclaimer: I&amp;rsquo;m not a Security Onion expert, and those video are meant to keep track of my progress and to &lt;strong>help others to familiarize with the tool&lt;/strong>.&lt;/p>
&lt;p>In first video I start from an alert from &lt;a href="https://github.com/target/strelka">Strelka&lt;/a> and then proceed to &lt;strong>identify possible compromised machine in the network as well as finding external malicious IPs&lt;/strong>.&lt;/p></description></item><item><title>Kali Linux in Hyper-V system</title><link>https://www.codewrecks.com/post/security/kali-linux-in-hyper-v/</link><pubDate>Wed, 30 Dec 2020 10:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/security/kali-linux-in-hyper-v/</guid><description>&lt;h2 id="kali-linux-on-windows">Kali Linux on Windows&lt;/h2>
&lt;p>Most of the time a &lt;a href="https://www.codewrecks.com/post/security/kali-linux-in-wsl2/">Kali Linux instance running in WSL&lt;/a> is more than enough to have some fun in a Windows box. Using WSL is really simple but &lt;strong>I have a couple of annoying problems that make my experience uncomfortable.&lt;/strong>&lt;/p>
&lt;ol>
&lt;li>UI experience is sluggish, and annoying&lt;/li>
&lt;li>I have very little control over networking.&lt;/li>
&lt;/ol>
&lt;p>Point 2 is the major pain point in my situation, I usually buy some inexpensive &lt;a href="https://www.codewrecks.com/post/security/kali-linux-in-wsl2/">Intel i350 T2 cards&lt;/a> on Ebay, to allow me to have &lt;strong>at least three Network Card on my workstation&lt;/strong>. If you wonder why I like three NICs here is my typical usage pattern.&lt;/p></description></item><item><title>Authenticate to Azure DevOps private Nuget Feed</title><link>https://www.codewrecks.com/post/azdo/pipeline/nuget-feed-authenticate/</link><pubDate>Tue, 29 Dec 2020 10:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/nuget-feed-authenticate/</guid><description>&lt;p>When you build a project that depends on Azure DevOps hosted nuget feed, usually if the feed &lt;strong>is on the same organization of the pipeline and you are using Nuget task, everything regarding authentication happens automatically&lt;/strong>. A really different situation arise if you are using Nuget directly from Command Line or PowerShell script. A typical situation is: everything seems to work perfectly in your machine but during pipeline run you receive 401 (unauthenticated) error or the build hangs with a message like this:&lt;/p></description></item><item><title>Azure DevOps: Execute GitHub code analysis in a pipeline</title><link>https://www.codewrecks.com/post/azdo/pipeline/github-code-analysis/</link><pubDate>Mon, 28 Dec 2020 08:50:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/github-code-analysis/</guid><description>&lt;p>Ok, I know that many of you are questioning: Why using Azure DevOps to analyze code with CodeQL? Using GitHub actions is the preferred way to do so why bother with running in another CI? The scenario is simple, a company has everything on Azure DevOps, it wants to retain everything there but it &lt;strong>want to be able to gain advantage from GitHub CodeQL analysis&lt;/strong>. This scenario is not so uncommon, and you &lt;a href="https://docs.github.com/en/free-pro-team@latest/github/finding-security-vulnerabilities-and-errors-in-your-code/running-codeql-code-scanning-in-your-ci-system">have a nice GitHub guide&lt;/a> on how to run CodeQL code scanning in your CI System.&lt;/p></description></item><item><title>Azure DevOps: Convert your classic pipeline in YAML</title><link>https://www.codewrecks.com/post/azdo/pipeline/convert-to-yaml/</link><pubDate>Tue, 22 Dec 2020 18:50:42 +0000</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/convert-to-yaml/</guid><description>&lt;p>When I teach to customer Azure DevOps pipeline, I always suggest them to avoid the classic editor and &lt;strong>direct learn the tool using yaml pipeline&lt;/strong>; while we can agree that classic GUI based editor is simpler, it also miss many of the advantages of YAML and have limited use.&lt;/p>
&lt;p>Yaml based pipeline have a lot of advantages, first of all they are included in the code (I really love have everything in my repository), you can simple copy and paste in new projects, &lt;strong>templates are really powerful&lt;/strong> and also your pipeline definition follow your branches.&lt;/p></description></item><item><title>Security onion in Hyper-V</title><link>https://www.codewrecks.com/post/security/security-onion-hyper-v/</link><pubDate>Sat, 05 Dec 2020 11:14:37 +0200</pubDate><guid>https://www.codewrecks.com/post/security/security-onion-hyper-v/</guid><description>&lt;p>If you want to setup a real lab to test Network Security Monitor solution, like &lt;a href="https://securityonionsolutions.com/">Security Onion&lt;/a> probably you will start with some &lt;strong>virtual machine where to install everything&lt;/strong>. While we can agree that VmWare is probably the best solution (I have a test ESXi node) Hyper-V can be a viable solution, but you need to be aware of some glitches.&lt;/p>
&lt;blockquote>
&lt;p>Most of the information I&amp;rsquo;ve found in internet are outdated and probably not valid for Windows Server 2019, as you can see in &lt;strong>Figure 2&lt;/strong>. Hope this post can save time to others that have my same problem.&lt;/p></description></item><item><title>Azure DevOps Pills: View progress in backlog</title><link>https://www.codewrecks.com/post/azdo/pills/progress-by-item/</link><pubDate>Sun, 29 Nov 2020 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/progress-by-item/</guid><description>&lt;p>If you start managing your backlog with Azure Boards, you probably will end having Epics-&amp;gt;Features-&amp;gt;User stories breakdown and as manager you have a usual question to answer &lt;strong>where are we on this epics or feature and when you expect it to be finished&lt;/strong>.&lt;/p>
&lt;p>While this is not a simple question to answer looking only at the tool, you need to know that Azure Boards can give you a &lt;strong>quick help visualizing completed work in a dedicated column&lt;/strong>.&lt;/p></description></item><item><title>How to handle errors in PowerShell script used in Azure DevOps pipeline</title><link>https://www.codewrecks.com/post/general/powershell/pipeline-and-powershell-return-code/</link><pubDate>Sun, 15 Nov 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/powershell/pipeline-and-powershell-return-code/</guid><description>&lt;p>Building with PowerShell or other scripting engine is a really nice option because you can &lt;strong>reuse the script in almost any Continuous Integration engine&lt;/strong> with a minimal effort, but sometimes there are tools that causes some headache.&lt;/p>
&lt;p>I had problem with tooling like yarn and npm when they are run in Azure DevOps pipeline, the problem is that &lt;strong>when the tool emit a warning, pipeline engine consider it an error and make the build fails&lt;/strong>. This happens usually because it is common to run PowerShell scripts in Azure DevOps pipeline with the option failOnStdErr to true&lt;/p></description></item><item><title>Pills: Invoke-WebRequest really Slow</title><link>https://www.codewrecks.com/post/azdo/pills/powershell-download/</link><pubDate>Tue, 03 Nov 2020 22:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/powershell-download/</guid><description>&lt;p>There are times when using Invoke-WebRequest in PowerShell is really slow, especially compared to a direct download in a browser. The answer is as always on &lt;a href="https://stackoverflow.com/questions/28682642/powershell-why-is-using-invoke-webrequest-much-slower-than-a-browser-download">StackOverflow in this post&lt;/a> but for some reason approved answer is not my favorite.&lt;/p>
&lt;p>Approved solution uses WebClient, &lt;strong>it is perfectly valid, but other answer are more correct&lt;/strong> (and have more votes). In my opinion the real solution is disabling progress.&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-powershell" data-lang="powershell">&lt;span style="display:flex;">&lt;span>$ProgressPreference = &lt;span style="color:#e6db74">&amp;#39;SilentlyContinue&amp;#39;&lt;/span>
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>This usually is enough to speedup Invoke-WebRequest without changing every single call to use WebClient.&lt;/p></description></item><item><title>Azure DevOps Pills: PowerShell in pipeline with Linux agents</title><link>https://www.codewrecks.com/post/azdo/pipeline/linux-powershell/</link><pubDate>Sun, 01 Nov 2020 13:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/linux-powershell/</guid><description>&lt;p>This is a really basic fact, but it is often underestimated. PowerShell core is now available on Linux and this means that you can use PowerShell for your Azure DevOps pipeline even if the pipeline &lt;strong>will be executed on Linux machine&lt;/strong>. If I have this task in a pipeline&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 5
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 6
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 7
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 8
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 9
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">10
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-yaml" data-lang="yaml">&lt;span style="display:flex;">&lt;span>&lt;span style="color:#f92672">steps&lt;/span>:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> - &lt;span style="color:#f92672">task&lt;/span>: &lt;span style="color:#ae81ff">PowerShell@2&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">displayName&lt;/span>: &lt;span style="color:#ae81ff">Simple task&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">inputs&lt;/span>:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">targetType&lt;/span>: &lt;span style="color:#ae81ff">inline&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">script&lt;/span>: |&lt;span style="color:#e6db74">
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> Write-Host &amp;#34;Simple task for simple stage pipeline&amp;#34;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> Write-Host &amp;#34;Value for variable Configuration is $(configuration) value for parameterA is ${{ parameters.ParameterA }}&amp;#34;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> Write-Host &amp;#34;Change Variable value configuration to &amp;#39;debug&amp;#39;&amp;#34;
&lt;/span>&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#e6db74"> Write-Host &amp;#34;##vso[task.setvariable variable=configuration]debug&amp;#34;&lt;/span>&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>I can schedule the pipeline on a Linux hosted agent, and everything runs smoothly.&lt;/p></description></item><item><title>Automatic publish PowerShell Gallery with GitHub Actions</title><link>https://www.codewrecks.com/post/general/powershell-gallery-publish/</link><pubDate>Mon, 26 Oct 2020 18:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/powershell-gallery-publish/</guid><description>&lt;p>Publishing &lt;a href="http://www.codewrecks.com/post/general/powershell-gallery/">PowerShell helper functions&lt;/a> to PowerShell gallery is a good solution to &lt;strong>maximize reuse on Build and general Scripting for DevOps mundane tasks.&lt;/strong> On &lt;a href="https://github.com/AlkampferOpenSource/powershell-build-utils">this GitHub repository&lt;/a> I&amp;rsquo;ve put some simple build utilities that can &lt;strong>be published on PowerShell gallery&lt;/strong>.&lt;/p>
&lt;p>To streamline the process I&amp;rsquo;ve decided to &lt;strong>automate publish process with GitHub actions&lt;/strong>, because this is the typical scenario where GH Actions shine. First of all I&amp;rsquo;ve reorganized my sources to create a &lt;strong>single PowerShell file for each function, then I&amp;rsquo;ve found this &lt;a href="https://evotec.xyz/powershell-single-psm1-file-versus-multi-file-modules/">excellent post&lt;/a> that explain how to combine all files into a unique file to maximize performances.&lt;/strong>&lt;/p></description></item><item><title>Azure DevOps pills: Avoid triggering pipelines continuous integration with commit message</title><link>https://www.codewrecks.com/post/azdo/pipeline/no-ci/</link><pubDate>Sat, 24 Oct 2020 10:00:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/no-ci/</guid><description>&lt;p>There are situation when you need to push frequently on a Git repository, a typical example is when you are &lt;strong>authoring a yaml pipeline and you are experimenting stuff&lt;/strong>; in such a situation you modify the pipeline, push, test and go on. It is quite common to push really frequently and this usually saturate standard pipelines.&lt;/p>
&lt;p>It is not uncommon to have a standard pipeline of build and test running for each commit and for each branch. In such a situation &lt;strong>if you push too often you risk to saturate all of your build agents&lt;/strong>.&lt;/p></description></item><item><title>Speedup WSUS in your AD (Part 2)</title><link>https://www.codewrecks.com/post/general/speed-up-wsus-part2/</link><pubDate>Sat, 17 Oct 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/speed-up-wsus-part2/</guid><description>&lt;p>After I did some maintenance on my WSUS &lt;a href="http://www.codewrecks.com/post/general/speed-up-wsus/">with some tricks&lt;/a> I still had problem, after hours of cleanup, PowerShell script stopped working and constantly gave me a timeout.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/wsusl-unable-to-clean.png"> &lt;img src="../images/wsusl-unable-to-clean.png" alt="WSUS was unable to perform cleanup" />&lt;/a>
&lt;em>&lt;strong>Figure 1:&lt;/strong>&lt;/em> &lt;em>WSUS was unable to perform cleanup&lt;/em>&lt;/p>
&lt;p>I did not had time to investigate the issue, but I got saved in my disk a small notes I&amp;rsquo;ve found somewhere that suggested how to &lt;strong>cleanup manually directly with SQL&lt;/strong>.&lt;/p></description></item><item><title>Speedup WSUS in your AD</title><link>https://www.codewrecks.com/post/general/speed-up-wsus/</link><pubDate>Sun, 11 Oct 2020 16:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/speed-up-wsus/</guid><description>&lt;p>I have a very old HP Proliant Microserver, it has 16 GB of RAM and an SSD, but it is really old, it is powered by a really old Turion CPU and WSUS is starting to becoming unresponsive. That machine act &lt;strong>as a test domain controller for a bunch of test virtual machines&lt;/strong>, WSUS always was a little bit slow, but it worked, until few days ago, when it constantly fails to load data.&lt;/p></description></item><item><title>Pip and Python in Visual Studio Code</title><link>https://www.codewrecks.com/post/general/pip-and-python-in-vscode/</link><pubDate>Sat, 10 Oct 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/pip-and-python-in-vscode/</guid><description>&lt;p>I&amp;rsquo;m not a Python expert, but I used it more often these days and I use Visual Studio Code with Python extension to author my scripts. One of the most annoying problem is &lt;strong>receiving a no module named xxx error when you already installed that module with pip&lt;/strong>.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/python-no-module-error.png"> &lt;img src="../images/python-no-module-error.png" alt="No module error when running Python code in Visual Studio Code" />&lt;/a>
&lt;em>&lt;strong>Figure 1:&lt;/strong>&lt;/em> &lt;em>No module error when running Python code in Visual Studio Code&lt;/em>&lt;/p></description></item><item><title>Code scanning in GitHub</title><link>https://www.codewrecks.com/post/github/code-scanning/</link><pubDate>Sat, 03 Oct 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/github/code-scanning/</guid><description>&lt;p>As you can read directly from &lt;a href="https://github.blog/2020-09-30-code-scanning-is-now-available/">GitHub blog post&lt;/a> GitHub code scanning is now available and ready to use for your repositories.&lt;/p>
&lt;p>To enable code scanning you can just go to the security tab of your repository &lt;strong>and choose to enable code scanning&lt;/strong>.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/enable-code-scanning-gh.png"> &lt;img src="../images/enable-code-scanning-gh.png" alt="Enable code scanning" />&lt;/a>
&lt;em>&lt;strong>Figure 1:&lt;/strong>&lt;/em> &lt;em>Enable code scanning&lt;/em>&lt;/p>
&lt;p>You are presented with a list of Code Scanning tools at your disposal, clearly the first is &lt;a href="https://securitylab.github.com/tools/codeql">CodeQL&lt;/a> &lt;strong>and it is automatically offered to you by GitHub&lt;/strong>, then you can find other tool available in the marketplace&lt;/p></description></item><item><title>Azure DevOps Pills: Update java in agent machines if you use SonarCloud integration</title><link>https://www.codewrecks.com/post/azdo/pills/update-java-for-sonarcloud-agents/</link><pubDate>Sat, 12 Sep 2020 12:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/update-java-for-sonarcloud-agents/</guid><description>&lt;p>If you have Azure DevOps pipelines that uses SonarCloud analyzer, you should update java version for your agents if you are using version 8 because support is going to drop.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/java-out-to-datepng.png"> &lt;img src="../images/java-out-to-datepng.png" alt="Warning message for old java version installed" />&lt;/a>
&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Warning message for old java version installed&lt;/em>&lt;/p>
&lt;p>You have not many days left to solve this issue before your builds &lt;strong>starts failing because Sonar Cloud analyzer will no longer work&lt;/strong>. The solution is simple, you can simply download an updated version of Open JDK in all agent machines. To check actual java version used you can simply check the JAVA_HOME capability directly in agent administration page.&lt;/p></description></item><item><title>Use multiple techniques to protect your data</title><link>https://www.codewrecks.com/post/security/protect-data-with-multiple-technique/</link><pubDate>Sat, 12 Sep 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/security/protect-data-with-multiple-technique/</guid><description>&lt;h2 id="the-problem">The problem&lt;/h2>
&lt;p>Several years ago I had a friend called me for a problem with MongoDb, it turns out that &lt;strong>someone, from an IP geolocated in China, accessed the instances during the night and wiped out everything&lt;/strong>.&lt;/p>
&lt;p>The problem was due to some misconfiguration or human error or whatever that:&lt;/p>
&lt;ol>
&lt;li>turned off Windows firewall and port 27017 was open to the internet&lt;/li>
&lt;li>MongoDb was installed with no password.&lt;/li>
&lt;li>MongoDb was bound to all ip addresses of the machine&lt;/li>
&lt;/ol>
&lt;p>When it is time to protect your data, you &lt;strong>should add as many layers / techniques of protection as you can&lt;/strong>, this because, if one if them is failing, another one can still offer protection.&lt;/p></description></item><item><title>GitHub Codespaces first impression</title><link>https://www.codewrecks.com/post/general/github-codespaces-first-impression/</link><pubDate>Sun, 06 Sep 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/github-codespaces-first-impression/</guid><description>&lt;h1 id="what-is-github-codespaces">What is GitHub codespaces&lt;/h1>
&lt;p>Visual Studio Code is becoming one of the most used and productive editors in all operating system and it is used by most developers. During these years Visual Studio Code introduced lots of interesting features, like the &lt;strong>ability to connect to Linux or Docker Container and develop inside that container&lt;/strong>. This means that your machine can run Windows / Linux / MacOS or whatever, but you can connect to a Linux machine or Linux Docker instance and &lt;strong>develop inside that container&lt;/strong>&lt;/p></description></item><item><title>Docker-compose to speed up setup dev environment</title><link>https://www.codewrecks.com/post/general/docker-compose-quick-start/</link><pubDate>Sat, 22 Aug 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/docker-compose-quick-start/</guid><description>&lt;p>Even if you do not plan to use Docker to distribute your application you can use it to speedup setup of development environment, for new developers and for new machines. I have a project where we use MongoDb and ElasticSearch, &lt;strong>mongodb should be authenticated and ElasticSearch needs to have some special plugin installed&lt;/strong>.&lt;/p>
&lt;blockquote>
&lt;p>Time to setup a new machine sometimes is high due to dependencies.&lt;/p>&lt;/blockquote>
&lt;p>I&amp;rsquo;m aware that for experienced user setting up mongodb and ElasticSearch is not a complex task, but nevertheless you usually can have some problem.&lt;/p></description></item><item><title>Azure DevOps Pills: Integration with SonarCloud</title><link>https://www.codewrecks.com/post/azdo/pills/sonarcloud-integration/</link><pubDate>Thu, 20 Aug 2020 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/sonarcloud-integration/</guid><description>&lt;p>I&amp;rsquo;ve dealt in the past on &lt;a href="http://www.codewrecks.com/blog/index.php/2018/10/10/azure-devops-pipelines-and-sonar-cloud-gives-free-analysis-to-your-os-project/">how to integrate SonarCloud analysis in a TFS/AzDo pipeline&lt;/a> but today it is time to update that post with some interesting nice capabilities.&lt;/p>
&lt;p>If you look in Figure 1 you can see that &lt;strong>now SonarCloud has a direct integration with Azure DevOps pull requests&lt;/strong>, all you need to do is add a Personal Access Token with code access privilege and you are ready to go.&lt;/p></description></item><item><title>Azure DevOps Pills: Process rules for state transition</title><link>https://www.codewrecks.com/post/azdo/pills/state-rules/</link><pubDate>Wed, 19 Aug 2020 08:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pills/state-rules/</guid><description>&lt;p>One of the most requested feature for Azure DevOps is the ability to restrict state transition for custom processes. Whenever a company starts creating its own process, Work Item States is always a big area of discussions. Which state we need? Who can change state from X to Y? &lt;strong>Until few weeks ago, only if you have Azure DevOps server with old process model based on XML you can restrict transition between states. Now this feature is available even for cloud version.&lt;/strong>&lt;/p></description></item><item><title>How to fix 'No matching creator found' mongodb error after upgrade</title><link>https://www.codewrecks.com/post/nosql/replace-immutable-serializer-in-mongodb/</link><pubDate>Sat, 15 Aug 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/nosql/replace-immutable-serializer-in-mongodb/</guid><description>&lt;p>Even if Release changes seems to have no breaking changes in MongodDb Driver latest upgrade, it is possible that your code could be affected by a change in the driver and you &lt;strong>starts having strange exception with code that works perfectly with an older version of the driver&lt;/strong>.&lt;/p>
&lt;blockquote>
&lt;p>I&amp;rsquo;ve a big project where after updating from 2.7.3 driver to 2.11.0 MongoDb driver I&amp;rsquo;ve started having all sort of weird errors that disappear restoring 2.7.3 of the driver.&lt;/p></description></item><item><title>Double T shaped professional</title><link>https://www.codewrecks.com/post/general/double-t-shaped-professional/</link><pubDate>Mon, 10 Aug 2020 18:40:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/double-t-shaped-professional/</guid><description>&lt;p>The term T-Shaped Professional or T-Shaped Skills is widely used to identify a person that has a good deep knowledge in a specific area and a broad knowledge on other areas.&lt;/p>
&lt;p>This kind of professional &lt;strong>is perfect to fit into DevOps culture, because it can collaborate better with others in the team&lt;/strong>. As an example, a front-end Developer usually has the leg of his T in web technologies (angular, Typescript, HTML, etc.) but he should also have a little bit of knowledge on backend development, networking and others. The same happens for other roles in the team, a Network Engineer has a deep knowledge of network infrastructure, but he/she should have some knowledge on Database, Development and web framework to communicate better with others.&lt;/p></description></item><item><title>Set ip of WSL2 machine in host file</title><link>https://www.codewrecks.com/post/general/powershell/wsl2-set-ip-in-hosts/</link><pubDate>Sat, 01 Aug 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/powershell/wsl2-set-ip-in-hosts/</guid><description>&lt;p>I have a WSL2 ubuntu installation where I have SAMBA installed and I really need it to answer to a specific name, something like \ubuntuwsl.&lt;/p>
&lt;blockquote>
&lt;p>In WSL2 the machine got its IP assigned from Hyper-V so it is dynamic and change at each reboot&lt;/p>&lt;/blockquote>
&lt;p>To solve this problem it is interesting to look on &lt;strong>how you can interact to your WSL2 distribution from PowerShell&lt;/strong>, this exercise will show you how powerful WSL2 is. First of all you can execute shell command directly from PowerShell. As an example this is how I can start SAMBA daemon directly from PowersShell&lt;/p></description></item><item><title>How to locate most recent MSBuild.exe using PowerShell</title><link>https://www.codewrecks.com/post/general/find-msbuild-location-in-powershell/</link><pubDate>Sun, 26 Jul 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/find-msbuild-location-in-powershell/</guid><description>&lt;p>If you want to build a Full Framework based project from PowerShell, &lt;strong>you need to locate MsBuild.exe tool tool to compile your project&lt;/strong>. You can indeed &amp;ldquo;open developer command prompt&amp;rdquo; to have a CommandLine with all needed tools in the %PATH%, but if you want to create a generic PowerShell script that uses MsBuild, knowing its location is probably a must.&lt;/p>
&lt;p>There are some solutions in the internet, but &lt;strong>I&amp;rsquo;ve found a nice module called VSSetup that can helps locating MsBuild&lt;/strong> because it gives you interesting information for every version of Visual Studio installed in the system (from VS2017 and subsequent versions).&lt;/p></description></item><item><title>Some fix for Word Exporter</title><link>https://www.codewrecks.com/post/azdo/misc/export-to-word-img-and-work-item-type/</link><pubDate>Wed, 22 Jul 2020 17:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/misc/export-to-word-img-and-work-item-type/</guid><description>&lt;p>This is another post in the series &amp;ldquo;&lt;strong>how to export Work Item data to Word Document&lt;/strong>&amp;rdquo;.&lt;/p>
&lt;p>Complete code for project can be found in &lt;a href="https://github.com/alkampfergit/AzureDevopsWordPlayground">GitHub - https://github.com/alkampfergit/AzureDevopsWordPlayground&lt;/a>&lt;/p>
&lt;p>Post in the series:&lt;/p>
&lt;ol>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2018/12/28/azure-devops-api-connection/">API Connection&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2018/12/28/azure-devops-api-retrieve-work-items-information/">Retrieve Work Items Information&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2018/12/31/azure-devops-api-embed-images-into-html/">Azure DevOps API, Embed images into HTML&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2018/12/31/create-word-document-from-work-items/">Create Word Document For Work Items&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2019/07/10/retrieve-image-in-work-item-description-with-tfs-api/">Retrieve image in Work Item Description with TFS API&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2019/07/25/retrieve-attachment-in-azure-devops-with-rest-api/">Retrieve Attachment in Azure DevOps with REST API in C#&lt;/a>&lt;/li>
&lt;/ol>
&lt;p>Exporter project was updated a little bit in past months, I had no time for blogging about it but probably it is the time to give further information on &lt;strong>how to export all of your Work Items to a Word File.&lt;/strong>&lt;/p></description></item><item><title>Error 0x8004212 during Bare Metal Recovery</title><link>https://www.codewrecks.com/post/general/bare-metal-recovery/</link><pubDate>Fri, 17 Jul 2020 17:56:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/bare-metal-recovery/</guid><description>&lt;p>I got an old HP Proliant Microserver Gen7, it has Turion CPU, quite slow in these days, but I got 16GB RAM and 6 TB of caviar Red. The overall performances are acceptable, it is a domain controller for a test domain, it is used as NAS and Windows Update services.&lt;/p>
&lt;p>Last week primary disk died, it starts with an annoying Tick Tick noise, then it is dead, so I bough a SAMSUNG 860 500GB SSD to replace the drive. &lt;strong>I already used Bare Metal Restore 2 times in that machine&lt;/strong>, because I changed to an SSD in the past then back to a standard Disk, so I was pretty sure that the backup procedure is good. But when I substituted the disk, booted Windows Server from USB and choose to restore the disk I got this nasty error.&lt;/p></description></item><item><title>Danger of public IPs</title><link>https://www.codewrecks.com/post/security/danger-of-public-ip/</link><pubDate>Thu, 16 Jul 2020 10:13:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/danger-of-public-ip/</guid><description>&lt;p>This morning I come across &lt;a href="https://www.comparitech.com/blog/vpn-privacy/ufo-vpn-data-exposure/">this article about another data exposure&lt;/a> and I could not avoid to notice that it is &lt;strong>another Elasticsearch exposed to the public&lt;/strong>.&lt;/p>
&lt;blockquote>
&lt;p>894 GB of data was stored in an unsecured Elasticsearch cluster.&lt;/p>&lt;/blockquote>
&lt;blockquote>
&lt;p>Due to personnel changes caused by COVID-19, we’ve not found bugs in server firewall rules immediately, which will lead to the potential risk of being hacked. And now it has been fixed.&lt;/p></description></item><item><title>Access your azure VM with Azure Bastion</title><link>https://www.codewrecks.com/post/azure/azure-bastion/</link><pubDate>Sat, 11 Jul 2020 10:45:18 +0200</pubDate><guid>https://www.codewrecks.com/post/azure/azure-bastion/</guid><description>&lt;p>There are lots of reasons to use a classic VM in Azure, even if PAAS is the preferred way to approach the cloud, IAAS is still strong especially because not every product is ready to run on cloud providers.&lt;/p>
&lt;p>If you have the need to create a standard VM, both Linux or Windows, you probably want an access with SSH or RDP to configure and manage it and &lt;strong>using a public address is probably the quickest, but less secure way, to do it.&lt;/strong>&lt;/p></description></item><item><title>Principle of least privilege</title><link>https://www.codewrecks.com/post/security/principle-of-least-privilege/</link><pubDate>Sat, 04 Jul 2020 08:13:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/principle-of-least-privilege/</guid><description>&lt;p>This is the fourth article in a series of post dealing on why it is important to strictly validate user input.&lt;/p>
&lt;ol>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/28/do-not-trust-user-input-enforce-whitelists-narrow-allowable-input/">Do not trust user input part 1&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/29/do-not-trust-user-input-part-2/">Do not trust user input part 2&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/02/19/do-not-trust-user-input-part-3/">Do not trust user input part 3&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://localhost:1313/post/security/validate-user-input-4/">Validate User Input part 4&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/post/security/do-not-disclose-error-to-the-user/">Do not disclose errors to the User part 5&lt;/a>&lt;/li>
&lt;/ol>
&lt;h2 id="a-brief-recap">A brief recap&lt;/h2>
&lt;p>Let&amp;rsquo;s return to the beginning, the very first version of the vulnerable function.&lt;/p></description></item><item><title>Do Not Disclose Errors to the User</title><link>https://www.codewrecks.com/post/security/do-not-disclose-error-to-the-user/</link><pubDate>Fri, 03 Jul 2020 22:13:30 +0200</pubDate><guid>https://www.codewrecks.com/post/security/do-not-disclose-error-to-the-user/</guid><description>&lt;p>This is the fourth article in a series of post dealing on why it is important to strictly validate user input.&lt;/p>
&lt;ol>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/28/do-not-trust-user-input-enforce-whitelists-narrow-allowable-input/">Do not trust user input part 1&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/29/do-not-trust-user-input-part-2/">Do not trust user input part 2&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/02/19/do-not-trust-user-input-part-3/">Do not trust user input part 3&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://localhost:1313/post/security/validate-user-input-4/">Validate User Input part 4&lt;/a>&lt;/li>
&lt;/ol>
&lt;p>In the last post we analyzed how it is not fully possible to limit user input in some functions like search. The user could almost search for every character and it is not easy to impose a maximum length. Nevertheless &lt;strong>imposing a maximum length of the string to 50 characters seems to break Sql Injection.&lt;/strong>&lt;/p></description></item><item><title>Error in mapping MongoDb classes after updating to 2.10 driver</title><link>https://www.codewrecks.com/post/general/error-in-mongodb-serializer/</link><pubDate>Thu, 02 Jul 2020 10:17:25 +0200</pubDate><guid>https://www.codewrecks.com/post/general/error-in-mongodb-serializer/</guid><description>&lt;p>After updating a big project from MongoDb C# driver 2.7 to latest 2.10 version I started having lots of error on Integration tests.&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-csharp" data-lang="csharp">&lt;span style="display:flex;">&lt;span>MongoDB.Bson.BsonSerializationException
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> HResult=&lt;span style="color:#ae81ff">0x80131500&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> Message=Creator map &lt;span style="color:#66d9ef">for&lt;/span> &lt;span style="color:#66d9ef">class&lt;/span> &lt;span style="color:#a6e22e">TestMongoBug&lt;/span>.TestArray has &lt;span style="color:#ae81ff">2&lt;/span> arguments, but none are configured.
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> Source=MongoDB.Bson
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> StackTrace:
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> at MongoDB.Bson.Serialization.BsonCreatorMap.Freeze()
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> at MongoDB.Bson.Serialization.BsonClassMap.Freeze()
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> at MongoDB.Bson.Serialization.BsonClassMap.LookupClassMap(Type classType)
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> at MongoDB.Bson.Serialization.BsonClassMapSerializationProvider.GetSerializer(Type type, IBsonSerializerRegistry serializerRegistry)
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> at MongoDB.Bson.Serialization.BsonSerializerRegistry.CreateSerializer(Type type)
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> at System.Collections.Concurrent.ConcurrentDictionary&lt;span style="color:#960050;background-color:#1e0010">`&lt;/span>&lt;span style="color:#ae81ff">2.&lt;/span>GetOrAdd(TKey key, Func&lt;span style="color:#960050;background-color:#1e0010">`&lt;/span>&lt;span style="color:#ae81ff">2&lt;/span> valueFactory)&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>
&lt;p>The stack trace is especially strange, because it &lt;strong>reveals that the error happens when the BsonClkassMap is trying to create mapping for the object&lt;/strong>. The error message is also really strange&lt;/p></description></item><item><title>Publish PowerShell functions to PowerShell Gallery</title><link>https://www.codewrecks.com/post/general/powershell-gallery/</link><pubDate>Sun, 28 Jun 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/powershell-gallery/</guid><description>&lt;p>I&amp;rsquo;m a great fan of PowerShell script for build and release, even if Azure DevOps, GitHub Actions, TeamCity or Jenkins have pre-made task for common operations (zipping, file handling, etc). I always like using PowerShell scripts to do most of the job and the reason is simple: &lt;strong>PowerShell scripts are easy to test, easy to understand and are not bound to a specific CI/CD engine&lt;/strong>.&lt;/p>
&lt;p>Since I&amp;rsquo;m not a real PowerShell expert, during the years I&amp;rsquo;ve made some functions I reuse across projects, but I didn&amp;rsquo;t organize them, leading to some confusion over the years.&lt;/p></description></item><item><title>Use Kali linux in Windows Subsystem for Linux</title><link>https://www.codewrecks.com/post/security/kali-linux-in-wsl2/</link><pubDate>Fri, 26 Jun 2020 10:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/security/kali-linux-in-wsl2/</guid><description>&lt;h2 id="kali-linux-on-windows">Kali Linux on Windows&lt;/h2>
&lt;p>Thanks to the new Windows Subsystem for Linux version 2, shortly called WSL2, we have now &lt;strong>a real Linux kernel running in a real VM as the core of WSL&lt;/strong>. This allows finally to use Kali Linux in WSL environment; if you tried in WSL you probably encountered some errors with network tools like NMap. With WSL2 everything seems to run just fine giving you a &lt;em>quick way to have a Kali Linux running in your Windows system&lt;/em> while having full integration between file systems.&lt;/p></description></item><item><title>Compiling Angular app in WSL2</title><link>https://www.codewrecks.com/post/general/compile-angular-in-wsl2/</link><pubDate>Sat, 06 Jun 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/compile-angular-in-wsl2/</guid><description>&lt;p>Windows Subsystem for Linux was a nice toy to play, but never impressed me very much, one of the reason is its limitations. I do not work much on Linux, but usually I have Linux boxes with MongoDB, Elasticsearch and play with security related stuff and for those purposes &lt;strong>I have &lt;a href="http://www.codewrecks.com/post/security/play-security-in-a-secure-environment/">dedicated virtual machines&lt;/a>.&lt;/strong>&lt;/p>
&lt;p>This was the reason why I did not found any real useful usage for WSL, I&amp;rsquo;ve tried to quick do NMAP scan, but I got errors since it did not run a real Linux full kernel and NMap wants to have full access to the NIC to do its stuff. &lt;strong>Now that we have WSL2 things starts to change.&lt;/strong>&lt;/p></description></item><item><title>Release a product composed by multiple projects and builds</title><link>https://www.codewrecks.com/post/azdo/pipeline/release-multiple-build/</link><pubDate>Sat, 30 May 2020 15:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/release-multiple-build/</guid><description>&lt;h2 id="situation">Situation&lt;/h2>
&lt;p>We have a legacy project, born when Asp.Net WebForm was still a thing and Asp.NET MVC was still not released. This project grow during the years, in more that one subversion and git repositories. It was finally time to start setting some best practice in action and, to avoid complexity, we end with a &lt;strong>single Git Repositories with six subfolders and six different solutions, each one that contains a part of the final product&lt;/strong>.&lt;/p></description></item><item><title>Play security in a secure environment</title><link>https://www.codewrecks.com/post/security/play-security-in-a-secure-environment/</link><pubDate>Sat, 23 May 2020 10:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/security/play-security-in-a-secure-environment/</guid><description>&lt;p>&lt;strong>Security is one of my long passions&lt;/strong>, I’ve spent lots of time on C++ and Assembly (both x86 and other architectures) and in that environment I&amp;rsquo;ve started exploring buffer overflow and other vulnerabilities. Over the course of years security remained only a passion and not my primary skill, but I spent constantly a little amount of time on it through the years.&lt;/p>
&lt;p>When it is time to study offensive security, it is quite common to &lt;strong>download and install test vulnerable Virtual Machines to test some offensive strategies&lt;/strong> and I’m quite surprised that most of the online tutorial simply tells you to use Virtual Box (sometimes VmWare workstation), in a very basic way and completely avoid exploring more advanced scenarios.&lt;/p></description></item><item><title>How to run x86 Unit Test in a .NET core application</title><link>https://www.codewrecks.com/post/visualstudio/dotnetcore-run-test-x86/</link><pubDate>Wed, 06 May 2020 21:45:18 +0200</pubDate><guid>https://www.codewrecks.com/post/visualstudio/dotnetcore-run-test-x86/</guid><description>&lt;p>We have a standard .NET standard solution with some projects and some Unit Tests, &lt;strong>everything runs perfectly until we have the need to force compilation of one of the project in x86&lt;/strong>. This can be done with RuntimeIdentifier tag in project file.&lt;/p>
&lt;div class="highlight">&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-xml" data-lang="xml">&lt;span style="display:flex;">&lt;span>&lt;span style="color:#f92672">&amp;lt;Project&lt;/span> &lt;span style="color:#a6e22e">Sdk=&lt;/span>&lt;span style="color:#e6db74">&amp;#34;Microsoft.NET.Sdk&amp;#34;&lt;/span>&lt;span style="color:#f92672">&amp;gt;&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">&amp;lt;PropertyGroup&amp;gt;&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">&amp;lt;TargetFramework&amp;gt;&lt;/span>netcoreapp3.1&lt;span style="color:#f92672">&amp;lt;/TargetFramework&amp;gt;&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">&amp;lt;RuntimeIdentifier&amp;gt;&lt;/span>win-x86&lt;span style="color:#f92672">&amp;lt;/RuntimeIdentifier&amp;gt;&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#f92672">&amp;lt;/PropertyGroup&amp;gt;&lt;/span>&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>
&lt;p>After this modification &lt;strong>tests started to fail with an error&lt;/strong> that is clearly directly related to the change in runtime, but was highly unexpected.&lt;/p></description></item><item><title>Advantage of Hugo</title><link>https://www.codewrecks.com/post/general/advantage-of-hugo/</link><pubDate>Sat, 02 May 2020 08:00:00 +0200</pubDate><guid>https://www.codewrecks.com/post/general/advantage-of-hugo/</guid><description>&lt;p>I should admit that I was one of the people that loved blogging with Windows Live Writer and in general with software that is able to simple &lt;strong>edit content, paste images, press a button and your post is online&lt;/strong>.&lt;/p>
&lt;p>Sometimes you just need to go out from your comfort zone a little bit to change perspective and to find what you are doing wrong. Let me do an example: &lt;strong>I used different plugins to print formatted code in my blog&lt;/strong> and after some migration you can see what happened to very first posts (around 2007).&lt;/p></description></item><item><title>Group application insight logs by custom property</title><link>https://www.codewrecks.com/post/azure/application-insight-group-logs-by-custom-property/</link><pubDate>Mon, 27 Apr 2020 18:45:18 +0200</pubDate><guid>https://www.codewrecks.com/post/azure/application-insight-group-logs-by-custom-property/</guid><description>&lt;p>Today we found excessive number of logs in Application Insight instance, an application that usually cost few bucks each month, started to use more resources. Looking at a summary of last 30 days we see &lt;strong>excessive number of custom events&lt;/strong>.&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/application-insight-summary.png"> &lt;img src="../images/application-insight-summary.png" alt="Summary of Application Insight data" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1:&lt;/strong>&lt;/em> &lt;em>Application insight summary for a specific application&lt;/em>&lt;/p>
&lt;p>Now the problem is: how can I quickly spot out why we have an excessive number of CustomEvents? &lt;strong>Logs shows me clearly that the vast majority of logs are indeed Custom Events&lt;/strong>. To have a better insight to detail of events, we need to use custom queries, first of all I grouped by name.&lt;/p></description></item><item><title>Validate User Input Step 4</title><link>https://www.codewrecks.com/post/security/validate-user-input-4/</link><pubDate>Sun, 26 Apr 2020 20:14:37 +0200</pubDate><guid>https://www.codewrecks.com/post/security/validate-user-input-4/</guid><description>&lt;p>This is the fourth article in a series of post dealing on why it is important to strictly validate user input.&lt;/p>
&lt;ol>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/28/do-not-trust-user-input-enforce-whitelists-narrow-allowable-input/">Do not trust user input part 1&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/29/do-not-trust-user-input-part-2/">Do not trust user input part 2&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://www.codewrecks.com/blog/index.php/2020/02/19/do-not-trust-user-input-part-3/">Do not trust user input part 3&lt;/a>&lt;/li>
&lt;/ol>
&lt;p>In this fourth part I will examine another problematic piece of code, obviously vulnerable to sql injection: an API to search in products.&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;display:grid;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;display:grid;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">5
&lt;/span>&lt;span style="background-color:#3c3d38">&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">6
&lt;/span>&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">7
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">8
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">9
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;display:grid;">&lt;code class="language-csharp" data-lang="csharp">&lt;span style="display:flex;">&lt;span>&lt;span style="color:#a6e22e">[SwaggerResponse(typeof(IEnumerable&amp;lt;Product&amp;gt;))]&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#a6e22e">[HttpGet]&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#a6e22e">[MapToApiVersion(&amp;#34;1.0&amp;#34;)]&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>&lt;span style="color:#66d9ef">public&lt;/span> IActionResult SearchProducts(String searchString)
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>{
&lt;/span>&lt;/span>&lt;span style="display:flex; background-color:#3c3d38">&lt;span> &lt;span style="color:#66d9ef">var&lt;/span> query = DataAccess.CreateQuery(&lt;span style="color:#e6db74">$&amp;#34;Select * from dbo.Products where productName like &amp;#39;%{searchString}%&amp;#39;&amp;#34;&lt;/span>);
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">var&lt;/span> products = query.ExecuteBuildEntities&amp;lt;Product&amp;gt;(Product.Builder);
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">return&lt;/span> Ok(products);
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>}&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>As you can see, leaving user input flow unconstrained in your business logic, where it is used to create a SQL query with string concatenation is a bad idea. &lt;strong>You can simply fire SQLMAP to test the API and you can verify that it&amp;rsquo;s indeed vulnerable.&lt;/strong>&lt;/p></description></item><item><title>Test error but build green when test are re-run</title><link>https://www.codewrecks.com/post/azdo/pipeline/reruntest/</link><pubDate>Thu, 23 Apr 2020 19:12:42 +0200</pubDate><guid>https://www.codewrecks.com/post/azdo/pipeline/reruntest/</guid><description>&lt;p>Suppose you have a result of an Azure DevOps Pipeline that contains this strange result: you have a clear indication that test run failed (1), but the overall build is green, both the entire build (2) and the single stage (3).&lt;/p>
&lt;p>&lt;a target="_blank" href="../images/re-run-result.png"> &lt;img src="../images/re-run-result.png" alt="Confusing result of a build" />&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1:&lt;/strong>&lt;/em> &lt;em>Confusing result of a build&lt;/em>&lt;/p>
&lt;p>In such a situation you wonder what happened, the overall build is green, but &lt;strong>the clear indication that test run failed gives you some bad feeling that something was not really ok.&lt;/strong> In this situation the problem is that, if you click on Test Run Failed error message you will be redirected to a clear log that states that test run failed.&lt;/p></description></item><item><title>Moving to Hugo</title><link>https://www.codewrecks.com/post/general/firstpost/</link><pubDate>Mon, 13 Apr 2020 17:17:25 +0200</pubDate><guid>https://www.codewrecks.com/post/general/firstpost/</guid><description>&lt;p>I&amp;rsquo;ve started blogging in English in 2007 and clearly the choice was WordPress. I must admit that in lots of years of WordPress I always was quite satisfied by the result, lots of plugin, lots of resources on the internet and the ability to post from programs like Windows Live Writer having a WYSIWYG program.&lt;/p>
&lt;p>You can also read one of the first post where I &lt;a href="http://www.codewrecks.com/blog/index.php/2007/05/03/the-advantage-of-word2007-blogging/">enjoyed blogging in Word&lt;/a>, really long time passed since that really old post.&lt;/p></description></item><item><title>Strange Error uploading artifacts in Azure DevOps pipeline</title><link>https://www.codewrecks.com/post/old/2020/04/strange-error-uploading-artifacts-in-azure-devops-pipeline-2/</link><pubDate>Sat, 11 Apr 2020 07:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/04/strange-error-uploading-artifacts-in-azure-devops-pipeline-2/</guid><description>&lt;p>I have a library that is entirely written in.NET core that deal with some self signed X509 certificates used to encrypt and digitally sign some data. Software runs perfectly and is composed by a server and client part.&lt;/p>
&lt;p>&lt;strong>At a certain point we decided that the client should be used not only by software that runs.NET core, but also software with full framework&lt;/strong> , so I’ve changed target framework to target both netstandard 2.0 and full framework 4.6.1, everything compiles perfectly, tests run fine, everything seems to be green. The problem is that unit test project ran tests only with.NET Core, so I was not exercising tests in full framework.&lt;/p></description></item><item><title>Strange Error uploading artifacts in Azure DevOps pipeline</title><link>https://www.codewrecks.com/post/old/2020/04/strange-error-uploading-artifacts-in-azure-devops-pipeline/</link><pubDate>Sat, 11 Apr 2020 06:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/04/strange-error-uploading-artifacts-in-azure-devops-pipeline/</guid><description>&lt;p>I have a pipeline that worked perfectly for Years, but yesterday a &lt;strong>build failed while uploading artifacts,&lt;/strong> I queued it again and it still failed, so it does not seems to be an intermittent error (network could be unreliable). I was really puzzled because from the last good build we changed 4 C# files, nothing really changed that can justify the failing and also we have no network problem that can justify problem uploading artifacts to Azure DevOps.&lt;/p></description></item><item><title>Azure DevOps Pipeline template steps and NET Core 3 local tools</title><link>https://www.codewrecks.com/post/old/2020/04/azure-devops-pipeline-template-steps-and-net-core-3-local-tools/</link><pubDate>Tue, 07 Apr 2020 16:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/04/azure-devops-pipeline-template-steps-and-net-core-3-local-tools/</guid><description>&lt;p>I’m a strong fan of Azure DevOps templates for pipelines because it is a really good feature to both simplify Pipeline authoring and avoid proliferation of too many way to do the same things. &lt;strong>In some of my&lt;/strong> &lt;a href="http://www.codewrecks.com/blog/index.php/2020/03/29/azure-devops-pipeline-template-for-build-and-release-net-core-project/">&lt;strong>previous examples&lt;/strong>&lt;/a> &lt;strong>I’ve always used a template that contains full Multi Stage pipeline definition&lt;/strong> , this allows you to create a new pipeline with easy, reference repository with the template, choose right template, set parameters and you are ready to go.&lt;/p></description></item><item><title>Continuous Integration in GitHub Actions deploy in AzureDevops</title><link>https://www.codewrecks.com/post/old/2020/04/continuous-integration-in-github-actions-deploy-in-azuredevops/</link><pubDate>Sat, 04 Apr 2020 05:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/04/continuous-integration-in-github-actions-deploy-in-azuredevops/</guid><description>&lt;p>My dear friend Matteo just published an &lt;a href="https://mattvsts.github.io/2020/04/03/CI-on-github-actions-CD-on-Azure-Pipelines/">interesting article on integration between GitHub actions and Azure Devops Pipeline here&lt;/a>. I have a different scenario where &lt;a href="http://www.codewrecks.com/blog/index.php/2020/03/22/github-actions-plus-gitversion/">I’ve already published a GitHub release from a GitHub action&lt;/a>, but I have nothing ready in GitHub to release in my machines.&lt;/p>
&lt;blockquote>
&lt;p>While GitHub is really fantastic for source code and starts having a good support for CI with Actions, in the release part it still miss a solution. Usually this is not a problem, because we have Azure DevOps or other products that can fill the gap.&lt;/p></description></item><item><title>Azure DevOps pipeline template for build and release NET core project</title><link>https://www.codewrecks.com/post/old/2020/03/azure-devops-pipeline-template-for-build-and-release-net-core-project/</link><pubDate>Sun, 29 Mar 2020 09:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/azure-devops-pipeline-template-for-build-and-release-net-core-project/</guid><description>&lt;p>Some days ago I’ve blogged on how to release projects on GitHub with actions, now it is time to understand &lt;strong>how to do a similar thing in Azure DevOps to build / test / publish a.NET core library with nuget&lt;/strong>. The purpose is to create a generic template that can be reused on every general that needs to build an utility dll, run test and publish to a Nuget feed.&lt;/p></description></item><item><title>Strange error disallow my NET core application to start</title><link>https://www.codewrecks.com/post/old/2020/03/strange-error-disallow-my-net-core-application-to-start/</link><pubDate>Mon, 23 Mar 2020 08:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/strange-error-disallow-my-net-core-application-to-start/</guid><description>&lt;p>Today I cloned in my workstation a.NET core application that works perfectly on my laptop, but when I started it I got this error&lt;/p>
&lt;blockquote>
&lt;p>An attempt was made to access a socket in a way forbidden by its access permissions&lt;/p>&lt;/blockquote>
&lt;p>I’ve spent almost 10 minutes to find why my netsh rule is not working (I work with a user that is not administrator of the machine) and finally, by frustration I opened Visual Studio with administrator user, just to verify that the error is still there.&lt;/p></description></item><item><title>Release software with GitHub actions and GitVersion</title><link>https://www.codewrecks.com/post/old/2020/03/github-actions-plus-gitversion/</link><pubDate>Sun, 22 Mar 2020 10:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/github-actions-plus-gitversion/</guid><description>&lt;p>One of the nice aspect of GitHub actions is that you can automate stuff simply with command line tools. &lt;strong>If you want to do continuous release of your software and you want to use GitVersion tool to determine a unique SemVer version&lt;/strong> from the history, here is the sequence of steps.&lt;/p>
&lt;ol>
&lt;li>Iinstall/update gitversion tool with commandline tools&lt;/li>
&lt;li>Run GitVersio to determine SemVer numbers&lt;/li>
&lt;li>Issue a standard build/test using SemVerNumbers of step 2&lt;/li>
&lt;li>If tests are ok, use dotnet publish command (with SemVer numbers) to publish software&lt;/li>
&lt;li>Zip and upload publish result&lt;/li>
&lt;li>It the branch is Master publish a GitHub release of your software.&lt;/li>
&lt;/ol>
&lt;blockquote>
&lt;p>Automating your CI pipeline using only CommandLine tools makes your build not dependent on the Engine you are using.&lt;/p></description></item><item><title>One Team Project to rule them all</title><link>https://www.codewrecks.com/post/old/2020/03/one-team-project-to-rule-them-all/</link><pubDate>Sat, 21 Mar 2020 16:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/one-team-project-to-rule-them-all/</guid><description>&lt;p>A similar post was made lots of time ago, but since this is always an hot topic, it is probably the time to refresh with new UI and new concepts of Azure DevOps.&lt;/p>
&lt;blockquote>
&lt;p>The subject is, how can I apply security to backlogs if I adopt the strategy one single Team Project subdivided by teams?&lt;/p>&lt;/blockquote>
&lt;p>&lt;strong>The approach One Team Project to rule them all is still valid as today&lt;/strong> , because, once you have a team project, you can divide it with Teams, where each team has its own backlog (or share a single backlog between teams) making everything more manageable.&lt;/p></description></item><item><title>Publish artifacts in GitHub actions</title><link>https://www.codewrecks.com/post/old/2020/03/publish-artifacts-in-github-actions/</link><pubDate>Sun, 15 Mar 2020 11:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/publish-artifacts-in-github-actions/</guid><description>&lt;p>GitHub action is perfect to automate simple build workflow and can also be used to publish “releases” of our software. While we can do actions to publish on cloud or elsewhere, what I appreciate from a tool is: &lt;strong>allow me to make simple things with simple workflow.&lt;/strong> While I appreciate being able to obtain complex result and indeed, sometimes we evaluate products on the ability to fulfill complex scenario, often we forgot about the simple things. &lt;strong>Is it true that, if a product allows me to solve complex scenarios, it will surely allow me to solve simple scenario, but I wonder about the complexity&lt;/strong>.&lt;/p></description></item><item><title>Home Made Zero trust Security step 2</title><link>https://www.codewrecks.com/post/old/2020/03/home-made-zero-trust-security-step-2/</link><pubDate>Sat, 14 Mar 2020 10:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/home-made-zero-trust-security-step-2/</guid><description>&lt;p>If you read my old post about how to create a simple program that can manage Windows Firewall to &lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/03/home-made-zero-trust-security/">open ports&lt;/a> with a simple udp request you surely got disappointed by the complete lack of security in the request. &lt;strong>That program was no more than a mere proof of concept to understand if I can manage windows firewall programmatically in.NET Core.&lt;/strong> &amp;gt; &lt;strong>The absolute critical problem in that program is that, UDP request to open a Tcp port is sent in clear text.&lt;/strong> Basically the protocol is, a client &lt;strong>C send to the server S a UDP packet in a specific port with a secret key,&lt;/strong> the server S check if the secret is correct and opens a corresponding TCP port, associated by UDP port in configuration, for requesting IP only and for a predetermined period of time.&lt;/p></description></item><item><title>NET core configuration array with Bind</title><link>https://www.codewrecks.com/post/old/2020/03/net-core-configuration-array-with-bind/</link><pubDate>Sat, 14 Mar 2020 09:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/net-core-configuration-array-with-bind/</guid><description>&lt;p>New configuration system of.NET core is really nice, but it does not works very well with arrays, &lt;strong>I have a configuration object that has an array of FirewallRules&lt;/strong> &lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">1
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-csharp" data-lang="csharp">&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">public&lt;/span> FirewallRule[] Rules { &lt;span style="color:#66d9ef">get&lt;/span>; &lt;span style="color:#66d9ef">set&lt;/span>; }&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>&lt;/p>
&lt;p>This rule is simply composed by four simple properties.&lt;/p>
&lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 5
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 6
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 7
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 8
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 9
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">10
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">11
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">12
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">13
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">14
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">15
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">16
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">17
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">18
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">19
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">20
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">21
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">22
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-csharp" data-lang="csharp">&lt;span style="display:flex;">&lt;span>&lt;span style="color:#66d9ef">public&lt;/span> &lt;span style="color:#66d9ef">class&lt;/span> &lt;span style="color:#a6e22e">FirewallRule&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>{
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">internal&lt;/span> FirewallRule()
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> {
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> }
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">public&lt;/span> FirewallRule(&lt;span style="color:#66d9ef">string&lt;/span> name, &lt;span style="color:#66d9ef">int&lt;/span> udpPort, &lt;span style="color:#66d9ef">int&lt;/span> tcpPort, &lt;span style="color:#66d9ef">string&lt;/span> secret)
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> {
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> Name = name;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> UdpPort = udpPort;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> TcpPort = tcpPort;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> Secret = secret;
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> }
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">public&lt;/span> String Name { &lt;span style="color:#66d9ef">get&lt;/span>; &lt;span style="color:#66d9ef">set&lt;/span>; }
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">public&lt;/span> Int32 UdpPort { &lt;span style="color:#66d9ef">get&lt;/span>; &lt;span style="color:#66d9ef">set&lt;/span>; }
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">public&lt;/span> Int32 TcpPort { &lt;span style="color:#66d9ef">get&lt;/span>; &lt;span style="color:#66d9ef">set&lt;/span>; }
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#66d9ef">public&lt;/span> String Secret { &lt;span style="color:#66d9ef">get&lt;/span>; &lt;span style="color:#66d9ef">set&lt;/span>; }
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>}&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>
&lt;p>Ok, nothing complex, now &lt;strong>I’m expecting to being able to write this json configuration file to configure a single rule.&lt;/strong> &lt;div class="highlight">&lt;div style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">
&lt;table style="border-spacing:0;padding:0;margin:0;border:0;">&lt;tr>&lt;td style="vertical-align:top;padding:0;margin:0;border:0;">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 1
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 2
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 3
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 4
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 5
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 6
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 7
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 8
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f"> 9
&lt;/span>&lt;span style="white-space:pre;-webkit-user-select:none;user-select:none;margin-right:0.4em;padding:0 0.4em 0 0.4em;color:#7f7f7f">10
&lt;/span>&lt;/code>&lt;/pre>&lt;/td>
&lt;td style="vertical-align:top;padding:0;margin:0;border:0;;width:100%">
&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;">&lt;code class="language-js" data-lang="js">&lt;span style="display:flex;">&lt;span>{
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#e6db74">&amp;#34;Rules&amp;#34;&lt;/span> &lt;span style="color:#f92672">:&lt;/span> [
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> {
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#e6db74">&amp;#34;Name&amp;#34;&lt;/span>&lt;span style="color:#f92672">:&lt;/span> &lt;span style="color:#e6db74">&amp;#34;Rdp&amp;#34;&lt;/span>,
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#e6db74">&amp;#34;UdpPort&amp;#34;&lt;/span>&lt;span style="color:#f92672">:&lt;/span> &lt;span style="color:#ae81ff">23456&lt;/span>,
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#e6db74">&amp;#34;TcpPort&amp;#34;&lt;/span>&lt;span style="color:#f92672">:&lt;/span> &lt;span style="color:#ae81ff">3389&lt;/span>,
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> &lt;span style="color:#e6db74">&amp;#34;Secret&amp;#34;&lt;/span>&lt;span style="color:#f92672">:&lt;/span> &lt;span style="color:#e6db74">&amp;#34;this_is_a_secret&amp;#34;&lt;/span>
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> }
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span> ]
&lt;/span>&lt;/span>&lt;span style="display:flex;">&lt;span>}&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/td>&lt;/tr>&lt;/table>
&lt;/div>
&lt;/div>&lt;/p></description></item><item><title>GitHub actions improvements</title><link>https://www.codewrecks.com/post/old/2020/03/github-actions-improvements/</link><pubDate>Thu, 12 Mar 2020 17:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/github-actions-improvements/</guid><description>&lt;p>GitHub actions is really new kid on the block and even if I still prefer Azure DevOps pipelines, because they are really more production ready, GitHub actions is rapidly evolving.&lt;/p>
&lt;p>&lt;a href="https://www.codewrecks.com/blog/wp-content/uploads/2020/03/SNAGHTML3961c2.png">&lt;a target="_blank" href="https://www.codewrecks.com/blog/wp-content/uploads/2020/03/SNAGHTML3961c2_thumb.png"> &lt;img src="https://www.codewrecks.com/blog/wp-content/uploads/2020/03/SNAGHTML3961c2_thumb.png" alt="SNAGHTML3961c2" />&lt;/a>&lt;/a>&lt;/p>
&lt;p>&lt;strong>Figure 1&lt;/strong> : &lt;em>GitHub actions now has a dedicated editor for actions to quickly include actions&lt;/em>&lt;/p>
&lt;p>As you can see in &lt;strong>Figure 1&lt;/strong> , when you edit workflow file in GitHub online editor &lt;strong>you can simply browse all available actions&lt;/strong>. Choosing a specific action reveal the snippet of text you should enter to use that action without the need to search around.&lt;/p></description></item><item><title>Azure DevOps YAML pipeline authorization problem</title><link>https://www.codewrecks.com/post/old/2020/03/azure-devops-yaml-pipeline-authorization-problem/</link><pubDate>Tue, 10 Mar 2020 18:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/azure-devops-yaml-pipeline-authorization-problem/</guid><description>&lt;p>It could happen, sometimes, that when you create a pipeline in Azure Devops at first run you got the following error.&lt;/p>
&lt;blockquote>
&lt;p>##[error]Pipeline does not have permissions to use the referenced pool(s) Default. For authorization details, refer to &lt;a href="https://aka.ms/yamlauthz">https://aka.ms/yamlauthz&lt;/a>.&lt;/p>&lt;/blockquote>
&lt;p>There are more than one kind of this error, the most common one is the build using some external resource that requires authorization, but &lt;strong>in this specific error message, pipeline has no permission to run on default pool&lt;/strong>.&lt;/p></description></item><item><title>Use latest OS image tag in GitHub actions</title><link>https://www.codewrecks.com/post/old/2020/03/use-latest-os-image-tag-in-github-actions/</link><pubDate>Sun, 08 Mar 2020 09:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/03/use-latest-os-image-tag-in-github-actions/</guid><description>&lt;p>I have a nice GH action that runs some build and test on my project, now I noticed that some of the latest runs have some problem.&lt;/p>
&lt;p>&lt;a href="https://www.codewrecks.com/blog/wp-content/uploads/2020/03/image.png">&lt;a target="_blank" href="https://www.codewrecks.com/blog/wp-content/uploads/2020/03/image_thumb.png"> &lt;img src="https://www.codewrecks.com/blog/wp-content/uploads/2020/03/image_thumb.png" alt="image" />&lt;/a>&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>My action that ran only one of the matrix combination&lt;/em>&lt;/p>
&lt;p>Action has two distinct run because it has a matrix, actually I want to run it against Linux and Windows operating systems, but it seems that it does not run anymore on Windows.&lt;/p></description></item><item><title>GitHub Actions plus Azure Docker Registry</title><link>https://www.codewrecks.com/post/old/2020/02/github-actions-plus-azure-docker-registry/</link><pubDate>Tue, 25 Feb 2020 17:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/02/github-actions-plus-azure-docker-registry/</guid><description>&lt;p>I have some projects that needs SqlServer and MongoDb or ElasticSearch to run some integration tests, these kind of requirements made difficult to use hosted agent for build (in Azure DevOps) or &lt;strong>whatever build system you are using where a provider gives you pre-configured machine to run your workflow.&lt;/strong> Usually each build engine made possible for you to run your own agent and GitHub actions makes no difference ( you can read here about self installed action runners &lt;a href="https://help.github.com/en/actions/hosting-your-own-runners/about-self-hosted-runners">https://help.github.com/en/actions/hosting-your-own-runners/about-self-hosted-runners&lt;/a>)&lt;/p></description></item><item><title>Do not trust user input part 3</title><link>https://www.codewrecks.com/post/old/2020/02/do-not-trust-user-input-part-3/</link><pubDate>Wed, 19 Feb 2020 18:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/02/do-not-trust-user-input-part-3/</guid><description>&lt;p>In &lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/29/do-not-trust-user-input-part-2/">part 2&lt;/a> we continued our journey to prevent malicious users to receive dangerous data, limiting customer id to be a 5 letters string value. We have two aspect to improve because usually I got 2 complains when I show that code.&lt;/p>
&lt;p>First one: &lt;strong>Customer object, has a composite id, serialized value is somewhat clumsy to access from client code&lt;/strong> as you can see in &lt;strong>Figure 1&lt;/strong>. Second: if you forget to create a CustomerId from value passed from the user, you are still victim of SQL Injection.&lt;/p></description></item><item><title>Azure DevOps Git repository options</title><link>https://www.codewrecks.com/post/old/2020/02/azure-devops-git-repository-options/</link><pubDate>Wed, 12 Feb 2020 17:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/02/azure-devops-git-repository-options/</guid><description>&lt;p>Azure DevOps is a big product and often users start using it without fully explore all the possibilities. As an example, when it is time to work with Git Repositories, users just create repositories and start working without any further configuration.&lt;/p>
&lt;p>&lt;strong>If you navigate to the Repos section of Project Settings page, you can configure lots of options for repositories.&lt;/strong> Security is probably the most important setting, because it determines who can access that specific repository and what permission each user / group has in the context of that very specific repository.&lt;/p></description></item><item><title>Do not trust user input part 2</title><link>https://www.codewrecks.com/post/old/2020/01/do-not-trust-user-input-part-2/</link><pubDate>Wed, 29 Jan 2020 20:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/01/do-not-trust-user-input-part-2/</guid><description>&lt;p>After we fixed our code in &lt;a href="http://www.codewrecks.com/blog/index.php/2020/01/28/do-not-trust-user-input-enforce-whitelists-narrow-allowable-input/">part 1&lt;/a> of this serie, we continue to expand our API adding a method to select  a Customer. Northwind database Customer table has an id of type string, so we could start with this very bad, bad, bad piece of code.&lt;/p>
&lt;p>&lt;a href="https://www.codewrecks.com/blog/wp-content/uploads/2020/01/image-11.png">&lt;a target="_blank" href="https://www.codewrecks.com/blog/wp-content/uploads/2020/01/image_thumb-11.png"> &lt;img src="https://www.codewrecks.com/blog/wp-content/uploads/2020/01/image_thumb-11.png" alt="image" />&lt;/a>&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>Another bad example of API vulnerable with Sql Injection&lt;/em>&lt;/p>
&lt;p>Again the question is: what is the most critical error in that piece of code? If you answer “Query with string concatenation” probably you are wrong. Indeed that is a huge problem, but &lt;strong>in my mind is accepting a string from the user is still the number one problem&lt;/strong>.&lt;/p></description></item><item><title>Do not trust user input</title><link>https://www.codewrecks.com/post/old/2020/01/do-not-trust-user-input-enforce-whitelists-narrow-allowable-input/</link><pubDate>Tue, 28 Jan 2020 21:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/01/do-not-trust-user-input-enforce-whitelists-narrow-allowable-input/</guid><description>&lt;p>It is time to start blogging a little bit about security, because injection is still high in OWASP TOP 10 and this implies that &lt;strong>people still trust their users&lt;/strong>. Remember, you should not trust your users, never, never, never, because for 10.000 good users there could be 1 bad user, and he/she is enough to damage your organization.&lt;/p>
&lt;p>Here you have a really bad, bad, bad, bad, piece of code that is meant to allow product retrieval from northwind database Products table.&lt;/p></description></item><item><title>Windows Docker Container for Azure Devops Build agent</title><link>https://www.codewrecks.com/post/old/2020/01/windows-docker-container-for-azure-devops-build-agent/</link><pubDate>Sat, 25 Jan 2020 11:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/01/windows-docker-container-for-azure-devops-build-agent/</guid><description>&lt;p>Thanks to Docker Compose, &lt;a href="http://www.codewrecks.com/blog/index.php/2019/12/27/azure-devops-agent-with-docker-compose/">I can spin off an agent for Azure Devops in mere seconds&lt;/a> (once you have all the images). Everything I need is just insert the address of my account a valid token and an agent is ready.&lt;/p>
&lt;p>&lt;strong>With.NET core everything is simple, because we have a nice build task that automatically install.NET Core SDK in the agent,&lt;/strong> the very same for node.js. This approach is really nice, because it does not require to preinstall too much stuff in your agent, everything is downloaded and installed on the fly when a build needs that specific tooling.&lt;/p></description></item><item><title>Why I love DevOps and hate DevSecOps</title><link>https://www.codewrecks.com/post/old/2020/01/why-i-love-devops-and-hate-devsecops/</link><pubDate>Sat, 18 Jan 2020 15:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/01/why-i-love-devops-and-hate-devsecops/</guid><description>&lt;p>DevOps is becoming a buzzword, it makes hype and everyone want to be part of it, even if he/she does not know exactly what DevOps is. &lt;strong>One of the symptoms of this is the “DevOpsEngineer”, a title that does not fit in my head.&lt;/strong> We could debate for days or years on the right definition of DevOps, but essentially is a &lt;em>cultural approach on building software focused on building the right thing with the maximum quality and satisfaction for the customer.&lt;/em>&lt;/p></description></item><item><title>Home Made zero trust security</title><link>https://www.codewrecks.com/post/old/2020/01/home-made-zero-trust-security/</link><pubDate>Fri, 03 Jan 2020 16:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2020/01/home-made-zero-trust-security/</guid><description>&lt;p>I have a small office with some computers and servers and since I’m a fan of Zero Trust Security, I have firewall enabled even in local network. &lt;strong>I’m especially concerned about my primary workstation, a Windows Machine where I have explicitly created firewall rules to block EVERY packet from another machine of the network.&lt;/strong> I have backups, I have antivirus, but that machine is important and I do not want it to be compromised, working with a rule that block every contact from external code is nice and make it secure, but sometimes it is inconvenient.&lt;/p></description></item><item><title>Azure DevOps agent with Docker Compose</title><link>https://www.codewrecks.com/post/old/2019/12/azure-devops-agent-with-docker-compose/</link><pubDate>Fri, 27 Dec 2019 20:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/12/azure-devops-agent-with-docker-compose/</guid><description>&lt;p>I’ve dealt in the past on using Docker for your Azure DevOps Linux Build Agent in a post called &lt;a href="http://www.codewrecks.com/blog/index.php/2017/10/14/configure-a-vsts-linux-agent-with-docker-in-minutes/">Configure a VSTS Linux agent with docker in minutes&lt;/a> and also I’ve blogged on how you can &lt;a href="http://www.codewrecks.com/blog/index.php/2019/06/10/hosted-agents-plus-docker-perfect-match-for-azure-devops-and-open-source-project/">use Docker inside a build definition to have some prerequisite for testing&lt;/a> (like MongoDb and Sql Server), now it is time to move a little step further and leverage Docker compose.&lt;/p>
&lt;p>&lt;strong>Using Docker commands in pipeline definition is nice, but has some drawbacks:&lt;/strong> First of all this approach suffers in speed of execution, because the container must start each time you run a build (and should be stopped at the end of the build). Is indeed true that if the docker image is already present in the agent machine startup time is not so high, but some images, like MsSql, are not immediately operative, so you need to wait for them to be ready for Every Build. The alternative is leave them running even if the build is finished, but this could lead to resource exaustion.&lt;/p></description></item><item><title>Check for Malware in a Azure DevOps Pipeline</title><link>https://www.codewrecks.com/post/old/2019/12/check-for-malware-in-a-azure-devops-pipeline/</link><pubDate>Sat, 14 Dec 2019 09:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/12/check-for-malware-in-a-azure-devops-pipeline/</guid><description>&lt;p>In a previous post I’ve showed &lt;a href="http://www.codewrecks.com/blog/index.php/2019/11/23/quick-peek-at-microsoft-security-code-analysis-credential-scanner/">Credential Scanner&lt;/a>, a special task part of &lt;a href="https://secdevtools.azurewebsites.net/">Microsoft Security Code Analysis&lt;/a> available in Azure, today &lt;strong>I want to have a quick peek at&lt;/strong> &lt;a href="https://secdevtools.azurewebsites.net/helpantimalware.html">&lt;strong>Anti Malware scanner task.&lt;/strong>&lt;/a>&lt;/p>
&lt;p>First of all a simple consideration: I’ve been asked several times if there is any need to have an AntiVirus or AntiMalware tools in build machines, after all the code that is build is developed by own developer, so there should be no need of such tools, right? In my opinion this is a false assumption, here is some quick consideration on how a malware can be downloaded in your build machine&lt;/p></description></item><item><title>Consume Azure DevOps feed in TeamCity</title><link>https://www.codewrecks.com/post/old/2019/12/consume-azure-devops-feed-in-teamcity/</link><pubDate>Wed, 04 Dec 2019 18:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/12/consume-azure-devops-feed-in-teamcity/</guid><description>&lt;p>&lt;strong>Azure DevOps has an integrated feed management you can use for nuget, npm, etc; the feed is private and only authorized users can download / upload packages.&lt;/strong> Today I had a little problem setting up a build in Team City that uses a feed in Azure Devops, because it failed with 201 (unauthorized)&lt;/p>
&lt;blockquote>
&lt;p>The problem with Azure DevOps NuGet feeds, is how to authenticate other toolchain or build server.&lt;/p>&lt;/blockquote>
&lt;p>This project still have some old build in TeamCity, but when it starts consuming packages published in Azure Devops, TeamCity builds start failing due 401 (unauthorized) error. The question is: How can I consume an Azure DevOps nuget feed from agent or tools that are not related to Azure Devops site itself?&lt;/p></description></item><item><title>BruteForcing login with Hydra</title><link>https://www.codewrecks.com/post/old/2019/11/bruteforcing-login-with-hydra/</link><pubDate>Fri, 29 Nov 2019 18:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/11/bruteforcing-login-with-hydra/</guid><description>&lt;p>Without any doubt, Hydra is one of the best tool to bruteforce passwords. It has support for many protocols, but &lt;strong>it can be used with standard web sites as well forcing a standard POST based login&lt;/strong>. The syntax is a little bit different from a normal scan, like SSH and is similar to this cmdline.&lt;/p>
&lt;p>./hydra -l username -P x:\temp\rockyou.txt hostname –s port http-post-form “/loginpage-address:user=^USER^&amp;amp;password=^PASS^:Invalid password!”&lt;/p>
&lt;p>Dissecting the parameters you have&lt;/p></description></item><item><title>Security in 2019 still unprotected ElasticSearch instance exists</title><link>https://www.codewrecks.com/post/old/2019/11/security-in-2019-still-unprotected-elasticsearch-instance-exists/</link><pubDate>Sun, 24 Nov 2019 13:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/11/security-in-2019-still-unprotected-elasticsearch-instance-exists/</guid><description>&lt;p>I’ve received today a notification from &lt;a href="https://haveibeenpwned.com/" title="https://haveibeenpwned.com/">https://haveibeenpwned.com/&lt;/a> because one of my emails was present in a data breach.&lt;/p>
&lt;p>&lt;a href="https://www.codewrecks.com/blog/wp-content/uploads/2019/11/image-23.png">&lt;a target="_blank" href="https://www.codewrecks.com/blog/wp-content/uploads/2019/11/image_thumb-23.png"> &lt;img src="https://www.codewrecks.com/blog/wp-content/uploads/2019/11/image_thumb-23.png" alt="image" />&lt;/a>&lt;/a>&lt;/p>
&lt;p>Ok, it happens, but two things disturbed me, the first is that I really never heard of those guys (People Data Labs), this because they are one of the companies that harvest public data from online sources, aggregates them and re-sell as “Data enrichment”. This means that they probably have only public data on me. &lt;strong>If you are interested you can read article by Troy Hunt&lt;/strong> &lt;a href="https://www.troyhunt.com/data-enrichment-people-data-labs-and-another-622m-email-addresses/" title="https://www.troyhunt.com/data-enrichment-people-data-labs-and-another-622m-email-addresses/">&lt;strong>https://www.troyhunt.com/data-enrichment-people-data-labs-and-another-622m-email-addresses/&lt;/strong>&lt;/a> &lt;strong>on details about this breach.&lt;/strong> &lt;strong>But the second, and more disturbing issue is that, in 2019, still people left ElasticSearch open and unprotected in the wild.&lt;/strong> This demonstrates really low attention about security, especially in situation where you have Elasticsearch on server that have a public exposure. It is really sad to see that Security is still a second citizen in software development, if not, such trivial errors would not be done.&lt;/p></description></item><item><title>Quick Peek at Microsoft Security Code Analysis Credential Scanner</title><link>https://www.codewrecks.com/post/old/2019/11/quick-peek-at-microsoft-security-code-analysis-credential-scanner/</link><pubDate>Sat, 23 Nov 2019 16:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/11/quick-peek-at-microsoft-security-code-analysis-credential-scanner/</guid><description>&lt;p>&lt;a href="https://secdevtools.azurewebsites.net/">&lt;strong>Microsoft Security Code Analysis&lt;/strong>&lt;/a> &lt;strong>contains a set of Tasks for Azure DevOps pipeline to automate some security checks during building of your software.&lt;/strong> Automatic security scanning tools are not a substitute in any way for human security analysis, remember: if you develop code ignoring security, no tool can save you.&lt;/p>
&lt;p>&lt;strong>Despite this fact, there are situation where static analysis can really give you benefit,&lt;/strong> because it can avoid you some simple and silly errors, that can lead to troubles. All Tasks in Microsoft Security Code Analysis package are designed to solve a particular problem and to prevent some common mistake.&lt;/p></description></item><item><title>Unable to execute NET core unit test in VS after uninstalling older sdk</title><link>https://www.codewrecks.com/post/old/2019/11/unable-to-execute-net-core-unit-test-in-vs-after-uninstalling-older-sdk/</link><pubDate>Thu, 21 Nov 2019 20:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/11/unable-to-execute-net-core-unit-test-in-vs-after-uninstalling-older-sdk/</guid><description>&lt;p>It is not uncommon to have installed many versions of.NET core framework, especially after many Visual Studio updates. &lt;strong>Each installation consumes disk space so I’ve decided to cleanup everything leaving only major version of the framework installed in the system.&lt;/strong> Everything worked fine, except Visual Studio Test Explorer that, upon test run request, generates this error in Tests output window&lt;/p>
&lt;p>&lt;font size="3">[21/11/2019 6:08:03.911] ========== Discovery aborted: 0 tests found (0:00:05,5002292) ==========&lt;br>[21/11/2019 6:08:03.934] ———- Run started ———-&lt;br>Testhost process exited with error: It was not possible to find any compatible framework version&lt;br>The framework ‘Microsoft.NETCore.App’, version ‘2.2.0’ was not found.&lt;/font>&lt;/p></description></item><item><title>Multiline PowerShell on YAML pipeline</title><link>https://www.codewrecks.com/post/old/2019/11/multiline-powershell-on-yaml-pipeline/</link><pubDate>Tue, 19 Nov 2019 18:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/11/multiline-powershell-on-yaml-pipeline/</guid><description>&lt;p>Sometimes having a &lt;strong>few lines of PowerShell in your pipeline is the only thing you need to quickly customize a build without using a custom task or having a PowerShell file in source code&lt;/strong>. One of the typical situation is: write a file with some content that needs to be determined by a PowerShell script, in my situation I need to create a configuration file based on some build variable.&lt;/p></description></item><item><title>Azure DevOps multi stage pipeline environments</title><link>https://www.codewrecks.com/post/old/2019/11/azure-devops-multi-stage-pipeline-environments/</link><pubDate>Tue, 12 Nov 2019 18:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/11/azure-devops-multi-stage-pipeline-environments/</guid><description>&lt;p>In a previous post on &lt;a href="http://www.codewrecks.com/blog/index.php/2019/10/21/release-app-with-azure-devops-multi-stage-pipeline/">releasing with Multi Stage Pipeline and YAML code&lt;/a> I briefly introduced the concept of environments. In that example I used an environment called single_env and &lt;strong>you can be surprised that, by default, an environment is automatically created when the release runs.&lt;/strong> This happens because an environment can be seen as sets of resources used as target for deployments, but in the actual preview version, in Azure DevOps, you can only add Kubernetes resources. The question is: &lt;strong>why have I used an environment to deploy an application to Azure if there is no connection between the environment and your azure resources?&lt;/strong> &amp;gt; At this stage of the preview, we can only connect Kubernetes to an environment, no other physical resource can be linked.&lt;/p></description></item><item><title>GitHub security Alerts</title><link>https://www.codewrecks.com/post/old/2019/10/github-security-alerts/</link><pubDate>Tue, 22 Oct 2019 16:00:37 +0200</pubDate><guid>https://www.codewrecks.com/post/old/2019/10/github-security-alerts/</guid><description>&lt;p>I really love everything about security and I’m really intrigued by GitHub security tab that is now present on you repository. In your project usually it is disabled by default.&lt;/p>
&lt;p>&lt;a href="https://www.codewrecks.com/blog/wp-content/uploads/2019/10/image-44.png">&lt;a target="_blank" href="https://www.codewrecks.com/blog/wp-content/uploads/2019/10/image_thumb-44.png"> &lt;img src="https://www.codewrecks.com/blog/wp-content/uploads/2019/10/image_thumb-44.png" alt="image" />&lt;/a>&lt;/a>&lt;/p>
&lt;p>&lt;em>&lt;strong>Figure 1&lt;/strong>&lt;/em>: &lt;em>GitHub Security tab on your repository&lt;/em>&lt;/p>
&lt;p>&lt;strong>If you enable it you start receiving suggestion based on code that you check in on the repository&lt;/strong> , as an example, GitHub will scan your npm packages source to find dependencies with libraries that are insecure.&lt;/p></description></item></channel></rss>