<?xml version="1.0" encoding="utf-8" ?>


<rss version="2.0"
	 xmlns:dc="http://purl.org/dc/elements/1.1/"
	 xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	 xmlns:admin="http://webns.net/mvcb/"
	 xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
	 xmlns:atom="http://www.w3.org/2005/Atom"
	 xmlns:content="http://purl.org/rss/1.0/modules/content/">
	
	<channel>

	<title><![CDATA[Blog of Andrei Marukovich]]></title>
	<atom:link href="http://lunarfrog.com/blog/feed" rel="self" type="application/rss+xml"></atom:link>
	<link>http://lunarfrog.com/blog</link>
	<description>LunarFrog blog</description>
	<dc:language>en</dc:language>
	<dc:creator>contact@lunarfrog.com</dc:creator>
	<dc:rights>Copyright 2026</dc:rights>
	<dc:date>2026-04-10T07:04:27-04:00</dc:date>
	<admin:generatorAgent rdf:resource="http://statamic.com/" />

			<item>
			<title><![CDATA[Cross-platform desktop applications with Avalonia and .NET Core]]></title>
			<link>http://lunarfrog.com/blog/cross-platform-desktop-applications-avalonia</link>
			<guid>http://lunarfrog.com/blog/cross-platform-desktop-applications-avalonia</guid>
			<description><![CDATA[<p>One portion of .NET Framework which didn’t find a way to .NET Core is a UI framework. Neither WinForms nor WPF are part of .NET Standard or supported by .NET Core. They are tightly coupled with Windows and may need significant rework to become available on other platforms.
To fill the gap and fight the dominance of JavaScript and Electron in cross-platform development, .NET community started Avalonia project – a cross-platform UI framework inspired by WPF and running on top of .NET Core. Last week Avalonia project announced Beta release, so it is a good time to try it to see what it can do.</p>

<h2>Overview</h2>

<p>Avalonia is inspired by WPF but it does not try to stay compatible with WPF or any other XAML stacks. The project uses own dialect of XAML, with the biggest difference in a way it handles styles. It is not only dropped the idea of resource dictionaries, but also uses CSS-like concept of selectors for applying styles.</p>

<script src="https://gist.github.com/AndreiMarukovich/84eb16056e61772a05c42f7a772c1318.js"></script>

<p>This is an example how the simple window and a styled control look like with Avalonia. Looks familiar, but there are twists to make WPF developers confusing from time to time. Experience with CSS will definitely be helpful to catch the new concepts.</p>

<p>This code works on top of .NET Core, allowing development of desktop applications for Windows, Linux and macOS. Based on the execution platform, Avalonia uses different platform-specific implementations and rendering engine.</p>

<h2>Usage</h2>

<p>Getting started is extremely easy. There is a Visual Studio extension which adds Avalonia templates to VS and brings visual designer for Avalonia XAML.</p>

<p><img src="http://lunarfrog.com/assets/img/resized/avalonia-new.png" alt="New Avalonia project" /></p>

<p>Alternative approach is to use Avalonia templates for .NET Core and create new application via dotnet new command.</p>

<p>Build generates a normal .NET Core project which can be deployed as any other .NET application. For a simple application development experience is surprisingly smooth. Now it is time to try Avalonia with a larger project to understand the gaps and any issues with running the app on different platforms.</p>]]></description>
			<dc:subject><![CDATA[.NET]]></dc:subject>
			<dc:date>2018-02-22T00:00:00-05:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Windows Compatibility Pack for .NET Core]]></title>
			<link>http://lunarfrog.com/blog/windows-compatibility-pack-net-core</link>
			<guid>http://lunarfrog.com/blog/windows-compatibility-pack-net-core</guid>
			<description><![CDATA[<p>.NET Core 2 is a great release from many angles. By implementing .NET Standard 2.0, it doubles a number of API which can be shared between .NET Core and .NET Standard applications. It allows reference of .NET Framework libraries. In some aspects it even faster than .NET Framework. But is it enough to allow migration of the real-world .NET Framework code to .NET Core?</p>

<p>Results of running compatibility analyzer on an enterprise-grade code base will be unspectacular. These types of applications heavily use XML configuration and ConfigurationSection which are not part of .NET Standard, they often depend on WCF and they use WinForms or WPF for UIs. From my experience, even for UI-less libraries, about 20% of code is not transferrable between .NET Core and .NET Standard.</p>

<h2>Windows Compatibility Pack for .NET Core aims to fill this gap.</h2>

<p>Compatibility pack is a set of packages which provide the missing APIs and allow to share more code between the platforms. UI libraries are still missing, but back-end developers should be happier now.</p>

<p>Compatibility pack solves the problem of missing APIs by combining type-forwarding and re-implementation approaches.</p>

<p>When an API exposed by the compatibility pack is used in .NET Framework execution environment, the type is forwarded and an existing .NET Framework implementation is used. However, for .NET Core, the pack provides new implementation.</p>

<p>It is important to understand, that Windows Compatibility Pack does not bring all these new features to other platforms. It allows to use missing APIs, but some of them are only for Windows platform, because even re-implemented code still depends on Windows.</p>

<p><a href="https://www.nuget.org/packages/Microsoft.Windows.Compatibility">Windows Compatibility Pack for .NET Core</a> is not a permanent solution which will stay in codebase forever. Intention of the pack is to build a temporary bridge, allowing adoption of .NET Core to a greater extend. However, in the long term the goal stays the same – replace outdated APIs and features of .NET with newer .NET Standard-compatible alternatives.</p>]]></description>
			<dc:subject><![CDATA[.NET]]></dc:subject>
			<dc:date>2018-01-11T00:00:00-05:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Is there a place for DevOps in desktop software development?]]></title>
			<link>http://lunarfrog.com/blog/devops-desktop-software-development</link>
			<guid>http://lunarfrog.com/blog/devops-desktop-software-development</guid>
			<description><![CDATA[<p>Most of the articles, presentations or courses which promote DevOps approach are focused on the cloud and Web projects. Most of the DevOps successful stories are coming from the Internet industry. While there are many good reasons why this philosophy is so widely accepted by the Web community, desktop software developers may greatly benefit from the same principles.</p>

<h2>Continuous deployment</h2>

<p>The biggest difference between the traditional desktop software development and Web/cloud software development processes is the way software is delivered to the customers. While it looks like the need to have an installer as well as the user’s involvement in the installation process conflicts with the Ops component of DevOps, it also provides a hint how desktop software should evolve to stay actual.</p>

<p>Instead of using heavy monolithic installers, modern desktop applications may use some modular approaches allowing partial update of the software when such updates are available. Also, user’s involvement may be limited to just an acknowledgment or restarting of the application. Modern browsers, Visual Studio and some other applications provide a good example of this approach.</p>

<p>Technology-wise implementation of continuous deployment for desktop application is a no-brainer. Frameworks and tools such as Windows Store, Squirrel or Dropcraft make it easy to implement.</p>

<h2>Continuous Integration</h2>

<p>Part of the continuous deployment is the continuous integration. Web applications have no exclusivity there – source control, build and artifact management systems are widely adopted by all type of projects. In reality, most of the desktop-related companies which claim practicing of DevOps, do just continuous integration. They are organizing DevOps teams, hiring DevOps specialists and ask them to maintain source control and build servers - where is Dev here? Sounds more like a job description for an admin…</p>

<p>Title renaming is easy but does not solve the goal – ultimate responsibility of the developer for the developed feature, from the moment of developing it, to building and deploying to the customers. DevOps are still outsiders for the development team, somebody who are easy to blame for the failed builds or request to do unwanted maintenance work.</p>

<p>To follow DevOps approach, there is no need to organize new teams or increase budget for new roles – training, mentoring and shared responsibility for the CI pipeline will do the work as well.</p>

<h2>Performance and usability monitoring</h2>

<p>Monitoring of the user activities is a controversial topic for the desktop applications, while it is a normal practice in Web. Despite the uncontrolled nature of execution environment, hardware variety and difficulties with receiving customer feedback, performance monitoring solutions are not common in desktop software.</p>

<p>Use of telemetry information is one of the DevOps practices underutilized in desktop applications. Level of acceptance for sharing telemetry information varies from user to user and from industry to industry, so applications may provide different solutions for gathering and transferring telemetry data, acceptable for the end users. Unlike Web applications, where collection of telemetry is usually fully automated and hidden from the user, it is a good idea to let user of desktop application to control when and what is shared. An easy way to opt-out, ability to review the data before transferring, public data usage policy – all such approaches may increase acceptance of the monitoring by the users.</p>

<h2>Is there a place for DevOps in desktop software development?</h2>

<p>Yes! DevOps is as relevant for desktop applications as it is for Web and cloud solutions. Properly tuned continuous integration pipeline will lead to a stable, deployable build, which can be easily pushed to the customers and monitored for the user experience and performance, to guide product improvements.</p>]]></description>
			<dc:subject><![CDATA[CI, Practices]]></dc:subject>
			<dc:date>2017-10-02T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Upcoming presentations on .NET Standard 2.0 and .NET Core 2.0]]></title>
			<link>http://lunarfrog.com/blog/upcoming-presentations-netstandard20-netcore20</link>
			<guid>http://lunarfrog.com/blog/upcoming-presentations-netstandard20-netcore20</guid>
			<description><![CDATA[<p>I am really existed about an opportunity to talk about .NET Standard 2.0 and .NET Core 2.0 with the Toronto and Toronto area .NET community. While .NET Standard 2.0 is a big step ahead for streamlining the compatibility story between the different implementations of the .NET Framework, .NET Core 2.0 brings better performance and simplifications for the ASP.NET Core developers.</p>

<p>If you are interested in the topic, here are the details about the upcoming meetups:</p>

<ul>
<li>September 26: <a href="https://www.meetup.com/Toronto-NET-Meetup/events/243054367/">Toronto .NET UG</a></li>
<li>September 27: <a href="https://www.meetup.com/CTTDNUG/events/243250821/">Kitchener .NET UG</a> </li>
<li>September 28: <a href="https://www.meetup.com/London-NET-Developers-Group/events/243158850/">London .NET UG</a> </li>
<li>October 19: <a href="https://www.meetup.com/NorthTorontoUG/events/243170364/">North Toronto UG</a></li>
</ul>

<p>Additional resources for the presentation:</p>

<ul>
<li><a href="https://docs.microsoft.com/en-us/dotnet/standard/net-standard">.NET Standard overview</a></li>
<li><a href="https://github.com/dotnet/standard">.NET Standard specification</a></li>
<li><a href="https://docs.microsoft.com/en-us/dotnet/articles/core/tools/csproj">Additions to the csproj format for .NET Core</a></li>
<li><a href="https://docs.microsoft.com/en-us/nuget/schema/msbuild-targets">NuGet pack and restore as MSBuild targets</a></li>
<li><a href="http://lunarfrog.com/blog/how-to-use-msbuild15-net-framework">How to use new features of MSBuild 15 with .NET Framework projects</a></li>
</ul>]]></description>
			<dc:subject><![CDATA[Talk, .NET]]></dc:subject>
			<dc:date>2017-09-24T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Main is allowed to be async! ]]></title>
			<link>http://lunarfrog.com/blog/main-is-allowed-to-be-async</link>
			<guid>http://lunarfrog.com/blog/main-is-allowed-to-be-async</guid>
			<description><![CDATA[<p>One of the pain points of the async/await feature from the beginning was inability to mark console application&#8217;s Main method as async and to await other methods without using workarounds. Not anymore!</p>

<p>With the release of C# 7.1, Main method can be async as well. This is just a syntactic sugar - compiler will rewrite the code to apply the same workarounds as before, but at leas for developers code will look much nicer:</p>

<pre><code>static async Task Main(string[] args)
{
   ...
   await DoWork();
   ...
}</code></pre>]]></description>
			<dc:subject><![CDATA[C#, async]]></dc:subject>
			<dc:date>2017-09-02T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[EditorConfig support in Visual Studio 2017]]></title>
			<link>http://lunarfrog.com/blog/editorconfig-visualstudio2017</link>
			<guid>http://lunarfrog.com/blog/editorconfig-visualstudio2017</guid>
			<description><![CDATA[<p>Visual Studio 2017 is shipped with the new tools for managing code style settings. However, even knowing about it, I have never touched the tools as it was totally unclear how to manage these settings for different projects – my work and my personal project have different indent styles and I did not want to mix them.</p>

<p>Recently, I was very surprised to learn that VS 2017 supports <a href="http://editorconfig.org/">.editorconfig</a> out of the box, allowing to control setting per-project, by committing this file in repository. Here is an example file:</p>

<pre><code>root = true

[*]
end_of_line = crlf
insert_final_newline = true
trim_trailing_whitespace = true

# 4 space indentation
[*.cs]
indent_style = space
indent_size = 4</code></pre>

<p>These settings are standard not only for VS but also for many other editors, allowing to easily share the same setting across the different code editing applications.</p>

<p>Naming conventions and style are stored in the same file, so they are easily separable between the projects. However, these settings are not standard and will be ignored by other applications.</p>

<p>This feature is great and I only wish to learn about it earlier, it is definitely under-communicated.</p>]]></description>
			<dc:subject><![CDATA[.NET]]></dc:subject>
			<dc:date>2017-08-10T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Exploring .NET Open Source ecosystem: Working with CSV files with CSVHelper]]></title>
			<link>http://lunarfrog.com/blog/net-open-source-csvhelper</link>
			<guid>http://lunarfrog.com/blog/net-open-source-csvhelper</guid>
			<description><![CDATA[<p>ClosedXML library, described in the <a href="http://lunarfrog.com/blog/net-open-source-excel-closedxml">previous post</a>, allows reach manipulations with Excel spreadsheets, however sometime all what is need by the application is to quickly import/export some data. In this CSV, as a lighter format, may be handier.</p>

<p>CSVHelper is a simple library which allows to work with CSV file in a strongly-typed manner and eliminates the need of manually parsing the text.</p>

<h2>Exporting data to CSV</h2>

<pre><code>public class Animal
{
   public string Name { get; set; }
   public string Color { get; set; }
   public int Age { get; set; }

   public static void ExportToCsv(string fileName, IEnumerable<Animal> animals)
   {
      using (var textWriter = new StreamWriter(fileName))
      using (var csv = new CsvWriter(textWriter))
      {
         csv.WriteRecords(animals);
      }
   }
}
</code></pre>

<p>In addition to storing the rows as the batch, library allows to store them one-by-one, which is useful for the large number of rows:</p>

<pre><code>csv.WriteRecord<Animal>(animal);
csv.NextRecord();
</code></pre>

<h2>Reading CSV</h2>

<p>Reading of information from CSV is equally simple:</p>

<pre><code>public static IEnumerable<Animal> ImportFromCsv(string fileName)
{
   using (var textReader = new StreamReader(fileName))
   using (var csv = new CsvReader(textReader))
   {
      return csv.GetRecords<Animal>();
   }
}
</code></pre>

<p>Similar to writing to CSV, reading from CSV can be performed row-by-row as well.</p>]]></description>
			<dc:subject><![CDATA[.NET, Open Source]]></dc:subject>
			<dc:date>2017-07-20T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Exploring .NET Open Source ecosystem: working with Excel files with ClosedXML]]></title>
			<link>http://lunarfrog.com/blog/net-open-source-excel-closedxml</link>
			<guid>http://lunarfrog.com/blog/net-open-source-excel-closedxml</guid>
			<description><![CDATA[<p>EPPlus is a stable, fully-featured library for working with Execl files. However, it is licensed under LGPL it is a showstopper for many businesses. In such situation a relatively new ClosedXML library may be handy.</p>

<p>It may not provide all the features of EPPlus, but is capable to handle the core manipulations with spreadsheets.</p>

<pre><code>var workbook = new XLWorkbook();
var ws = workbook.AddWorksheet("data");
ws.Cell("A1").Value = 2;
ws.Cell("A2").Value = 2;
ws.Cell("A3").SetFormulaA1("=A1+A2");

Console.WriteLine(ws.Cell("A3").Value);
workbook.SaveAs("calculations.xlsx");
</code></pre>]]></description>
			<dc:subject><![CDATA[.NET, Open Source]]></dc:subject>
			<dc:date>2017-07-13T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Exploring .NET Open Source ecosystem: handling database schema versioning with FluentMigrator]]></title>
			<link>http://lunarfrog.com/blog/net-open-source-database-scema-versioning-fluentmigrator</link>
			<guid>http://lunarfrog.com/blog/net-open-source-database-scema-versioning-fluentmigrator</guid>
			<description><![CDATA[<p>One of the most common mistake which a junior database architect can make is a missed versioning schema. It is so easy to design a schema, release the corresponding application and to realize later how difficult to maintain this schema, support compatibility between the versions and migrate users to the new versions.</p>

<p>However, even when the new schema includes a concept of version, work is required to keep the schema in a health state, have a migration procedure and some tooling to automate the maintenance tasks.</p>

<p>FluentMigrator C# library provides all the needed tools to solve those problems. It provides a syntax to define the versioned schema, it provides a way to migrate databases from version to version and it includes tools to automate the tasks, during the development, deployment and in the field.</p>

<h2>Schema</h2>

<p>The core concept of FluentMigrator is a migration. Migration is a class which has a version and two methods Up() and Down(). Up() is responsible to migrate the target database from the previous version, to the version defined by the migration. Down() is responsible for the opposite operation – downgrading the database to the previous version.</p>

<pre><code class="csharp">[Migration(10)]
public class AddNotesTable : Migration
{
      public override Up()
      {
            Create.Table("Notes")
                  .WithIdColumn()
                  .WithColumn("Body").AsString(4000).NotNullable()
                  .WithTimeStamps()
                  .WithColumn("UserId").AsInt32();
      }

      public override Down()
      {
            Delete.Table("Notes");
      }
}
</code></pre>

<p>Instead of using SQL, migrations are defined using fluent C# syntax. This approach makes the migrations almost independent from the concrete databases, hiding the differences in SQL between them.</p>

<p>Migration version is defined using MigrationAttribute. Attribute accepts a number, which will be used by a migration executor to sort all the defined migrations and execute them one by one.</p>

<p>In addition to the schema definition, migrations can also include data seeding.</p>

<pre><code class="csharp">[Profile("Development")]
public class CreateDevData: Migration
{
      public override Up()
      {
            Insert.IntoTable("User").Row( new
                  {
                        Username = "devuser1",
                        DisplayName = "Dev User1"
                  });
      }

      public override Down()
      {
            // empty, not using
      }
}
</code></pre>

<p>This example also demonstrates an idea of profiles – an ability to selectively execute some migrations to have, for example, a seeded database for development or testing.</p>

<h2>Execution</h2>

<p>All migrations are usually grouped in on assembly and can be executed using the various provided tool. FluentMigrator provides CLI, NAnt, MSBuild and Rake migration runners.</p>

<pre><code>Migrate.exe /connection "Data Source=db\db.sqlite;Version=3;" /db sqlite /target migrations.dll
</code></pre>

<p>This code demonstrates usage of CLI tool to execute the migrations from migrations.dll for the database defined via the connection string using sqlite driver. Runner automatically detect the current database version and applies only the required migrations.</p>

<p>FluentMigrator is published under Apache 2.0 license and available at <a href="https://github.com/schambers/fluentmigrator">GitHub</a> and <a href="http://nuget.org/packages/FluentMigrator">NuGet</a>.</p>]]></description>
			<dc:subject><![CDATA[Open Source, .NET]]></dc:subject>
			<dc:date>2017-07-06T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Configuring TeamCity to run in Docker on Linux and build .NET Core projects]]></title>
			<link>http://lunarfrog.com/blog/configuring-teamcity-docker-linux-dotnetcore-builds</link>
			<guid>http://lunarfrog.com/blog/configuring-teamcity-docker-linux-dotnetcore-builds</guid>
			<description><![CDATA[<p>Recently, I had a need to setup a build pipeline for a medium size .NET Core project and, having a good previous experience with JetBrains TeamCity, I decided to use it in this case as well. Professional Edition is free and its limitations are acceptable for the project – up to three build agents and up to twenty build configurations.</p>

<p>This post provides a step-by-step guide on installing and configuring TeamCity. The starting point is a clean Ubuntu 16.04 LTS server, and the goal is to run TeamCity server, build agents and PostgreSQL on this system using Docker containers.  Additionally, the server and the agents are configured to support .NET Core project builds. This solution can be equally easy deployed on a local system or in cloud, like Azure or AWS.</p>

<p>For use of TeamCity in production environment, it is recommended to use an external database for storing the configuration data. For the described case, I use PostgreSQL running in a Docker container as well. So, the full stack includes five Docker containers, one for PostgreSQL, one for TeamCity server and three for the build agents. PostgreSQL database and all the data generated by TeamCity are persisted on a local drive using Docker mounted volumes.</p>

<h2>Installing Docker</h2>

<p>If you are starting from a clean system, you will need to install Docker and Docker Compose first. The detailed instructions on installing Docker for Ubuntu are available at <a href="https://docs.docker.com/engine/installation/linux/ubuntu/">https://docs.docker.com/engine/installation/linux/ubuntu/</a> and to install Docker Compose use apt-get:</p>

<pre><code>sudo apt-get install docker-compose
</code></pre>

<h2>Folder structure</h2>

<p>I use <em><strong>/srv</strong></em> folder as a root folder for all the data related to TeamCity builds and here is the full hierarchy of folders you will need to create inside of <em><strong>/srv</strong></em>:</p>

<p><img src="http://lunarfrog.com/assets/img/resized/tc-folders.PNG" alt="Folders" /></p>

<ul>
<li><strong>/srv/docker</strong> is used to store docker-compose.yaml file (see below for more details)</li>
<li><strong>/srv/postgresql/data</strong> is used to persist the PostgreSQL database</li>
<li><strong>/srv/teamcity/agents/00X/conf</strong> contains the corresponding agent’s configuration</li>
<li><strong>/srv/teamcity/data</strong> is mounted to the TeamCity container to provide a persistent storage for the database drivers, plugins, etc.</li>
<li><strong>/srv/teamcity/logs</strong> contains TeamCity’s logs</li>
</ul>

<p>When the folders are created, we are ready to define the stack.</p>

<h2>Docker containers</h2>

<p>Create a <strong><em>docker-compose.yaml</em></strong> file in the <strong><em>/srv/docker</em></strong> folder and paste the following content</p>

<script src="https://gist.github.com/AndreiMarukovich/57d70d06ef59fd67a2c4b2edbe156e04.js"></script>

<p>It configures the stack of the required containers. The configuration is self-explanatory and I’d like to highlight just a couple of things:</p>

<ul>
<li>All containers are based on the latest version of the corresponding images. It is okay for experimenting and demo, but for the production environment you may want to replace :latest tag with the a particular version of the image to use.</li>
<li>PostgreSQL password, user name and the database information will be needed later, to connect PostgreSQL with TeamCity.</li>
<li>The default TeamCity port is 8111, however sometimes it is disabled by IT and the cloud-based server will not be available. The configuration demonstrates how to remap TeamCity from port 8111 to 8080 for the public access.</li>
<li>While the PostgreSQL and TeamCity server instances are directly based on the official images, the TeamCity build agents are based on an image created (and available on <a href="https://hub.docker.com/r/lunarfrog/teamcity-agent-dotnet/">Docker Hub</a>) by me. This image is based on a JetBrain’s build agent image, but also includes installation of the .NET Core SDK (1.0.4 at the moment), allowing to use it for building and executing .NET Core applications.</li>
</ul>

<p>The only required change for the file is a correct PostgreSQL password and if it is updated, save the file, close it and start the configured stack by running</p>

<pre><code>docker-compose up -d
</code></pre>

<p>It will download all the required images and start the containers. We are ready to open and configure TeamCity.</p>

<h2>TeamCity first start</h2>

<p>In a browser, open TeamCity site. There is nothing special about configuring TeamCity running in Docker comparing with a conventional deployment, so these instructions are provided just for the completeness of the guide and based on a version 2017 of TeamCity.</p>

<p><img src="http://lunarfrog.com/assets/img/resized/tc-start01.PNG" alt="TeamCity First Start" /></p>

<p>On the first page, just click Proceed, the data directory is properly configured already.</p>

<p><img src="http://lunarfrog.com/assets/img/resized/tc-start02.PNG" alt="JDBC drivers needed" /></p>

<p>Now you need to connect TeamCity with the running instance of PostgreSQL. But before, you need JDBC drivers – they are not shipped with TeamCity. In terminal, open <em><strong>/srv/teamcity/data/lib/jdbc</strong></em> and put downloaded drivers here, for example by executing</p>

<pre><code>sudo wget https://jdbc.postgresql.org/download/postgresql-42.1.1.jar 
</code></pre>

<p>Back to the browser and click Refresh JDBC drivers – TeamCity should detect the newly installed drivers and allow you to connect to the database.</p>

<p><img src="http://lunarfrog.com/assets/img/resized/tc-start03.PNG" alt="Enter database connection information" /></p>

<p>Provide the required information (use database name, user name and the password defined in the docker-compose file) and click Proceed. If you are receiving a connection error, verify that the database host name is entered without ‘http’ and the host allows access to port 5432 for PostgreSQL (most likely will be blocked if the instance is hosted by Azure or AWS).</p>

<p>On the next page accept the agreement, create an administrative account and you are ready to use TeamCity.</p>

<h2>Using TeamCity for building .NET Core project</h2>

<p>After the start, three build agents shall be detected by TeamCity automatically, but marked as Unauthorized.  They need to be authorized manually.</p>

<p><img src="http://lunarfrog.com/assets/img/resized/tc-agents.PNG" alt="Agents" /></p>

<p>So far, we managed to configure and launch TeamCity and connect the build agent. The last step, before creating a new build project, is to install .NET Core plugin. This step is optional, as you can run .NET Core tasks from the command line runner, but the plugin simplifies steps definition by adding a dedicated .NET Core runner.</p>

<p>The plugin can be downloaded at <a href="https://plugins.jetbrains.com/plugin/9190--net-core-support">plugins.jetbrains.com</a> and can be installed via TeamCity UI – just open <em><strong>Administration\Plugins List</strong></em> page and upload the plugin. To enable the plugin, TeamCity requires restart and, unfortunately, there is no way to do it from the UI, so you need to use console again, go to <em><strong>/srv/docker</strong></em> and do</p>

<pre><code>docker-compose stop
docker-compose up -d 
</code></pre>

<p>After that, the plugin is installed and the agents are capable to use it (see the agent’s properties)</p>

<p><img src="http://lunarfrog.com/assets/img/resized/tc-agents-props.PNG" alt="Agent properties" /></p>

<p>That’s it – now you are ready to create a TeamCity project and configure the first build.</p>

<p><img src="http://lunarfrog.com/assets/img/resized/tc-build-step.png" alt=".NET Core build step" />
<img src="http://lunarfrog.com/assets/img/resized/tc-two-build-steps.png" alt="Configured .NET Core build steps" /></p>

<h2>Conclusion</h2>

<p>This guide demonstrated an approach for deploying a Docker-based TeamCity setup for running .NET Core build. It is based on a free version of TeamCity and allows an easy cloud deployment.</p>]]></description>
			<dc:subject><![CDATA[.NET, Web, Practices, Docker, CI]]></dc:subject>
			<dc:date>2017-06-16T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Fixing LibLog for using with ILMerge]]></title>
			<link>http://lunarfrog.com/blog/fixing-liblog-for-ilmerge</link>
			<guid>http://lunarfrog.com/blog/fixing-liblog-for-ilmerge</guid>
			<description><![CDATA[<p>LibLog library, <a href="http://lunarfrog.com/blog/net-open-source-netstandard-logging-using-liblog">which I described before</a>, uses reflection magic to allow libraries to do logging without introducing dependencies to any particular logging framework. Problem with the magic tricks is that they fail from time to time.</p>

<p>For me this happened when I tried to use ILMerge for <a href="http://lunarfrog.com/blog/introducing-dropcraft-nuget">Dropcraft CLI tool</a> and merge in one executable all the Dropcraft libraries, which use LibLog, and Serilog. As the result, merged executable did not produce any logs. No exceptions, no warnings – just empty screen.</p>

<p>After a short LibLog code review, I found the root cause - Type.GetType() calls. LibLog uses GetType calls to probe availability of the different logging frameworks and it uses assembly qualified type names, like <code>Type.GetType("NLog.LogManager, NLog")</code>.</p>

<p>Here the issue, in the ILMerged executable there is no NLog assembly. LibLog is not able to detect any logging framework and silently ignores all logging calls.  Solution is easy - if GetType call for an assembly qualified type returns null, call GetType for the type only.</p>

<pre><code>Type.GetType("NLog.LogManager, NLog") ?? Type.GetType("NLog.LogManager");
</code></pre>

<p>After the change, assembly perfectly works with and without merging. An example of the fully modified LibLog file is available in the <a href="https://github.com/Dropcraft/Dropcraft/blob/master/src/Dropcraft.Common.NetStandard/Logging/LibLog.cs">Dropcraft repository</a>.</p>]]></description>
			<dc:subject><![CDATA[.NET, Open Source]]></dc:subject>
			<dc:date>2017-06-01T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Introducing Dropcraft, a NuGet-based app deployment and composition framework]]></title>
			<link>http://lunarfrog.com/blog/introducing-dropcraft-nuget</link>
			<guid>http://lunarfrog.com/blog/introducing-dropcraft-nuget</guid>
			<description><![CDATA[<p>NuGet ecosystem already overgrew the original idea of the NuGet as a design-time package management tool. NuGet powers installers, helps to create auto-updateable applications and pluggable solutions. What the ecosystem missed so far, is the general-purpose library which abstracts complex, not yet documented NuGet API and simplifies the NuGet-based solutions development.</p>

<p>Welcome <a href="http://dropcraft.net/">Dropcraft</a>. Based on the version 4 of NuGet, it provides a high-level package management API and enables various scenarios of using NuGet packages in the applications. Going beyond just package installations, updates and uninstallations, Dropcraft includes a runtime API for the installed packages discovering and a simple extensibility framework, which is based on NuGet packages.</p>

<h2>The main features of Dropcraft</h2>

<ul>
<li>Packages installation (update, uninstallation) from the local and remote sources to any target folder.</li>
<li>Package management API and a reference implementation of the command line tool.</li>
<li>Configurable package resolution and deployment strategies.</li>
<li>Runtime API which allows to discover installed packages, package dependencies and initialize the packages. </li>
<li>Package-based application extensibility model (i.e. NuGet packages as plugins). Runtime application composition.</li>
<li>Framework extensibility via the deployment and runtime hooks. Packages are able to expose event handlers to track all Dropcraft activities and modify the hosting application behavior.</li>
</ul>

<h2>Scenarios, where Dropcraft may be useful</h2>

<ul>
<li>Deployment and management of individual NuGet packages in test environment and in production.</li>
<li>An internal tool packaging using NuGet packages and deployment to target PCs.</li>
<li>A custom installer which build the final product directly from NuGet packages.</li>
<li>NuGet package-based plugin infrastructure.</li>
</ul>

<h2>Get started</h2>

<p>The easiest way to try Dropcraft is to use <a href="https://github.com/Dropcraft/Dropcraft/releases/latest">dropcraft.exe command line tool</a>. It is built using public Dropcraft APIs and can be used as a framework usage example by itself.</p>

<pre><code>dropcraft.exe install "bootstrap/3.0.0" --path "c:\DemoApp" -s "https://api.nuget.org/v3/index.json" --framework net461
</code></pre>

<p>This command instructs Dropcraft to install bootstrap v3.0.0 package from NuGet.org to c:\DemoApp\ folder. It automatically resolves all the dependencies, downloads the packages and installs them</p>

<pre>Installing packages...
0 product package(s) found and 1 new package(s) requested
Versions are confirmed
2 package(s) are resolved
        2 package(s) to install
        0 package(s) to update
bootstrap/3.0.0 downloaded
jQuery/1.9.0 downloaded
bootstrap/3.0.0 installed
jQuery/1.9.0 installed
Installation complete.</pre>

<p>As the result, C:\DemoApp contains Content, Scripts and fonts folder from the bootstrap and jQuery packages. Dropcraft followed the instructions and installed Bootstrap 3.0.0, which is pretty old. So, the following command will update it</p>

<pre><code>dropcraft.exe install "bootstrap" --path "c:\DemoApp" -s "https://api.nuget.org/v3/index.json" --framework net461
</code></pre>

<pre>Installing packages...
2 product package(s) found and 1 new package(s) requested
Versions are confirmed
2 package(s) are resolved
        0 package(s) to install
        2 package(s) to update
bootstrap/3.3.7 downloaded
jQuery/1.9.1 downloaded
bootstrap/3.0.0 uninstalled
jQuery/1.9.0 uninstalled
bootstrap/3.3.7 installed
jQuery/1.9.1 installed
Installation complete.</pre>

<p>Dropcraft automatically resolved the latest version for the packages and upgraded them. Similarly, Dropcraft can install additional packages, downgrade or uninstall exiting ones.</p>

<h2>Advanced scenarios</h2>

<p>Previous example demonstrated used of the unmodified NuGet packages with Dropcraft. To enable more advanced scenarios, Dropcraft introduces an additional package manifest file, which can be included in the package by its author.</p>

<p>Package manifest allows to notify Dropcaft about package’s initialization method, allows packages to intercept various Dropcraft events and participate in the application composition.</p>

<p>Dropcraft defines a lightweight application composition model based on an extensibility point concept. Any package is able to define one or many extensibility points which will be linked with the corresponding extensions exported by other packages. It allows to create a chain of extensibility points/extensions and build application from packages like from the Lego blocks.</p>

<p><a href="https://github.com/Dropcraft/Examples/tree/master/src/WPFAppComposition">Dropcraft WPF demo app</a> demonstrates this concept. It consists from the several NuGet package which can be installed separately or all together. First command installs a package with the common interfaces, an executable and the application’s main window. It uses two package sources – NuGet.org and MyGet.org</p>

<pre><code>dropcraft.exe install "Dropcraft.WpfExample.App" "Dropcraft.WpfExample.Shell" "Dropcraft.WpfExample.Common" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461
</code></pre>

<p><img src="http://lunarfrog.com/assets/img/resized/DropcraftDemo001.png" alt="Dropcraft Demo App" /></p>

<p>The resulting application is just an empty window. Next command adds some functionality by installing an extension – text editor.</p>

<pre><code>dropcraft.exe install "Dropcraft.WpfExample.Editor" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461
</code></pre>

<p><img src="http://lunarfrog.com/assets/img/resized/DropcraftDemo002.png" alt="Dropcraft Demo App" /></p>

<p>Text editor defines a new extensibility point – editor command – and there is Dropcraft.WpfExample.Commands package which exports two command. So the next step is to install it</p>

<pre><code>dropcraft.exe install "Dropcraft.WpfExample.Commands" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461
</code></pre>

<p><img src="http://lunarfrog.com/assets/img/resized/DropcraftDemo003.png" alt="Dropcraft Demo App" /></p>

<p>The final result is the application composed from the packages where all the packages are loosely coupled through the interfaces and the composition is facilitated by Dropcraft. The framework takes care about the order of the packages initialization, runtime extensions registration and other scenarios common in the pluggable applications.</p>

<h2>Conclusion</h2>

<ul>
<li>GitHub repository: <a href="https://github.com/Dropcraft/Dropcraft">https://github.com/Dropcraft/Dropcraft</a></li>
<li>NuGet packages: <a href="https://www.nuget.org/packages?q=dropcraft">NuGet.org</a></li>
</ul>

<p>Dropcraft provides APIs which can be used by applications to incorporate NuGet functionality. It enables a wide range of scenarios, from direct manipulation with NuGet packages, to package-based plugins and runtime application composition.</p>

<p>While the release 0.2.1 is compiled for .NET 4.6.1, Dropcraft libraries target .NET Standard and are going to support .NET Core in the future releases. Similarly, future release will support ASP.NET Core in addition to the desktop applications.</p>]]></description>
			<dc:subject><![CDATA[Dropcraft, NuGet, Open Source, .NET]]></dc:subject>
			<dc:date>2017-05-25T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[How to use new features of MSBuild 15 with .NET Framework projects]]></title>
			<link>http://lunarfrog.com/blog/how-to-use-msbuild15-net-framework</link>
			<guid>http://lunarfrog.com/blog/how-to-use-msbuild15-net-framework</guid>
			<description><![CDATA[<p>One of the components updated for the Visual Studio 2017 release is the MSBuild build system. With the move of the .NET Core’s project system from project.json to the csproj format, MSBuild was updated to support the new lightweight csproj files. It also provides a better NuGet support, supporting referencing of the NuGet packages directly from csproj, and introducing <code>restore</code> and <code>pack</code> build tasks. This tasks can be used to restore project’s NuGet dependencies and to pack libraries without using NuGet, just with MSBuild.</p>

<p>While these features are primarily advertised (and work out of the box) for .NET Core projects, they can be used with .NET Framework 4.x projects too, assuming you have Visual Studio 2017 installed.</p>

<h2>Restore</h2>

<p>Restore task invocation is simple and is equivalent to the NuGet restore command</p>

<pre><code>msbuild "path" /t:restore</code></pre>

<p>If you will try to execute the command for an existing .NET Framework project with some NuGet packages referenced, it will do nothing. The exact message is <em>&#8220;Nothing to do. None of the projects specified contain packages to restore&#8221;</em>. MSBuild restore does not recognize <code>packages.config</code>. Instead, it expects to see all the NuGet references inside of the .csproj files.</p>

<p>This feature can be enabled via the NuGet Package Manager options. You may need to remove and add again referenced packages to switch csproj to the new way of referencing packages.</p>

<p><img src="http://lunarfrog.com/assets/img/resized/MSBuild001.png" alt="NuGet options" /></p>

<p>As the result, the MSBuild should be able to perform the restore command successfully.</p>

<h2>Pack</h2>

<p>MSBuild pack command is the direct replacement for the NuGet pack. While MSBuild still supports .nuspec files, it can also build the package based on the project properties, or the command line parameters, without spec.</p>

<p>Similar to the restore command, pack does not work out of the box for .NET Framework projects. To support this command, some MSBuild tasks need to be imported in the project. It is done automatically for the .NET Core projects, but requires some modifications in csproj for the .NET Framework projects.</p>

<p>First change is to import NuGet related build tasks.</p>

<pre><code>&lt;Import Project="$(MSBuildSDKsPath)\NuGet.Build.Tasks.Pack\build\NuGet.Build.Tasks.Pack.targets" /&gt;
</code></pre>

<p>MSBuildSDKsPath variable points to <em>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\MSBuild\Sdks</em> folder and is a part of the new MSBuild concept - SDK - which allows dynamically delivered build process extensions.</p>

<p>With this change, MSBuild is ready to do the pack. However, if you will try, it will complain about missing fields – ID and Authors. This fields can be defined using MSBuild properties or from the command line. For example, the command line may look like</p>

<pre><code>msbuild /t:pack /p:PackageId=MSBuildTestLib /p:Authors="Andrei Marukovich"</code></pre>

<p>And when defined in csproj</p>

<pre><code>&lt;PropertyGroup&gt;
    &lt;PackageId&gt;MSBuildTestLib&lt;/PackageId&gt;
    &lt;Authors&gt;Andrei Marukovich&lt;/Authors&gt;
    &lt;BuildOutputTargetFolder&gt;lib\net461&lt;/BuildOutputTargetFolder&gt;
&lt;/PropertyGroup&gt;
</code></pre>

<p><code>BuildOutputTargetFolder</code> is a property which should be used in this case to instruct MSBuild which framework folder to use for the build output. Without this line, the project assemblies will go in the root of .nupkg \lib folder. For the additional information about using the MSBuild properties for controlling the packing process and defining the package metadata, see <a href="https://docs.microsoft.com/en-us/dotnet/articles/core/tools/csproj">Additions to the csproj format for .NET Core</a>.</p>

<p>After these final changes, the modified .NET Framework project will allow to restore the packages and to be packed as a NuGet package using MSBuild, without involving NuGet.exe, and will continue to work in Visual Studio 2017.</p>]]></description>
			<dc:subject><![CDATA[.NET, NuGet]]></dc:subject>
			<dc:date>2017-05-18T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Using the NuGet v3 libraries in your projects, part 2]]></title>
			<link>http://lunarfrog.com/blog/using-nuget-libraries-in-your-projects-part-2</link>
			<guid>http://lunarfrog.com/blog/using-nuget-libraries-in-your-projects-part-2</guid>
			<description><![CDATA[<p>The <a href="http://lunarfrog.com/blog/using-nuget-libraries-in-your-projects">previous post</a> demonstrated the use of <em>RemoteDependencyWalker</em> to discover all the dependencies of the target libraries. The provided sample will work as expected if versions for all requested packages are perfectly aligned. However, this is not a case in the real life. User’s request may require installation of the conflicting packages and the application should be able to recognize and handle this situation.</p>

<p>With the provided code, application will fail when any conflict will appear. For the problematic packages, meta information will not be resolved, and the resolve result for the associated <em>GraphNode</em> will be null.</p>

<p>The simplest approach to solve the conflicts is to use <em>Analyze()</em> method of <em>GraphNode</em> class. This method returns analysis result which contains information about the issues. There are three types of issues: cycle dependencies, version downgrades and versioning conflicts, and all of them are detected by <em>GraphNode.Analyze()</em>.</p>

<p>While the cycle dependencies and versioning conflicts are most likely will lead to the application failure, version downgrades can be handled. Downgrade means presence of the required package with a version lower than a version of one of the resolved packages. In this situation, the dependency resolver adds both packages and marks package with the lower version as a downgraded package. It can be used by the application to decide next action: allow downgrade or fail.</p>]]></description>
			<dc:subject><![CDATA[NuGet, .NET]]></dc:subject>
			<dc:date>2017-05-11T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Using the NuGet v3 libraries in your projects]]></title>
			<link>http://lunarfrog.com/blog/using-nuget-libraries-in-your-projects</link>
			<guid>http://lunarfrog.com/blog/using-nuget-libraries-in-your-projects</guid>
			<description><![CDATA[<p>NuGet 3 tool, as it is expected from a package manager, by itself built using packages. These packages are published on the NuGet.org gallery and can be used by any applications required NuGet-like features. Usage scenarios include plugins, packaged as .nupkg, application content, package-based installers and others. Several projects, like Chocolatey or Wyam, already integrate NuGet for the different proposes, however for the really wide adoption of the NuGet libraries, a better API documentation is required.</p>

<p>This post demonstrates one of the ways of incorporating the NuGet libraries in an application. Dave Glick, an author of Wyam’s, has a great <a href="http://daveaglick.com/posts/exploring-the-nuget-v3-libraries-part-1">introduction to NuGet v3 APIs</a> and I recommend to read his posts before continuing, however it is not required. The NuGet usage approach described in this post is different from the approach reviewed in the mentioned articles. When applied, it allows to create .NET Standard compatible libraries and incorporate the NuGet tooling not only in .NET Framework applications, but also in .NET Core solutions.</p>

<p>NuGet 3 uses a zillion of libraries. Unlike NuGet 2, composed from just a few libraries, NuGet 3 design is based on multiple small libraries. For example, the post’s sample code uses nine libraries. Another note about the API – it is still in development. Post is based on version 3.5.0-rc1-final of NuGet and before the release some APIs may change.</p>

<p>Top level NuGet libraries used by the solutions are NuGet.DependencyResolver and NuGet.Protocol.Core.v3.</p>

<h2>Workflow</h2>

<p>The logical workflow is similar to NuGet restore command and from the developer perspective it includes the following phases:</p>

<ul>
<li>Prepare package sources</li>
<li>Identify a list of packages to install. This is a list of top level packages, requested by user or application.</li>
<li>Request NuGet to discover dependencies for the targeted packages</li>
<li>Install the target packages and recursively install dependencies with help of NuGet</li>
</ul>

<h2>Main concepts</h2>

<ul>
<li><em>PackageSource</em> identifies packages feed. NuGet understands local (file system-based) and remote package sources.</li>
<li><em>SourceRepository</em> is a combination of <em>PackageSource</em> and various services to retrieve specific resources (metadata, dependencies, etc.)</li>
<li>Context usually plays a role of operation’s configuration and operation’s cache. Examples are <em>SourceCacheContext</em>, <em>RemoteWalkContext</em></li>
<li><em>RemoteDependencyWalker</em> discovers all dependencies for the provided package</li>
<li><em>LibraryRange</em> uniquely identifies a library (package) with <em>Name</em>, <em>VersionRange</em> and <em>LibraryDependencyTarget</em>, which is the library type (Package, Project, etc.). Library type plays an important role in the way the dependencies are resolved in the sample code.</li>
<li><em>LibraryDependency</em> is a list of dependencies for the concrete library</li>
<li><em>IProjectDependencyProvider</em> is a special library provider which allows to submit custom libraries in <em>RemoteDependencyWalker</em> and makes the whole workflow possible.</li>
</ul>

<h2>Prepare package sources</h2>

<p>The following code adds the official NuGet feed as the package source and registers the sources in the RemoteDependencyWalker’s context.</p>

<pre><code class="csharp">var resourceProviders = new List<Lazy<INuGetResourceProvider>>();
resourceProviders.AddRange(Repository.Provider.GetCoreV3());
 
var repositories = new List<SourceRepository>
{
    new SourceRepository(new PackageSource("https://api.nuget.org/v3/index.json"), resourceProviders)
};
 
var cache = new SourceCacheContext();
var walkerContext = new RemoteWalkContext();
 
foreach (var sourceRepository in repositories)
{
    var provider = new SourceRepositoryDependencyProvider(sourceRepository, _logger, cache, true);
    walkerContext.RemoteLibraryProviders.Add(provider);
}
</code></pre>

<h2>Identify a list of packages to install</h2>

<p>RemoteDependencyWalker accepts only one root library to calculate the dependencies. In case of the multiple root target libraries, they should be wrapped inside of a fake library and IProjectDependencyProvider allows to include the fake library in the dependency resolution process.</p>

<p>IProjectDependencyProvider defines SupportsType method, which allows to control library types handled by the class and GetLibrary method which is expected to return the library object.</p>

<p>The trick is to define the fake library as a LibraryDependencyTarget.Project and only accept this type of libraries to be resolved by ProjectDependencyProvider. So, when RemoteDependencyWalker will ask for the instance of the fake library, it can be constructed with the list of targeted libraries as dependencies. For example, the following code assumes that two NuGet libs are the targeted libraries to install.</p>

<pre><code class="csharp">public Library GetLibrary(LibraryRange libraryRange, NuGetFramework targetFramework, string rootPath)
{
    var dependencies = new List<LibraryDependency>();
 
    dependencies.AddRange( new []
    {
        new LibraryDependency
        {
            LibraryRange = new LibraryRange("NuGet.Protocol.Core.v3", VersionRange.Parse("3.0.0"), LibraryDependencyTarget.Package)
        },
        new LibraryDependency
        {
            LibraryRange = new LibraryRange("NuGet.DependencyResolver", VersionRange.Parse("3.0.0"), LibraryDependencyTarget.Package)
        },
    });
 
    return new Library
    {
        LibraryRange = libraryRange,
        Identity = new LibraryIdentity
        {
            Name = libraryRange.Name,
            Version = NuGetVersion.Parse("1.0.0"),
            Type = LibraryType.Project,
        },
        Dependencies = dependencies,
        Resolved = true
    };
}
</code></pre>

<h2>Dependency discovery</h2>

<p>When all preparations are done, RemoteDependencyWalker can start to discover the dependencies</p>

<pre><code class="csharp">walkerContext.ProjectLibraryProviders.Add(new ProjectLibraryProvider());

var fakeLib = new LibraryRange("FakeLib", VersionRange.Parse("1.0.0"), LibraryDependencyTarget.Project);
var frameworkVersion = FrameworkConstants.CommonFrameworks.Net461;
var walker = new RemoteDependencyWalker(walkerContext);
 
GraphNode<RemoteResolveResult> result = await walker.WalkAsync(
    fakeLib,
    frameworkVersion,
    frameworkVersion.GetShortFolderName(), RuntimeGraph.Empty, true);
 
foreach (var node in result.InnerNodes)
{
    await InstallPackageDependencies(node);
}
</code></pre>

<p>The provided code does more than the dependencies discovery. It defines the supported .NET framework version and it iterates through the result to install the packages.</p>

<h2>Installing the packages</h2>

<p>And now application is ready to install the discovered packages</p>

<pre><code class="csharp">HashSet<LibraryRange> _installedPackages = new HashSet<LibraryRange>();
 
private async Task InstallPackageDependencies(GraphNode<RemoteResolveResult> node)
{
    foreach (var innerNode in node.InnerNodes)
    {
        if (!_installedPackages.Contains(innerNode.Key))
        {
            _installedPackages.Add(innerNode.Key);
            await InstallPackage(innerNode.Item.Data.Match);
        }
 
        await InstallPackageDependencies(innerNode);
    }
}
 
private async Task InstallPackage(RemoteMatch match)
{
    var packageIdentity = new PackageIdentity(match.Library.Name, match.Library.Version);
 
    var versionFolderPathContext = new VersionFolderPathContext(
        packageIdentity,
        @"D:\Temp\MyApp\",
        _logger,
        PackageSaveMode.Defaultv3,
        XmlDocFileSaveMode.None);
 
    await PackageExtractor.InstallFromSourceAsync(
        stream => match.Provider.CopyToAsync(
            match.Library,
            stream,
            CancellationToken.None),
        versionFolderPathContext,
        CancellationToken.None);
}
</code></pre>

<p>As the result of execution, all resolved packages will be de-duplicated and installed in D:\Temp\MyApp&#91;package-name] subfolder. Each package subfolder includes .nupkg, .nuspec and libraries for all supported frameworks.</p>

<p>And that’s it. The provided code demonstrates the whole workflow. There are tons of small details hidden behind this simple demo, but it should be enough for staring your own experiments. Fill free to comment if you have any questions.</p>]]></description>
			<dc:subject><![CDATA[.NET, NuGet]]></dc:subject>
			<dc:date>2017-05-04T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[How to improve pattern matching and deconstruction in C#]]></title>
			<link>http://lunarfrog.com/blog/improve-pattern-matching-deconstruction-csharp</link>
			<guid>http://lunarfrog.com/blog/improve-pattern-matching-deconstruction-csharp</guid>
			<description><![CDATA[<p>In the previous post I mentioned that C# pattern matching is far from complete. In particular, close coupling between matching and deconstruction is not present so far in C#. The maximum what can be achieved using deconstruction is extraction of values from complex types:</p>

<pre><code>(var name, var color, _) = new Cat();</code></pre>

<p>Underscore here represents a value to discard, similar to out variables. However, what I would really like to see from the pattern matching is the following:</p>

<pre><code>if (x is Cat (var name, var color, _)) { }</code></pre>

<p>to match <code>x</code> as <code>Cat</code> and deconstruct it to extract name and color. Or</p>

<pre><code>if (x is Cat (var name, var color, 2)) { } </code></pre>

<p>to match <code>x</code> as two years old <code>Cat</code>, and extract the rest if it is true. Or may be</p>

<pre><code>If (GetCat() is (_, var color, 3)) { }</code></pre>

<p>to match and deconstruct the <code>Cat</code> object returned by <code>GetCat()</code></p>

<p>All these constructs bring together pattern matching, tuples and deconstruction and allow to develop even more condensed code.</p>]]></description>
			<dc:subject><![CDATA[C#]]></dc:subject>
			<dc:date>2017-04-20T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[C# 7.0 bits: pattern matching]]></title>
			<link>http://lunarfrog.com/blog/csharp7-pattern-matching</link>
			<guid>http://lunarfrog.com/blog/csharp7-pattern-matching</guid>
			<description><![CDATA[<p>Another interesting set of features introduced in C# 7.0 is related to patter matching. Pattern matching is a well-known concept in functional programming languages and in a nutshell, it allows to test data against some “pattern” (for example test if data belongs to some type) and to extract values from complex data (deconstruct).  In many cases pattern matching allows to replace a series of if, else and assignment statements with a single expression.</p>

<p>To support pattern matching, C# 7.0 extends use of is and case keywords and allows to use constants and types as the patterns. 
The simple use of the pattern matching is to match a value against some constant:</p>

<pre><code>public void UltimateQuestion(object x)
{
   if (x is 42)
   {
      Console.WriteLine("This is the answer");
   }
}</code></pre>

<p>More interesting use is to test for the type and to create a new variable of that type</p>

<pre><code>public void UltimateQuestion(object x)
{
   if (x is null)
      return;

   if (!(x is int i))
      return;

   Console.WriteLine(i < 42 ? "Ask one more time" : "42 is the answer");
}</code></pre>

<p>Similarly, patterns can be matched using switch statement:</p>

<pre><code>switch (animal)
{
   case Dog d:
      break;
   case Cat c when c.Color == "White":
      break;
   case null:
      Console.WriteLine("No animal");
      break;
   default:
      Console.WriteLine("Unknown animal");
      break;
}</code></pre>

<h2>Notes</h2>

<p>Unlike Haskel or Elixir where pattern matching is engraved into the language, C# implementation of pattern matching does not feel at the moment fully integrated with the language. It could be because of the limited number of patterns available to use or a loos link between pattern matching and deconstruction, but anyway it is a promising start which requires some improvements in the next releases.</p>]]></description>
			<dc:subject><![CDATA[C#]]></dc:subject>
			<dc:date>2017-04-18T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[C# 7.0 bits: out variables]]></title>
			<link>http://lunarfrog.com/blog/csharp7-out-variables</link>
			<guid>http://lunarfrog.com/blog/csharp7-out-variables</guid>
			<description><![CDATA[<p>Release of Visual Studio 2017 not only introduced new productivity tools and improvements for the code editing experience, but also brought new version of C#.</p>

<p>Similar to the previous release – C# 6.0 – new release is not focused on a single flagship feature and introducing several smaller features. One of the most exiting of these features (at least for me) is out variables. Initially this feature was planned for 6.0 release but was cut shortly before the release.</p>

<p>Code which is similar to the following snipped is extremely common in all types of C# application:</p>

<pre><code>public string ReformatDouble(string val)
{
   double x;

   if (double.TryParse(val, out x))
   {
      return $"{x:F4}";
   }

   return $"{x}";
}</code></pre>

<p>Sometime out variable is not even needed, because the single propose of the code could be testing parsing of the value:</p>

<pre><code>public bool IsDouble(string val)
{
   double x;
   return double.TryParse(val, out x);
}</code></pre>

<p>With the new “out variable” feature this code can be significantly simplified:</p>

<pre><code>public string ReformatDouble(string val)
{
   if (double.TryParse(val, out var x))
   {
      return $"{x:F4}";
   }

   return $"{x}";
}</code></pre>

<p>One interesting point here is the scope of the variable. Unlike variable <code>i</code> defined in the following loop <code>for (var i = 0; i &lt; 10; i++) {}</code>, <code>x</code> continues to exist even outside of <code>if (…) {}</code> construct.</p>

<p>When out variable is not needed, code can be simplified even more - variable can be discarded:</p>

<pre><code>public bool IsDouble(string val)
{
   return double.TryParse(val, out _);
}</code></pre>

<p>Happy parsing!</p>]]></description>
			<dc:subject><![CDATA[C#]]></dc:subject>
			<dc:date>2017-04-14T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[First impressions from using .NET Core for AWS Lambdas and deployment tips]]></title>
			<link>http://lunarfrog.com/blog/using-net-core-for-aws-lambda-deployment</link>
			<guid>http://lunarfrog.com/blog/using-net-core-for-aws-lambda-deployment</guid>
			<description><![CDATA[<p>Earlier this year, Amazon announced availability of C# as a first-class citizen for AWS Lambda services. This solution is based on .NET Core and allows to use .Net ecosystem for building serverless services.</p>

<h2>Tooling</h2>

<p>Tooling and integration with Visual Studio are surprisingly good. VS2017 support is in preview, but even the preview extension is pretty stable. C#-based Lambdas can be deployed and tested directly from Visual Studio, with a couple of clicks. Lambda tools can also be used from .NET Core CLI, which enables command line-based Continuous Integration and Continuous Deployment scenarios.</p>

<h2>Deployment</h2>

<p>C# Lambdas support two main workflows – simple Lambda Function and serverless application hosted by Lambda. While the first case is simple, the serverless model is more interesting as it allows to host the whole ASP.NET Core Web API site in Lambda. In this case, AWS tools, with help of CloudFormation, deploys and configures API Gateway endpoint and Lambda which routes all requests to the corresponding Web API controllers/actions. In the same time, Web API site can still be run locally for testing purposes. This model provides a huge productivity boost comparing with a traditional model when Lambda shall be deployed to AWS first and only then can be tested.</p>

<p>AWS Developer blog provides tutorials on creating a new C# Lambda or converting existing ASP.NET Web API site in Serverless application. While the tutorials are well written, they miss a couple of important points:</p>

<ul>
<li>AWS Lambda does not support .NET Core 1.1 – this is a design decision to support only LTS versions of .Net Core (<strong><em>netcoreapp1.0</em></strong> at the moment)</li>
<li>Sometimes, first publish of C# Lambda may not wire Lambda and API Gateway properly. In this case, calling Lambda from the associated gateway will results in a security error. The easy way to fix it, is to open API Gateway resource associated with the Lambda and save it, even without changes. It fixes the missing security items.</li>
<li>The guide describing migration of the existing Web API site does not mention a couple of additional changes, which are required. Default .NET Core Web application template uses <strong><em>Microsoft.NET.Sdk.Web</em></strong> as a SDK. However, this SDK references <strong><em>libuv</em></strong> runtime component which is missing on AWS and raises runtime exception. This is a known issue in .NET Core and was recently fixed in v1.1 but not available in production. To avoid this issue, Web application for Lambda shall use <strong><em>Microsoft.NET.Sdk</em></strong> instead (and set <strong><em>EXE</em></strong> as an output type). </li>
</ul>

<h2>Performance</h2>

<p>Performance of the solution is yet to be profiled, but the quick measurements show performance enough to proceed with the spikes. Latency for the cold Lambda (which is almost not relevant as you may use some techniques to keep Lambdas warm) is about 5s for the “Hello World” type of API and the average latency for the warm Lambdas is about 80ms.</p>]]></description>
			<dc:subject><![CDATA[.NET, C#, Web]]></dc:subject>
			<dc:date>2017-04-11T00:00:00-04:00</dc:date>
		</item>
			<item>
			<title><![CDATA[Xamarin experience event]]></title>
			<link>http://lunarfrog.com/blog/xamarin-experience-event</link>
			<guid>http://lunarfrog.com/blog/xamarin-experience-event</guid>
			<description><![CDATA[<p><img src="http://lunarfrog.com/assets/img/resized/image001.png" alt="" /></p>

<p>If you are interested in Xamarin technologies, Xamarin team is going to be in Toronto to speak about the new features delivered as part of Visual Studio 2017 release, cloud connectivity for the mobile apps and the mobile DevOps story. You will also have a chance to participate in a panel discussion and ask your questions.</p>

<p>The event will take place on Thursday, April 13th at Microsoft Toronto Office (suite 1201, 222 Bay Street) from 10am to 1pm, lunch provided.</p>

<p>For registration, email <a href="&#109;&#x61;&#x69;&#108;&#116;&#x6f;&#58;&#118;&#x2d;&#99;&#97;&#x6b;e&#114;&#x40;&#x6d;&#105;&#x63;&#x72;&#111;&#115;&#x6f;&#102;&#116;&#x2e;c&#111;&#x6d;">Catherine Kerr</a> with your name and contact details. Spaces are limited.</p>

<h3>Agenda</h3>

<ul>
<li>10:00AM - App Innovation (Why Mobile? Why Now? Why Innovate?)</li>
<li>10:30AM - Delivering Great Mobile Experiences (Common challenges and approaches / Cloud connected apps    / Mobile DevOps overview)</li>
<li>11.15AM - Mobile DevOps Demo</li>
<li>12:00PM - Panel Discussion / Q&amp;A</li>
<li>12:30PM - Lunch</li>
<li>1:00PM - Close</li>
</ul>]]></description>
			<dc:subject><![CDATA[Events]]></dc:subject>
			<dc:date>2017-04-03T00:00:00-04:00</dc:date>
		</item>
	

</channel>

</rss>