<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">

  <title><![CDATA[CodeBits]]></title>
  <link href="http://bjornej.github.io/atom.xml" rel="self"/>
  <link href="http://bjornej.github.io/"/>
  <updated>2017-10-02T12:33:28+00:00</updated>
  <id>http://bjornej.github.io/</id>
  <author>
    <name><![CDATA[Paolo Nicodemo]]></name>
    
  </author>
  <generator uri="http://octopress.org/">Octopress</generator>

  
  
  <entry>
    <title type="html"><![CDATA[Monitoring a Windows enviroment: SQL Server]]></title>
    <link href="http://bjornej.github.io/blog/2017/09/05/monitoring-a-windows-enviroment-sql-server/"/>
    <updated>2017-09-05T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2017/09/05/monitoring-a-windows-enviroment-sql-server</id>
    <content type="html"><![CDATA[<p>In the previous post we’ve seen what informations are useful to monitor a Windows application server. While useful data it won’t allow a complete vision of our applications behaviour if we don’t monitor also the important services, and what service is more important than your database?</p>

<p>If you are working in a Windows enviroment there’s a good probability you are using Microsoft’s SQL Server and you <strong>will</strong> have to monitor and fine-tune it’s usage to prevent problems and identify wrong usages.</p>

<p>Telegraf already has a plugin for sql server that will extract information for your instance by executing a query. While this works well it has two problems:</p>

<ul>
  <li>due to a bug it can only connect to SQL server 2008 R2 SP3 (this is a minor problem, I hope your SQL Servers are at least updated to a version which has not run out of support)</li>
  <li>it extracts a <strong>lot</strong> of data most of which is useful only in limited cases.</li>
</ul>

<p>To make it simple we can craft a configuration which includes only the most relevant and important counters. As in the case of an application server you will need the same base data:</p>

<ul>
  <li>use the <strong>disk</strong>, <strong>cpu</strong> and <strong>mem</strong> plugins</li>
  <li><strong>Network Interface</strong> :</li>
  <li>“Bytes Received/sec”</li>
  <li>“Bytes Sent/sec”</li>
  <li>“Bytes Total/sec”</li>
  <li><strong>Processor</strong>:
    <ul>
      <li>”% Idle Time”</li>
      <li>”% Interrupt Time”</li>
      <li>”% Privileged Time”</li>
      <li>”% User Time”</li>
      <li>”% Processor Time”</li>
    </ul>
  </li>
  <li><strong>LogicalDisk</strong> :
    <ul>
      <li>“Disk Bytes/sec”</li>
      <li>“Disk Read Bytes/sec”</li>
      <li>“Disk Write Bytes/sec” (all istances except _Total)</li>
    </ul>
  </li>
</ul>

<p>Now to the SQL Server- specific Performance counters (replace <strong>INSTANCE</strong> with the name of your instance, there can be more than one on a single server):</p>

<ul>
  <li><strong>MSSQL$INSTANCE:Buffer Manager</strong> :
    <ul>
      <li>“Buffer cache hit ratio”</li>
      <li>“Database Pages”</li>
      <li>“Free list stalls/sec”</li>
      <li>“Page life expectancy”</li>
      <li>“Page lookups/sec”</li>
      <li>“Page reads/sec”</li>
      <li>“Page writes/sec”</li>
    </ul>
  </li>
  <li><strong>MSSQL$INSTANCE:Databases</strong>:
    <ul>
      <li>“Active transactions”</li>
      <li>“Transactions/sec”</li>
      <li>“Write transactions/sec”</li>
    </ul>
  </li>
  <li><strong>MSSQL$INSTANCE:General Statistics</strong>:
    <ul>
      <li>“Processes blocked”</li>
    </ul>
  </li>
  <li><strong>MSSQL$INSTANCE:Latches</strong>:
    <ul>
      <li>“Average Latch Wait Time (ms)”</li>
      <li>“Latch Waits/sec”</li>
      <li>“Total Latch Wait Time (ms)”</li>
    </ul>
  </li>
  <li><strong>MSSQL$INSTANCE:Locks</strong>:
    <ul>
      <li>“Average wait time (ms)”</li>
      <li>“Lock Requests/sec”</li>
      <li>“Timeout locks (timeout &gt; 0)/sec”</li>
      <li>“Lock Timeouts/sec”</li>
      <li>“Lock Wait Time (ms)”</li>
      <li>“Lock Waits/sec”</li>
      <li>“Number of Deadlocks/sec”</li>
    </ul>
  </li>
  <li><strong>MSSQL$INSTANCE:SQL Statistics</strong>:
    <ul>
      <li>“Batch Requests/sec”</li>
      <li>“SQL Compilations/sec”</li>
      <li>“SQL Re-Compilations/sec”</li>
    </ul>
  </li>
  <li><strong>MSSQL$INSTANCE:Transactions</strong>:
    <ul>
      <li>“Free Space in tempdb (KB)”</li>
      <li>“Transactions”</li>
      <li>“Version Cleanup rate (KB/s)”</li>
      <li>“Version Generation rate (KB/s)”</li>
      <li>“Version Store Size (KB)”</li>
      <li>“Version Store unit count”</li>
      <li>“Version Store unit creation”</li>
      <li>“Version Store unit truncation”</li>
    </ul>
  </li>
  <li><strong>MSSQL$INSTANCE:Wait Statistics</strong>:
    <ul>
      <li>“Lock waits”</li>
      <li>“Log buffer waits”</li>
      <li>“Log write waits”</li>
      <li>“Memory grant queue waits”</li>
      <li>“Network IO waits”</li>
      <li>“Non-Page latch waits”</li>
      <li>“Page IO latch waits”</li>
      <li>“Page latch waits”</li>
      <li>“Thread-safe memory objects waits”</li>
      <li>“Transaction ownership waits”</li>
      <li>“Wait for the worker”</li>
      <li>“Workspace synchronization waits”</li>
    </ul>
  </li>
</ul>

<p>These performance counters will give you a pretty complete view of your SQL Server behaviour allowing you to inspect and correlate performance problems in your applications with the state of your database.</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Monitoring a Windows enviroment: servers]]></title>
    <link href="http://bjornej.github.io/blog/2017/08/28/monitoring-a-windows-enviroment-servers/"/>
    <updated>2017-08-28T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2017/08/28/monitoring-a-windows-enviroment-servers</id>
    <content type="html"><![CDATA[<p>Once  all the programs described in the previous post are installed we can start to familiarize with the system.</p>

<p>One question arises: what data to capture? This depends on the role of the server where the collecting agent is installed.</p>

<p>For an application server we can collect some basic data:</p>

<ul>
  <li>we can use Telegraf inputs <strong><em>cpu</em></strong>, <strong><em>disk</em></strong> e <strong><em>mem</em></strong></li>
  <li>some useful Performance Counters to collect are:
    <ul>
      <li><strong><em>Processor</em></strong> : “% Idle Time”, “% Interrupt Time”, “% Privileged Time”, “% User Time”, “% Processor Time” (only _Total)</li>
      <li><strong><em>Network Interface</em></strong>: “Bytes Received/sec”, “Bytes Sent/sec”, “Bytes Total/sec” (all istances except _Total)</li>
      <li><strong><em>LogicalDisk</em></strong> : “Disk Bytes/sec”, “Disk Read Bytes/sec”, “Disk Write Bytes/sec” (all istances except _Total)</li>
    </ul>
  </li>
</ul>

<p>If the server hosts web applications you can add these Performance Counters:</p>

<ul>
  <li><strong><em>ASP.NET</em></strong> : “Requests Queued”</li>
  <li><strong><em>ASP.NET Apps v4.0.30319</em></strong> : “Requests/sec”, “Sessions active”</li>
  <li><strong><em>Web Service</em></strong>: “Connection Attempts/sec”, “Current Connections”</li>
</ul>

<p>If you are using MSMQ:</p>

<ul>
  <li><strong><em>MSMQ Queue</em></strong> : “Messages in Queue” (you can read them all or specify individual queues),</li>
  <li><strong><em>MSMQ Service</em></strong> : “Total bytes in all queues” (this is important as MSMQ by default has a 1GB limit for messages saved and will refuse new messages when that limit is reached)</li>
</ul>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Monitoring a Windows enviroment: a solution]]></title>
    <link href="http://bjornej.github.io/blog/2017/08/25/monitoring-a-windows-enviroment-a-solution/"/>
    <updated>2017-08-25T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2017/08/25/monitoring-a-windows-enviroment-a-solution</id>
    <content type="html"><![CDATA[<p>In the previous post I described some of the challenges in monitoring a Windows-only enviroment. Now I’ll describe the solution I’ve chosen and has been working for almost a year without an hitch to monitor an hundred VMs.</p>

<p>If you remember from the previous post the four areas to consider are:</p>

<ul>
  <li>Data collection</li>
  <li>Data storage</li>
  <li>Data visualization</li>
  <li>Alerting</li>
</ul>

<h3 id="data-collection">Data collection</h3>

<p>As we said one of the most important things is the ability to read Windows Performance Counters and write data to our chosen storage. Here the solution was simple <a href="https://www.influxdata.com/time-series-platform/telegraf/">Telegraf</a>.</p>

<p>Telegraf can be easily installed as a Windows Service and most importantly:</p>

<ul>
  <li>can save to numerous type of storage allowing you flexibility in choosing the storage type</li>
  <li>already has numerous plugins allowing you to scrape data from numerous sources (if a plugin does not exists it’s simple to create a new one if you know a little Go)</li>
</ul>

<h3 id="data-storage">Data storage</h3>

<p>The second choice to make regarded the storage layer. While using Telegraf allows a great degree of flexibility we had to choose a solution with a good Windows support, low hardware requirements and able to be queried by the tools we chose for data visualization and alerting.</p>

<p>The choice was simple, we chose <a href="https://www.influxdata.com/time-series-platform/influxdb/">InfluxDb</a> which can be installed as a WIndows service, has low requirements and integrates well with the other tools in the stack.</p>

<p>Notably it has also a concept of data retention allowing you to specify an automatic cleanup of your data after 30 days.</p>

<h3 id="data-visualization">Data visualization</h3>

<p>To visualize the collected data we chose to use <a href="https://grafana.com/">Grafana</a> which is able to plot graphs reading directly from InfluxDb and from a lot of other data sources giving you a lot of flexibility and allowing you to easily navigate and correlate your data.</p>

<h3 id="alerting">Alerting</h3>

<p>Initially we tried to use for this part Grafana new support for alerts but we soon realized it was too limited to work well (no multiple levels, no different alerting rules, …) so we switched later to <a href="https://bosun.org/">Bosun</a>, a monitoring and alerting tool created by Stack Exchange. This tool can easily query InfluxDb, allows you to define complex rules and tweak them over time.</p>

<p>Also it has a dashboard with all active alerts where you can handle them byacknowledging them or closing when solved.</p>

<h3 id="reamrks">Reamrks</h3>

<p>While this solution has proved to work well, other combinations are possible.</p>

<ul>
  <li>swapping Grafana and Bosun for Chronograf and Kapacitor (produced by the same developers as InfluxDb and telegraf)</li>
  <li>using Prometheus as data storage</li>
  <li>using ElastiCsearch as storage, Kibana for visualization and Watcher for alerting</li>
</ul>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Monitoring a Windows enviroment]]></title>
    <link href="http://bjornej.github.io/blog/2017/08/23/monitoring-a-windows-enviroment/"/>
    <updated>2017-08-23T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2017/08/23/monitoring-a-windows-enviroment</id>
    <content type="html"><![CDATA[<p>Developing distributed applications in a Windows-only enviroment, with hosted Virtual machines, can present some challenges and require you to find solutions that are not so standardized due to this type of enviorment not being very common.</p>

<p>One of the challenges is collecting informations about your VMs, some important services (SQL Server, RabbitMQ, ElasticSearch,..) and your virtualization enviroment to be able to inspect them or be alerted in case of problems (or better before they happen).</p>

<p>To setup this kind of data collection and monitoring there are four area to consider. I will describe them and then propose a solution I’ve found works really well with minimal setup and is easy to configure.</p>

<p>The four areas to consider are:</p>

<ul>
  <li>Data collection</li>
  <li>Data storage</li>
  <li>Data visualization</li>
  <li>Alerting</li>
</ul>

<h3 id="data-collection">Data collection</h3>

<p>The first step is deciding what data you need to collect and find a program that can read it at a pre-defined interval and store where you want it. While there are many tools of this type, we need one that is able to work on Windows and adapt to it’s peculiarities. This means having something able to read Windows Performance Counters.</p>

<p>Performance Counter are a Windows-only mechanism that is used to report values about the operating system but can be also used by applications like SQL Server. Some examples are:</p>

<ul>
  <li>number of bytes of memory used</li>
  <li>number of messages in a MSMQ queue</li>
  <li>output troughtput for every network card</li>
  <li>number of phisycal reads per seconds (SQL Server)</li>
</ul>

<p>The full list of these counters can be seen from <strong>Performance Monitor</strong>, where they are divided in categories for easier navigation. The Performance Monitor interface is a throwback to the nineties but we need only the list of counters reachable when adding a new counter.</p>

<p>Pay attention to the fact that Performance Counter names are localized so mind the possibility that there may be a difference in name between you local computer and production servers.</p>

<p><strong>Options</strong>: Telegraf, Collectd, scollector, Elastic Beats,….</p>

<h3 id="data-storage">Data storage</h3>

<p>The second and most important part is the storage of your collected data. The usual solution is using a time-series database which is a form of database specialized in saving and reading time-series (metrics). Multiple options are available that can easily run on windows like InfluxDB and Prometheus. Your choice should be based primarily on two factors:</p>

<ul>
  <li>can my data collector program write to the chosen storage?</li>
  <li>can I easily visualize data from the storage?</li>
</ul>

<p>Availability and disaster recovery may play a part in your choice but after running our chosen system for a year we treat this data as important but disposable, meaning we have no high availability and we accept the possibility of losing the data. We also keep only the last thirty days of data, a compromise allowing us to inspect trends in a metric but limiting the size of storage needed.</p>

<p>Another important factor to consider is how much data your chosen tool is able to ingest. If you collect 10 metrics per machine every ten seconds what can work with a hundred server may fail with ten thousand. Fortunately most are able to ingest massive quantities of data with limited resources and if this becomes a problem you can use multiple databases, partitioning the resources by type or group of servers.</p>

<p><strong>Options</strong>: Prometheus, InfluxDb, ElasticSearch, Graphite, OpenTSDB</p>

<h3 id="data-visualization">Data visualization</h3>

<p>The third part is data visualization, for which you need a tool that is:</p>

<ul>
  <li>easy to use</li>
  <li>able to show the current status at a glance</li>
  <li>able to let you inspect and correlate historical data to pinpoint problems in a post-mortem</li>
  <li>easily configurable by everyone</li>
</ul>

<p>While this part seems “easy” a good visualization tool <strong>will</strong> make the difference. It will help you correlate a spike in your response times with anomalies in your database metrics. It will help you pinpoint configuration errors in your servers. It will let you uncover stray queries that every two hours read the entire database to generate a report. In short a good visualization tool will let you discover so much more about you systems and applications than you ever thought existed.</p>

<p>The fourth point is also important. While creating standard visualizations is really useful it should also be possible to explore the data freely by anyone without having to ask to a central group to create a new visualization every time it is needed. This will let everyone move freely witout having to wait for someone else.</p>

<p><strong>Options</strong>: Grafana, Kibana, Chronograf,…</p>

<h3 id="alerting">Alerting</h3>

<p>The last part is alerting on the collected data. Ideally you want to be able to:</p>

<ul>
  <li>define rules about possible problems with levels of importance (at least warning and critical)</li>
  <li>having the ability to quickly change the defined rules</li>
  <li>having the possibility to define alerting destinations</li>
  <li>having the possibility to view at a glance the status of all your alerts and track their status over time</li>
</ul>

<p>Defining different levels of importance will let you see problems way before they matter, like full disk storage, and give you time to act or ignore them during the night. Having the ability to tweak these rules will let you cut down on the number of alerts which can lead you to <em>alert fatigue</em> if left unchecked making you ignore an important alert because it is buried in another thousand of less important notifications.</p>

<p>If you can define different alert destinations you can escalate the important things while leaving the minor ones to be handled when you have time. Having also a dashboard which lets you see all active alerts, their status and past status will let you more easily coordinate with others in your team.</p>

<p><strong>Options</strong>: Grafana, Bosun, Chronograf</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Easy structured documentation]]></title>
    <link href="http://bjornej.github.io/blog/2017/03/05/easy-structured-documentation/"/>
    <updated>2017-03-05T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2017/03/05/easy-structured-documentation</id>
    <content type="html"><![CDATA[<p>One of the classic problems arising when developing professionally is how to create documentation about our software in a way that is easily disccoverable and updatable from every developer. We faced this problem and tried some options on the way before settling on the easiest and most approachable option we could find.</p>

<h4 id="first-solution-sharepoint">First solution: Sharepoint</h4>

<p>This was the solution implemented when I was hired. It basically boils down to a website where documents can be uploaded in a folder-like structure with permission to upload, read and modify them. While it worked it had some serious problems:</p>

<ul>
  <li>lack of search across documents</li>
  <li>permissions had  to be requested to an administrator to view documents (this could be avoided by giving read permissions to everyone in truth)</li>
  <li>permission had to be requested to modify documents, even to correct a typo</li>
  <li>finding documents was not easy as they could be there but you could be missing the necessary permissions to see them</li>
  <li>its folder-like structure allowed to organize content in a logical way</li>
</ul>

<h4 id="second-solution-the-wiki">Second solution: The wiki</h4>

<p>In the attempt to solve some of these problems we decided to use a Wiki to write our documentation. While it solved some problems it had some disadvantages:</p>

<ul>
  <li>we had full text search across all documentation</li>
  <li>permission were not needed allowing everyone to contribute and amend the documentation with full traceability</li>
  <li>a logical organization of the documentation was missing requiring each developer to hand mantain it by using links in every page making it tedious and error-prone</li>
</ul>

<h3 id="the-final-solution">The final solution</h3>

<p>At this point we realized we needed a diferent solution, something allowing us to:</p>

<ul>
  <li>organizing in a hyerarchical way our documentation easily</li>
  <li>allowing everyone to easily update it</li>
</ul>

<p>To reach this objective we used <a href="http://www.metalsmith.io/">Metalsmith</a> to create a git repository containing our documentation written in Markdown. Each time someone edits a page, the repository is built by the CI system and a static site is deployed. The structure of the site reflects exactly the on-disk folder structure allowing the topics organization.</p>

<p>In this configuration no permissions are needed as every edit is reflected in the source control history and can easily be reverted.</p>

<p>If you want to try it you can download a personalizable skeleton at https://github.com/Bjornej/KnowledgeBase. To run it just</p>

<div class="highlighter-rouge"><pre class="highlight"><code>npm install
node -harmony build.js
</code></pre>
</div>

<p>an your static site will be built.</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Ten thousand deploys]]></title>
    <link href="http://bjornej.github.io/blog/2017/02/25/ten-thousand-deploys/"/>
    <updated>2017-02-25T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2017/02/25/ten-thousand-deploys</id>
    <content type="html"><![CDATA[<p>I started developing professionally six years ago and got a job fresh out of university. At the time I didn’t know much about all the topics surrounding development: continuous integration, continuous delivery, capacity planning, etc.. .</p>

<p>When I started developing my first application what was taught to me was: verify that it builds and deploy it to a network folder with the Visual Studio <em>Publish</em> option. It worked fine as I was working alone but it had numerous problems:</p>

<ul>
  <li>a build was not reproducible exactly, as it didn’t match exactly what was in source control but what was on my computer</li>
  <li>I had to remember to build and deploy the correct configuration which was different from the one used to develop</li>
  <li>in a strange case my computer was the only one able to build the application due to specific version of a compiler only I had installed</li>
</ul>

<p>All these things summed up to many little problems so I started looking for better solutions, reading books about continuous delivery and deployment. At the same time the number of applications we were working one augmented magnifying the problems we had so I started working on Bazooka a tool to automate application deployments.</p>

<p>Last week I was looking at its statistics page and noticed one thing: in less than two years (from June 2015 to February 2017) over thirty developers worked on ninety applications making over ten thousand deploys.</p>

<p><img src="/images/graficoDeploy.png" alt="Deplyments by month" /></p>

<p>As you can see usage has been steady with an increase during summer months as many of our applications have a usage peak in the summer.</p>

<p>Over 90 applications were published some of which are very active while others are in maintenance mode and experience only a few deploys per year</p>

<p><img src="/images/grafico2.png" alt="Deployments by application" /></p>

<p>During these two years many bugs have been fixed and features added but there still are some that could make for an easier deployment experience like release promotion through enviroments but this is an argument for another post</p>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[OData and ASP.NET Core]]></title>
    <link href="http://bjornej.github.io/blog/2017/01/31/odata-and-asp-net-core/"/>
    <updated>2017-01-31T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2017/01/31/odata-and-asp-net-core</id>
    <content type="html"><![CDATA[<p>At work we’re in the process of migrating some applications to ASP.NET Core to take advantage of some of it’s features:</p>

<ul>
  <li>the ability to self host in process without IIS making it perfect to create standalone services</li>
  <li>its design based on dependency injection</li>
</ul>

<p>While we would love to be able to use the .NET Core version there are too many libraries that have not been ported still so we are forced to run on the full .NET framework and keep an eye on future developments to use the .NET Core framework.</p>

<p>While this solves many problems some libraries that integrated with ASP.NET have not been ported, in our case what was missing was the OData support. Searching online you can found some informations about it:</p>

<ul>
  <li>https://github.com/OData/WebApi/issues/772</li>
  <li>https://github.com/OData/WebApi/issues/229</li>
  <li>https://github.com/OData/WebApi/issues/744</li>
</ul>

<p>but all signs point to the porting work being stopped due to other priorities. This was a show stopper for us as we rely on OData to easily connect a React component representing a sorted, paginated and filterable table directly to a data source without having to code all possible queries.</p>

<p>Fortunately we didn’t rely on the classic ODataController but instead bound ODataQueryOptions as an action parameter and the built some extension method to apply them and obtain a PageResult (a type representing your data along with the total number of elements in the collection which is necessary to draw a paginated table).</p>

<div class="highlighter-rouge"><pre class="highlight"><code>    [HttpGet]
    public PageResult&lt;Person&gt; Teens(ODataQueryOptions opts)
    {
        return dataSource.Persons().Where(x =&gt; x.Age &lt; 18).PagedFilter(opts);
    }
</code></pre>
</div>

<p>The main problem is that this type by default won’t bind to the request as this was done in WebAPI from <a href="https://www.symbolsource.org/Public/Metadata/NuGet/Project/Microsoft.AspNet.WebApi.OData/5.2.2/Release/.NETFramework,Version=v4.5/System.Web.Http.OData/System.Web.Http.OData/System.Web.Http.OData/OData/ODataQueryParameterBindingAttribute.cs">ODataQueryParameterBindingAttribute</a>. As the source is easily available thanks to Microsoft new openness we can easily adapt it to an IModelBinder and IModelBinderProvider used by Asp.NET Core</p>

<div class="highlighter-rouge"><pre class="highlight"><code>public class ODataQueryOptionsModelBinder : IModelBinder
{
    private struct AsyncVoid
    {
    }

    private static MethodInfo _createODataQueryOptions = typeof(ODataQueryOptionsModelBinder).GetMethod("CreateODataQueryOptions");

    private static readonly Task _defaultCompleted = Task.FromResult&lt;AsyncVoid&gt;(default(AsyncVoid));

    public Task BindModelAsync(ModelBindingContext bindingContext)
    {
        if (bindingContext == null)
        {
            throw new ArgumentNullException("bindingContext");
        }
        
        var request = bindingContext.HttpContext.Request;
        if (request == null)
        {
            throw new ArgumentNullException("actionContext");
        }

        var actionDescriptor = bindingContext.ActionContext.ActionDescriptor;
        if (actionDescriptor == null)
        {
            throw new ArgumentNullException("actionDescriptor");
        }


        Type entityClrType = GetEntityClrTypeFromParameterType(actionDescriptor) ?? GetEntityClrTypeFromActionReturnType(actionDescriptor as ControllerActionDescriptor);

        Microsoft.Data.Edm.IEdmModel model = actionDescriptor.GetEdmModel(entityClrType);
        ODataQueryContext entitySetContext = new ODataQueryContext(model, entityClrType);
        ODataQueryOptions parameterValue = CreateODataQueryOptions(entitySetContext, request, entityClrType);
        bindingContext.Result = ModelBindingResult.Success(parameterValue);

        return _defaultCompleted;
    }

    private static ODataQueryOptions CreateODataQueryOptions(ODataQueryContext ctx, HttpRequest req, Type entityClrType) {
        var method = _createODataQueryOptions.MakeGenericMethod(entityClrType);
        var res = method.Invoke(null,new object[] { ctx,req}) as ODataQueryOptions;
        return res;
    }

    public static ODataQueryOptions&lt;T&gt; CreateODataQueryOptions&lt;T&gt;(ODataQueryContext context, HttpRequest request)
    {
        var req = new System.Net.Http.HttpRequestMessage(System.Net.Http.HttpMethod.Get, request.Scheme + "://" + request.Host + request.Path + request.QueryString);
        return new ODataQueryOptions&lt;T&gt;(context, req);
    }

    internal static Type GetEntityClrTypeFromActionReturnType(ControllerActionDescriptor actionDescriptor)
    {
        if (actionDescriptor.MethodInfo.ReturnType == null)
        {
            throw new Exception("Cannot use ODataQueryOptions when return type is null");
        }

        return TypeHelper.GetImplementedIEnumerableType(actionDescriptor.MethodInfo.ReturnType);
    }

    internal static Type GetEntityClrTypeFromParameterType(ActionDescriptor parameterDescriptor)
    {
        Type parameterType = parameterDescriptor.Parameters.First(x =&gt; x.ParameterType == typeof(ODataQueryOptions) || x.ParameterType.IsSubclassOf&lt;ODataQueryOptions&gt;()).ParameterType;

        if (parameterType.IsGenericType &amp;&amp;
            parameterType.GetGenericTypeDefinition() == typeof(ODataQueryOptions&lt;&gt;))
        {
            return parameterType.GetGenericArguments().Single();
        }

        return null;
    }
} }
</code></pre>
</div>

<p>the ODataModelHelper</p>

<div class="highlighter-rouge"><pre class="highlight"><code>public static class ODataModelHelper
{
    private const string ModelKeyPrefix = "MS_EdmModel";

    private static System.Web.Http.HttpConfiguration configuration = new System.Web.Http.HttpConfiguration();

    internal static Microsoft.Data.Edm.IEdmModel GetEdmModel(this ActionDescriptor actionDescriptor, Type entityClrType)
    {
        if (actionDescriptor == null)
        {
            throw new ArgumentNullException("actionDescriptor");
        }

        if (entityClrType == null)
        {
            throw new ArgumentNullException("entityClrType");
        }

        if (actionDescriptor.Properties.ContainsKey(ModelKeyPrefix + entityClrType.FullName))
        {
            return actionDescriptor.Properties[ModelKeyPrefix + entityClrType.FullName] as Microsoft.Data.Edm.IEdmModel;
        }
        else
        {
            ODataConventionModelBuilder builder = new ODataConventionModelBuilder(ODataModelHelper.configuration, isQueryCompositionMode: true);
            EntityTypeConfiguration entityTypeConfiguration = builder.AddEntity(entityClrType);
            builder.AddEntitySet(entityClrType.Name, entityTypeConfiguration);
            Microsoft.Data.Edm.IEdmModel edmModel = builder.GetEdmModel();
            actionDescriptor.Properties[ModelKeyPrefix + entityClrType.FullName] = edmModel;
            return edmModel;

        }
    }
}
</code></pre>
</div>

<p>the IModelBinderProvider</p>

<div class="highlighter-rouge"><pre class="highlight"><code>public class ODataModelBinderProvider : IModelBinderProvider
{
    public IModelBinder GetBinder(ModelBinderProviderContext context)
    {
        if (context.Metadata.ModelType.GetTypeInfo() == typeof(ODataQueryOptions) ||
            context.Metadata.ModelType.GetTypeInfo().IsSubclassOf&lt;ODataQueryOptions&gt;())
        {
            return new ODataQueryOptionsModelBinder();
        }

        return null;
    }
}
</code></pre>
</div>

<p>and finally register it  in the <strong>ConfigureServices</strong> method</p>

<div class="highlighter-rouge"><pre class="highlight"><code>var mvc = services.AddMvc(options =&gt;{
	options.ModelBinderProviders.Insert(0, new ODataModelBinderProvider());
});
</code></pre>
</div>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Bazooka 0.3 is ready]]></title>
    <link href="http://bjornej.github.io/blog/2017/01/30/your-filename/"/>
    <updated>2017-01-30T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2017/01/30/your-filename</id>
    <content type="html"><![CDATA[<p>After more than one year and over five thousand deploys a new release of Bazooka is ready. Grab the new and shiny 0.3 version from <a href="https://github.com/BazookaDeploy/Bazooka/releases/tag/0.3">the release page</a>.</p>

<p>This version is a complete rewrite of the User Interface to make it more appealing and easier to add functionalities to.</p>

<p>Also it comes with some new features:</p>

<ul>
  <li>the agent no longer stops if multiple packages are found when updating</li>
  <li>application groups can be reordered in homepage customizing the placement for every user</li>
  <li>an application enviroment configuration can be cloned from another enviroment</li>
  <li>an entire application configuration can be cloned from another one</li>
</ul>

<p>Stay tuned for more features.</p>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Local Windows Docker deployment with ASP.NET 5 and Docker Toolbox]]></title>
    <link href="http://bjornej.github.io/blog/2015/11/05/local-windows-docker-deployment-with-asp-net-5-and-docker-toolbox/"/>
    <updated>2015-11-05T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2015/11/05/local-windows-docker-deployment-with-asp-net-5-and-docker-toolbox</id>
    <content type="html"><![CDATA[<p>If you are reading this article you probably already know what <a href="https://www.docker.com/">Docker</a> is and as a .NET developer you’ve always wanted to be able to use it to deploy your applications.</p>

<p>With the beta 8 of ASP.NET 5 a new deploy option is available that lets you deploy your website <a href="http://docs.asp.net/en/latest/publishing/docker.html">to a virtual machine in Azure running Docker</a>. This is actually pretty simple and well covered by the documentation but what if you don’t have an Azure Account, network access or simply prefer to try things on your local computer.</p>

<p>This situation is not covered by the docs so I’ll cover the necessary steps. Note that I’ve tried this instructions on Windows 10 but there should be no problem on WIndows 7, 8 or 8.1 .</p>

<h3 id="prerequisites">Prerequisites</h3>

<p>First of all make sure you’ve installed all of the followings:</p>

<ul>
  <li><a href="https://docs.docker.com/windows/step_one/">docker machine</a></li>
  <li><a href="https://www.visualstudio.com/en-us/downloads/download-visual-studio-vs.aspx">Visual Studio 2015</a> (at least community edition)</li>
  <li><a href="https://www.microsoft.com/en-us/download/details.aspx?id=49442">Microsoft ASP.NET and Web Tools 2015 (Beta8)</a></li>
  <li><a href="https://visualstudiogallery.msdn.microsoft.com/0f5b2caa-ea00-41c8-b8a2-058c7da0b3e4">Visual Studio 2015 Tools for Docker - October Preview</a></li>
</ul>

<p>and verify your docker installation is working.</p>

<h3 id="deploying-to-docker">Deploying to Docker</h3>

<p>The first thing to do is to create a new ASP.NET Web Application with the <strong>ASP.NET Preview Templates</strong>. Choose <strong>Web application</strong> as in the image and uncheck the <em>host in the cloud</em> option.</p>

<p><img src="/images/dockercreate.png" alt="" /></p>

<p>Proceed with project creation and then build it by pressing <strong>Ctrl+Shift+B</strong>. To start the deploy simply right-click on the project and select the <strong>Publish</strong> option.</p>

<p><img src="/images/dockerpublish.png" alt="" /></p>

<p>Choose the <strong>docker containers</strong> option and then the <strong>Custom Docker host</strong>.</p>

<p><img src="/images/dockerhost.png" alt="dockerhost.png" /></p>

<p>Now you will have to specify the connection info for your local Docker host</p>

<p><img src="/images/dockercustom.png" alt="" /></p>

<p>To obtain these info launch the <strong>Docker quickstart terminal</strong> installed with the Docker Tools.</p>

<p><img src="/images/dockerterminal.png" alt="" /></p>

<p>You will see, drawn in green, the name of your machine and it’s IP, in this case 192.168.99.100. Return to the connection info dialog and insert ** tcp://192.168.99.100:2376 **. Then execute in the terminal</p>

<div class="highlighter-rouge"><pre class="highlight"><code>docker-machine config default
</code></pre>
</div>

<p>substituting default with your machine name. You will see all the config options o connect to your docker host. Copy all of it except the last parameter ( -H tcp://192.168.99.100:2376) as it is not necessary and paste it in the **Docker Advanced Options &gt; Auth Options ** field.</p>

<p>Now you’re done. Press <strong>Publish</strong> and wait until the build terminates, the browser will automatically open your website running in Docker.</p>

<p><img src="/images/dockerrunning.png" alt="" /></p>

<p><img src="/images/dockerps.png" alt="" /></p>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Simulating actions on many elements in Javascript]]></title>
    <link href="http://bjornej.github.io/blog/2015/08/10/simulating-actions-on-many-elements-in-javascript/"/>
    <updated>2015-08-10T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2015/08/10/simulating-actions-on-many-elements-in-javascript</id>
    <content type="html"><![CDATA[<p>Sometimes, when developing an application, especially in certain fields you’ll receive the same old request:</p>

<div class="highlighter-rouge"><pre class="highlight"><code>I want to select all the element in the table and [print|delete|close|whatever] them
</code></pre>
</div>

<p>While this request is reasonable (sometimes) it carries a lot of problems that must be solved to implement it correctly:</p>

<ul>
  <li>how many elements can be selected together ? 10, 20, 50, a thousand?</li>
  <li>how many users the system has that can use this feature together</li>
  <li>does the user have to see the status of his pending jobs?</li>
</ul>

<p>This type of feature can easily cause problems if not tought well. Imagine a user coming in, selecting a thousand rows and clicking “Print”. Now the system has to print maybe a couple thousand pages which may take some time.</p>

<p>If the action can be completed in a small time usually there is no problem but imagine a user that gives the command and doesn’t receive a timely response. Usually one of the thing he tries is to execute the action again which means your system is now printing double the pages.</p>

<p>This problem can be even compounded if many users try to use the same function swamping your application with pending requests until it grinds to a halt, especially if these jobs are processed by a queue that cannot be emptied sufficiently fast.</p>

<p>In some of this cases you’ll find that a really simple solution can be used that can be simply called “cheating”. Suppose you have the list of elements to process and for each of these you have to make a AJAX call to the server. Instead of creating a new action to process all the element in a batch you can simply chain all the request together to execute in sequence and show an update on screen in this way:</p>

<div class="highlighter-rouge"><pre class="highlight"><code>var d = jQuery.Deferred();
var p = d.promise();
var i = 0;
for (i = 0; i &lt; chosen.length; i++) {
  p = p.then(_.bind(function (index) {
    return $.post("youaction", function (response) {
	  "SHOW AN UPDATE ON SCREEN"
    });
   },this, i))
}

d.resolve();
</code></pre>
</div>

<p>As you can see all we do is create an initial promise through jQuery and then proceed to concatenate a Promise for each element to process. The last line simply starts the process by resolving the first request and the starting with the second and so on.</p>

<p>While I called this solution “cheating” as it doesn’t really process in batch all the element it has some nice properties:</p>

<ul>
  <li>it solves the user problem fully as it allows him to select some elements and apply an action to them in bulk</li>
  <li>it doesn’t swamp the system with huge requests, as only one element at a time is processed</li>
  <li>it eliminates the problem of having to stop the user from requeing the same batch because he thinks it is taking too much, or from having to implement a way to show the user the status of his pending jobs.</li>
</ul>

<p>All in all a simple solution that pleases almost everyone. The only drawback is that if the user closes the browser not all the elements will be processed but in many cases you’ll find that this doessn’t matter.</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Supporting multiple Visual Studio versions in a single extension]]></title>
    <link href="http://bjornej.github.io/blog/2015/04/18/supporting-multiple-visual-studio-versions-in-a-single-extension/"/>
    <updated>2015-04-18T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2015/04/18/supporting-multiple-visual-studio-versions-in-a-single-extension</id>
    <content type="html"><![CDATA[<p>When you create a Visual Studio extension to maximise its usefulness you try to make it work in all the most common Visual Studio versions which usually means VS 2012, 2013 and 2015.</p>

<p>This is not a big problem if you extension doesn’t target specific capabilities introduced only in the latest versions as every Visual Studio versions install the preceding versions libraries.</p>

<p>Sometimes, when you upgrade your extensions to target the new visual studio versions, you receive a complaint from someone that your extensions doesn’t work anymore. Further this seems to happen only if you have only an old VS version (usually only 2012) installed so you’re left to ponder what happened.</p>

<p>Searching on the internet for the specific message you find the likely cause, using newer versions of visual studio updated your project references to target the new dll versions which can’t be found in a system that only has vs 2012 installed.</p>

<p>A better search finds many results <a href="https://github.com/jfromaniello/nestin/issues/2">here</a>, <a href="http://stackoverflow.com/questions/25606632/how-can-a-vs-extension-target-multiple-versions-in-regard-to-microsoft-visualstu">here</a> and <a href="https://connect.microsoft.com/VisualStudio/feedback/details/794961/previous-version-assemblies-cannot-load-in-visual-studio-2013-preview">here</a> suggesting some options :</p>

<ul>
  <li>creating two separate extensions to target VS2012 and VS2013</li>
  <li>load dynamically the required library</li>
  <li>manually edit you csproj to remove the new reference and pay attention to the fact that VS will always try to update them</li>
</ul>

<p>Solution 1 is a no-go as I’m lazy and for a simple extension like Gruntlauncher I don’t want to mantain two separate version. Option 2 is doable but a little complex while solution 3 is the easiest but feels a lot like battling with Visual studio.</p>

<p>Luckily I found another options in the form of the <a href="https://github.com/tunnelvisionlabs/vsxdeps">VSSDK nuget pakages</a>. These packages when installed allow you to always reference the correct version of the library without having to mantain separate version or resort tu ugly hacks.</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Running Grunt/gulp as part of a TFS build]]></title>
    <link href="http://bjornej.github.io/blog/2015/03/08/running-grunt-gulp-as-part-of-a-tfs-build/"/>
    <updated>2015-03-08T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2015/03/08/running-grunt-gulp-as-part-of-a-tfs-build</id>
    <content type="html"><![CDATA[<p>In the past to execute grunt as part of a TFS build I’ve created a nuget package that modifies the msbuild file <a href="http://bjornej.github.io/blog/2014/01/19/grunt-msbuild-and-tfs/">Grunt.MSBuild</a> .</p>

<p>Having updated TFS to the 2013 version I’ve found there’s an easier way to do it by leveraging it’s ability to execute a Powershell script during the build. This can be done easily:</p>

<ul>
  <li>install grunt/gulp globally on every build agent. Pay attention to the fact that it must be installed as the user that runs the build agent and the grunt/gulp location <strong>must</strong> be included in the PATH variable (else you won’t be able to invoke grunt/gulp).</li>
  <li>
    <p>Add a powershell script inside your solution with the following lines</p>

    <div class="highlighter-rouge"><pre class="highlight"><code>  Push-Location "$PSScriptRoot\..\Source\Web"  
  Write-Host "npm package restore"
  &amp; "npm" install
  if ($LastExitCode -ne 0) {
    Write-Error "Npm package restore failed";
    exit 1;
  }

  Write-Host "Grunt requirejs"
  &amp; "grunt" requirejs
  if ($LastExitCode -ne 0) {
    Write-Error "Grunt requirejs failed";
    exit 1;
  }
  Pop-Location
</code></pre>
    </div>
  </li>
  <li>Modify your build definition to include the script by going in the process Tab -&gt; Build section -&gt; Advanced -&gt; PreBuild script path</li>
  <li>Run your newly configured build and enjoy.</li>
</ul>

<p>The script it’s really simple but sometimes must be adapted to your configuration and can be easily extended.
The first line moves the current working directory to the directory containing you package.json file and then executes an <em>npm install</em> to restore all the needed node packages. After that it exeutes <em>grunt requirejs</em> and finalyy restores the original working directory.</p>

<p>This step can be easily replaced by running gulp or extended by compiling your sass etcetera… .</p>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Extracting a list of grunt and gulp tasks]]></title>
    <link href="http://bjornej.github.io/blog/2014/07/16/extracting-a-list-of-grunt-and-gulp-tasks/"/>
    <updated>2014-07-16T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2014/07/16/extracting-a-list-of-grunt-and-gulp-tasks</id>
    <content type="html"><![CDATA[<p>I use a lot grunt and gulp when working so, to improve my workflow with Visual Studio, I created the GruntLauncher extension which can parse a <em>gulpfile</em> or <em>gruntfile</em> and extract the contained tasks allowing you to execute them from inside VS.</p>

<p>To be able to do this I didn’t have many options so I used an hack: <strong>executing</strong> the file in a javascript engine and reading the option object to identify the tasks.</p>

<p>While this works it has many unhandled corner cases:</p>

<ul>
  <li>all the grunt utility functions must be present and defined</li>
  <li>all the gulp utility functions must be present and defined</li>
  <li>all the common node functions should be present and defined ( this is almost impossible to do as they are too many)</li>
</ul>

<p>While this was an hack it worked quite well with just an handful of unhandled cases but everytime someone had a problem I got back to thinking to a better  and more robust solution.</p>

<p>Then an idea dawned on me: Why should I mock all these functions and not simply ask to grunt and gulp for the defined config options? While this seems easy to do you have to remember that I cannot simply invoke the command line and look at the results as I need the whole options object for postprocessing and also because it’s not a perfect solution.</p>

<p>WIth some experiments I came to the following node.js code for gulp:</p>

<div class="highlighter-rouge"><pre class="highlight"><code>var gulp = require('gulp');
require('./gulpfile');
console.log(Object.keys(gulp.tasks).join(' '));
</code></pre>
</div>

<p>and a similar snippet for grunt (in this case I also look for subtasks)</p>

<div class="highlighter-rouge"><pre class="highlight"><code>var param = grunt.config.data;
var names = [];
for(var prop in param){
	if(param.hasOwnProperty(prop)){
		names.push(prop);
		if(param[prop] === Object(param[prop])){
		    var arr=[];
			for(var innerProp in param[prop]){
				if(param[prop].hasOwnProperty(innerProp)){
					if(innerProp!=='options'){
						arr.push(prop+":"+innerProp);
					}
				}
			}
			if(arr.length &gt; 1){
				for (var i=0;i&lt; arr.length;i++){
					names.push(arr[i]);
				}
			}
		}
	}
}
</code></pre>
</div>

<p>With these I can let gulp and grunt do the work, trusting the files to be well formed and the necessary plugins installed, then simply invoke this code from .NET thanks to <a href="https://github.com/tjanczuk/edge">Edge.js</a>.</p>

<p>GruntLauncher is gonna get a serious refactoring..</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[A Modern style checkbox]]></title>
    <link href="http://bjornej.github.io/blog/2014/07/15/a-modern-style-checkbox/"/>
    <updated>2014-07-15T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2014/07/15/a-modern-style-checkbox</id>
    <content type="html"><![CDATA[<p>As I was working on a project I had to create a personal area page when a user could set some options to tweak the behaviour of the program.</p>

<p>As it always happens some of these options were of the type <strong>enable/disable behaviour</strong> which can be modelled with a simple checkbox.</p>

<p>While the checkbox works I felt the need for a more stylish solution and decided to try to replicate in CSS the Windows 8 checkbox.</p>

<p>First thing I searched the internet  looking for a ready solution and found something close to what I wanted in <a href="http://metroui.org.ua/forms.html">Metro UI</a>, which has a checkbox with the correct style but with a major defect: when clicked the checkbox switches state immediately ruining the overall effecct.</p>

<p>Solving this problem was quite easy and it took only the addition of a css animation on the slider movement and the slider color change.</p>

<p>The result and also the modified CSS code cand be seen here:</p>

<p data-height="213" data-theme-id="0" data-slug-hash="bszrE" data-default-tab="result" class="codepen">See the Pen <a href="http://codepen.io/paonic/pen/bszrE/">bszrE</a> by Paolo Nicodemo (<a href="http://codepen.io/paonic">@paonic</a>) on <a href="http://codepen.io">CodePen</a>.</p>
<script async="" src="//codepen.io/assets/embed/ei.js"></script>

<p>Two things in this example are interesting to note:</p>

<ul>
  <li>an additional span is required to style everything correctly as some browsers don’t support the <em>:after</em> pseudoselector on an input field (Firefox if I remember correctly)</li>
  <li>the animated transition is set not only on the span position but also on the original span so even the background color is animated thus giving a much more natural effect</li>
</ul>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Preventing double clicks in angular]]></title>
    <link href="http://bjornej.github.io/blog/2014/05/20/preventing-double-clicks-in-angular/"/>
    <updated>2014-05-20T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2014/05/20/preventing-double-clicks-in-angular</id>
    <content type="html"><![CDATA[<p>Last week I was working on a SPA made with Angular and I was incurring in a classic problem: the double click.</p>

<p>You know the situation: a button is clicked, a call is made to the server to create a new order and after the call finishes the form closes. In my case if you were fast enough you could click a couple of times on the button before the call finished, resulting in multiple calls and multiple orders.</p>

<p>To solve this problem once and for all I decided to try my hand at creating a custom directive. Looking in Angular the source code I searched for the <strong>ng-click</strong> directive and modified it to create <a href="https://github.com/Bjornej/angular-single-click"><strong>ng-single-click</strong></a>.</p>

<p>This directive can be used by simply substituting ng-click with ng-single-click. If the invoked expression returns something similar to a Promise (meaning it has a finally function) it will automatically ignore all the clicks on the element between the start of the promise and its fulfillment or rejection.</p>

<p>PS: After some tests I noticed that stopping clicks without giving the user a hint that work is currently being done is not a good user experience so I created a <a href="https://github.com/Bjornej/angular-animated-click">second</a> version of this directive that adapts <a href="http://lab.hakim.se/ladda/">ladda-button</a> to show a working animation in the clicked button.</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[GruntLauncher 1.6 is out]]></title>
    <link href="http://bjornej.github.io/blog/2014/05/19/gruntlauncher-1-6-is-out/"/>
    <updated>2014-05-19T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2014/05/19/gruntlauncher-1-6-is-out</id>
    <content type="html"><![CDATA[<p>A little late, the new version of GruntLauncher is available on Visual Studio gallery.</p>

<p>This new version brings three new features:</p>

<ul>
  <li>the ability to launch commands in the same directory as your gruntfile. Some people complained (rightly) that the commands were always executed at the project root and this was not always right because the gruntfile sat in a subfolder.</li>
  <li>support for grunt targets. Now if you have a task with at least two targets all will be shown as options to invoke directly. If there’s only one target only the task will be shown.</li>
  <li>npm install support thanks to <a href="https://github.com/MadsKristensen">Mads Kristensen</a>. Now when right clicking on you package.json file you will see the option to install all the specified packages (npm install).</li>
</ul>

<p>As a news the next version of GruntLauncher will almost certainly get a rename as the current name doesn’t reflect anymore the features it has.</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[GruntLauncher 1.5 is out]]></title>
    <link href="http://bjornej.github.io/blog/2014/03/13/gruntlauncher-1-5-is-out/"/>
    <updated>2014-03-13T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2014/03/13/gruntlauncher-1-5-is-out</id>
    <content type="html"><![CDATA[<p>A new release of GruntLauncher is available on <a href="http://visualstudiogallery.msdn.microsoft.com/dcbc5325-79ef-4b72-960e-0a51ee33a0ff">VisualStudioGallery</a> for you to download.</p>

<p>This new release brings a lot of improvements:</p>

<ul>
  <li>better gruntfile parsing support</li>
  <li>
    <p>use of a submenu to group all grunt commands, useful when your gruntfile has many entries ( thanks to Mads Kristensen who did all the work )
 <img src="/images/grunt.png" alt="grunt" /></p>
  </li>
  <li>
    <p>support for executing bower update on all your packages or on a specific package when right-clicking on it ( again thanks to Mads Kristensen )
 <img src="/images/bowerall.png" alt="" />
 <img src="/images/bower.png" alt="" /></p>
  </li>
  <li>support for gulpfile parsing in the same way it is done with gruntfiles (I’ve began to use gulp and support for it was easy to add )
 <img src="/images/gulp.png" alt="" /></li>
</ul>

<p>With these new functionalities I believe I will have to change the plugin name to something more significant but it will take me some time to come up with a decent name.</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[Grunt, MsBuild and Tfs]]></title>
    <link href="http://bjornej.github.io/blog/2014/01/19/grunt-msbuild-and-tfs/"/>
    <updated>2014-01-19T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2014/01/19/grunt-msbuild-and-tfs</id>
    <content type="html"><![CDATA[<p>One of the most useful features of TFS (at least for me) is the ability to set it up to build your project and output the resultin an automated fashion.</p>

<p>The ability to have a simple, unique  and repeatable process allows you to build and deploy your project without manual intervention.</p>

<p>Howewer it can be tricky to customize the build templates used by TFS to include specific additional steps like invoking grunt and executing additional tasks. For this reason I created a nuget package Grunt.MsBuild that can help you to integrate grunt execution in your TFS build by simply installing it in your projects.</p>

<h4 id="what-does-it-do">What does it do</h4>

<p>The Grunt.MsBuild package extends your build with a new target that sequentially invokes</p>

<ul>
  <li>npm install ( to install all the needed packages and dependencies )</li>
  <li>grunt build$Configuration (where $Configuration will be replaced by the current build configuration allowing you to customize the build process when in debug mdoe or release mode)</li>
</ul>

<h4 id="version-differences">Version Differences</h4>

<p>To address all the possibilities I’ve created two different packages:</p>

<ul>
  <li>The local version assumes grunt-cli is installed as a local module and will use it to launch grunt</li>
  <li>the global version assumes grunt-cli is installed globally and available in your path</li>
</ul>

<p>This separation was necessary because while it’s easy to install grunt-cli globally on your development machine it could not be possible on the build server.</p>

<h4 id="running-it">Running it</h4>

<p>I’ve used successfully the packages on a on-premise version of tfs but if you want to use it on VisualStudioOnline it’s currently not possible because it still uses node 0.6 which is not supported by grunt.</p>

<p>There are to solutions to this problem:</p>

<ul>
  <li>wait until node is updated on the build servers</li>
  <li>commit the node modules you depend on in TFS ( I don’t recommend this solution as it can cause problems with the number of files to save)</li>
</ul>

<h4 id="where-can-i-get-it">Where can I get it</h4>

<p>You can download it from the <a href="https://github.com/Bjornej/Grunt.MsBuild">Github repository</a> or install them trough Nuget</p>
]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[GruntLauncher for VS2013]]></title>
    <link href="http://bjornej.github.io/blog/2014/01/09/gruntlauncher-for-vs2013/"/>
    <updated>2014-01-09T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2014/01/09/gruntlauncher-for-vs2013</id>
    <content type="html"><![CDATA[<p>A new version of <a href="http://visualstudiogallery.msdn.microsoft.com/dcbc5325-79ef-4b72-960e-0a51ee33a0ff">GruntLauncher</a> is out and it brings support to Visual Studio 2013. Go forth and update it!</p>

]]></content>
  </entry>
  
  
  
  <entry>
    <title type="html"><![CDATA[GruntLauncher 1.1 is out]]></title>
    <link href="http://bjornej.github.io/blog/2014/01/06/gruntlauncher-1-1-is-out/"/>
    <updated>2014-01-06T00:00:00+00:00</updated>
    <id>http://bjornej.github.io/blog/2014/01/06/gruntlauncher-1-1-is-out</id>
    <content type="html"><![CDATA[<p>As the title says GruntLauncher 1.1 is out and available on VisualStudio Gallery to update.</p>

<h3 id="whats-new">What’s new</h3>

<p>This version brings two improvements:</p>

<ol>
  <li>execution of grunt tasks is no more blocking (this was due to a bug in node for windows but newer version have solved it so make sure to have a recent version installed, for example 0.10.22)</li>
  <li>long running tasks like grunt watch can be started and run in background, writing the output in the output pane. The command status will become checked and clicking again on it will stop the process</li>
</ol>

<p><img src="/images/gruntLauncherDetail.png" alt="Running task" /></p>
]]></content>
  </entry>
  
  
</feed>
