<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>PowerShell Magazine</title>
	<atom:link href="https://www.powershellmagazine.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.powershellmagazine.com</link>
	<description>For the most Powerful community</description>
	<lastBuildDate>
	Mon, 05 Apr 2021 13:40:51 +0000	</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.1.9</generator>
	<item>
		<title>Getting Started with PSArm</title>
		<link>https://www.powershellmagazine.com/2021/04/05/getting-started-with-psarm/</link>
				<comments>https://www.powershellmagazine.com/2021/04/05/getting-started-with-psarm/#respond</comments>
				<pubDate>Mon, 05 Apr 2021 13:40:50 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Azure]]></category>
		<category><![CDATA[Azure Resource Manager]]></category>
		<category><![CDATA[Module Spotlight]]></category>
		<category><![CDATA[PSArm]]></category>
		<category><![CDATA[modules]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13606</guid>
				<description><![CDATA[In the first part of this series, you learned about PSArm &#8212; a PowerShell embedded DSL &#8212; that you can... ]]></description>
								<content:encoded><![CDATA[
<p>In the first part of this <a href="/tags/psarm">series</a>, you learned about <a href="https://github.com/powershell/psarm">PSArm</a> &#8212; a PowerShell embedded DSL &#8212; that you can use to declaratively define your Azure infrastructure and generate an ARM template. PSArm, being a PowerShell DSL, supports PowerShell language constructs and builds the object model for ARM templates on top of PowerShell. So, if you are already familiar with PowerShell, writing ARM templates now will be as simple as writing another PowerShell script. So, how do you get started?</p>



<h3>Installing PSArm</h3>



<p>You can get PSArm module from the <a href="https://www.powershellgallery.com/packages/PSArm">PowerShell gallery</a>.</p>



<pre class="crayon-plain-tag">Install-Module -Name PSArm -AllowPrerelease</pre>



<p>You can also build module from the <a href="https://github.com/powershell/psarm">source code available on GitHub</a>. </p>



<p>This module, when loaded, exports a bunch of functions and cmdlets.</p>



<pre class="crayon-plain-tag">PS C:\sandbox\psarm&gt; Get-Command -Module PSArm
 CommandType     Name                                               Version    Source
 -----------     ----                                               -------    ------                 
 Function        add                                                0.1.0      PSArm             
 Function        and                                                0.1.0      PSArm          
 Function        array                                              0.1.0      PSArm
 Function        base64                                             0.1.0      PSArm
 Function        base64ToJson                                       0.1.0      PSArm
 Function        base64ToString                                     0.1.0      PSArm
 Function        bool                                               0.1.0      PSArm
 Function        coalesce                                           0.1.0      PSArm
 Function        concat                                             0.1.0      PSArm
 Function        contains                                           0.1.0      PSArm
 Function        copyIndex                                          0.1.0      PSArm
 Function        createArray                                        0.1.0      PSArm
 Function        createObject                                       0.1.0      PSArm
 Function        dataUri                                            0.1.0      PSArm
 Function        dataUriToString                                    0.1.0      PSArm
 Function        dateTimeAdd                                        0.1.0      PSArm
 Function        deployment                                         0.1.0      PSArm
 Function        div                                                0.1.0      PSArm
 Function        empty                                              0.1.0      PSArm
 Function        endsWith                                           0.1.0      PSArm
 Function        environment                                        0.1.0      PSArm
 Function        equals                                             0.1.0      PSArm
 Function        extensionResourceId                                0.1.0      PSArm
 Function        false                                              0.1.0      PSArm
 Function        first                                              0.1.0      PSArm
 Function        float                                              0.1.0      PSArm
 Function        format                                             0.1.0      PSArm
 Function        greater                                            0.1.0      PSArm
 Function        greaterOrEquals                                    0.1.0      PSArm
 Function        guid                                               0.1.0      PSArm
 Function        if                                                 0.1.0      PSArm
 Function        indexOf                                            0.1.0      PSArm
 Function        int                                                0.1.0      PSArm
 Function        intersection                                       0.1.0      PSArm
 Function        json                                               0.1.0      PSArm
 Function        last                                               0.1.0      PSArm
 Function        lastIndexOf                                        0.1.0      PSArm
 Function        length                                             0.1.0      PSArm
 Function        less                                               0.1.0      PSArm
 Function        lessOrEquals                                       0.1.0      PSArm
 Function        list                                               0.1.0      PSArm
 Function        listAccountSas                                     0.1.0      PSArm
 Function        listAdminKeys                                      0.1.0      PSArm
 Function        listAuthKeys                                       0.1.0      PSArm
 Function        listChannelWithKeys                                0.1.0      PSArm
 Function        listClusterAdminCredential                         0.1.0      PSArm
 Function        listConnectionStrings                              0.1.0      PSArm
 Function        listCredential                                     0.1.0      PSArm
 Function        listCredentials                                    0.1.0      PSArm
 Function        listKeys                                           0.1.0      PSArm
 Function        listKeyValue                                       0.1.0      PSArm
 Function        listPackage                                        0.1.0      PSArm
 Function        listQueryKeys                                      0.1.0      PSArm
 Function        listRawCallbackUrl                                 0.1.0      PSArm
 Function        listSecrets                                        0.1.0      PSArm
 Function        listServiceSas                                     0.1.0      PSArm
 Function        listSyncFunctionTriggerStatus                      0.1.0      PSArm
 Function        max                                                0.1.0      PSArm
 Function        min                                                0.1.0      PSArm
 Function        mod                                                0.1.0      PSArm
 Function        mul                                                0.1.0      PSArm
 Function        newGuid                                            0.1.0      PSArm
 Function        not                                                0.1.0      PSArm
 Function        null                                               0.1.0      PSArm
 Function        or                                                 0.1.0      PSArm
 Function        padLeft                                            0.1.0      PSArm
 Function        parameters                                         0.1.0      PSArm
 Function        providers                                          0.1.0      PSArm
 Function        range                                              0.1.0      PSArm
 Function        reference                                          0.1.0      PSArm
 Function        replace                                            0.1.0      PSArm
 Function        resourceGroup                                      0.1.0      PSArm
 Function        resourceId                                         0.1.0      PSArm
 Function        skip                                               0.1.0      PSArm
 Function        split                                              0.1.0      PSArm
 Function        startsWith                                         0.1.0      PSArm
 Function        string                                             0.1.0      PSArm
 Function        sub                                                0.1.0      PSArm
 Function        subscription                                       0.1.0      PSArm
 Function        subscriptionResourceId                             0.1.0      PSArm
 Function        substring                                          0.1.0      PSArm
 Function        take                                               0.1.0      PSArm
 Function        tenantResourceId                                   0.1.0      PSArm
 Function        toLower                                            0.1.0      PSArm
 Function        toUpper                                            0.1.0      PSArm
 Function        trim                                               0.1.0      PSArm
 Function        true                                               0.1.0      PSArm
 Function        union                                              0.1.0      PSArm
 Function        uniqueString                                       0.1.0      PSArm
 Function        uri                                                0.1.0      PSArm
 Function        uriComponent                                       0.1.0      PSArm
 Function        uriComponentToString                               0.1.0      PSArm
 Function        utcNow                                             0.1.0      PSArm
 Function        variables                                          0.1.0      PSArm
 Cmdlet          ConvertFrom-ArmTemplate                            0.1.0      PSArm
 Cmdlet          ConvertTo-PSArm                                    0.1.0      PSArm
 Cmdlet          New-PSArmDependsOn                                 0.1.0      PSArm
 Cmdlet          New-PSArmEntry                                     0.1.0      PSArm
 Cmdlet          New-PSArmFunctionCall                              0.1.0      PSArm
 Cmdlet          New-PSArmOutput                                    0.1.0      PSArm
 Cmdlet          New-PSArmResource                                  0.1.0      PSArm                   
 Cmdlet          New-PSArmSku                                       0.1.0      PSArm
 Cmdlet          New-PSArmTemplate                                  0.1.0      PSArm
 Cmdlet          Publish-PSArmTemplate                              0.1.0      PSArm</pre>



<p>You can see that the exported functions are similar to what ARM template language offers. </p>



<h3>PSArm Syntax Basics </h3>



<p>A typical PSArm script for ARM template starts with the <code>Arm</code> keyword. Within the <code>Arm</code> body, you define each Azure resource using the <code>Resource</code> keyword. </p>



<pre class="crayon-plain-tag">Arm {
     Resource &quot;identifier&quot; -Namespace &quot;Microsoft.resource&quot; -ApiVersion &quot;resourceApiVersion&quot; -Type &quot;resourceType&quot; {
         properties {
             &quot;propertyName&quot; &quot;value&quot;
         }
     }
output &quot;Output from deployment&quot;
 }</pre>



<p>The <code>Arm</code>, <code>Resource</code>, and <code>output</code> keywords are aliases defined in the module.</p>



<pre class="crayon-plain-tag">PS C:\sandbox\psarm&gt; (Get-Alias).Where({$_.Source -eq 'PSArm'})
 CommandType     Name                                               Version    Source    
 -----------     ----                                               -------    ------                 
 Alias           Arm -&gt; New-PSArmTemplate                           0.1.0      PSArm
 Alias           ArmArray -&gt; New-PSArmArray                         0.1.0      PSArm
 Alias           ArmElement -&gt; New-PSArmElement                     0.1.0      PSArm
 Alias           ArmSku -&gt; New-PSArmSku                             0.1.0      PSArm
 Alias           DependsOn -&gt; New-PSArmDependsOn                    0.1.0      PSArm
 Alias           Output -&gt; New-PSArmOutput                          0.1.0      PSArm
 Alias           RawCall -&gt; New-PSArmFunctionCall                   0.1.0      PSArm
 Alias           RawEntry -&gt; New-PSArmEntry                         0.1.0      PSArm
 Alias           Resource -&gt; New-PSArmResource                      0.1.0      PSArm</pre>



<p>With <code>Arm</code> keyword, you can specify an optional <code>Name</code> parameter which will be used as a name of the deployment within the template. For the resource definition, you use the <code>Resource</code> keyword. You must specify a <code>Name</code> to be used for the resource you want to provision, <code>Namespace</code> of the resource, <code>ApiVersion</code>, and <code>Type</code>.  As you enter arguments for these four parameters, you will see that PowerShell dynamically adds some more parameters based on the type of resource you intend to provision.</p>



<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2021/04/psarmresourceparam-1024x281.png" alt="" class="wp-image-13619" srcset="https://www.powershellmagazine.com/wp-content/uploads/2021/04/psarmresourceparam-1024x281.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2021/04/psarmresourceparam-300x82.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2021/04/psarmresourceparam-768x211.png 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>For example, as you see in the above screenshot, as soon as I added the <code>Type</code> parameter and its argument, I get <code>Kind</code> and <code>Tags</code> as the dynamic parameters. You can, then, use the <code>ArmSku</code> keyword to specify the SKU of the resource that you intend to provision. In case of a storage account, this can be set to Standard_LRS or any other supported value. Each Azure resource may need some more additional properties for resource provisioning and configuration. You can use the <code>properties</code> keyword for this purpose. PSArm gives you the auto-completion of property names within the <code>properties</code> block.</p>



<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2021/04/psaramresourceprop-1024x507.png" alt="" class="wp-image-13620" srcset="https://www.powershellmagazine.com/wp-content/uploads/2021/04/psaramresourceprop-1024x507.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2021/04/psaramresourceprop-300x149.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2021/04/psaramresourceprop-768x380.png 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>Finally, the <code>output</code> keyword can be used to retrieve properties required from the deployed resource objects. This keyword takes <code>Name</code>, <code>Type</code>, and <code>Value</code> as parameters.</p>



<h3>First PSArm Script</h3>



<p>Here is a complete PSArm script for provisioning a simple storage account.</p>



<pre class="crayon-plain-tag">Arm -Name myFirstTemplate -Body {
     Resource 'mypsarmsaccount' -Namespace 'Microsoft.Storage' -Type 'storageAccounts' -apiVersion '2019-06-01' -Kind 'StorageV2' -Location 'WestUS' {
         ArmSku 'Standard_LRS'
         Properties {
             accessTier 'Hot'
         }
     }
Output 'storageResourceId' -Type 'string' -Value (ResourceId 'Microsoft.Storage/storageAccounts' (Concat 'myPsArmSaccount'))
 }</pre>



<p>Make a note of the <code>ResourceId</code> and <code>Concat</code> functions used along with the <code>output</code> keyword. PSArm has public function parity with what is offered in ARM template language. You can save this script with a <code>.psarm.ps1</code> extension and generate the ARM template JSON using the <code>Publish-PSArmTemplate</code> cmdlet that PSArm module provides.</p>



<pre class="crayon-plain-tag">Publish-PSArmTemplate -Path .\firstTemplate.psarm.ps1 `
                       -OutFile .\firstTemplate.json -Force</pre>



<p>Here is the generated ARM template JSON for the PSArm script that you just built.</p>



<pre class="crayon-plain-tag">{
   &quot;$schema&quot;: &quot;https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#&quot;,
   &quot;contentVersion&quot;: &quot;1.0.0.0&quot;,
   &quot;metadata&quot;: {
     &quot;_generator&quot;: {
       &quot;name&quot;: &quot;psarm&quot;,
       &quot;version&quot;: &quot;0.1.0.0&quot;,
       &quot;psarm-psversion&quot;: &quot;5.1.19041.610&quot;,
       &quot;templateHash&quot;: &quot;5716551932369025750&quot;
     }
   },
   &quot;resources&quot;: [
     {
       &quot;name&quot;: &quot;myFirstTemplate&quot;,
       &quot;type&quot;: &quot;Microsoft.Resources/deployments&quot;,
       &quot;apiVersion&quot;: &quot;2019-10-01&quot;,
       &quot;properties&quot;: {
         &quot;mode&quot;: &quot;Incremental&quot;,
         &quot;expressionEvaluationOptions&quot;: {
           &quot;scope&quot;: &quot;inner&quot;
         },
         &quot;template&quot;: {
           &quot;$schema&quot;: &quot;https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#&quot;,
           &quot;contentVersion&quot;: &quot;1.0.0.0&quot;,
           &quot;resources&quot;: [
             {
               &quot;name&quot;: &quot;mypsarmsaccount&quot;,
               &quot;apiVersion&quot;: &quot;2019-06-01&quot;,
               &quot;type&quot;: &quot;Microsoft.Storage/storageAccounts&quot;,
               &quot;kind&quot;: &quot;StorageV2&quot;,
               &quot;location&quot;: &quot;WestUS&quot;,
               &quot;sku&quot;: {
                 &quot;name&quot;: &quot;Standard_LRS&quot;
               },
               &quot;properties&quot;: {
                 &quot;accessTier&quot;: &quot;Hot&quot;
               }
             }
           ],
           &quot;outputs&quot;: {
             &quot;storageResourceId&quot;: {
               &quot;type&quot;: &quot;string&quot;,
               &quot;value&quot;: &quot;[resourceId('Microsoft.Storage/storageAccounts', concat('myPsArmSaccount'))]&quot;
             }
           }
         }
       }
     }
   ]
 }</pre>



<p>This ARM template can be deployed using your favorite command line tool or using template deployment in Azure Portal.</p>



<p>When using Azure CLI,</p>



<pre class="crayon-plain-tag">az deployment group create --resource-group psarm --template-file .\firstTemplate.json</pre>



<p>When using Azure PowerShell,</p>



<pre class="crayon-plain-tag">New-AzResourceGroupDeployment -ResourceGroupName psarm -TemplateFile .\firstTemplate.json</pre>



<p>This is it. Congratulations. You just used PowerShell based DSL to generate and deploy an ARM template. In the next part of this series, you will learn more parameterizing PSArm scripts.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2021/04/05/getting-started-with-psarm/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PSArm &#8211; PowerShell DSL For Azure Resource Manager &#8211; Introduction #PowerShell #AzureResourceManager</title>
		<link>https://www.powershellmagazine.com/2021/04/01/psarm-powershell-dsl-for-azure-resource-manager-introduction-powershell-azureresourcemanager/</link>
				<comments>https://www.powershellmagazine.com/2021/04/01/psarm-powershell-dsl-for-azure-resource-manager-introduction-powershell-azureresourcemanager/#respond</comments>
				<pubDate>Thu, 01 Apr 2021 10:10:27 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Azure]]></category>
		<category><![CDATA[Azure Resource Manager]]></category>
		<category><![CDATA[Module Spotlight]]></category>
		<category><![CDATA[PSArm]]></category>
		<category><![CDATA[modules]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13595</guid>
				<description><![CDATA[Those who worked on Azure Resource Manager (ARM) templates understand that complexity of writing and making sure that the template... ]]></description>
								<content:encoded><![CDATA[
<p>Those who worked on Azure Resource Manager (ARM) templates understand that complexity of writing and making sure that the template works as expected becomes very complicated as the complexity of a deployment grows. ARM template language uses JSON data representation format and it is not a language to be honest. It is fragments of functions and other programming constructs embedded into JSON which only the Azure Resource Manager understands. There are tools such as the Visual Studio Code extension that attempt to simplify the ARM template authoring experience. But, end of the day, when you have to debug an issue within the ARM template, it won&#8217;t be easy based on the complexity of the template. There are other tools that intend to simplify provisioning Azure resources.</p>



<p> <a href="https://www.terraform.io/">HashiCorp Terraform</a> is one of the most popular provisioning tools that can provision Azure resources. <a href="https://www.pulumi.com/">Pulumi</a> provides the necessary support to define your Azure infrastructure in a supported programming language like write Go, Python, C#, F#, and so on. Both Terraform and Pulumi support multiple cloud providers. You may have recently seen or read about <a href="https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/bicep-overview">Project Bicep</a> as well. Bicep is domain-specific language meant for generating ARM templates. The most recent entrant in this race is what the PowerShell product team announced &#8212; <a href="https://devblogs.microsoft.com/powershell/announcing-the-preview-of-psarm/">PSArm module</a> (preview) which provides a PowerShell-embedded domain-specific language (DSL) for Azure Resource Manager (ARM) templates. This is an <a href="https://github.com/powershell/psarm">experimental module</a> and certainly not meant for production use yet. This module enables users to leverage the existing PowerShell scripting knowledge to author ARM templates using PowerShell. Once you write ARM template as a PowerShell DSC using <a href="https://www.powershellgallery.com/packages/PSArm">PSArm</a> which can be used to generate the ARM template JSON. This, at the moment, is just an experimental project and more like a proof-of-concept for creating a DSL within PowerShell. </p>



<p>Comparison between Project Bicep and PSArm naturally arises as both these projects are from Microsoft.</p>



<h3>Project Bicep</h3>



<p>Bicep is a domain-specific language (DSL) that is specifically designed for transpiling Bicep code into ARM templates. Generating ARM templates is the only purpose here. Bicep is not a general-purpose programming language. At the time of this writing, Bicep is in the very early stages of development but it is supported by Microsoft for production use. Think of Bicep as a transparent abstraction on Azure Resource Manager. For someone who is getting started with Azure Resource Manager and Azure deployments, Bicep will be a great start. It is simple and easy to learn. No doubt. Bicep can even decompile (on a best effort basis) your existing ARM templates to Bicep. Bicep seems to have taken some inspiration from how Terraform use HashiCorp Configuration Language (HCL). Bicep already  has an excellent VS Code extension that makes your life easy when authoring Bicep files.</p>



<p>Here is a Bicep file looks like. This is a simple storage account creation.</p>



<pre class="crayon-plain-tag">param storageAccountName string
​
@allowed([
  'Hot', 
  'Cool',
  'Archive'
])
param accessTier string = 'Hot'
​
@allowed([
  'WestUS2',
  'CentralUS'
])
param location string = 'WestUS2'
​
resource sa 'Microsoft.Storage/storageAccounts@2019-06-01' = {
  name: storageAccountName
  location: location
  sku: {
 &nbsp;  name: 'Standard_LRS'
  }
  kind: 'StorageV2'
  properties: {
 &nbsp;  accessTier: 'Hot'
  }
}</pre>



<p>You can build an ARM template from this Bicep file using the bicep command line.</p>



<pre class="crayon-plain-tag">bicep build main.bicep</pre>



<p></p>



<p>But since Bicep is a language meant only for ARM template generation, there may not be a way to interact with other parts of the system. For example, what if you had certain configuration artifacts stored in a CMDB and you want to pull that data and use in a Bicep program. This won&#8217;t be possible. At least for now. You will be restricted to what language constructs are available within Bicep.</p>



<h3>PSArm</h3>



<p>PSArm on the other hand is an embedded DSL module. If you are already familiar with PowerShell language, it will be quite natural to choose a DSL that works in PowerShell. There is no need to learn another language. With PowerShell, you get access to a wide-range of APIs other than what PSArm may provide. This helps in creating complex and dynamic ARM templates with just what PowerShell can do. You get use all aspects of PowerShell language and the underlying infrastructure. </p>



<p>Here is how a PSArm script looks like. Again, this is for simple storage account creation.</p>



<pre class="crayon-plain-tag">param(
 &nbsp; &nbsp;[Parameter(Mandatory)]
 &nbsp; &nbsp;[string]
 &nbsp; &nbsp;$StorageAccountName,
​
 &nbsp; &nbsp;[Parameter()]
 &nbsp; &nbsp;[ValidateSet('WestUS2', 'CentralUS')]
 &nbsp; &nbsp;[string]
 &nbsp; &nbsp;$Location = 'WestUS2',
​
 &nbsp; &nbsp;[Parameter()]
 &nbsp; &nbsp;[ValidateSet('Hot', 'Cool', 'Archive')]
 &nbsp; &nbsp;[string]
 &nbsp; &nbsp;$AccessTier = 'Hot'
)
​
Arm {
 &nbsp; &nbsp;Resource $StorageAccountName -Namespace 'Microsoft.Storage' -Type 'storageAccounts' -apiVersion '2019-06-01' -Location $Location {
 &nbsp; &nbsp; &nbsp; &nbsp;ArmSku 'Standard_LRS'
 &nbsp; &nbsp; &nbsp; &nbsp;Properties {
 &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;accessTier $AccessTier
 &nbsp; &nbsp; &nbsp; &nbsp;}
 &nbsp; &nbsp;}
}</pre>



<p>Here is how you can compile the above PSArm file into an ARM template.</p>



<pre class="crayon-plain-tag">Publish-PSArmTemplate -Path .\newStorageAccount.psarm.ps1 -Parameters @{
    storageAccountName = 'storageName'
    location = 'location'
}</pre>



<p>You can see that both Bicep and PSArm files more or less have the same number of lines. PowerShell is a little more verbose given the nature of how PowerShell commands are built. </p>



<p>At this point in time, I don&#8217;t see this as a Bicep vs PSArm. It is about whether you want to learn a new language or use your existing skills. This is just a quick introduction to PSArm. Next in this <a href="/tags/psarm">series of articles on PSArm</a>, you will learn more about using PSArm. </p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2021/04/01/psarm-powershell-dsl-for-azure-resource-manager-introduction-powershell-azureresourcemanager/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PowerShell wrapper around the Calendarific API</title>
		<link>https://www.powershellmagazine.com/2020/03/23/powershell-wrapper-around-the-calendarific-api/</link>
				<comments>https://www.powershellmagazine.com/2020/03/23/powershell-wrapper-around-the-calendarific-api/#respond</comments>
				<pubDate>Mon, 23 Mar 2020 14:54:02 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Calendarific]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13552</guid>
				<description><![CDATA[Calendarific offers a webservice for listing holidays from different countries. They also have an API that can be used by developers... ]]></description>
								<content:encoded><![CDATA[
<p><a href="https://calendarific.com/">Calendarific</a> offers a webservice for listing holidays from different countries. They also have an API that can be used by developers to implement this holiday queries in the their own applications.</p>



<p>Accessing this API requires an API key which can be registered for free at&nbsp;<a href="https://calendarific.com/signup">https://calendarific.com/signup</a>.</p>



<p>This module is a wrapper around the Calendarific API. You can install the module from the&nbsp;<a href="https://www.powershellgallery.com/packages/PSCalendarific/1.0.0.0">PowerShell Gallery</a>.</p>



<pre class="crayon-plain-tag">Install-Module -Name PSCalendarific -Force</pre>



<p>The following commands are available in this module.</p>



<pre class="crayon-plain-tag">Get-Command -Module PSCalendarific

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Function        Get-PSCalendarificCountry                          1.0.0.0    PSCalendarific
Function        Get-PSCalendarificDefaultConfiguration             1.0.0.0    PSCalendarific
Function        Get-PSCalendarificHoliday                          1.0.0.0    PSCalendarific
Function        Register-PSCalendarificDefaultConfiguration        1.0.0.0    PSCalendarific
Function        Unregister-PSCalendarificDefaultConfiguration      1.0.0.0    PSCalendarific</pre>



<h3><a href="https://github.com/rchaganti/PSCalendarific#register-pscalendarificdefaultconfiguration"></a>Register-PSCalendarificDefaultConfiguration</h3>



<p>This command helps set the parameter defaults for accessing the API. At present, this command supports only storing APIKey and Country values as default configuration. API key must always be provided and the&nbsp;<code>Get-PSCalendarificHoliday</code>&nbsp;requires Country name as well for listing the holidays. Therefore, these configuration settings can be stored locally so that other commands in the module can be used without explicitly providing any of these parameters.</p>



<blockquote class="wp-block-quote"><p>Note: Storing API key on local filesystem is not a good practice.</p></blockquote>



<pre class="crayon-plain-tag">Register-PSCalendarificDefaultConfiguration -APIKey '1562567d51443af046079a9bca8a84a358e2c393' -Country IN -Verbose
WARNING: This command stores specified parameters in a local file. API Key is sensitive information. If you do not 
prefer this, use Unregister-PSCalendarificDefaultConfiguration -APIKey to remove the API key from the store.</pre>



<p>This command has no mandatory parameters. You can specify either APIKey or Country or both. When you need to update either configuration parameters, just specify the parameter you want to update.</p>



<h3><a href="https://github.com/rchaganti/PSCalendarific#unregister-pscalendarificdefaultconfiguration"></a>Unregister-PSCalendarificDefaultConfiguration</h3>



<p>This command helps you remove the stored API key or just delete the configuration store itself.</p>



<pre class="crayon-plain-tag">Unregister-PSCalendarificDefaultConfiguration -APIKey -Verbose</pre>



<p>If you do not specify any parameters, the configuration store gets deleted.</p>



<h3><a href="https://github.com/rchaganti/PSCalendarific#get-pscalendarificdefaultconfiguration"></a>Get-PSCalendarificDefaultConfiguration</h3>



<p>This command gets the stored defaults from the configuration store.</p>



<pre class="crayon-plain-tag">Get-PSCalendarificDefaultConfiguration -Verbose
APIKey                                   Country
------                                   -------
1562567d51443af046079a9bca8a84a358e2c393 IN</pre>



<h3><a href="https://github.com/rchaganti/PSCalendarific#get-pscalendarificholiday"></a>Get-PSCalendarificHoliday</h3>



<p>This command lists all holidays for a given country based on the parameters supplied. If you do not provide any parameters, this commands tries to find and use the default parameter values from the configuration store.</p>



<pre class="crayon-plain-tag">Get-PSCalendarificHoliday</pre>



<p>The following are different parameters supported with this command.</p>



<figure class="wp-block-table"><table class=""><thead><tr><th>Parameter Name</th><th>Description</th><th>Default Value</th></tr></thead><tbody><tr><td>APIKey</td><td>Key to access the API</td><td>No default Value. When not specified, the command will try to use the default parameter from configuration store.</td></tr><tr><td>Country</td><td>Country</td><td>No default Value. When not specified, the command will try to use the default parameter from configuration store.</td></tr><tr><td>Year</td><td>Year for which the holidays need to be listed</td><td>The command internally defaults to the current year.</td></tr><tr><td>Month</td><td>Month for which the holidays need to be listed.</td><td>No default value but the valid values are 1 .. 12</td></tr><tr><td>Day</td><td>Day for which the holidays need to be listed.</td><td>No default value but the valid values are 1 .. 31.</td></tr><tr><td>Type</td><td>Type of holidays</td><td>No default value but the valid values are national, local, religious, and observance.</td></tr></tbody></table></figure>



<p>This module is available on GitHub as well in case <a href="https://github.com/rchaganti/PSCalendarific" target="_blank" rel="noreferrer noopener" aria-label="you want to contribute or create issues (opens in a new tab)">you want to contribute or create issues</a>.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2020/03/23/powershell-wrapper-around-the-calendarific-api/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PowerShell and DevOps Conference Asia 2020</title>
		<link>https://www.powershellmagazine.com/2020/03/12/powershell-and-devops-conference-asia-2020/</link>
				<comments>https://www.powershellmagazine.com/2020/03/12/powershell-and-devops-conference-asia-2020/#respond</comments>
				<pubDate>Thu, 12 Mar 2020 10:43:51 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Community]]></category>
		<category><![CDATA[PSConf]]></category>
		<category><![CDATA[PSConfAsia]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13539</guid>
				<description><![CDATA[PowerShell Conference Asia 2019 was held in Bangalore (India). It was such a great event and fun hosting it here.... ]]></description>
								<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/03/1-1-1-1024x791.png" alt="" class="wp-image-13547" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/03/1-1-1-1024x791.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2020/03/1-1-1-300x232.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/03/1-1-1-768x593.png 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p><a rel="noreferrer noopener" aria-label="PowerShell Conference Asia 2019 (opens in a new tab)" href="http://psconference2017.azurewebsites.net/" target="_blank">PowerShell Conference Asia 2019</a> was held in Bangalore (India). It was such a great event and fun hosting it here. For the first time in the history of PowerShell Conference Asia we had 220+ PowerShell lovers at the conference. The pre-conference workshops were very well received and the rest two days of conference was equally fun. With over 15+ international speakers, the attendees had great time learning about their favorite features and getting their doubts cleared. </p>



<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/03/2-1-1024x791.png" alt="" class="wp-image-13548" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/03/2-1-1024x791.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2020/03/2-1-300x232.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/03/2-1-768x593.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2020/03/2-1.png 1600w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>We have understood that the PowerShell community in India is really looking forward to the year 2020 event and there is already enough buzz about it in the local communities. We are looking forward to host more PowerShell lovers than the previous year and I am sure we are heading in that direction.</p>



<p>Today, we are delighted to announce PowerShell and DevOps Conference Asia 2020. We will be hosting it from November 5th to 7th in Bangalore again.</p>



<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/03/banner4-1024x602.png" alt="" class="wp-image-13544" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/03/banner4-1024x602.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2020/03/banner4-300x176.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/03/banner4-768x452.png 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>We are still working out the details about the venue and will announce it as soon as we finalize on that. We will ensure that this year conference will be much more bigger and better than the previous year. </p>



<p>If you are interested in submitting sessions, please wait for call for paper announcement next week. </p>



<p>Stay tuned for more information!</p>



<p></p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2020/03/12/powershell-and-devops-conference-asia-2020/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>The Case of Unknown Custom Object Property</title>
		<link>https://www.powershellmagazine.com/2020/02/10/the-case-of-unknown-custom-object-property/</link>
				<comments>https://www.powershellmagazine.com/2020/02/10/the-case-of-unknown-custom-object-property/#respond</comments>
				<pubDate>Mon, 10 Feb 2020 15:30:16 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Debugging]]></category>
		<category><![CDATA[PowerShell]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13513</guid>
				<description><![CDATA[In the last two or three months, I have been busy working on a complete end to end deployment automation... ]]></description>
								<content:encoded><![CDATA[
<p>In the last two or three months, I have been busy working on a complete end to end deployment automation solution and been writing lot of PowerShell code. Literally, lot of code. A simple to use deployment automation solution always has complex implementation beneath. I was dealing with several configuration files, dynamically deciding what configuration to fetch and update, and dealing with hundreds of lines of dynamically generated JSON for progress tracking and so on. </p>



<p>While working with one such JSON (shown below), I was seeing a strange error when trying to update a specific property within the JSON.</p>



<pre class="crayon-plain-tag">{
     &quot;id&quot; : &quot;6894cb4e-907c-43d0-b79d-c4fb8ef422eb&quot;,
     &quot;description&quot; : &quot;Just another manifest for deployment&quot;,
     &quot;version&quot; : &quot;1.0.0.0&quot;,
     &quot;systems&quot; : [
         {
             &quot;serialNumber&quot; : &quot;123abc&quot;,
             &quot;ipAddress&quot; : &quot;8.9.10.11&quot;,
             &quot;status&quot; : &quot;pending&quot;
         },
         {
             &quot;serialNumber&quot; : &quot;456def&quot;,
             &quot;ipAddress&quot; : &quot;8.9.10.12&quot;,
             &quot;status&quot; : &quot;pending&quot;
         },
         {
             &quot;serialNumber&quot; : &quot;789ghi&quot;,
             &quot;ipAddress&quot; : &quot;8.9.10.13&quot;,
             &quot;status&quot; : &quot;pending&quot;
         }
     ]
 }</pre>



<p>This is a very minimal and simplified version of JSON that I have in the automation framework. In this JSON, based on the status of deployment, I need to update the status property of each system in the manifest. </p>



<p>Here is what I tried.</p>



<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/02/1-1024x334.png" alt="" class="wp-image-13518" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/02/1-1024x334.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/1-300x98.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/1-768x250.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/1.png 1141w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>Now, that error is pretty strange. To investigate this, I looked at the type of object that was getting returned from the Where() method.</p>



<figure class="wp-block-image size-large is-resized"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/02/2.png" alt="" class="wp-image-13519" width="404" height="69" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/02/2.png 819w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/2-300x52.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/2-768x133.png 768w" sizes="(max-width: 404px) 100vw, 404px" /></figure>



<p><strong>Update (2/11):</strong> Prasoon commented on this post and mentioned that the Item() method on this collection to update the status property. Here is how we do it based on his suggestion</p>



<pre class="crayon-plain-tag">$manifest.systems.Where({$_.serialNumber -eq '123abc'}).item(0).Status = 'Complete'</pre>



<hr class="wp-block-separator is-style-wide"/>



<p>What you see below is my investigation before Prasoon commented on this post!</p>



<p>It should ideally be a PS custom object. The where() method is therefore doing something to the custom object. I tried, then, using the index of an object within the systems collection.</p>



<figure class="wp-block-image size-large is-resized"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/02/3.png" alt="" class="wp-image-13520" width="404" height="247" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/02/3.png 808w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/3-300x184.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/3-768x470.png 768w" sizes="(max-width: 404px) 100vw, 404px" /></figure>



<p>This is good. So, I can work around the issue with Where() method by using the index but the only catch here is that I need to dynamically determine the index of a specific object instance during the orchestration. I tried a couple of methods.</p>



<figure class="wp-block-image size-large is-resized"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/02/4.png" alt="" class="wp-image-13521" width="462" height="56" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/02/4.png 880w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/4-300x37.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/4-768x95.png 768w" sizes="(max-width: 462px) 100vw, 462px" /></figure>



<p>I was skeptical about the above method of using where() again. And, it does fail. The index of the object instance returned using this method is always -1. </p>



<p>In the second method, I resorted to using a simple for loop to gather the index of the node.</p>



<pre class="crayon-plain-tag">$serialNumber = '456def'
for ($currentIndex = 0; $currentIndex -lt $manifest.systems.Count; $currentIndex++)
 {
     if ($manifest.systems[$currentIndex].serialNumber -eq $serialNumber)
     {
         break
     }
     else
     {
         continue
     }
 }
 $manifest.systems[$currentIndex].status = 'complete'</pre>



<p>The above snippet does not look super optimal to me but works as expected.</p>



<figure class="wp-block-image size-large is-resized"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/02/5.png" alt="" class="wp-image-13524" width="247" height="120" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/02/5.png 372w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/5-300x146.png 300w" sizes="(max-width: 247px) 100vw, 247px" /></figure>



<p>I have not figured out any other optimal way of handling this but have you come across something like this? Do you see a better way to handle this?</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2020/02/10/the-case-of-unknown-custom-object-property/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Secrets Management in PowerShell</title>
		<link>https://www.powershellmagazine.com/2020/02/07/secrets-management-in-powershell/</link>
				<comments>https://www.powershellmagazine.com/2020/02/07/secrets-management-in-powershell/#respond</comments>
				<pubDate>Fri, 07 Feb 2020 05:49:39 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[PowerShell]]></category>
		<category><![CDATA[secrets]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13503</guid>
				<description><![CDATA[In any method of automation secrets management is a very critical part. You wouldn&#8217;t want to store the plain-text credentials... ]]></description>
								<content:encoded><![CDATA[
<p>In any method of automation secrets management is a very critical part. You wouldn&#8217;t want to store the plain-text credentials needed for your automation to carry on the orchestration tasks. Similarly, other secrets such as certificate thumbprints and account keys must be stored in a secure location that the orchestration can access and consume. </p>



<p>Within PowerShell, we have always used the built-in credential manager to store such secrets. There are a bunch of modules out there on <a href="http://www.vpnbug.com/">VPNbug </a>Gallery that you can readily use.</p>



<pre class="crayon-plain-tag">Find-Module -Name &quot;*Credential*&quot;, &quot;*Secret*&quot;</pre>



<p>There are also modules that are wrappers around 3rd party vaults such as <a href="https://www.vaultproject.io/">Hashicorp Vault</a> or <a rel="noreferrer noopener" aria-label="SecureStore (opens in a new tab)" href="https://github.com/neosmart/SecureStore" target="_blank">SecureStore</a>. However, there was nothing that was officially supported by Microsoft (Azure Vault doesn&#8217;t count for secrets management in PowerShell) or PowerShell team until now.</p>



<p>At Ignite 2019, PowerShell team introduced <a rel="noreferrer noopener" aria-label="secrets management in PowerShell (opens in a new tab)" href="https://myignite.techcommunity.microsoft.com/sessions/83981?source=sessions" target="_blank">secrets management in PowerShell</a>. Today, <a rel="noreferrer noopener" aria-label="they have announced a development release version (opens in a new tab)" href="https://devblogs.microsoft.com/powershell/secrets-management-development-release/" target="_blank">PowerShell team announced a development release version</a> of a module for PowerShell secrets management. </p>



<pre class="crayon-plain-tag">Install-Module -Name Microsoft.PowerShell.SecretsManagement -AllowPrerelease</pre>



<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/02/commands-1024x181.png" alt="" class="wp-image-13507" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/02/commands-1024x181.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/commands-300x53.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/commands-768x136.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2020/02/commands.png 1210w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>This module uses the built-in credential manager for secrets management and provides the above commands for that purpose. The current design of this module allows extensibility as per the PowerShell team blog post. Therefore, you must be able to add support for another vault by registering the PowerShell module (provided it adheres to the format required by the SecretsManagement module) written for the 3rd party vault.</p>



<p>I have been using some existing modules for secret management in my build and deployment automation. I mostly use the built-in Credential Manager for this purpose. In fact, I demonstrated how I use this in <a rel="noreferrer noopener" aria-label="Garuda framework (opens in a new tab)" href="http://github.com/rchaganti/garuda" target="_blank">Garuda framework</a>. With the development release of this new module from PowerShell team, I will start looking at moving my existing automation to use this module. The one advantage I see here is the extensibility nature of the module. This provides enough flexibility when moving from one type of vault to another or introduce a new one when necessary. Looking forward to see what the community comes up here.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2020/02/07/secrets-management-in-powershell/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PowerShell Conference Europe 2020</title>
		<link>https://www.powershellmagazine.com/2020/01/30/powershell-conference-europe-2020/</link>
				<comments>https://www.powershellmagazine.com/2020/01/30/powershell-conference-europe-2020/#respond</comments>
				<pubDate>Thu, 30 Jan 2020 19:53:00 +0000</pubDate>
		<dc:creator><![CDATA[Tobias Weltner]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Community]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13480</guid>
				<description><![CDATA[June 2, 2020, PowerShell Conference Europe opens for the 5th time in Hannover, Germany. Here is a quick walkthrough for... ]]></description>
								<content:encoded><![CDATA[
<p>June 2, 2020, PowerShell Conference Europe opens for the 5th time in Hannover, Germany. Here is a quick walkthrough for this year’s event. </p>



<h2><strong>Learning New Things</strong></h2>



<p>As an
experienced <strong>PowerShell</strong>
professional, you know its awesome automation capabilities. </p>



<p>At the same
time, you probably spent numerous hours googling for tricky answers, came
across unexpected astonishing tricks and capabilities that you didn&#8217;t know
exist, and may still have questions that none of your colleagues could fully
answer. There is just so much you can do with <strong>PowerShell</strong>, and almost every month something new is added
somewhere in the ecosystem. </p>



<p>You may
have the best colleagues or the most experienced trainers, yet there is <em>no
one in the world who knows it all</em>, let alone knows it best. </p>



<p>Just take a look at the tip of the knowledge iceberg, and check out these three lines:</p>



<pre class="crayon-plain-tag">Install-Module -Name ImportExcel -Scope CurrentUser -Force

$Path = &quot;$env:temp\report.xlsx&quot;

Get-Service | Export-Excel -Path $Path -Now</pre>



<p></p>



<p>In
virtually no time, it turns any <strong>PowerShell</strong> data into a beautiful Excel report. You don&#8217;t
even need <em>Microsoft Office</em> to be installed. Maybe you knew this, maybe
you use it all the time.</p>



<p>But what if
you didn&#8217;t? If you did not know about the <strong>ImportExcel</strong> module before, how
much extra work would that have cost you?</p>



<p>If you never used Install-Module before, and the things in its background, know about <em>PowerShellGet</em>, release pipelines, private repositories, and how to combine <em>Git</em> with the <em>PowerShell Gallery,</em> <em>psconf.eu</em> could be worth a rewarding visit. Any of these items are covered one way or another.</p>



<p>This example is just the tiny tip of the iceberg. Sometimes it&#8217;s just a matter of hinting a powerful module that can make your day. Often, though, it&#8217;s a bit more complex to bring you to the next level, and add new skills. Check the agenda below, the speakers, and their sessions. Chances are there are tons of topics just waiting for you to improve your skill set and tickle your intellect.</p>



<h2>Melting Pot of Creativity</h2>



<p> There is not <em>the one almighty super trainer</em> that can teach everything in one class, and the more you know the harder it gets to learn new things in standard classes anyway. But there is <em>one </em>place where you can meet them all: <em>psconf.eu</em>! </p>



<p>That&#8217;s why
five years ago, we decided to create the PowerShell Conference Europe: to bring
together bright heads and experienced folks and set the stage to have a great
time together. </p>



<p>Each year
we invite 40 renown top experts to deliver expert talks from various areas. Top
people. Like <em>James O&#8217;Neill</em> who coincidentally co-created the wonderful<strong>
ImportExcel</strong> module (together with <em>Doug Finke</em>) that I used in the
sample above. A conference and place to learn about super useful work done by
others. To meet the people behind them and say &#8220;thank you&#8221;. To ask
(any) question, even if it is super hard and tricky, and still get the best
answers. </p>



<p>Here is a
list of our <strong>PowerShell Gladiators</strong> for this year (preliminary with a few
more to be added):</p>



<script type="text/javascript" src="https://sessionize.com/api/v2/y0loje4y/view/SpeakerWall"></script>



<p>Even these 40 people don&#8217;t know <em>everything</em>. This conference is not about these 40 invited speakers delivering their talks unidirectional to you. There talks act as a starting point, to get the thinking started, to get discussions going. </p>



<p>We&#8217;ll again
have a lot of valuable content taking place and being generated in coffee
breaks, in lunch break sessions or by asking questions. This conference is a
4-day learning experience for advanced <strong>PowerShell</strong> professionals. Help steer <strong>PowerShell</strong> into the right direction and be a
part of it. You can make a difference!</p>



<p>Plus, in
some sense an opportunity for companies to reward hard working individuals. If
you are the boss and thinking how could I say &#8220;Thank you&#8221;, sending
your automation crew or successful consultant or script guru to <em>psconf.eu</em>
could be an idea. </p>



<p>Here&#8217;s a video from last year so you get a better impression:</p>



<iframe width="800" height="600" src="https://www.youtube.com/embed/oYFw8YNSWAg" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>



<p>To sign up and reserve your seat, or to get a proposal for your boss, please visit the registration page (<a href="https://psconf.eu/register.html">https://psconf.eu/register.html</a>).</p>



<p>This year, we are working on a delegate jacket. It would help us a lot if you decided to sign up before March 15 for a number of reasons, including not to have to guess jacket sizes. Anyone signing up by March 15 gets a <em>guaranteed</em> jacket size and make sure you get one of the limited seats. Anyone else gets a best-effort jacket size, and we are doing our best at extrapolating jacket sizes.</p>



<h2>The Specs: Conference &amp; Training Hybrid</h2>



<p><em>psconf.eu</em> is a lot of things:</p>



<p>&#8211; <strong>Classic
Conference</strong>: We experimented over the years with the number of sessions and
parallel tracks. We wanted to make sure you can always pick a session that
matters to you and build a dense agenda while at the same time not missing too
much. </p>



<p>&nbsp;&nbsp; Four days turned out the best trade-off
between getting out of work and consuming adequate content and justifying the
travel. </p>



<p>&nbsp;&nbsp; Three tracks turned out to be the best
trade-off between having rich choices and not missing too many other sessions,
so psconf.eu runs 4 days, on each day you have a choice of three parallel
sessions.</p>



<p>&#8211; <strong>Focused
Sessions</strong>: We tinkered with session lengths and found that 45 minutes of
session time plus 15 minutes of Q&amp;A is the perfect way for speakers to
focus on the beef yet dig deep enough, and delegates to stay focused.</p>



<p>&#8211; <strong>Coffee
Break Talks</strong>: We started with just a few traditional coffee breaks only to
find that <em>Coffee Break Talks</em> are the perfect ending for <em>every</em>
session; personal talks with the speakers, professional discussions among
delegates provide extra value. A quick walk, a smoke, or checking emails may be
your way to guarantee that you are ready for action once the next session
starts. </p>



<p>&nbsp;&nbsp; At psconf.eu, there is now a coffee break
after <em>every</em> session.</p>



<p>&#8211; <strong>Great
Environment</strong>: Don&#8217;t be mistaken: listening to dense sessions for a whole day
is hard work. It is fun but it still exhausting. That&#8217;s why we make sure
attendees get recharged whenever possible: no typical &#8220;conference
sandwich&#8221; but instead a variety of healthy and yummy freshly cooked food,
classical and vegetarian. Fresh fruits and biscuits in the afternoon. And big
rooms with fresh air.&nbsp; </p>



<p>&#8211; <strong>On-Premises
AND Cloud</strong>: <strong>PowerShell</strong>
was born on-premises but is now <em>also</em> in the cloud. The word <em>&#8220;also&#8221;</em>
matters to us: <em>cloud</em> and <em>DevOps</em> turned into buzz words with many
focusing entirely on these while increasingly neglecting &#8220;on-premises&#8221;.
Not for us: you find sophisticated sessions and experts both for <em>on-premises</em>
tasks and <em>cloud</em> tasks, so you can pick what matters to you and learn new
skills where appropriate. </p>



<p>&nbsp; While we love &#8220;the latest &amp;
greatest&#8221;, solid knowledge for realistic every-day tasks and topics are
just as important and part of the agenda. </p>



<p>&#8211; <strong>Evening
Event</strong>: Over the years we watched delegates personally grow and becoming
experts in their area, so this event is not unidirectional and just about the
40 renown speakers but also about the 300 delegates: interacting, discussing,
exploring, and learning from each other are key, including networking and
building professional international relationships. </p>



<img src="http://www.burg-koenigsworth.de/files/pages/396c28967e60f960692e917c3e6e1ec6.jpg" alt="Image" style="zoom:50%;">



<p>Beyond the technical sessions at daytime, we organize an official evening event to kick things off, to &#8220;break the ice&#8221; and provide the setting to get to know each other. It&#8217;s perfectly OK to just eat and drink, or just listen. However, if you attend the conference all by yourself and are open to get to know new people, you definitely can.</p>



<p>&nbsp; This year, we&#8217;ll be at castle Königsworth, an
ancient city castle with a big hall and a number of smaller rooms including the
bar and fireplace room, perfect for hanging lose, discussing <strong>PowerShell</strong> ideas, or founding new user groups.
Dinner and drinks included.</p>



<img src="http://www.burg-koenigsworth.de/files/pages/c2e86faae44cbcc56f1bd08a4b8a68a3.jpg" alt="Image" style="zoom:50%;">



<p>&nbsp;&nbsp; On the following days, typically groups form on their own and successfully tackle Hannover night life independently.&nbsp; </p>



<p>&#8211; <strong>Authoritative
First-Hand Information</strong>: Get authoritative firsthand information from the
people making <strong>PowerShell</strong>
and the services around it. </p>



<p>&nbsp; &#8211; <strong>PowerShell</strong> inventor <em>Jeffrey Snover</em> will be with us
again, as is part of the <strong>PowerShell Team</strong> around <em>Steve Lee</em> and <em>Joey
Aiello</em>. </p>



<p>&nbsp; &#8211; We welcome <em>Bruce Payette</em>, the key
developer of <strong>Windows PowerShell</strong>, and <em>Christoph Bergmeister,</em> one
of the open-source contributors to both <strong>PowerShell 7</strong> and the <strong>VSCode</strong>
extension. </p>



<p>&nbsp; &#8211; <strong>Amazon</strong> (AWS) sends <em>Angel Calvo</em>,
a former <strong>PowerShell</strong>
Manager and now General Manager at Amazon Web Services. </p>



<p>&nbsp; &#8211; <strong>Microsoft Germany</strong> sends <em>Miriam
Wiesner</em>, security program manager for Microsoft Defender ATP, and <em>Friedrich
Weinmann</em>, a premier field engineer for security. Both talk about securing <strong>PowerShell</strong> and your enterprise.</p>



<p>&#8211; <strong>Community</strong>:
<strong>PowerShell</strong>
is driven by a vibrant, creative and very friendly community. If you are
already a member of a <strong>PowerShell</strong> user group, you know how caring the ecosystem
is and how the community shares work through modules, blog posts, podcasts, and
more. So psconf.eu is the annual place to meet in person, think up new plans,
and just hang out and relax among people that share the passion.</p>



<p>&nbsp; If this community is still new to you,
psconf.eu is a perfect starting place to meet the gang and many of the key
members in person, find a user group near you, or get help founding one
yourself. Don&#8217;t worry to attend this event if you feel you are a bit shy. You
never walk alone (unless you want to), and there are plenty of opportunities to
get connected. </p>



<p>&#8211; <strong>Session Recordings</strong>: A significant amount of effort each year goes into session recordings. We don&#8217;t turn the conference into a huge video tutorial but want to make sure each attendee can recap sessions with a video. We make all of these videos freely available (<a href="https://www.youtube.com/channel/UCxgrI58XiKnDDByjhRJs5fg">https://www.youtube.com/channel/UCxgrI58XiKnDDByjhRJs5fg</a>) after the conference on a best-effort basis. They can&#8217;t capture the many discussions, side events and personal talks. But they are very helpful to rewind through some of the topics and sessions and refresh the memory.</p>



<h2>Sessions</h2>



<p>Below
please find the preliminary agenda. There are still some blind spots while we
are waiting for the <strong>PowerShell Team</strong> to finalize their sessions. Since <strong>PowerShell
7</strong> is released this year, you can guess some of their topics.</p>



<p>Click a session to open a popup with the session abstract!</p>



<script type="text/javascript" src="https://sessionize.com/api/v2/y0loje4y/view/GridSmart"></script>



<p>A month prior to the conference when the agenda is finalized, we make available a conference app that you can use to start building your personal agenda and navigate the sessions during the conference.</p>



<h2>Community Module Authors</h2>



<p>Chances are
you are using community-authored <strong>PowerShell</strong> modules in your daily work. </p>



<p>Below is a
quick list of popular <strong>PowerShell</strong> modules published by this year’s speakers.&nbsp; <em>psconf.eu</em> would be an excellent time
to meet their creators, ask questions, toss in ideas, or just say &#8220;thank
you&#8221;:</p>



<p>&#8211; Universal Dashboard: <em>Adam Driscoll</em> is with us this year and presents his Universal Dashboard (<a href="https://www.powershellgallery.com/packages/UniversalDashboard">https://www.powershellgallery.com/packages/UniversalDashboard</a>) to create breath-taking web-based helpdesk UIs. He&#8217;s also an expert for building your own <strong>PowerShell</strong> hosts and has created the PowerShell extension for Visual Studio (<a href="https://marketplace.visualstudio.com/items?itemName=AdamRDriscoll.PowerShellToolsforVisualStudio2017-18561">https://marketplace.visualstudio.com/items?itemName=AdamRDriscoll.PowerShellToolsforVisualStudio2017-18561</a>). One of his lesser known <strong>PowerShell</strong> modules is Snek (<a href="https://www.powershellgallery.com/packages/snek">https://www.powershellgallery.com/packages/snek</a>), a wrapper around <em>Python for .NET</em>.</p>



<p>&#8211; TypeToUml: <em>Anthony Allen</em> has created TypeToUml (<a href="https://www.powershellgallery.com/packages/TypeToUml">https://www.powershellgallery.com/packages/TypeToUml</a>) to create UML diagrams from .NET types and in his talks sheds lights on <strong>PowerShell</strong> scoping and plenty of reusable code to not have to reinvent the wheel all the time.</p>



<p>&#8211; EzLog: <em>Arnaud Petitjean</em> makes well-formatted <strong>PowerShell</strong> logs a snap with EzLog (<a href="https://www.powershellgallery.com/packages/EZLog">https://www.powershellgallery.com/packages/EZLog</a>) and shares freely his module KeeRest (<a href="https://github.com/apetitjean/KeeRest">https://github.com/apetitjean/KeeRest</a>) to expose a KeePass database via a Rest API. He&#8217;s from France and has published a number of French books on <strong>PowerShell</strong>. At the conference, he focuses on secret management: how to build a secure Rest API to expose passwords, and how to manage access permissions.</p>



<p>&#8211; ArmHelper: <em>Barbara Forbes</em> published ArmHelper (<a href="https://www.powershellgallery.com/packages/ARMHelper">https://www.powershellgallery.com/packages/ARMHelper</a>) which provides functions to help with the deployment of ARM templates. Barbara is an <em>Office365</em> and <em>Azure</em> expert and shares her know-how at 4bes.nl. Her talks at the conference help you discover <em>Azure PowerShell Functions</em> and find out how you can run some of your <strong>PowerShell</strong> tasks in the cloud.</p>



<p>&#8211;
PSScriptAnalyzer: <em>Christoph Bergmeister</em> is one of the master minds
behind the <em>PSScriptAnalyzer</em> that analyzes <strong>PowerShell</strong> code in real-time and is responsible for
squiggle lines in <strong>VSCode</strong>. If you ever wanted to extend the cleverness of
this engine, or add your own rules and have the engine check your corporate <strong>PowerShell</strong> formatting rules, this is the
chance to get the know-how first-hand.</p>



<p>&#8211; PowerShell IoT: <em>Daniel Silva</em> is one of the most prominent <strong>PowerShell</strong> IoT lovers and with his module (<a href="https://github.com/PowerShell/PowerShell-IoT">https://github.com/PowerShell/PowerShell-IoT</a>) illustrates how to use <strong>PowerShell</strong> to control devices and build your own smart home. At the conference, Daniel helps you expand your skills in two directions: learn more about <em>IoT</em>, and embrace <em>C#</em> even if you are not a developer and happy with scripting.</p>



<p>&#8211; RDExSessionInfo: <em>Evgenij Smirnov</em> created RDExSessionInfo (<a href="https://www.powershellgallery.com/packages/RDExSessionInfo">https://www.powershellgallery.com/packages/RDExSessionInfo</a>) to get extended information on RDS sessions. At the conference, he talks about consuming low-level APIs to extend PowerShell’s capabilities.</p>



<p>&#8211; Kubectl: if you never heard of &#8220;Kubectl&#8221; or Kubernetes (<a href="https://kubernetes.io/docs/reference/kubectl/overview/">https://kubernetes.io/docs/reference/kubectl/overview/</a>), then <em>Felix Becker</em> and his module PSKubectl (https://www.powershellgallery.com/packages/PSKubectl) may be a great starting point: kubectl is the command line to manage Kubernetes which is becoming the industry standard to orchestrate container deployments. PSKubectl wraps this inside <strong>PowerShell</strong>. If this made you curious, join Felix&#8217; talk about<em> PSKubectl</em>. If containers and clusters aren&#8217;t yours, join Felix shedding light on the secret treasures of the <strong>PowerShell</strong> formatting system.</p>



<p>&#8211; MicrosoftGraphAPI: <em>Jakob Gottlieb Svendsen</em> is our LEGO robot specialist with many more talents. He wrote MicrosoftGraphAPI (<a href="https://www.powershellgallery.com/packages/MicrosoftGraphAPI">https://www.powershellgallery.com/packages/MicrosoftGraphAPI</a>) to manage the Microsoft Graph functionality from <strong>PowerShell</strong>. At the conference, he&#8217;ll be talking about <strong>PowerShell</strong> on Raspberry Pi, and the making of his Azure-connected green house.</p>



<p>&#8211; ImportExcel: <em>James O&#8217;Neill</em> (working with <em>Doug Finke</em>) has probably created the single most useful community module there is: ImportExcel (<a href="https://www.powershellgallery.com/packages/ImportExcel">https://www.powershellgallery.com/packages/ImportExcel</a>) makes importing and exporting Excel data/xlsx files a snap and does not even require Office to be installed. </p>



<p>&nbsp;James talks about becoming a <strong>PowerShell</strong> parameter Ninja, and on how to use multithreading in <strong>PowerShell</strong> to speed up tasks and do them in parallel, using his module Start-Parallel (<a href="https://www.powershellgallery.com/packages/Start-parallel)">https://www.powershellgallery.com/packages/Start-parallel)</a>.</p>



<p>&#8211; PSVersion: Ever wanted to turn the <strong>PowerShell</strong> version number into a meaningful friendly name? Then use PSVersion (<a href="https://www.powershellgallery.com/packages/PSVersion">https://www.powershellgallery.com/packages/PSVersion</a>) from <em>Jan Egil Ring</em>! </p>



<p>&nbsp; In his talks, Jan focuses on <em>Azure
Functions</em> in a hybrid world, and <em>Azure Policy Guest Configuration</em> which
in some respect works like Group Policies in the cloud and across domains and
platforms.</p>



<p>&#8211; ADCSTemplateParser: Senior cloud architect <em>Jan-Henrik Damaschke</em> created ADCSTemplateParser (<a href="https://www.powershellgallery.com/packages/ADCSTemplateParser">https://www.powershellgallery.com/packages/ADCSTemplateParser</a>). He also worked on asynchronous <strong>PowerShell</strong> logging (<a href="https://www.itinsights.org/PowerShell-async-logging/">https://www.itinsights.org/PowerShell-async-logging/</a>) which is highly interesting: don&#8217;t let writing logs slow down or block your scripts! He&#8217;s explaining his module and concepts at one of his talks. His second talk focuses on real-time communication (basically the stuff done by messengers like WhatsApp) via SignalR and Azure Functions.</p>



<p>&#8211; PowerForensics: Security expert <em>Jared Atkinson</em> published PowerForensics (<a href="https://www.powershellgallery.com/packages/PowerForensics">https://www.powershellgallery.com/packages/PowerForensics</a>), a digital forensics framework for <strong>PowerShell</strong>. At the conference, Jared talks about detection engineering to uncover hacker techniques, and ways for enterprises to approach intrusion detections and responses at scale.</p>



<p>&#8211; Pester: This module is so important, it is part of Windows. The latest version is available at <a href="https://www.powershellgallery.com/packages/Pester">https://www.powershellgallery.com/packages/Pester</a>. Pester is maintained by <em>Jakub Jareš</em> and is a <strong>PowerShell</strong> testing framework to make sure a script does what it is supposed to do, and won&#8217;t break when you add new things to it. At the conference, Jakub introduces version 5. If you have any question about <em>Pester</em>, make sure you bring it.</p>



<p>&#8211; ArcAdminTools: Co-founder of the Polish <strong>PowerShell</strong> user group <em>Mateusz Czerniawski</em> has published a collection of useful admin tools called ArcAdminTools (<a href="https://www.powershellgallery.com/packages/ArcAdminTools">https://www.powershellgallery.com/packages/ArcAdminTools</a>). At the conference, he talks about <em>Azure Log Analytics</em> (ALA) and sheds light on <em>Microsoft Graph</em> and what you can do with it.</p>



<p>&#8211; AADInternals: <em>Dr. Nestori Syynimaa</em> is a leading <em>Office365</em> expert and has created AADInternals (<a href="https://www.powershellgallery.com/packages/AADInternals">https://www.powershellgallery.com/packages/AADInternals</a>): It utilizes several internal features of Azure Active Directory, Office 365, and related admin tools and can be used as a Azure AD hacking and pen-testing tool. With his intimate knowledge of <em>Azure</em> and<em> Office365</em>, Nestori talks about Azure AD security and how it can be attacked and abused.</p>



<p>&#8211; cChoco: <em>Paul Broadwith</em> is a DSC expert and has created the extremely successful cChoco (<a href="https://www.powershellgallery.com/packages/cChoco">https://www.powershellgallery.com/packages/cChoco</a>) DSC resource to use <em>Chocolatey</em> with DSC. At the conference, Paul is tackling two extremely hot topics: using SSH for remoting instead of WinRM, and how to automate the setup of brand-new computer hardware using <em>Boxstarter</em>.</p>



<p>&#8211; PSWriteColor: <em>Przemysław Kłys</em> is a &#8220;discovery&#8221; of last year’s <em>psconf.eu</em>. He had never talked before at large conferences, yet his sessions rocked last year. Meanwhile, he is a regular speaker at large conferences and has published a great number of modules (https://www.powershellgallery.com/profiles/Przemyslaw.Klys), for example PSWriteColor (<a href="https://www.powershellgallery.com/packages/PSWriteColor">https://www.powershellgallery.com/packages/PSWriteColor</a>): a wrapper around Write-Host to create beautiful colored output. At this year’s conference, he&#8217;ll use his set of free tools to create Active Directory and Office365 auto-documentation to word, excel, and HTML. Definitely a must-see.</p>



<p>&#8211; DBAchecks: <em>Rob Sewell</em> co-authored DBAchecks (https://www.powershellgallery.com/packages/dbachecks) together with <em>Chrissy LeMaire</em>: A testing framework for SQL Server to ensure it is (and continues to be) compliant with your requirements. And as a database admin, of course you&#8217;ll know dbatools (<a href="https://www.powershellgallery.com/packages/dbatools">https://www.powershellgallery.com/packages/dbatools</a>), the community tool filled with commands to easily automate database deployment and administration. Rob was in charge of the psconf.eu call for papers and manages the speakers. At the conference, he is talking about <em>PowerShell Notebooks</em>, part of <em>Azure Data Studio</em>, and how useful they can be for you.</p>



<p>&#8211; PSHTML: <em>Stephane van Gulick</em> defines himself in one sentence: &#8220;I love computers&#8221;. He is into DevOps, but also into HTML. His module PSHTML (<a href="https://www.powershellgallery.com/packages/PSHTML">https://www.powershellgallery.com/packages/PSHTML</a>) can be used to create stunning reports and build entire responsive websites. At the conference he&#8217;s sharing how he discovered <strong>PowerShell</strong> classes and how you could benefit from classes, too.</p>



<p>&#8211; ISESteroids: <em>Dr. Tobias Weltner</em> originally created ISESteroids (<a href="https://www.powershellgallery.com/packages/ISESteroids">https://www.powershellgallery.com/packages/ISESteroids</a>) to make his life easier while adding missing functionality to the built-in <strong>PowerShell</strong> ISE. Soon, public interest turned this into a commercial-grade product for anyone working with <em>Windows PowerShell</em> and the<em> PowerShell ISE</em>. Tobias has started the <em>psconf.eu</em> conference and lives in Hannover.</p>



<h2>Book Signing</h2>



<p>Many of us
have learned <strong>PowerShell</strong>
using books, and we are humbled to have a number of renown <strong>PowerShell</strong> book authors from around the world
with us. If you have learned by reading one of the books below, and still own
your copy, bring it to have it signed by the author:</p>



<p>&#8211; <strong>Windows
PowerShell 5.1 Biblia</strong></p>



<p>&nbsp; <strong>PowerShell Deep Dives</strong></p>



<p>&nbsp; <em>Bartek Bielawski</em> talks about <strong>PowerShell</strong> classes and how you can author DSC
resources with it. He also helps you find your way into Git and as a team work
with scripts in a safe and structured way.</p>



<p>&#8211; <strong>PowerShell
Core et Windows PowerShell</strong></p>



<p>&nbsp; <strong>Windows PowerShell : Fonctionnalités
avancées </strong></p>



<p>&nbsp; <strong>Windows PowerShell : Guide de référence
pour l&#8217;administration système</strong></p>



<p>&nbsp; <strong>PowerShell Deep Dives </strong></p>



<p>&nbsp; <em>Arnaud Petitjean</em> talks about managing
secrets such as passwords and access permissions</p>



<p>&#8211; <strong>PowerShell
in Action</strong></p>



<p>&nbsp;<em>Bruce Payette</em> is a founding member of
the <strong>PowerShell</strong>
team and now with AWS. At the conference, Bruce will explain some of the more
mysterious moving parts of the <strong>PowerShell</strong> architecture like PSHost, threads, and runspace
pools. He&#8217;ll focus on how they work and how they differ in remoting, and if you
ever attended one of his talks, you know the tons of practical and undocumented
tricks that come with it.</p>



<p>&#8211; <strong>Windows
PowerShell 5 &#8211; kurz und gut</strong></p>



<p>&nbsp; <em>Thorsten Butz</em> is a <strong>PowerShell</strong> trainer and &#8220;on-premises&#8221;
fan, and at the conference talks about querying Wikidata with a glimpse of
SPARQL.</p>



<p>&#8211; <strong>PowerShell 5: Windows-Automation für
Einsteiger und Profis</strong></p>



<p><strong>&nbsp; Windows
PowerShell: Grundlagen &amp; Scripting-Praxis für Einsteiger – alle Versionen</strong></p>



<p><em>&nbsp; </em><em>Dr. Tobias Weltner</em> is running <em>psconf.eu</em> and
delivering <strong>PowerShell</strong>
trainings throughout Europe.</p>



<h2>Get Your Seat!</h2>



<p>Don&#8217;t wait
for too long and get your seat! In the past three years, <em>psconf.eu</em> sold
out every time.</p>



<p>To sign up and reserve your seat, or to get a proposal for your boss, please visit the registration page (<a href="https://psconf.eu/register.html">https://psconf.eu/register.html</a>).</p>



<p>Signing up
early has a number of advantages (for us, but also for you):</p>



<p>* Hotel
accommodation is still reasonable, and there is a variety of flights available
to Hannover Airport</p>



<p>* You have
the guarantee to get a seat</p>



<p>* We are
working on a <strong>PowerShell</strong>
delegate jacket. Obviously, the jacket needs to go to production at some time.
Anyone signing up until <em>March 15</em> gets his or her <em>guaranteed</em>
jacket size. We order jackets based on gender, so yes the jackets do look good
for female delegates as well! Of course, we do our best in extrapolating jacket
sizes and types for the rest but anyone signing up later gets a <em>best-effort</em>
jacket size and type.</p>



<p>* Signing
up early makes life for us a lot easier.</p>



<p>What if you
signed up early and can&#8217;t come? While conference tickets are never refundable
(or else a conference would be impossible to organize), they are transferrable
at no cost.</p>



<p>We are
looking forward to seeing you at the PowerShell Conference Europe 2020!</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2020/01/30/powershell-conference-europe-2020/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Parsing Failover Cluster Validation Report in PowerShell</title>
		<link>https://www.powershellmagazine.com/2020/01/30/parsing-failover-cluster-validation-report-in-powershell/</link>
				<comments>https://www.powershellmagazine.com/2020/01/30/parsing-failover-cluster-validation-report-in-powershell/#respond</comments>
				<pubDate>Thu, 30 Jan 2020 13:22:02 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[failover cluster]]></category>
		<category><![CDATA[PowerShell]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13457</guid>
				<description><![CDATA[If you have ever worked with the Test-Cluster command in the failover clustering module, you will know that this command... ]]></description>
								<content:encoded><![CDATA[
<p>If you have ever worked with the Test-Cluster command in the failover clustering module, you will know that this command generates an HTML report. This is visually good, for like IT managers, but not very appealing to people are automating infrastructure build process. There is no way this command provides any passthru type of functionality through which it returns the result object that can be easily parsed or used in PowerShell.</p>



<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationHTML-1024x439.png" alt="" class="wp-image-13459" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationHTML-1024x439.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationHTML-300x129.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationHTML-768x329.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationHTML.png 1576w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>As a part of some larger automation that I was building, I needed to parse the validation result into a PowerShell object that can be used later in the orchestration. Parsing HTML isn&#8217;t what I needed but a little of digging gave some clues about the XML report that gets generated when we run this command.</p>



<p>Behind the scenes, the Test-Cluster command generates an XML every time it was run. This XML gets stored at C:\Windows\Temp. Looking at the XML you can easily notice that the schema was really designed to generate the HTML easily. So, it took a few minutes to understand how the tests are categorized and come up with the below script.</p>



<pre class="crayon-plain-tag">[CmdletBinding()]
param
(
    [Parameter(Mandatory = $true)]
    [String]
    $ValidationXmlPath
)

$xml = [xml](Get-Content -Path $ValidationXmlPath)
$channels = $xml.Report.Channel.Channel

$validationResultArray = New-Object -TypeName System.Collections.ArrayList

foreach ($channel in $channels)
{
    if ($channel.Type -eq 'Summary')
    {
        $channelSummaryHash = [PSCustomObject]@{}
        $summaryArray = New-Object -TypeName System.Collections.ArrayList

        $channelId = $channel.id
        $channelName = $channel.ChannelName.'#cdata-section'        
        
        foreach ($summaryChannel in $channels.Where({$_.SummaryChannel.Value.'#cdata-section' -eq $channelId}))
        {
            $channelTitle = $summaryChannel.Title.Value.'#cdata-section'
            $channelResult = $summaryChannel.Result.Value.'#cdata-section'
            $channelMessage = $summaryChannel.Message.'#cdata-section'

            $summaryHash = [PSCustomObject] @{
                Title = $channelTitle
                Result = $channelResult
                Message = $channelMessage
            }

            $null = $summaryArray.Add($summaryHash)
        }

        $channelSummaryHash | Add-Member -MemberType NoteProperty -Name Category -Value $channelName
        $channelSummaryHash | Add-Member -MemberType NoteProperty -Name Results -Value $summaryArray

        $null = $validationResultArray.Add($channelSummaryHash)
    }
}

return $validationResultArray</pre>



<p>The input to the above script is the XML file that gets generated at C:\Windows\Temp. Once you run the script, you should see the output similar to what is shown below.</p>



<figure class="wp-block-image size-large"><img src="https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationResult-1024x344.png" alt="" class="wp-image-13474" srcset="https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationResult-1024x344.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationResult-300x101.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2020/01/clusterValidationResult-768x258.png 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>I have only added the property values that I really need in my scripts but you can look at the XML and then easily modify the above script to add other details as you need. </p>



<p>Comment on this <a href="https://gist.github.com/rchaganti/b4fc88f1615c6810b2c46b647ae4fe96">Gist</a> if you have any suggestions or have you version of the script.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2020/01/30/parsing-failover-cluster-validation-report-in-powershell/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Weekly Module Spotlight: ImportExcel</title>
		<link>https://www.powershellmagazine.com/2019/09/09/weekly-module-spotlight-importexcel/</link>
				<comments>https://www.powershellmagazine.com/2019/09/09/weekly-module-spotlight-importexcel/#respond</comments>
				<pubDate>Mon, 09 Sep 2019 04:00:30 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Module Spotlight]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[importexcel]]></category>
		<category><![CDATA[modulespotlight]]></category>
		<category><![CDATA[opensource]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13425</guid>
				<description><![CDATA[In my previous job, I worked at a customer site that had multiple Windows and Unix Servers. One of the... ]]></description>
								<content:encoded><![CDATA[
<p>In my previous job, I worked at a customer site that had multiple Windows and Unix Servers. One of the every day tasks was to report the disk space utilization from these servers. When I joined the customer site, engineers there used to collect the statistics manually and create an Excel spreadsheet manually. This was not just time consuming but boring as well. This task is something that needs to be automated. Period.</p>



<p>So, I went on to create a rather long VBScript that uses WMI for collecting disk usage statistics from Windows servers and uses Excel COM object to generate spreadsheets that contain the reports. It certainly made my job easy. I just had to run this sitting at my local workstation and within a few seconds I would have the Excel report that I can mail to the IT manager.</p>



<p>But, those of you who worked on Excel COM objects and PowerShell would know that it is not the best thing. Working with VBScript is a pain. When that combined with COM object, the pain of writing and testing a script increases a few folds.</p>



<p>You would be glad to hear that you don&#8217;t have to do that anymore if you know PowerShell even a little bit. Thanks to <a rel="noreferrer noopener" aria-label="ImportExcel (opens in a new tab)" href="https://github.com/dfinke/ImportExcel" target="_blank">ImportExcel</a>.</p>



<p>ImportExcel allows you to read and write Excel files without installing Microsoft Excel on your system. With this module, there is no need to bother with the cumbersome Excel COM-object. With ImportExcel, creating Tables, Pivot Tables, Charts and much more has becomes a lot easier.</p>



<p>Before you try any of the following examples, install ImportExcel module from the <a rel="noreferrer noopener" aria-label="PowerShell Gallery (opens in a new tab)" href="https://www.powershellgallery.com/packages/ImportExcel" target="_blank">PowerShell Gallery</a>.</p>



<p>Here is the simple first example for you!</p>



<pre class="crayon-plain-tag">Get-Process | Select-Object Company, Name, Handles | Export-Excel</pre>



<p>This a command exports the values of selected properties from the process object and opens an Excel spreadsheet automatically.</p>



<p>Here is another example from <a rel="noreferrer noopener" aria-label="ImportExcel GitHub repository (opens in a new tab)" href="https://github.com/dfinke/ImportExcel/blob/master/Examples/Charts/ChartAndTrendlines.ps1" target="_blank">ImportExcel GitHub repository</a> that generates charts.</p>



<pre class="crayon-plain-tag">$xlfile = &quot;$env:TEMP\trendLine.xlsx&quot;
Remove-Item $xlfile -ErrorAction SilentlyContinue

$data = ConvertFrom-Csv @&quot;
Region,Item,TotalSold
West,screws,60
South,lemon,48
South,apple,71
East,screwdriver,70
East,kiwi,32
West,screwdriver,1
South,melon,21
East,apple,79
South,apple,68
South,avocado,73
&quot;@

$cd = New-ExcelChartDefinition -XRange Region -YRange TotalSold -ChartType ColumnClustered -ChartTrendLine Linear
$data | Export-Excel $xlfile -ExcelChartDefinition $cd -AutoNameRange -Show</pre>



<p>Finally, here is something I showed at the PowerShell Conference Europe 2019. This uses the speaker and session data JSON and generates a spreadsheet.</p>



<pre class="crayon-plain-tag">if (-not (Get-Module -ListAvailable -Name ImportExcel -ErrorAction SilentlyContinue))
{
    Install-Module -Name ImportExcel -Force
}

$speakersJson = 'https://raw.githubusercontent.com/psconfeu/2019/master/data/speakers.json'
$sessionsJson = 'https://raw.githubusercontent.com/psconfeu/2019/master/sessions.json'

$speakers = ConvertFrom-Json (Invoke-WebRequest -UseBasicParsing -Uri $speakersJson).content
$sessions = ConvertFrom-Json (Invoke-WebRequest -UseBasicParsing -Uri $sessionsJson).content

# All Sessions Sheet
$sessions | Select-Object Name, Starts, Ends, Track, Speaker | 
            Export-Excel -Path .\psconfeu2019.xlsx -WorksheetName 'All Tracks' `
            -Title 'PowerShell Conference EU 2019 - Sessions' `
            -TitleBold -TitleFillPattern DarkDown -TitleSize 20 `
            -TableStyle Medium6 -BoldTopRow

# Track sheets
foreach ($i in 1..3)
{
    $trackSessions = $sessions.Where({$_.Track -eq &quot;Track $i&quot;})
    $trackSessions | Select-Object Name, Starts, Ends, Speaker |
        Export-Excel -Path .\psconfeu2019.xlsx -WorksheetName &quot;Track $i&quot; `
        -Title 'PowerShell Conference EU 2019 - Track $i' `
        -TitleBold -TitleFillPattern DarkDown -TitleSize 20 `
        -TableStyle Medium6 -BoldTopRow        
}

# Add Speakers sheet
$speakers | Export-Excel -Path .\psconfeu2019.xlsx -WorksheetName 'Speakers' `
    -Title 'PowerShell Conference EU 2019 - Speakers' `
    -TitleBold -TitleFillPattern DarkDown -TitleSize 20 `
    -TableStyle Medium6 -BoldTopRow 

# Add chart for speaker country number
$chartDef = New-ExcelChart -Title 'PowerShell Conference EU 2019 - Speakers' `
                    -ChartType ColumnClustered `
                    -XRange Name -YRange Count `
                    -Width 800 -NoLegend -Column 3 

$speakers | Group-Object -Property Country | Select-Object Name, Count |  Sort-Object -Property Count -Descending |
    Export-Excel -path .\psconfeu2019.xlsx -AutoSize -AutoNameRange -ExcelChartDefinition $chartDef -WorksheetName SpeakerCountryChart -Show</pre>



<p>There are many other ways you can use this module in creating report dashboards. The GitHub <a rel="noreferrer noopener" aria-label="repository contains several examples (opens in a new tab)" href="https://github.com/dfinke/ImportExcel/tree/master/Examples" target="_blank">repository contains several examples</a> that you can use as a starting point. </p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/09/09/weekly-module-spotlight-importexcel/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Weekly Module Spotlight: Polaris</title>
		<link>https://www.powershellmagazine.com/2019/09/03/weekly-module-spotlight-polaris/</link>
				<comments>https://www.powershellmagazine.com/2019/09/03/weekly-module-spotlight-polaris/#respond</comments>
				<pubDate>Tue, 03 Sep 2019 05:41:44 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Module Spotlight]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[modulespotlight]]></category>
		<category><![CDATA[opensource]]></category>
		<category><![CDATA[polaris]]></category>
		<category><![CDATA[pshtml]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13403</guid>
				<description><![CDATA[I create HTTP REST APIs a lot in my proof-of-concept work and I generally use the .NET HTTPListener class for... ]]></description>
								<content:encoded><![CDATA[
<p>I create HTTP REST APIs a lot in my proof-of-concept work and I generally use the <a rel="noreferrer noopener" aria-label=".NET HTTPListener (opens in a new tab)" href="https://docs.microsoft.com/en-us/dotnet/api/system.net.httplistener?view=netframework-4.8" target="_blank">.NET HTTPListener</a> class for this purpose. Using this class we can create simple and programmatically controlled HTTP listeners. For some quick prototype this totally makes sense as it is inbox and there is no need for any external modules or libraries. However, creating complex endpoints won&#8217;t be easy. This is where <a rel="noreferrer noopener" aria-label="Polaris (opens in a new tab)" href="https://github.com/PowerShell/Polaris" target="_blank">Polaris</a> plays a role. </p>



<p>Polaris is a cross-platform, minimalist web framework for PowerShell. This is an experimental module but stable enough to try it out. Before you can try out the examples, install Polaris module from <a rel="noreferrer noopener" aria-label="PS Gallery (opens in a new tab)" href="https://www.powershellgallery.com/packages/Polaris/" target="_blank">PS Gallery</a>.</p>



<p>Here is a quick example of how we create a HTTP endpoint.</p>



<pre class="crayon-plain-tag">New-PolarisGetRoute -Path &quot;/helloworld&quot; -Scriptblock {
    $Response.Send('Hello World!')
}

Start-Polaris</pre>



<p>Once these commands are executed, if you access <em>http://localhost:8080/helloworld</em> in a browser, you will see &#8216;Hello World!&#8217; returned as response. In the above example, there is just one endpoint or route. This implements HTTP GET method. Whenever this route gets accessed, the PowerShell commands specified as an argument to <em>-Scriptblock</em> parameter gets invoked. In this example, we are just using the <em>$Response</em> automatic variable that Polaris provides and use the <em>.Send()</em> method to send the response back to browser.</p>



<p>Similar to this, you can create other HTTP routes as well for POST, PUT, DELETE, and so on. In the next example, you will see how we can <a rel="noreferrer noopener" aria-label="combine what PSHTML provides with Polaris (opens in a new tab)" href="https://www.powershellmagazine.com/2019/08/21/weekly-module-spotlight-pshtml/" target="_blank">combine what PSHTML provides with Polaris</a>. </p>



<p>Save the following script as content.ps1 in a folder of your choice.</p>



<pre class="crayon-plain-tag">html {
    head {
        title &quot;Example 2&quot;
    }

    body {
        h1 {&quot;This is just an example of using PSHTML as view engine for Polaris&quot;}

        $Languages = @(&quot;PowerShell&quot;,&quot;Python&quot;,&quot;CSharp&quot;,&quot;Bash&quot;)

        &quot;My Favorite language are:&quot;
        ul{
          foreach($language in $Languages){
               li {
                    $Language
               }
           }
        }
    }

    Footer {
        h6 &quot;This is h1 Title in Footer&quot;
    }
}</pre>



<p>The following script is the route that we need to create for displaying the HTML content from the above script.</p>



<pre class="crayon-plain-tag">New-PolarisGetRoute -Path &quot;/languages&quot; -Scriptblock {
    $response.SetContentType('text/html')
    $html = . &quot;C:\scripts\content.ps1&quot;
    $response.Send($html)
}

Start-Polaris -Port 8080</pre>



<p>Once you start Polaris and load the routes, you can access http://localhost:8080 to see the content generated from a PS1 script. It should be like this!</p>



<p><img class="wp-image-13420" style="width: 600px;" src="http://www.powershellmagazine.com/wp-content/uploads/2019/09/1.png" alt="" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/09/1.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2019/09/1-300x108.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/09/1-768x278.png 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></p>



<p>Hope you got a hang of what you can achieve with PSHTML and Polaris combined. I have built dashboards with simply this combination and nothing more. In the future posts, I will show one such example from my demo at PowerShell Conference Europe 2019.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/09/03/weekly-module-spotlight-polaris/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Weekly Module Spotlight: PSHTML</title>
		<link>https://www.powershellmagazine.com/2019/08/21/weekly-module-spotlight-pshtml/</link>
				<comments>https://www.powershellmagazine.com/2019/08/21/weekly-module-spotlight-pshtml/#respond</comments>
				<pubDate>Wed, 21 Aug 2019 04:00:12 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Module Spotlight]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[modulespotlight]]></category>
		<category><![CDATA[opensource]]></category>
		<category><![CDATA[pshtml]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13387</guid>
				<description><![CDATA[Ever wanted to generate HTML documents dynamically? Until now, you had to statically code the HTML tags in the PowerShell... ]]></description>
								<content:encoded><![CDATA[
<p>Ever wanted to generate HTML documents dynamically? Until now, you had to statically code the HTML tags in the PowerShell scripts and ensure you open and close all needed HTML elements. Very cumbersome, tedious, and not very efficient.</p>



<p>Do that no more! <a href="https://github.com/Stephanevg/PSHTML">PSHTML</a> is here.</p>



<p>PSHTML is a cross platform Powershell module to generate HTML markup language within a DSL on Windows and Linux.</p>



<p>With PSHTML, you can write HTML documents the same way you write PowerShell scripts. You can use every possible language artifact and generate HTML code dynamically based on the input and context. Let us see a quick example!</p>



<p>Before you go forward and try out the example below, install the module from the PS Gallery.</p>



<pre class="crayon-plain-tag">html {
    head {
        title 'This is a test HTML page'
    }

    body {
        h2 'PSHTML is cool!'

        p {
            'Using PSHTML, offers code completion and syntax highlighting from the the default powershell language.'
            'As PSHTML respects the W3C standards, any HTML errors, will be spotted immediately.'
        }
    }

    footer {
        p {
            'This is footer. All credits reserved to PSHTML'
        }
    }
}</pre>



<p>This generates the following HTML text.</p>



<p><a href="https://www.powershellmagazine.com/wp-content/uploads/2019/08/html1.png" rel="lightbox[13387]"><img scale="0" class="wp-image-13392" style="width: 600px;" src="http://www.powershellmagazine.com/wp-content/uploads/2019/08/html1.png" alt="" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/08/html1.png 1429w, https://www.powershellmagazine.com/wp-content/uploads/2019/08/html1-300x149.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/08/html1-768x381.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2019/08/html1-1024x508.png 1024w" sizes="(max-width: 1429px) 100vw, 1429px" /></a></p>



<p>This is very easy. Let us give it some styles. </p>



<pre class="crayon-plain-tag">html {
    head {
        title 'This is a test HTML page'
    }

    body {
        h2 'PSHTML is cool!' -Style 'color:blue;background-color:powderblue'

        p -Style 'color:red' {
            'Using PSHTML, offers code completion and syntax highlighting from the the default powershell language.'
            'As PSHTML respects the W3C standards, any HTML errors, will be spotted immediately.'
        }
    }

    footer {
        p -Style 'color:green' {
            'This is footer. All credits reserved to PSHTML'
        }
    }
}</pre>



<p>This results in a HTML page as shown below!</p>



<p><a href="https://www.powershellmagazine.com/wp-content/uploads/2019/08/html2.png" rel="lightbox[13387]"><img scale="0" class="wp-image-13396" style="width: 600px;" src="http://www.powershellmagazine.com/wp-content/uploads/2019/08/html2.png" alt="" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/08/html2.png 1006w, https://www.powershellmagazine.com/wp-content/uploads/2019/08/html2-300x97.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/08/html2-768x248.png 768w" sizes="(max-width: 1006px) 100vw, 1006px" /></a></p>



<p>This is all good but very trivial. Let us try generating some tables and we will use bootstrap for styles.</p>



<pre class="crayon-plain-tag">html {
    head {
        title 'Top 5 Processes - HTML report Powered by PSHTML'
        Link -href 'https://maxcdn.bootstrapcdn.com/bootstrap/4.1.3/css/bootstrap.min.css' -rel 'stylesheet'
        script -src 'https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js' -type 'text/javascript'
        script -src 'https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.14.3/umd/popper.min.js' -type 'text/javascript'
        script -src 'https://maxcdn.bootstrapcdn.com/bootstrap/4.1.3/js/bootstrap.min.js' -type 'text/javascript'
    }

    body {
        $topFiveProcess = Get-Process | Sort CPU -descending | Select -first 5 -Property ID,ProcessName,CPU
        div -Content {
            table -class 'table table-striped' {
                thead {
                    tr {
                        th {
                            ID
                        }
        
                        th {
                            ProcessName
                        }
        
                        th {
                            CPU
                        }
                    }
                }
                                
                Tbody {
                    foreach ($process in $topFiveProcess)
                    {
                        tr {
                            td {
                                $process.id
                            }
            
                            td {
                                $process.Name
                            }
            
                            td {
                                $process.CPU
                            }
                        }
                    }
                }
            }
        }
    }

    footer {
        p -Class 'lead' -Content 'All Credits Reserved. PSHTML!'
    }
}</pre>



<p>This results in a nice table shown below!</p>



<p><a href="https://www.powershellmagazine.com/wp-content/uploads/2019/08/html3.png" rel="lightbox[13387]"><img scale="0" class="wp-image-13400" style="width: 600px;" src="http://www.powershellmagazine.com/wp-content/uploads/2019/08/html3.png" alt="" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/08/html3.png 812w, https://www.powershellmagazine.com/wp-content/uploads/2019/08/html3-300x192.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/08/html3-768x492.png 768w" sizes="(max-width: 812px) 100vw, 812px" /></a></p>



<p>See how easy was that!? I will stop this article here as this is not a PSHTML tutorial. Hope you have got a good idea about how useful the module is. There are several community members who did some great work with PSHTML. <a href="https://github.com/Stephanevg/PSHTML" target="_blank" rel="noreferrer noopener" aria-label="Check out their work as well (opens in a new tab)">Check out their work as well</a>.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/08/21/weekly-module-spotlight-pshtml/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Pester Result Reporting With Suggestions And XSL Support</title>
		<link>https://www.powershellmagazine.com/2019/07/25/pester-result-reporting-with-suggestions-and-xsl-support/</link>
				<comments>https://www.powershellmagazine.com/2019/07/25/pester-result-reporting-with-suggestions-and-xsl-support/#respond</comments>
				<pubDate>Thu, 25 Jul 2019 16:00:07 +0000</pubDate>
		<dc:creator><![CDATA[Prasoon Karunan v]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Community]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Pester]]></category>
		<category><![CDATA[DevOps]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13362</guid>
				<description><![CDATA[I believe there is no introduction required for pester in PowerShell community. If you have never heard of Pester, this... ]]></description>
								<content:encoded><![CDATA[
<p>I believe there is no introduction required for pester in PowerShell community. If you have never heard of Pester, <a href="https://github.com/pester/pester/wiki" target="_blank" rel="noreferrer noopener" aria-label="this (opens in a new tab)">this</a> is the place to go first. </p>



<blockquote class="wp-block-quote"><p>This article uses a custom version of Pester which is not an official version yet. This custom version of Pester with these changes is currently available <a href="https://github.com/kvprasoon/Pester/tree/remark">here</a>. </p></blockquote>



<p>Pester is used both as an unit testing framework as well as an operational validation tool. In both the use cases, output report plays an important role in presenting the test results. This article is about the output report with and added capability. </p>



<p>Below is a Pester test script with few tests.</p>



<pre class="crayon-plain-tag">param(
    [Parameter(Mandatory = $True)]
    [string]$ConfigPath
)

$Config = Get-Content -Path $ConfigPath | ConvertFrom-Json

Describe &quot;Describing validation tests post deployment&quot; {

    Context &quot;Post deployment validation tests for services&quot; {
        BeforeAll {
            $Config.service.expectedconfiguration | ForEach-Object -Process {

                $Name = $_.Name
                $Status = $_.Status
                $StartMode = $_.StartMode
                $Service = Get-Service -Name $Name

                it &quot;Service $Name status should be $Status&quot; {
                    $Service.Status | Should -Be $Status
                } 

                it &quot;Service $Name startmode should be $StartMode&quot; {
                    $Service.StartType | Should -Be $StartMode
                } 
            }
        }
    }

    Context &quot;Post deployment validation tests for folder permission&quot; {

        $Config.folderpermission.expectedconfiguration | ForEach-Object -Process {

            $User = $_.user
            $Permission = $_.permission
            $Path = $_.path

            it &quot;user $User should have $Permission permission on path $Path&quot; {
                $Access = (Get-Acl -Path $Path).Access | Where-Object -FilterScript { $_.IdentityReference -eq $User }
                $Access.FileSystemRights | Should -Contain $Permission
            } 
        }
    }

    Context &quot;Post deployment validation tests for firewall rule&quot; {

        $Config.firewallrule.expectedconfiguration | ForEach-Object -Process {

            $Rulename = $_.rulename
            $Direction = $_.direction
            $Rule = Get-NetFirewallRule -Name $RuleName -ErrorAction SilentlyContinue

            it &quot;A Firewall rule with name $RuleName should be available&quot; {
                $Rule | Should -Not -BeNullOrEmpty
            } 

            it &quot;Firewall rule $RuleName should be allowed for $Direction connection&quot; {
                $Rule.Direction | Should -Not $Direction
            } 
        }
    }
}</pre>



<p>Data for above test script is shown below.</p>



<pre class="crayon-plain-tag">{
    &quot;service&quot;: {
        &quot;suggestion&quot;: {
            &quot;startmode&quot;: &quot;Open PowerShell as administrator and run 'Set-Service -Name {0} -StartType {1}'&quot;,
            &quot;status&quot;: &quot;Make the service '{0}' in {1} state.&quot;
        },
        &quot;expectedconfiguration&quot;: [
            {
                &quot;name&quot;: &quot;BITS&quot;,
                &quot;status&quot;: &quot;running&quot;,
                &quot;startmode&quot;: &quot;automatic&quot;
            },
            {
                &quot;name&quot;: &quot;wuauserv&quot;,
                &quot;status&quot;: &quot;running&quot;,
                &quot;startmode&quot;: &quot;automatic&quot;
            }
        ]
    },
    &quot;folderpermission&quot;: {
        &quot;suggestion&quot;: {
            &quot;message&quot;: &quot;Give {0} permission for {1} user on {2} folder.&quot;
        },
        &quot;expectedconfiguration&quot;: [
            {
                &quot;path&quot;: &quot;c:\\Deployment\\config&quot;,
                &quot;user&quot;: &quot;RDFC\\Test&quot;,
                &quot;permission&quot;: &quot;FullControl&quot;
            },
            {
                &quot;path&quot;: &quot;c:\\Deployment\\files&quot;,
                &quot;user&quot;: &quot;RDFC\\kvprasoon&quot;,
                &quot;permission&quot;: &quot;FullControl&quot;
            }
        ]
    },
    &quot;firewallrule&quot;: {
        &quot;suggestion&quot;: {
            &quot;rulename&quot;: &quot;Open wf.msc and create an {0} rule with name '{1}'.&quot;,
            &quot;direction&quot;: &quot;Open wf.msc and create the firewall rule '{0}' for {1} connection.&quot;
        },
        &quot;expectedconfiguration&quot;: [
            {
                &quot;rulename&quot;: &quot;Rule1&quot;,
                &quot;direction&quot;: &quot;Inbound&quot;
            },
            {
                &quot;rulename&quot;: &quot;Rule2&quot;,
                &quot;direction&quot;: &quot;Outbound&quot;
            }
        ]
    }
}</pre>



<p>In a nutshell, the above pester test script will test Service status, file system permissions, and firewall rules by reading the data from the JSON configuration file.</p>



<p>You can execute the test as shown below.</p>



<pre class="crayon-plain-tag">Invoke-Pester -Script @{Path = 'C:\temp\OpsValidation.tests.ps1' ; Parameters = @{ConfigPath = 'c:\temp\OpsValidationConfig.json'}}</pre>



<p>This will show the test result in the console with summary.</p>



<p><img class="wp-image-13369" style="width: 600px" src="http://www.powershellmagazine.com/wp-content/uploads/2019/07/0.png" alt="" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/07/0.png 831w, https://www.powershellmagazine.com/wp-content/uploads/2019/07/0-300x17.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/07/0-768x43.png 768w" sizes="(max-width: 831px) 100vw, 831px" /></p>



<p>When comes to the reporting side, Invoke-Pester has parameters which will create a <a href="https://github.com/nunit/docs/wiki/NUnit-Project-XML-Format">nunit</a> based XML test result file. Nunit XML reports are mostly rendered using a report reader or by converting it to html format. There are many tools in the market that convert Nunit XML to nice UI reports. <a href="http://relevantcodes.com/reportunit/">Reportunit</a> (now <a href="https://github.com/extent-framework/extentreports-dotnet-cli">extent-reports</a>) is my favorite. But this tool uses jquery and CSS which are referenced online, hence made me to think about adding this capability to Pester. Since the report is in XML format, my choice was to go with <a href="https://www.w3schools.com/xml/xsl_intro.asp">XSLT</a>. In short, <em>XSL is a steroid for XML</em>. This article doesn’t cover anything on how it transforms XML to HTML. <a href="https://www.w3schools.com/xml/xsl_languages.asp">Here</a> is a simple example for creating an XSL targeting an XML. Now we know that XSL can transform an XML to HTML but how to add this in the Nunit report generated by Pester. This is done by adding a new parameter to accept the XSL path and putting it as a stylesheet reference in XML generated by pester. I’ve named the parameter as <strong>–XSLPath. </strong>Below is an example execution with the XSL path. </p>



<pre class="crayon-plain-tag">Invoke-Pester -Script @{Path = 'C:\temp\OpsValidation.tests.ps1' ; Parameters = @{ConfigPath = 'c:\temp\OpsValidationConfig.json'}} -OutputFile c:\temp\OpsValidation.xml -OutputFormat NUnitXml -XSLPath c:\temp\OpsValidation.xsl</pre>



<p>Once executed, open the XML report using a web browser to
see the magic !</p>



<p>So far so good! But here comes the interesting part of this article.</p>



<p>We are now able to have a report in html, hence it is easy to see the test failures (human beings are interested in analyzing failures than success !) in a web browser. How about adding some suggestions/remarks for the end user/support engineers to fix the issue and make the test pass?</p>



<p>Well then that has to be done for each testcase. Yes for each test cases. This is done by adding a new parameter to the <strong>it </strong>function in Pester. </p>



<p>Below is an example with the suggestion feature.</p>



<pre class="crayon-plain-tag">it &quot;Service BITS status should be Running&quot; {
     $Service.Status | Should -Be &quot;Running&quot;
 } -Remark &quot;Open services.msc and start BITS service&quot;</pre>



<p>But there is a caveat, we can add a parameter to the It function, but how to add this remark in the report XML. The report XML is in nunit format and it has predefined layout. The <a href="https://github.com/nunit/docs/wiki/Test-Result-XML-Format">test result layout</a> has to be followed in the report and therefore any Nunit report readers can parse the result.</p>



<p>Well, Pester report doesn’t use all the attributes of the nunit test result layout and I found the <strong>Label</strong> attribute as a candidate for adding remarks. So with the remark support for testcases, below is the final pester test script.</p>



<p>Script: <a href="https://gist.github.com/kvprasoon/bec40fa50d6975fcdafa6536b61cf1aa">https://gist.github.com/kvprasoon/bec40fa50d6975fcdafa6536b61cf1aa</a></p>



<p>Test Configuration: <a href="https://gist.github.com/kvprasoon/2dd5fc64eec0653e4bdde6a18da526ff">https://gist.github.com/kvprasoon/2dd5fc64eec0653e4bdde6a18da526ff</a></p>



<p>Lets execute and see the report with suggestions. </p>



<pre class="crayon-plain-tag">Invoke-Pester -Script @{Path = 'C:\temp\OpsValidation.tests.ps1' ; Parameters = @{ConfigPath = 'c:\temp\OpsValidationConfig.json'}} -OutputFile c:\temp\OpsValidation.xml -OutputFormat NUnitXml -XSLPath c:\temp\OpsValidation.xsl</pre>



<p>After opening the generated report in a web browser.</p>



<p><img class="wp-image-13371" style="width: 600px" src="http://www.powershellmagazine.com/wp-content/uploads/2019/07/1.png" alt="" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/07/1.png 1370w, https://www.powershellmagazine.com/wp-content/uploads/2019/07/1-300x106.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/07/1-768x271.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2019/07/1-1024x361.png 1024w" sizes="(max-width: 1370px) 100vw, 1370px" /></p>



<p>If you want to explore and see the code changes, Pester with
these changes is currently available <a href="https://github.com/kvprasoon/Pester/tree/remark">here</a>. </p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/07/25/pester-result-reporting-with-suggestions-and-xsl-support/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Podcast &#8211; A vision for Continuously Integrated Data Center</title>
		<link>https://www.powershellmagazine.com/2019/07/15/podcast-a-vision-for-continuously-integrated-data-center/</link>
				<comments>https://www.powershellmagazine.com/2019/07/15/podcast-a-vision-for-continuously-integrated-data-center/#respond</comments>
				<pubDate>Mon, 15 Jul 2019 16:00:00 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[podcast]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[IaC]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13354</guid>
				<description><![CDATA[I was recently featured on the Latest Shiny Podcast (@l8istsh9y) hosted by Rob Hirschfeld and Stephen Spector. I came across... ]]></description>
								<content:encoded><![CDATA[
<p>I was recently featured on the <a rel="noreferrer noopener" aria-label="Latest Shiny Podcast (opens in a new tab)" href="https://l8istsh9y.com/" target="_blank">Latest Shiny Podcast</a> (<a href="https://twitter.com/l8istsh9y">@l8istsh9y</a>) hosted by  Rob Hirschfeld and  Stephen Spector. I came across their podcast a while ago and listened to <a rel="noreferrer noopener" aria-label="their last two episodes (opens in a new tab)" href="https://soundcloud.com/user-410091210" target="_blank">their last two episodes</a>. </p>



<p>A few weeks ago Rob (I knew him from his Dell days) tweeted about a probable topic for an upcoming episode.</p>



<figure class="wp-block-embed-twitter wp-block-embed is-type-rich is-provider-twitter"><div class="wp-block-embed__wrapper">
<blockquote class="twitter-tweet" data-width="550" data-dnt="true"><p lang="en" dir="ltr">Who wants to rant on <a href="https://twitter.com/l8istsh9y?ref_src=twsrc%5Etfw">@l8istsh9y</a> about infrastructure as code <a href="https://twitter.com/hashtag/IaC?src=hash&amp;ref_src=twsrc%5Etfw">#IaC</a>? Seems fraught, so perfect podcast topic.</p>&mdash; Rob Hirschfeld (@zehicle) <a href="https://twitter.com/zehicle/status/1141490085822644229?ref_src=twsrc%5Etfw">June 19, 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</div></figure>



<p>Having written a couple of published books on PowerShell DSC and being in the infrastructure automation space, Infrastructure as Code (IaC) is close to my heart. Therefore, I just jumped in and said count me in! </p>



<p>We started with a discussion on IaC but eventually it lead to Rob naming what we discussed as a vision for the continuously integrated data center! Indeed, that is (should be) the goal. Rest is what you will hear in this episode of the podcast.  </p>



<figure class="wp-block-embed-soundcloud wp-block-embed is-type-rich is-provider-soundcloud wp-embed-aspect-4-3 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe title="A Vision for the Continuous Integrated Data Center by L8ist Sh9y Podcast" width="760" height="400" scrolling="no" frameborder="no" src="https://w.soundcloud.com/player/?visual=true&#038;url=https%3A%2F%2Fapi.soundcloud.com%2Ftracks%2F645844614&#038;show_artwork=true&#038;maxwidth=760&#038;maxheight=1000&#038;dnt=1"></iframe>
</div></figure>



<p>This was a fun episode. Let me know what your thoughts are. I will certainly find some time to write about this vision and the objectives.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/07/15/podcast-a-vision-for-continuously-integrated-data-center/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Garuda &#8211; Session Demo From #PSConfEU</title>
		<link>https://www.powershellmagazine.com/2019/07/11/garuda-session-demo-from-psconfeu/</link>
				<comments>https://www.powershellmagazine.com/2019/07/11/garuda-session-demo-from-psconfeu/#respond</comments>
				<pubDate>Thu, 11 Jul 2019 14:31:47 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Pester]]></category>
		<category><![CDATA[Operations validation]]></category>
		<category><![CDATA[OVF]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13352</guid>
				<description><![CDATA[Posts in this series Distributed and Flexible Operations Validation Framework - Introduction Garuda - Architecture and Plan Garuda - Session... ]]></description>
								<content:encoded><![CDATA[<div class="multi_part_posts"><h5>Posts in this series</h5><ol><li><a href="https://www.powershellmagazine.com/2019/06/17/distributed-and-flexible-operations-validation-framework-introduction/" title="Distributed and Flexible Operations Validation Framework - Introduction">Distributed and Flexible Operations Validation Framework - Introduction</a></li><li><a href="https://www.powershellmagazine.com/2019/06/24/garuda-architecture-and-plan/" title="Garuda - Architecture and Plan">Garuda - Architecture and Plan</a></li><li><strong>Garuda - Session Demo From #PSConfEU</strong></li></ol></div>
<p>In the earlier parts of this series, I introduced you to the concepts and design of Garuda framework. I demonstrated a proof-of-concept version of this at PowerShell Conference Europe. </p>



<p>The recording of that session is available.</p>



<figure class="wp-block-embed-youtube wp-block-embed is-type-rich is-provider-embed-handler wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe title="Ravikanth Chaganti - Designing a distributed flexible validation framework for Test in Production" width="760" height="428" src="https://www.youtube.com/embed/uSimQ7n-130?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div></figure>



<p>Instead of writing another article about how the POC works, I thought it is easier for you to see it in action.</p>



<p>I am working on a complete overhaul of the framework and will have a new version soon on GitHub. Stay tuned!</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/07/11/garuda-session-demo-from-psconfeu/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Inviting Women In Tech to PowerShell Conference Asia</title>
		<link>https://www.powershellmagazine.com/2019/07/10/inviting-women-in-tech-to-powershell-conference-asia/</link>
				<comments>https://www.powershellmagazine.com/2019/07/10/inviting-women-in-tech-to-powershell-conference-asia/#respond</comments>
				<pubDate>Wed, 10 Jul 2019 14:21:18 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Community]]></category>
		<category><![CDATA[PowerShell]]></category>
		<category><![CDATA[PSConfAsia]]></category>
		<category><![CDATA[WomenInTech]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13347</guid>
				<description><![CDATA[I have been talking to several automation engineers (for a vacant position) and realized there are many women who have... ]]></description>
								<content:encoded><![CDATA[
<p>I have been talking to several automation engineers (for a vacant position) and realized there are many women who have been doing some great work in the area of infrastructure automation. However, there have been very few women attendees or speakers at our user group meetings or conferences that I attended. </p>



<p>While there may be many reasons for this, the organizing committee of PowerShell Conference Asia decided that we invite women in tech (infrastructure automation, Cloud, and DevOps) to this year&#8217;s edition of our conference. </p>



<p>We have opened up <a href="http://bit.ly/psconfwomenintech">registration of intent to attend</a> the conference. All you have to do is just provide your details. We will select five random registrations and give them full 3 day pass to the conference at no cost. For five more, we will offer higher discount. The organizing committee will decide the percentage of discount.</p>



<blockquote class="wp-block-quote"><p>The free entry or the discounted entry entails you the conference pass only. If you need to travel to Bangalore to attend this conference, attendee must bear the travel and accommodation expenses.</p></blockquote>



<p>This registration will end on 15th August 2019. We will announce the selected registrations on 20th August 2019. </p>



<p>Please share this registration information and help us enable women in the infrastructure automation, cloud, and DevOps space to attend PowerShell Conference Asia 2019!</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/07/10/inviting-women-in-tech-to-powershell-conference-asia/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Garuda &#8211; Architecture and Plan</title>
		<link>https://www.powershellmagazine.com/2019/06/24/garuda-architecture-and-plan/</link>
				<comments>https://www.powershellmagazine.com/2019/06/24/garuda-architecture-and-plan/#respond</comments>
				<pubDate>Mon, 24 Jun 2019 04:01:55 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Pester]]></category>
		<category><![CDATA[Operations validation]]></category>
		<category><![CDATA[OVF]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13339</guid>
				<description><![CDATA[Posts in this series Distributed and Flexible Operations Validation Framework - Introduction Garuda - Architecture and Plan Garuda - Session... ]]></description>
								<content:encoded><![CDATA[<div class="multi_part_posts"><h5>Posts in this series</h5><ol><li><a href="https://www.powershellmagazine.com/2019/06/17/distributed-and-flexible-operations-validation-framework-introduction/" title="Distributed and Flexible Operations Validation Framework - Introduction">Distributed and Flexible Operations Validation Framework - Introduction</a></li><li><strong>Garuda - Architecture and Plan</strong></li><li><a href="https://www.powershellmagazine.com/2019/07/11/garuda-session-demo-from-psconfeu/" title="Garuda - Session Demo From #PSConfEU">Garuda - Session Demo From #PSConfEU</a></li></ol></div>
<p>In the first part of this series, I mentioned the reasoning behind starting development of a new framework for operations validation. Towards the end, I introduced Garuda &#8212; a distributed and flexible operations validation framework. There are certain principles that drove the design of this framework &#8212; Distributed, Flexible, and Secure.</p>



<p>In this part of the series, you will see the architecture proposal and the plan I have to implement these features. </p>



<h2>Architecture</h2>



<p>To support the principles described, the framework needs a few moving parts. These moving parts provide the flexibility needed and gives you a choice of tooling.</p>



<p><img class="wp-image-13340" style="width: 500px;" src="http://www.powershellmagazine.com/wp-content/uploads/2019/06/Garuda.png" alt="" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/06/Garuda.png 1153w, https://www.powershellmagazine.com/wp-content/uploads/2019/06/Garuda-300x139.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/06/Garuda-768x355.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2019/06/Garuda-1024x473.png 1024w" sizes="(max-width: 1153px) 100vw, 1153px" /></p>



<p>At a high-level, there are five components in this framework. </p>



<h3>Test Library</h3>



<p>The test library is just a collection of parameterized Pester test scripts. The parameterization helps in reuse. As a part of the Garuda core, you can group these tests and then use the publish engine to push the tests to remote targets. The tags within the tests are used in the test execution process to control what tests need to be executed within the test group published to the remote target. </p>



<h3>Garuda Core</h3>



<p>This is the main module which sticks the remaining pieces together. This is what you can use to manage the test library. This core module will provide the ability to parse the test library and generate the parameter information for each test script which is eventually used in generating a parameter manifest or configuration data. One of the requirements for this framework is to enable grouping of tests. The Garuda core gives you the ability to generate the test groups based on what is there in the library. You can then generate the necessary parameter manifest (configuration data) template for the test group that you want to publish to remote targets. Once you have the configuration data prepared for the test group, you can publish the tests to the remote targets.</p>



<h3>Publish Engine</h3>



<p>The publish engine is responsible for several things. </p>



<p>This module generates the necessary configuration or deploy script that does the following:</p>



<ol><li>Install necessary dependent modules (for test scripts from a local repository or PowerShell Gallery)</li><li>Copy test scripts from the selected test group to the remote target</li><li>Copy the configuration data (sans the sensitive data) to the remote target </li><li>As needed, store credentials to a credential vault on the remote target</li><li>Copy the Chakra engine to the remote targets</li><li>if selected, create the JEA endpoints for operators to retrieve test results from the remote targets</li><li>If selected, create scheduled tasks on the remote target for reoccurring test script execution</li></ol>



<p>Once the configuration or the deploy script is ready, the publish engine can enact it directly on the remote targets or just return the script for you to enact it yourself.</p>



<p>The publish engine is extensible and by default will support PowerShell DSC and <a href="https://github.com/RamblingCookieMonster/PSDeploy">PSDeploy</a> for publishing tests to the remote targets. Eventually, I hope the community will write providers for other configuration management platforms / tools. There will be abstractions within the engine to add these providers in a transparent manner.</p>



<p>The publish engine helps in securing the test execution by storing sensitive configuration data in a vault. It also implements the scheduled tasks as either SYSTEM account or a specified user. The JEA endpoints configured on the remote targets help us securely retrieve the test results with least privileges needed.</p>



<p>You can publish multiple tests groups to the same remote target. This helps implement the flexibility that IT teams need in the operations space for a given infrastructure or application workload. There can be multiple JEA endpoints one for each team publishing the test groups.</p>



<h3>Chakra</h3>



<p>The Chakra is what helps execute the tests in test group(s) on the remote targets. It has the test parameter awareness and can use the published configuration data and the sensitive data stored in the vault for unattended execution of tests. Chakra is also responsible for result consolidation. It can be configured to retain results for X number of days. All the test results for each group get stored as JSON files and are always timestamped. The scheduled tasks created on the remote targets invoke Chakra at the specified intervals. Chakra also contains the cmdlets that are configured for access within the JEA endpoints. Using these endpoints, test operators can retrieve the test results from a central system from all the remote targets.</p>



<h3>Report Engine</h3>



<p>The report engine is final piece of this framework that enables retrieving the results from the remote targets and transforming those results into something meaningful for the IT managers. By default, there will be providers for reports based on <a href="https://github.com/Stephanevg/PSHTML">PSHTML</a>, <a href="https://github.com/dfinke/ImportExcel">ImportExcel</a>, and <a href="https://github.com/ironmansoftware/universal-dashboard">UniversalDashboard</a>. The report engine provides that abstractions for community to add more reporting options.</p>



<h2>The Plan</h2>



<p>The initial release of the framework or what I demonstrated at the PowerShell Conference Europe was just a proof of concept. I am planning on breaking down the framework into different core components I mentioned above. The GitHub repository for the framework will have issues created for each of these components and I will start implementing the basics. The 1.0 release of this framework will have support for every detail mentioned above and will be completely useable and functional for production use.</p>



<h2>What about the naming?</h2>



<p>The names <em>Garuda</em> and <em>Chakra</em> are from the Hindu mythology and their meaning is connected to the concepts I am proposing for this framework. Garuda is the <a href="https://en.wikipedia.org/wiki/Garuda">bird from Hindu mythology</a>. It has a mix of human and bird features. It is deemed powerful and is the vehicle of the Hindu god Vishnu. It can travel anywhere and is considered the king of birds. The <a href="https://en.wikipedia.org/wiki/Sudarshana_Chakra">Chakra</a> the weapon that lord Vishnu carries. It is used to eliminate evil. This is also known as the wheel of time. Garuda is the vehicle that transports lord Vishnu and his Chakra to places where there is evil.</p>



<p>The Garuda Core combined with the Publish engine can take your operational validation tests to your remote targets. Chakra is the way to perform operations validation to ensure that your infrastructure is always healthy and functional. </p>



<p>In the next article in this series, you will see the framework in action.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/06/24/garuda-architecture-and-plan/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PSPublicAPI &#8211; Module For Listing Free APIs For Use in Software and Web Development</title>
		<link>https://www.powershellmagazine.com/2019/06/18/pspublicapi-module-for-listing-free-apis-for-use-in-software-and-web-development/</link>
				<comments>https://www.powershellmagazine.com/2019/06/18/pspublicapi-module-for-listing-free-apis-for-use-in-software-and-web-development/#respond</comments>
				<pubDate>Tue, 18 Jun 2019 14:54:14 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Module]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[publicapi]]></category>
		<category><![CDATA[REST]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13327</guid>
				<description><![CDATA[The Public APIs repository on GitHub has a list of free APIs that you can use in software and web... ]]></description>
								<content:encoded><![CDATA[
<p>The <a rel="noreferrer noopener" aria-label="Public APIs repository on GitHub (opens in a new tab)" href="https://github.com/public-apis/public-apis" target="_blank">Public APIs repository on GitHub</a> has a list of free APIs that you can use in software and web development. This is a great resource for finding out if there is a free public API for a specific task at hand. For example, if your application requires weather data, you can take a look at several <a rel="noreferrer noopener" aria-label="free API options available (opens in a new tab)" href="https://github.com/public-apis/public-apis#weather" target="_blank">free API options available</a> and select the one that works for you. I have been following this repository and they have recently added something useful &#8212; a <a rel="noreferrer noopener" aria-label="public API to query for public APIs (opens in a new tab)" href="https://api.publicapis.org/" target="_blank">public API to query for public APIs</a>! </p>



<p>I quickly created a new PowerShell module that wraps around the public API for the public APIs!</p>



<p>You can install this <a rel="noreferrer noopener" aria-label="module from the gallery (opens in a new tab)" href="https://www.powershellgallery.com/packages/PSPublicAPI/1.0.0.0" target="_blank">mo</a><a href="https://www.powershellgallery.com/packages/PSPublicAPI/" target="_blank" rel="noreferrer noopener" aria-label="dule from the gallery (opens in a new tab)">dule from the gallery</a> as well.</p>



<p><code>Install-Module -Name PSPublicAPI -Force</code></p>



<p>There are four commands in this module.</p>



<div class="wp-block-group">
<p><a href="https://github.com/rchaganti/PSPublicAPI#get-pspublicapicategory"></a><strong>Get-PSPublicAPICategory</strong> &#8211; Gets a list of categories for the public API.</p>



<p><a href="https://github.com/rchaganti/PSPublicAPI#get-pspublicapihealth"></a><strong>Get-PSPublicAPIHealth</strong> &#8211; Gets the health state of public API service.</p>



<p><strong>Get-PSPublicAPIEntry</strong> &#8211; Gets specific APIs or all API entries from the public API service.</p>



<p><a href="https://github.com/rchaganti/PSPublicAPI#get-pspublicapirandomentry"></a><strong>Get-PSPublicAPIRandomEntry</strong> &#8211; Gets a random API entry public API service or a random API entry matching a specific criteria.</p>
</div>



<p>The commands are pretty much self-explained and you can find the docs for each command <a href="https://github.com/rchaganti/PSPublicAPI/tree/master/docs" target="_blank" rel="noreferrer noopener" aria-label="here (opens in a new tab)">here</a>.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/06/18/pspublicapi-module-for-listing-free-apis-for-use-in-software-and-web-development/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Distributed and Flexible Operations Validation Framework &#8211; Introduction</title>
		<link>https://www.powershellmagazine.com/2019/06/17/distributed-and-flexible-operations-validation-framework-introduction/</link>
				<comments>https://www.powershellmagazine.com/2019/06/17/distributed-and-flexible-operations-validation-framework-introduction/#respond</comments>
				<pubDate>Mon, 17 Jun 2019 16:16:51 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Pester]]></category>
		<category><![CDATA[Operations]]></category>
		<category><![CDATA[OVF]]></category>
		<category><![CDATA[Validation]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13324</guid>
				<description><![CDATA[Posts in this series Distributed and Flexible Operations Validation Framework - Introduction Garuda - Architecture and Plan Garuda - Session... ]]></description>
								<content:encoded><![CDATA[<div class="multi_part_posts"><h5>Posts in this series</h5><ol><li><strong>Distributed and Flexible Operations Validation Framework - Introduction</strong></li><li><a href="https://www.powershellmagazine.com/2019/06/24/garuda-architecture-and-plan/" title="Garuda - Architecture and Plan">Garuda - Architecture and Plan</a></li><li><a href="https://www.powershellmagazine.com/2019/07/11/garuda-session-demo-from-psconfeu/" title="Garuda - Session Demo From #PSConfEU">Garuda - Session Demo From #PSConfEU</a></li></ol></div>
<p>[6/18/2019 &#8211; Added PoshSpec]</p>



<p>Operations validation using PowerShell and Pester has been one of my favorite topics and I have both personal and professional interest in this area. I have invested good amount of time experimenting with the existing frameworks and creating a couple of my own. One of my PowerShell Conference EU sessions was around this topic and a new framework that I am developing. This session was well received and there was good amount of interest in this new framework.</p>



<p>In this series of articles, I will write about the need for a new framework and introduce the new framework which I demonstrated at <a href="http://www.psconf.eu/">PowerShell Conference Europe</a>. There is still a lot of work to be done and this series will act as a way to express where I want to see this whole framework and the end goals.</p>



<p>In this part, you will see what are the available options today for operations validation and / or similar use cases and what their limitations are. Towards the end, I will talk about the desired state requirements for a distributed and flexible operations validation framework.</p>



<h2>The Current State</h2>



<p>My work with operations validation started back in 2016 when <a href="https://www.youtube.com/watch?v=vmEeTBSWd5s">I had first demonstrated</a> using Pester for validating the functional state of clusters. This first implementation was <a href="https://github.com/rchaganti/InfraBlueprints/tree/Dev/HyperVConfigurations">tightly coupled with PowerShell DSC resource modules</a> and needed configuration data supplied with DSC configuration documents to perform the operations validations. This model worked very well for infrastructure that was configured using DSC. However, this is not really a generic or flexible framework for running operations validation.</p>



<h3>Microsoft Operations Validation Framework (OVF)</h3>



<p>Around the time I started working on the operations validations bundled with infrastructure blueprints, PowerShell team published an open source version of a <a href="https://github.com/PowerShell/Operation-Validation-Framework/">framework meant for operations validation</a>. This implements operational tests bundled with regular PowerShell modules. The cmdlets in the framework can discover the operations validation tests packaged in the installed modules and invoke the same. You can specify a hashtable of values as the test script parameters. This is a distributed test execution model. Tests can be copied to all nodes and invoked on the node. This is certainly what I wanted to start with. But, the tight coupling between the modules and tests is not what I really want. Instead, I want to be able to distribute chosen tests as groups of tests to any node. I could have written a wrapper script around OVF and achieve what I wanted but there are other limitations. </p>



<p>Packaging tests as modules is an unnecessary overhead. If you have a huge library of tests and you need to determine the tests that run on the remote targets dynamically, you also need to be able to generate modules dynamically. And, then, you need to find a way to distribute those modules among the target nodes and also ensure that these are kept up to date as you update the central repository.</p>



<p>The test parameters are passed as a hashtable and therefore if you need to invoke the tests in an unattended (such as a schedule task) manner, you need to ensure that you have a wrapper script that reads some sort of configuration data and translates that into relevant parameters. But, then you need a way to publish that configuration data as well to the remote targets. </p>



<h4>PSHealthz</h4>



<p><a href="https://github.com/devblackops/pshealthz">PSHealthz</a> by Brandon Olin provides a web service endpoint to invoke tests packaged or published using OVF. This is an implementation of the <a href="https://msdn.microsoft.com/en-us/library/dn589789.aspx">Health Endpoint Monitoring Pattern</a> using PowerShell. The available tests can be retrieved using the /health endpoint. Tests can be executed on the target node using the query parameters on the /health endpoint. PSHealthz is more of a way to list and invoke tests on the target nodes using the REST API but the limitations of OVF I mentioned above still exist. </p>



<h3>Remotely and PSRemotely</h3>



<p><a href="https://github.com/PowerShell/Remotely">Remotely</a> is an open source PowerShell module from Microsoft that can be used for running Pester tests remotely &#8212; no surprises there! You can specify a set of remote targets in a file called machineconfi.csv and then use the Remotely keyword inside the It scriptblock for running the tests on the remote targets. This module has several drawbacks and has been more experimental than anything remotely useful (pun intended!). In fact, it has been more than 3 years since there was any update. Although the tests run on the remote node (using PowerShell remoting), they are essentially triggered from a central location in a fan-out method. Therefore, this module implements centralized test execution and reporting.</p>



<p><a href="https://github.com/DexterPosh/PSRemotely">PSRemotely</a> was born out of the need for running tests on a bunch of remote nodes while eliminating all of Remotely drawbacks and providing better control over what runs when and where. This module uses DSC type configuration data for providing test parameters for each remote node. In fact, we have implemented a complete validation suite using PSRemotely before writing one more internal framework for operations validation of clusters. The major drawback of this module was the need to enable CredSSP so that the delegated credentials, when needed, can be used on remote targets. Also, there was no infrastructure awareness in PSRemotely. The number and type of tests running gets determined using the configuration data and we had no control over grouping tests based on the type of infrastructure. With PSRemotely, the execution of tests is distributed and reporting is centralized. Therefore, PSRemotely implements a hybrid model. With this framework, Pester tags is the only way to separate tests into groups.</p>



<h3>DBAChecks and pChecksAD</h3>



<p>Both <a href="https://github.com/sqlcollaborative/dbachecks">DBAChecks</a> and <a href="https://github.com/mczerniawski/pChecksAD">pChecksAD</a> implement a more centralized test execution and reporting. All tests stay on the local system and you can design these tests to target remote systems using a cmdlet provided method or write your tests to use PowerShell remoting to target remote systems. These are purpose built modules but you can take clues from how they implemented these modules and write one for your specific use case. These are great at what they are doing but not something that would satisfy my requirements for a distributed and flexible operations validation framework.</p>



<h3>PoshSpec</h3>



<p><a href="https://github.com/ticketmaster/poshspec" target="_blank" rel="noreferrer noopener" aria-label="PoshSpec  (opens in a new tab)">PoshSpec </a>is another great project that enables simplified infrastructure validations with a way to extend it very easily. PoshSpec provides a DSL for infrastructure validations for known resources. For example, you can write a test in PoshSpec DSL to verify if a hotfix is installed without worrying about how to get a list of hotfixes. There is currently a <a rel="noreferrer noopener" aria-label="limited set of resources (opens in a new tab)" href="https://github.com/ticketmaster/poshspec/tree/master/Public" target="_blank">limited set of resources</a>. PoshSpec is centralized test execution. It does not yet support remote execution but there is an <a rel="noreferrer noopener" aria-label="RFC on that (opens in a new tab)" href="https://github.com/ticketmaster/poshspec/issues/6" target="_blank">RFC on that</a>. I created this issue back in 2016 and continued to work on my own frameworks for achieving what I really need.</p>



<h2>The Desired State</h2>



<p>You have seen, so far, options available for performing operations validation. You have also read about the limitations that these frameworks or modules pose. I will, now, translate these limitations into the requirements for a new operations validation framework.</p>



<h3>Distributed</h3>



<p>The new framework needs to support distribution (publishing) of tests to remote targets and should offer different methods for test distribution. For example, I should be able to publish tests to remote targets using PowerShell DSC or Ansible or Chef or Puppet.</p>



<p>The new framework should support distributed test execution. I want to be able to invoke tests on-demand or on a scheduled basis on the remote targets. The input parameters or configuration data needed for the tests should be local but the framework should provide a way to publish the configuration data as well. And, the secrets within the configuration data should be encrypted.</p>



<h3>Flexible</h3>



<p>The new framework should be flexible enough to allow integration with different other modules or technologies. For example, I had mentioned already that the test distribution should support more than one method. </p>



<p>Within infrastructure management, there will be more than one team involved in bringing up the infra. For example, if there is a SQL cluster that is being managed, there may be a team that is solely responsible for OS deployment &amp; management whereas another takes care of SQL management. Now, each of these team will have their own operations validation tests. The new framework should enable a way to publish multiple test groups to the remote targets and execute and report them independently. </p>



<p>From a reporting point of view, the new framework should be capable of supporting multiple reporting methods like HTML, Excel, and so on.</p>



<h3>Secure</h3>



<p>The tests running on the remote targets need input parameters and this may include secure strings and secrets. Since this configuration data needs to reside on the target nodes, the sensitive data should be stored in a safe manner. For example, credentials should go into a vault such as Windows Credential Manager. The new framework should support this.</p>



<p>The result retrieval from the remote targets happens at a central console. For this, the test operators need access only to invoke the test result retrieval from the remote targets. The framework should support least privileged way of doing this such as implementing a JEA endpoint.</p>



<h2>Introducing Garuda</h2>



<p>I have been experimenting a bit trying to implement a totally new framework that satisfies most if not all of the desired state requirements. This is still in a <a href="https://github.com/rchaganti/garuda">proof-of-concept phase</a>. There is not even documentation around how to use this yet. This is what I demonstrated at the PowerShell Conference EU 2019 a week ago. I named this framework Garuda. I will write about the naming choice in the next post.</p>



<p>Today&#8217;s article is an introduction to the thought process behind Garuda. In the next post, I will explain the architecture of Garuda and talk about how some of the desired state requirements are implemented.</p>



<p>BTW, just before my session at the EU conference, I had a couple of hours to kill and created this logo for the Garuda framework. You will know the meaning of this in the next post.</p>



<p style="text-align:center"><img class="wp-image-13317" style="width: 300px;" src="http://www.powershellmagazine.com/wp-content/uploads/2019/06/logo.png" alt="" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/06/logo.png 1415w, https://www.powershellmagazine.com/wp-content/uploads/2019/06/logo-300x124.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/06/logo-768x316.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2019/06/logo-1024x422.png 1024w" sizes="(max-width: 1415px) 100vw, 1415px" /></p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/06/17/distributed-and-flexible-operations-validation-framework-introduction/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>#PSTip A Better Way to Generate HTTP Query Strings in PowerShell</title>
		<link>https://www.powershellmagazine.com/2019/06/14/pstip-a-better-way-to-generate-http-query-strings-in-powershell/</link>
				<comments>https://www.powershellmagazine.com/2019/06/14/pstip-a-better-way-to-generate-http-query-strings-in-powershell/#respond</comments>
				<pubDate>Fri, 14 Jun 2019 05:41:49 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Tips and Tricks]]></category>
		<category><![CDATA[HTTP]]></category>
		<category><![CDATA[PSTip]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13299</guid>
				<description><![CDATA[While working on a module that interacts with REST API, I came across a situation where I had to generate... ]]></description>
								<content:encoded><![CDATA[
<p>While working on a module that interacts with REST API, I came across a situation where I had to generate query strings for <a href="https://www.headphonage.com/">Headphonage</a> operations. The number of parameters will vary based on what is supplied to the function. This becomes a bit tricky since the HTTP query strings have a certain format. </p>



<p>For example, https://localhost:443/?name=test&amp;age=25&amp;company=powershell+magazine.</p>



<p>As you see in the above example, the first parameter in the query string should be prefixed with question mark and the subsequent parameters are separated by an ampersand. If there are spaces in a parameter value the spaces should be replaced with a plus sign. This can be coded easily in PowerShell but there is a better way using the <a rel="noreferrer noopener" aria-label="System.Web.HttpUtility (opens in a new tab)" href="https://docs.microsoft.com/en-us/dotnet/api/system.web.httputility?view=netcore-2.2" target="_blank">System.Web.HttpUtility</a> class in .NET.</p>



<p>The ParseQueryString method in the httputility class parses the URL and gives us a key-value collection. To start with, we can provide this method an empty string.</p>



<pre class="crayon-plain-tag">$nvCollection = [System.Web.HttpUtility]::ParseQueryString([String]::Empty)</pre>



<p>We can then add the key value pairs to this collection.</p>



<pre class="crayon-plain-tag">$nvCollection.Add('name','powershell')
$nvCollection.Add('age',13)
$nvCollection.Add('company','automate inc')</pre>



<p>Once the parameters are added to the collection, we can build the URI and retrieve the query string.</p>



<pre class="crayon-plain-tag">$uriRequest = [System.UriBuilder]'https://localhost'
$uriRequest.Query = $nvCollection.ToString()
$uriRequest.Uri.OriginalString</pre>



<p>This is it. I created a function out of this for reuse. </p>



<pre class="crayon-plain-tag">function New-HttpQueryString
{
    [CmdletBinding()]
    param 
    (
        [Parameter(Mandatory = $true)]
        [String]
        $Uri,

        [Parameter(Mandatory = $true)]
        [Hashtable]
        $QueryParameter
    )

    # Add System.Web
    Add-Type -AssemblyName System.Web

    # Create a http name value collection from an empty string
    $nvCollection = [System.Web.HttpUtility]::ParseQueryString([String]::Empty)

    foreach ($key in $QueryParameter.Keys)
    {
        $nvCollection.Add($key, $QueryParameter.$key)
    }

    # Build the uri
    $uriRequest = [System.UriBuilder]$uri
    $uriRequest.Query = $nvCollection.ToString()

    return $uriRequest.Uri.OriginalString
}</pre>



<p></p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/06/14/pstip-a-better-way-to-generate-http-query-strings-in-powershell/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PowerShell Conference Asia &#8211; Early Bird Ticket Sales</title>
		<link>https://www.powershellmagazine.com/2019/04/09/powershell-conference-asia-early-bird-ticket-sales/</link>
				<comments>https://www.powershellmagazine.com/2019/04/09/powershell-conference-asia-early-bird-ticket-sales/#respond</comments>
				<pubDate>Tue, 09 Apr 2019 12:34:26 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Community]]></category>
		<category><![CDATA[PSConfAsia]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13285</guid>
				<description><![CDATA[PowerShell Conference Asia 2019 edition will be in Bangalore, India. We closed our CFP towards end of February and finalized... ]]></description>
								<content:encoded><![CDATA[
<p><a rel="noreferrer noopener" aria-label="PowerShell Conference Asia 2019 (opens in a new tab)" href="http://psconf.asia" target="_blank">PowerShell Conference Asia 2019</a> edition will be in Bangalore, India. We <a rel="noreferrer noopener" aria-label="closed our CFP (opens in a new tab)" href="https://www.powershellmagazine.com/2018/12/31/powershell-conference-asia-2019-call-for-papers-2/" target="_blank">closed our CFP</a> towards end of February and finalized (partially) a great set of International PowerShell Experts. This conference, as always, will feature PowerShell product team members from Redmond. </p>



<p><a rel="noreferrer noopener" aria-label="psconf.asia (opens in a new tab)" href="http://psconf.asia" target="_blank">psconf.asia</a> is updated to feature the confirmed speakers and this list includes experts from the USA, Australia, Europe, and India. As I write this, we are yet to finalize a few more speakers and I am sure we will have a fully loaded agenda for all three days. The pre-conf workshops include content from level 100 (PowerShell 101) to CI / CD for PowerShell professionals. On the pre-conf day, there is a track dedicated for deep-dive sessions for the attendees who are already comfortable writing PowerShell scripts and modules. </p>



<p>At this point in time, we have opened the early bird discount sale (15% on the original ticket price). The tickets are priced in INR and a 3 day pass at this point in time costs less than 100 USD.</p>



<p>You can get the 3 day (includes pre-conf day) early bird pass @ <a href="https://imjo.in/N8t8G7"></a><a rel="noreferrer noopener" aria-label=" (opens in a new tab)" href="https://imjo.in/N8t8G7" target="_blank">https://imjo.in/N8t8G7</a> and the 2 day (only full-conf days) early bird pass @  <br><a rel="noreferrer noopener" aria-label=" (opens in a new tab)" href="https://www.instamojo.com/@tecoholic/l2f610b18b43c4deb9f534b1082b7f414/" target="_blank">https://www.instamojo.com/@tecoholic/l2f610b18b43c4deb9f534b1082b7f414/</a> </p>



<p>If you have a group of 5 or more interested in attending this year&#8217;s conference, reach out to us at <a aria-label="get-help@psasia.org (opens in a new tab)" rel="noreferrer noopener" href="mailto:get-help@psasia.org" target="_blank">get-help@psasia.org</a>. We will let you know what best we can do for your group.</p>



<p>We have already received a few ticket sales and very happy with the progress in last couple of days. Can&#8217;t wait to see you all in September.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/04/09/powershell-conference-asia-early-bird-ticket-sales/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PSWindowsAdminCenter: PowerShell Module to Manage Connections, Feeds, and Extensions</title>
		<link>https://www.powershellmagazine.com/2019/02/19/pswindowsadmincenter-powershell-module-to-manage-connections-feeds-and-extensions/</link>
				<comments>https://www.powershellmagazine.com/2019/02/19/pswindowsadmincenter-powershell-module-to-manage-connections-feeds-and-extensions/#respond</comments>
				<pubDate>Tue, 19 Feb 2019 14:42:53 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[PowerShell]]></category>
		<category><![CDATA[WAC]]></category>
		<category><![CDATA[Windows Admin Center]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13269</guid>
				<description><![CDATA[I had published a PowerShell DSC resource module, last month, called WindowsAdminCenterDsc that uses the PowerShell module that was made... ]]></description>
								<content:encoded><![CDATA[
<p>I had published a PowerShell DSC resource module, last month, called <a rel="noreferrer noopener" aria-label="WindowsAdminCenterDsc (opens in a new tab)" href="https://www.powershellmagazine.com/2019/01/31/dsc-resource-module-to-install-and-configure-windows-admin-center/" target="_blank">WindowsAdminCenterDsc</a> that uses the PowerShell module that was made available with Windows Admin Center version 1812. This module makes use of the REST API that comes with Windows Admin Center to manage connections, feeds, and extensions. </p>



<p> I had immediately verified that the API was available in version 1809.5 as well. So, I wanted to build another PowerShell module that has similar and/or more features than the current module that ships with version 1812. Also, the goal was to ensure that I can use this module in my build process to add the newly deployed servers and clusters to Windows Admin Center in an automated manner.</p>



<blockquote class="wp-block-quote"><p><strong>Note</strong>: This module works with Windows Admin Center 1809.5 and above.</p></blockquote>



<p>This module can be installed from PowerShell Gallery:</p>



<pre class="crayon-plain-tag">Install-Module -Name PSWindowsAdminCenter &lt;br&gt;</pre>



<table class="wp-block-table"><thead><tr><th>Command</th><th>Description</th></tr></thead><tbody><tr><td>Get-WacConnection</td><td>Gets connections added to Windows admin Center for management.</td></tr><tr><td>Add-WacConnection</td><td>Adds a new connection to Windows Admin Center for management.</td></tr><tr><td>Get-WacFeed</td><td>Gets all extension feeds available in Windows Admin Center.</td></tr><tr><td>Add-WacFeed</td><td>Adds an extension feed to Windows Admin Center.</td></tr><tr><td>Remove-WacFeed</td><td>Removes an extension feed from Windows Admin Center.</td></tr><tr><td>Get-WacExtension</td><td>Gets all extensions available or installed in Windows Admin Center.</td></tr><tr><td>Install-WacExtension</td><td>Installs an extension.</td></tr><tr><td>Uninstall-WacExtension</td><td>Uninstalls an extension.</td></tr><tr><td>Update-WacExtension</td><td>Updates an extension.</td></tr></tbody></table>



<p>This project is available in <a href="https://github.com/rchaganti/PSWindowsAdminCenter/" target="_blank" rel="noreferrer noopener" aria-label="my GitHub repository (opens in a new tab)">my GitHub repository</a>. I have a few TODOs:</p>



<ul><li>Add Export option to <em>Get-WacConnection</em> command so that you can export the connections details to a CSV file.</li><li>Add Import option to <em>Add-WacConnection</em> command so that you can import all connections from a CSV file.</li><li>Update <em>WindowsAdminCenterDsc </em>module to use the <em>PSWindowsAdminCenter </em>instead of the module that ships with WAC.</li></ul>



<p>If you see any issues or would like to see new features, feel free to <a href="https://github.com/rchaganti/PSWindowsAdminCenter/issues/new" target="_blank" rel="noreferrer noopener" aria-label="create an issue (opens in a new tab)">create an issue</a>.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/02/19/pswindowsadmincenter-powershell-module-to-manage-connections-feeds-and-extensions/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PowerShell DSC Resource Module to Install and Configure Windows Admin Center</title>
		<link>https://www.powershellmagazine.com/2019/01/31/dsc-resource-module-to-install-and-configure-windows-admin-center/</link>
				<comments>https://www.powershellmagazine.com/2019/01/31/dsc-resource-module-to-install-and-configure-windows-admin-center/#respond</comments>
				<pubDate>Thu, 31 Jan 2019 17:00:15 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[PowerShell DSC]]></category>
		<category><![CDATA[PSDSC]]></category>
		<category><![CDATA[WAC]]></category>
		<category><![CDATA[Windows Admin Center]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13210</guid>
				<description><![CDATA[Windows Admin Center (WAC) is the new web-based management application for managing Windows Servers, Failover Clusters, Hyper-Converged Infrastructure (HCI) clusters,... ]]></description>
								<content:encoded><![CDATA[
<p><a aria-label="Windows Admin Center (opens in a new tab)" rel="noreferrer noopener" href="https://docs.microsoft.com/en-us/windows-server/manage/windows-admin-center/understand/windows-admin-center" target="_blank">Windows Admin Center</a> (WAC) is the new web-based management application for managing Windows Servers, Failover Clusters, Hyper-Converged Infrastructure (HCI) clusters, and Windows 10 PCs. This is a free application and can be installed on Windows Server 2016, Windows Server 2019, or Windows 10. Unlike System Center Operations Manager (SCOM), WAC does not store any monitoring data locally and therefore it is near real-time only.</p>



<p>Ever since WAC was released, one thought I had was to automatically onboard the servers and clusters that I want to manage within WAC right after their deployment is complete. There was no API that was available to do this earlier. </p>



<p>With the release of the WAC version 1812 (insider preview), there are a couple of PowerShell modules that are bundled with WAC. This internally uses the REST API and wraps around the same for a few management tasks.</p>



<p>When I saw this, I immediately explored a design to implement DSC resources for WAC install/uninstall and configuration. And, the result is here: <a href="https://github.com/rchaganti/WindowsAdminCenterDsc">https://github.com/rchaganti/WindowsAdminCenterDsc</a></p>



<blockquote class="wp-block-quote"><p>This works only with Windows Admin Center 1812 insider preview and above.</p></blockquote>



<p>This REST API is available in 1809.5 as well and I am working on creating a PowerShell to wrap that API as a set of cmdlets. I will update this DSC resource module as well without breaking the current DSC resource design.</p>



<p>This module contains a set of resources to install Windows Admin Center (WAC) and configure WAC feeds, extensions, and connections.</p>



<table class="wp-block-table"><thead><tr><th>DSC Resource Name</th><th>Description</th></tr></thead><tbody><tr><td>WacSetup</td><td>Installs Windows Admin Center. This is a composite resource and enables options to change the port and certificate thumbprint to a local certificate instead of a self-signed certificate.</td></tr><tr><td>WacFeed</td><td>This resource supports adding and removing WAC extension feeds.</td></tr><tr><td>WacExtension</td><td>This resource supports installing and uninstalling WAC extensions.</td></tr><tr><td>WacServerConnection</td><td>Use this resource to add Windows Server for management within WAC.</td></tr><tr><td>WacHciConnection</td><td>Use this resource to add Windows Failover Cluster for management within WAC.</td></tr><tr><td>WacClusterConnection</td><td>Use this resource to add HCI Cluster for management within WAC.</td></tr></tbody></table>



<p>For complete documentation, see&nbsp;<a href="https://windowsadmincenterdsc.readthedocs.io/en/latest/">https://windowsadmincenterdsc.readthedocs.io/en/latest/</a>.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/01/31/dsc-resource-module-to-install-and-configure-windows-admin-center/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Using PowerShell for generating dynamic folders in Royal TS</title>
		<link>https://www.powershellmagazine.com/2019/01/03/using-powershell-for-generating-dynamic-folders-in-royal-ts/</link>
				<comments>https://www.powershellmagazine.com/2019/01/03/using-powershell-for-generating-dynamic-folders-in-royal-ts/#respond</comments>
				<pubDate>Thu, 03 Jan 2019 11:00:13 +0000</pubDate>
		<dc:creator><![CDATA[Jan Egil Ring]]></dc:creator>
				<category><![CDATA[Articles]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13184</guid>
				<description><![CDATA[Royal TS is a powerful tool for managing remote systems using many different protocols such as Remote Desktop (RDP), PowerShell,... ]]></description>
								<content:encoded><![CDATA[
<p><a href="https://www.royalapplications.com/">Royal
TS</a> is a powerful tool for managing remote systems using many different
protocols such as Remote Desktop (RDP), PowerShell, SSH, HTTPS, and many more.
In this article we will look at a new feature introduced in Royal TS 5.0
(released in December 2018) called dynamic folders.</p>



<p>Previously, I have written an <a href="https://www.powershellmagazine.com/2015/01/08/introducing-the-royal-ts-powershell-module/">article</a>
covering a PowerShell module for managing Royal TS Documents which is built in
to Royal TS. </p>



<p>In that article I’ve showcased a script
called <a href="https://github.com/janegilring/PSCommunity/blob/master/Royal%20TS/Update-RoyalFolder.ps1">Update-RoyalFolder.ps1</a>,
which could replicate server computer objects from a specified Active Directory
domain or Organizational Unit (OU).</p>



<p>This was very useful as the script could be
scheduled to update a Royal TS connections document, for example on a daily
basis.</p>



<p>The new Dynamic Folder Script feature allows
you to configure a script and the interpreter which populates the dynamic
folder content. </p>



<p>We start by creating a new folder of the Dynamic
Folder type:</p>



<figure class="wp-block-image is-resized"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_01-1024x903.png" alt="" class="wp-image-13185" width="512" height="452" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_01-1024x903.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_01-300x265.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_01-768x678.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_01.png 1121w" sizes="(max-width: 512px) 100vw, 512px" /></figure>



<p>Give it a meaningful name, such as the
source we are going to dynamically retrieve data from:</p>



<figure class="wp-block-image is-resized"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_03-1024x344.png" alt="" class="wp-image-13187" width="512" height="172" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_03-1024x344.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_03-300x101.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_03-768x258.png 768w" sizes="(max-width: 512px) 100vw, 512px" /></figure>



<p>On the Dynamic Folder Script window, we can
choose PowerShell to be the script interpreter:</p>



<figure class="wp-block-image is-resized"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_02-1024x475.png" alt="" class="wp-image-13186" width="512" height="238" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_02-1024x475.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_02-300x139.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_02-768x357.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_02.png 1174w" sizes="(max-width: 512px) 100vw, 512px" /></figure>



<p>You will get an example script which shows
what kind of objects is expected, as well as how to convert them to JSON (which
is the output format Royal TS expects):</p>



<figure class="wp-block-image is-resized"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_04-1024x759.png" alt="" class="wp-image-13188" width="512" height="380" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_04-1024x759.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_04-300x222.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_04-768x570.png 768w" sizes="(max-width: 512px) 100vw, 512px" /></figure>



<p>After modifying the options, such as
credentials, click OK. When right-clicking the folder we create, we can see
that we have the option of reloading the folder:</p>



<figure class="wp-block-image is-resized"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_05.png" alt="" class="wp-image-13189" width="226" height="209" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_05.png 452w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_05-300x277.png 300w" sizes="(max-width: 226px) 100vw, 226px" /></figure>



<p>Clicking this will trigger the PowerShell
script we just saw on the Dynamic Folder Script window, and the folder should
now look like this when using the provided example script.</p>



<figure class="wp-block-image is-resized"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_06.png" alt="" class="wp-image-13190" width="137" height="104"/></figure>



<p>According to the <a href="https://content.royalapplications.com/Help/RoyalTS/V5/index.html?reference_dynamicfolder_advanced.htm">documentation</a>,
there are two options available for reloading a dynamic folder:</p>



<ul><li><strong><em>Automatically reload folder
contents</em></strong><em>
&#8211; If checked, Royal TS will automatically reload the folder contents when the
document is opened.</em></li><li><strong><em>Persist (cache) folder
contents</em></strong><em>
&#8211; If checked, Royal TS will save (cache) the contents of this dynamic folder
within the document.</em></li></ul>



<p>By default, none of the two options is
enabled.</p>



<p>In order to populate the dynamic folder
with our own data using PowerShell, we can build custom objects with the
necessary properties:</p>


<pre class="brush: powershell; title: ; notranslate">
<pre class="crayon-plain-tag">[PSCustomObject]@{
        Name                   = $PSItem.Name
        Type                   = 'RemoteDesktopConnection'
        ComputerName           = $PSItem.Name
        CredentialName         = 'DOMAIN\username'
        Path                   = MySubfolder
    }</pre>
</pre>


<p>This example creates an object to be used with
a Remote Desktop Connection. If you want to build other connection types, see
the documentation for <a href="https://support.royalapplications.com/support/solutions/articles/17000070210">RoyalJSON
and Dynamic Folders</a>.</p>



<p>Next, we need to put all the objects we
have created in a hash table called ‘Objects’, as this is what the RoyalJSON
format expects:</p>


<pre class="brush: powershell; title: ; notranslate">
<pre class="crayon-plain-tag">$RoyalTSObjects = @{}
$null = $RoyalTSObjects.Add('Objects',$Servers)</pre>
</pre>


<hr class="wp-block-separator"/>



<p>The final piece is to convert the hash table
to JSON format, which is very convenient to do in PowerShell:</p>


<pre class="brush: powershell; title: ; notranslate">
<pre class="crayon-plain-tag">$RoyalTSObjects | ConvertTo-Json</pre>
</pre>


<p>As you can see, for a PowerShell user it is
very straightforward to build a script for dynamically populating a folder with
connection objects in Royal TS.</p>



<p>We can populate it with any data we can
retrieve from PowerShell.</p>



<p>We will start by looking at an example on
how to accomplish this using Active Directory as the data source.</p>



<p>You need to install Remote Server Administration
Tools (RSAT) in order to leverage the Active Directory module for PowerShell
from a workstation. Starting with Windows 10 October 2018 Update, RSAT is
included as a set of &#8220;Features on Demand&#8221; in Windows 10 itself. From
PowerShell, it can be installed using this command:</p>


<pre class="brush: powershell; title: ; notranslate">
<pre class="crayon-plain-tag">Add-WindowsCapability -Online -Name Rsat.ActiveDirectory.DS-LDS.Tools~~~~0.0.1.0</pre>
</pre>


<p>A Royal TS community member has created a <a href="https://www.youtube.com/watch?v=pKurlGhMfoQ">YouTube video</a> explaining
more details about retrieving data from Active Directory for building a dynamic
folder in Royal TS, such as building the Dynamic Folder sub folder structure
based on the Organizational Unit structure the computer objects is retrieved
from.</p>



<p>You can use your existing skills for
retrieving the computer accounts you want from Active Directory. I am re-using
an existing function I have created called Get-ServerFromAD. This will retrieve
computer accounts with an operating system name starting with Windows Server*,
exclude Cluster Name Objects and include computer accounts which have logged on
during the last number of days specified (30 is the default).</p>



<p>You can find the complete script <a href="https://github.com/janegilring/PSCommunity/blob/master/Royal%20TS/Dynamic%20Folder%20Scripts/Get-ServerFromAD.ps1">here</a>.</p>



<p>I would recommend to first run the script
manually in order to verify that data can be retrieved from Active Directory.
You may also want to customize options such as domain name and credentials. I
am using the script from a non-domain joined laptop, hence I need to specify
credentials.</p>



<p>When the script is customized and verified,
paste it in the Dynamic Folder Script window and click OK:</p>



<figure class="wp-block-image is-resized"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_11.png" alt="" class="wp-image-13198" width="600" height="454" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_11.png 800w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_11-300x227.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_11-768x581.png 768w" sizes="(max-width: 600px) 100vw, 600px" /></figure>



<p>After re-loading the document, you should now see server computer accounts from Active Directory:</p>



<figure class="wp-block-image"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_07.png" alt="" class="wp-image-13191"/></figure>



<p>I have also created a <a href="https://github.com/janegilring/PSCommunity/blob/master/Royal%20TS/Dynamic%20Folder%20Scripts/Get-ServerFromSCVMM.ps1">script</a>
for getting computer names from System Center Virtual Machine Manager (both
hosts and virtual machines), which can be used to populate a dynamic folder:</p>



<figure class="wp-block-image"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_08.png" alt="" class="wp-image-13192"/></figure>



<p>In my example script I’ve created a flat structure, putting all computer names in a subfolder called VMM. Here it is possible to do all sorts of creative things, such as use Group-Object on the OperatingSystem property and then create SSH connections for Linux machines and RDP connections for Windows machines.</p>



<p>I plan to create other scripts to retrieve computer names from other sources, such as Azure, Amazon Web Services, and VMware. Royal Applications has a dedicated <a href="https://github.com/royalapplications/toolbox">repository</a> for various automation scripts &#8211; created both by the Royal TS team and the community – where I also will submit Pull Requests for my contributions.</p>



<p><strong>Using
PowerShell Core as Script Interpreter</strong></p>



<p>By navigating to the Royal TS Options, it
is possible to modify Script Interpreter settings in the Advanced section:</p>



<figure class="wp-block-image is-resized"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_09.png" alt="" class="wp-image-13193" width="600" height="450" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_09.png 800w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_09-300x225.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_09-768x576.png 768w" sizes="(max-width: 600px) 100vw, 600px" /></figure>



<p>By default, the PowerShell Script
Interpreter is configured with the following path:</p>



<p><em>%windir%\System32\WindowsPowerShell\v1.0\powershell.exe</em></p>



<p>If you rather want to leverage PowerShell Core as the engine for the PowerShell Script Interpreter, simply change the path to either pwsh.exe (as it is available in the System Path):</p>



<figure class="wp-block-image"><img src="http://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_10.png" alt="" class="wp-image-13194" srcset="https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_10.png 567w, https://www.powershellmagazine.com/wp-content/uploads/2019/01/Royal_TS_Dynamic_Folders_10-300x97.png 300w" sizes="(max-width: 567px) 100vw, 567px" /></figure>



<p>Alternatively, specify the full path:</p>



<p><em>C:\Program
Files\PowerShell\6\pwsh.exe</em></p>



<p>I have been using PowerShell Core without
issues for the 2 Dynamic Folder Scripts I have showed in this article. For the
Active Directory Dynamic Folder Script, PowerShell Core is using ~2.5 seconds
to run against my lab Active Directory instance, while Windows PowerShell is
using ~4 seconds.</p>



<p><strong>Summary</strong></p>



<p>In this article we have looked at how the
dynamic folder script feature in Royal TS can be used to dynamically creating
Royal TS connection objects based on data gathered from a PowerShell script.</p>



<p>We also looked at different sources we can
retrieve data from, such as Active Directory and System Center Virtual Machine
Manager.</p>



<p><em>Bonus
tip</em></p>



<p>If you are a Microsoft MVP, you can get an
NFR license for Royal TS by sending an e-mail to support (at) royalapplications
(dot) com with a link to your MVP profile.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2019/01/03/using-powershell-for-generating-dynamic-folders-in-royal-ts/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>#PowerShell Conference Asia 2019 &#8211; Call for papers!</title>
		<link>https://www.powershellmagazine.com/2018/12/31/powershell-conference-asia-2019-call-for-papers-2/</link>
				<comments>https://www.powershellmagazine.com/2018/12/31/powershell-conference-asia-2019-call-for-papers-2/#respond</comments>
				<pubDate>Tue, 01 Jan 2019 03:32:02 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Community]]></category>
		<category><![CDATA[PowerShell Conference]]></category>
		<category><![CDATA[PSConfAsia]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13180</guid>
				<description><![CDATA[We announced back in October that the next edition of PowerShell Conference Asia will be hosted in Bangalore (India) and... ]]></description>
								<content:encoded><![CDATA[
<p>We announced back in October that the next edition of <a rel="noreferrer noopener" aria-label="PowerShell Conference Asia (opens in a new tab)" href="http://psconf.asia" target="_blank">PowerShell Conference Asia</a> will be hosted in Bangalore (India) and is scheduled to happen from 19th to 21st September 2019.</p>



<p>Like the previous year, we will have a day-long pre-conference workshops this year too. In addition to that, we are introducing pre-conference deep dive talks. While the workshops are targeted more towards the beginners, the pre-conference deep dive talks are meant for those of you who are already at level 150 or 200.</p>



<p>The main conference starts on 20th September 2019. The two days of this conference will have exciting in-depth talks by experts from across the world.</p>



<p>Today, we are announcing the <a rel="noreferrer noopener" aria-label="call for papers (CFP) (opens in a new tab)" href="https://www.papercall.io/psconfasia2019" target="_blank">call for papers (CFP)</a> to invite interested speakers from around the world to share their knowledge at PowerShell Conference Asia 2019. If you have some exciting work that you want to share with a larger international PowerShell community, this is your chance. Go ahead and <a href="https://www.papercall.io/psconfasia2019" target="_blank" rel="noreferrer noopener" aria-label="submit your session proposals (opens in a new tab)">submit your session proposals</a>. We will be closing the CFP by end of February. </p>



<p>If you are a PowerShell beginner or interested in networking with the international community of PowerShell experts and the PowerShell product team from Microsoft, stay tuned for our blind/early bird ticket sales that will start very soon.</p>


<p><!--EndFragment--></p>]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/12/31/powershell-conference-asia-2019-call-for-papers-2/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Redfish Event Listener in PowerShell</title>
		<link>https://www.powershellmagazine.com/2018/11/13/redfish-event-listener-in-powershell/</link>
				<comments>https://www.powershellmagazine.com/2018/11/13/redfish-event-listener-in-powershell/#respond</comments>
				<pubDate>Tue, 13 Nov 2018 09:00:51 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[PowerShell]]></category>
		<category><![CDATA[redfish]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13096</guid>
				<description><![CDATA[The Redfish specification supports event mechanism through which the target Redfish devices can send events from different components in the... ]]></description>
								<content:encoded><![CDATA[<p>The <a href="https://www.dmtf.org/standards/redfish">Redfish specification</a> supports event mechanism through which the target Redfish devices can send events from different components in the system to an event listener. The <a href="https://github.com/rchaganti/PSRedfishEventListener">PSRedfishEventListener</a> project provides an event listener that is create in native PowerShell.</p>
<p class="md-end-block md-focus" contenteditable="true">This module contains the following commands.</p>
<figure class="md-table-fig" contenteditable="false">
<table class="md-table" border="2">
<thead>
<tr class="md-end-block">
<th><span class="td-span" contenteditable="true">Command Name</span></th>
<th><span class="td-span" contenteditable="true">Description</span></th>
</tr>
</thead>
<tbody>
<tr class="md-end-block">
<td><span class="td-span" contenteditable="true">Add-PSRedfishEventListenerHttpsBinding</span></td>
<td><span class="td-span" contenteditable="true">This command adds an SSL certificate binding for the HTTPS event listener for Redfish events. This command has the following parameters.</span></td>
</tr>
<tr class="md-end-block">
<td><span class="td-span" contenteditable="true">Get-PSRedfishEventSubscription</span></td>
<td><span class="td-span" contenteditable="true">This command gets all the event subscriptions from a Redfish device.</span></td>
</tr>
<tr class="md-end-block">
<td><span class="td-span" contenteditable="true">Remove-PSRedfishEventListenerHttpsBinding</span></td>
<td><span class="td-span" contenteditable="true">This command removes all SSL certificate bindings from the local system.</span></td>
</tr>
<tr class="md-end-block">
<td><span class="td-span" contenteditable="true">Register-PSRedfishEventSubscription</span></td>
<td><span class="td-span" contenteditable="true">This command registers a redfish device with an event listener. A given redfish can send events to multiple event listeners.</span></td>
</tr>
<tr class="md-end-block">
<td><span class="td-span" contenteditable="true">Start-PSRedfishEventListener</span></td>
<td><span class="td-span" contenteditable="true">The <span spellcheck="false"><code>Start-PSRedfishListener</code></span> command starts the Redfish event listener (HTTPS) on the local host. This HTTPS listener requires an SSL certificate binding and it needs to be done as a prerequisite and can be performed using the <span spellcheck="false"><code>Add-PSRedfishListenerHttpsBinding</code></span> command.</span></td>
</tr>
<tr class="md-end-block">
<td><span class="td-span" contenteditable="true">Stop-PSRedfishEventListener</span></td>
<td><span class="td-span" contenteditable="true">The <span spellcheck="false"><code>Stop-PSRedfishEventListener</code></span> commands stop the event listener that was started using the <span spellcheck="false"><code>Start-PSRedfishEventListener</code></span> command either locally or on a remote system.</span></td>
</tr>
<tr class="md-end-block">
<td><span class="td-span" contenteditable="true">Send-PSRedfishTestEvent</span></td>
<td><span class="td-span" contenteditable="true">As per the Redfish eventing specification, each Redfish device should support submitting a test event to the listener for testing purposes. This is generally used for validation purposes that the Redfish device can indeed send the events to configured destination and the listener can process the received event messages as desired.</span></td>
</tr>
<tr class="md-end-block">
<td><span class="td-span" contenteditable="true"><span class="">Unregister-PSRedfishEventSubscription</span></span></td>
<td><span class="td-span" contenteditable="true"><span class="">This command will remove an event subscription from the Redfish device.</span></span></td>
</tr>
</tbody>
</table>
</figure>
<p><span class="md-expand">Full documentation of these commands with examples is available at: </span><span class="md-link md-expand" spellcheck="false"><a href="https://psredfishlistener.readthedocs.io/en/latest/">https://psredfishlistener.readthedocs.io/en/latest/</a></span></p>
<h3>Example Workflow</h3>
<ol>
<li>Start an event listener using the Start-PSRedfishEventListener command.</li>
</ol>
<p></p><pre class="crayon-plain-tag">Start-PSRedfishEventListener -IPAddress 172.16.102.76 -Port 9090 -LogPath C:\RedfishEvents</pre><p>The above command will start new event listener that is bound to a specific IP address on the local host and to port 9090. The log path will be set to C:\RedfishEvents.</p>
<ol>
<li>Perform registration on a Redfish Endpoint to send alerts to the listener. This is done using the Register-PSRedfishEventSubscription command.</li>
</ol>
<p></p><pre class="crayon-plain-tag">$credential = Get-Credential -Message 'Credentials to authenticate to the Redfish device ...'
Register-PSRedfishEventSubscription -EventDestination https://172.16.102.76 -IPAddress 172.16.100.21 -Credential $credential</pre><p>The above command will register (create an event subscription) Redfish device 172.16.100.21 to send all event types to listener at https://172.16.102.76.</p>
<ol>
<li>Test event subscription using the&nbsp;Send-PSRedfishTestEvent.</li>
</ol>
<p></p><pre class="crayon-plain-tag">$credential = Get-Credential -Message 'Credentials to authenticate to the Redfish device ...'

Send-PSRedfishTestEvent -IPAddress 172.16.100.21 -Credential $credential -EventDestination https://172.16.102.76 -Verbose</pre><p>The above command will submit a test event from Redfish device with an IP address 172.16.100.21 to the event listener at 172.16.102.76. The event type and message ID will be set to the defaults defined by the function.</p>
<p>4.Stop the event listener using the&nbsp;Stop-PSRedfishEventListener command.</p><pre class="crayon-plain-tag">Stop-PSRedfishEventListener -IPAddress '172.16.102.76' -Verbose</pre><p>I have a few new features lined up for the next release. Similar to PowerShell object events, with the upcoming release, you will be able to tag an action associated to a specific event type from all Redfish device source or a specific source.</p>
<p>&nbsp;</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/11/13/redfish-event-listener-in-powershell/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Converting a PowerShell Project to use Azure DevOps Pipelines</title>
		<link>https://www.powershellmagazine.com/2018/09/20/converting-a-powershell-project-to-use-azure-devops-pipelines/</link>
				<comments>https://www.powershellmagazine.com/2018/09/20/converting-a-powershell-project-to-use-azure-devops-pipelines/#respond</comments>
				<pubDate>Thu, 20 Sep 2018 18:01:19 +0000</pubDate>
		<dc:creator><![CDATA[Daniel Scott-Raynsford]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Azure]]></category>
		<category><![CDATA[Azure DevOps]]></category>
		<category><![CDATA[DevOps]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13064</guid>
				<description><![CDATA[Introduction Continuous Integration (CI) is the process of integrating code into a source code repository several times a day. Each... ]]></description>
								<content:encoded><![CDATA[<h2>Introduction</h2>
<p><em>Continuous Integration (CI)</em> is the process of integrating code into a <em>source code repository</em> several times a day. Each time code is pushed into the repository an automated process runs to build and verify the code continues to work. This is often called the “CI process” and needs to run on an <em>agent</em>, usually a Windows or Linux machine dedicated to the task. Developers and practitioners of “DevOps” have been using this practice for several years, but it is now becoming common and even critical path with IT professionals.</p>
<p>If you’re putting your code in a <em>source code repository</em> within a corporate or private environment, you may have some private <em>CI tools</em> set up to run your “CI process”, for example Team Foundation Server or Jenkins.</p>
<p>But what about if you’re running an <em>open source public project</em> in GitHub? You had to use one of the free (for open source public projects) CI systems available such as <a href="https://www.appveyor.com/">AppVeyor</a>, which provides <em>Linux</em> and <em>Windows</em> agents or <a href="https://docs.travis-ci.com/">TravisCI</a> which provides <em>Linux</em> and <em>macOS</em> agents. For <strong>PowerShell</strong> projects you pretty much had only one option: <em>AppVeyor</em>. But with the ability for PowerShell to run across <em>multiple platforms</em> (Windows, Linux and macOS) with <a href="https://github.com/PowerShell/PowerShell">PowerShell Core</a> the need for a multi-platform CI has become more important. This meant you needed to add <em>multiple</em> <em>CI systems</em> to your open source project to ensure your PowerShell Core module or code works correctly on Windows, Linux and macOS.</p>
<p>With the introduction of <a href="https://azure.microsoft.com/en-us/blog/announcing-azure-pipelines-with-unlimited-ci-cd-minutes-for-open-source/">Azure DevOps Pipelines</a> you can now use the same <em>CI process</em> across Windows, Linux and macOS using the same system. For <em>open source projects</em> this is free, and you can have 10 concurrent <em>CI processes</em> running at the same time. You can keep your existing TravisCI and AppVeyor processes if you’ve got them configured already – there is no restriction to how many different CI processes you can have running on a GitHub open source project.</p>
<p>In this article I’ll share my experiences moving one of my open source multi-platform <a href="https://github.com/PlagueHO/CosmosDB/">PowerShell Core module projects</a> over to use Azure DevOps Pipelines. This project already had a well-defined <em>CI process</em> set up using PowerShell scripts and <a href="https://github.com/pester/Pester">Pester</a>, <a href="https://github.com/PowerShell/PSScriptAnalyzer">PSScriptAnalyzer</a>, <a href="https://github.com/psake/psake">PSake</a>, <a href="https://github.com/RamblingCookieMonster/PSDepend">PSDepend</a>, <a href="https://github.com/RamblingCookieMonster/BuildHelpers">BuildHelpers</a>, and <a href="https://github.com/RamblingCookieMonster/PSDeploy">PSDeploy</a> modules. So, this article won’t be showing how to write PowerShell to create a <em>CI process,</em> as that is a book in itself.</p>
<p><em>Disclaimer: <strong>Azure DevOps</strong> is a rebranding (and more) of an existing cloud-based development tool called <strong>Visual Studio Team Services</strong>. I’ve been using Visual Studio Team Services build and release pipelines for a couple of years in a private corporate environment, so many of the techniques I implemented such as YAML build definitions weren’t new. However, the experience of getting my GitHub account set up to use Azure DevOps Pipelines was new to me.</em></p>
<h2>Getting Started</h2>
<p>To get started using <strong>Azure DevOps Pipelines</strong> with your open source project you’ve got to hook up your GitHub account to an Azure DevOps organization. The easiest way to do this is to find the <a href="https://github.com/marketplace/azure-pipelines">Azure Pipelines</a> service page in the <a href="https://github.com/marketplace">GitHub marketplace</a>:</p>
<p><img class="alignnone wp-image-13065" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_01.png" alt="" width="600" height="293" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_01.png 1324w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_01-300x146.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_01-768x375.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_01-1024x500.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>Once you’re there, just click <strong>Set up a plan</strong> and then click <strong>Install it for Free</strong>. You’ll then be able to review the details of your order.</p>
<p><em>Note: If your GitHub account is an <strong>owner</strong> of a public or private organizational account in then you may also choose the <strong>billing account</strong> under the <strong>Billing information</strong>.</em></p>
<p>Once you’re ready, click <strong>Complete order and begin installation</strong>:</p>
<p><img class="alignnone wp-image-13066" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_02.png" alt="" width="600" height="293" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_02.png 1324w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_02-300x147.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_02-768x375.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_02-1024x500.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>GitHub then asks which repositories to <strong>Install Azure Pipelines</strong> into. This will grant <strong>Azure Pipelines</strong> permissions to perform certain tasks on any repositories you select. I chose to enable Azure Pipelines on just a single repository to start with, but you could select <strong>All repositories</strong>.</p>
<p><img class="alignnone size-full wp-image-13067" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_03.png" alt="" width="541" height="831" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_03.png 541w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_03-195x300.png 195w" sizes="(max-width: 541px) 100vw, 541px" /></p>
<p><em>It is easy to enable <strong>Azure Pipelines </strong>into additional GitHub repositories later by heading over to <strong>Applications</strong> in your </em><a href="https://github.com/settings/profile"><em>GitHub Settings</em></a><em> and clicking <strong>Configure</strong> next to the <strong>Azure Pipelines</strong> installed GitHub Apps:</em></p>
<p><img class="alignnone wp-image-13068" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_04.png" alt="" width="600" height="329" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_04.png 1385w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_04-300x165.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_04-768x421.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_04-1024x562.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>Clicking <strong>Install </strong>takes you over to <strong>Azure DevOps</strong> where you will be required to login with a <a href="https://account.microsoft.com/account">Microsoft Account</a>. If you’ve already got an <em>Azure DevOps organization</em> (or a previous VSTS organization) you’ll be asked to select the organization to add the <strong>Azure DevOps Pipeline </strong>to. But if your <strong>Microsoft Account</strong> isn’t a member of an <em>Azure DevOps organization</em> then one will be created <em>automatically</em> for you. You can change the name of the <em>Azure DevOps organization</em> later if the default name doesn’t suit you.</p>
<p><img class="alignnone wp-image-13069" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_05.png" alt="" width="600" height="328" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_05.png 1153w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_05-300x164.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_05-768x420.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_05-1024x560.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>If a new<em> Azure DevOps organization </em>was created for you then a new project will be created in the <em>Azure DevOps organization </em>with the same name. This isn’t too obvious at first.</p>
<p>But if you chose (or had an option to choose) to use an existing <em>Azure DevOps organization </em>(e.g. if you had a previous VSTS organization attached to your Microsoft account) then you’ll be asked to select an existing <strong>project</strong> or create a new one. The flow is slightly different, but still very straightforward.</p>
<p>Whether or not you had a project and organization created for you or used existing ones you’ll be taken straight to the <strong>New Pipeline</strong> screen:</p>
<p><img class="alignnone wp-image-13070" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_06.png" alt="" width="600" height="170" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_06.png 1401w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_06-300x85.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_06-768x218.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_06-1024x290.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>This is a new experience that was not present in the old VSTS. The list of repositories that have been granted access to <strong>Azure DevOps Pipelines</strong> will be listed. Clicking the repository to create a pipeline for will display the <strong>Template</strong> step.</p>
<p>The <strong>Template</strong> step is where you can select from a list of CI build templates for common project types. Unfortunately, a template for PowerShell modules or projects is not provided, but this is fine because we don’t need anything fancy if using the <a href="https://github.com/psake/psake">PowerShell PSake module</a> to include all the build code in a <a href="https://github.com/PlagueHO/CosmosDB/blob/dev/psakefile.ps1">psakefile.ps1</a> in the GitHub repository (which I had done).</p>
<p>I just selected <strong>Starter pipeline</strong>:</p>
<p><img class="alignnone wp-image-13071" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_07.png" alt="" width="600" height="329" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_07.png 1385w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_07-300x164.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_07-768x421.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_07-1024x561.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>A <a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=vsts">YAML file</a> is displayed that describes the steps that will be performed each time the CI process runs. This file, named <strong>azure-pipelines.yml</strong>, will be committed into the root folder of the GitHub repository for us.</p>
<p><img class="alignnone wp-image-13072" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_08.png" alt="" width="600" height="325" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_08.png 1401w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_08-300x163.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_08-768x416.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_08-1024x555.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>What we would typically do is customize the YAML file to define jobs, steps, tasks and other configuration items that are used to drive our <em>CI process</em>. We can customize this file directly in this editor or do it later by changing it in the repository.</p>
<p>For the <em>CI Process</em> to build and test my <a href="https://github.com/PlagueHO/CosmosDB/blob/dev/azure-pipelines.yml">Cosmos DB PowerShell module</a> I used the following:</p>
<p>&nbsp;</p>
<pre class="brush: plain; title: ; notranslate">

jobs:
  - job: Build_PS_Win2016
    pool:
      vmImage: vs2017-win2016
    steps:
    - powershell: |
        .\build.ps1 -Verbose
      displayName: 'Build and Test'
    - task: PublishTestResults@2
      inputs:
        testRunner: 'NUnit'
        testResultsFiles: '**/TestResults.xml'
        testRunTitle: 'PS_Win2016'
      displayName: 'Publish Test Results'

  - job: Build_PSCore_Ubuntu1604

    pool:
      vmImage: ubuntu-16.04

    steps:
    - script: |
        curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
        curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list | sudo tee /etc/apt/sources.list.d/microsoft.list
        sudo apt-get update
        sudo apt-get install -y powershell
      displayName: 'Install PowerShell Core'

    - script: |
        pwsh -c '.\build.ps1'
      displayName: 'Build and Test'

    - task: PublishTestResults@2
      inputs:
        testRunner: 'NUnit'
        testResultsFiles: '**/TestResults.xml'
        testRunTitle: 'PSCore_Ubuntu1604'
      displayName: 'Publish Test Results'

  - job: Build_PSCore_MacOS1013
    pool:
      vmImage: xcode9-macos10.13
    steps:
    - script: |
        brew update
        brew tap caskroom/cask
        brew cask install powershell
      displayName: 'Install PowerShell Core'

    - script: |
        pwsh -c '.\build.ps1'
      displayName: 'Build and Test'

    - task: PublishTestResults@2
      inputs:
        testRunner: 'NUnit'
        testResultsFiles: '**/TestResults.xml'
        testRunTitle: 'PSCore_MacOS1013'
      displayName: 'Publish Test Results'

</pre>
<p>I’ll cover the content of this file further down. But for now, click <strong>Save and Run</strong>.</p>
<p>This gives you the option of <strong>committing directly to dev branch</strong> (or whatever the default branch of your repository is set to) or <strong>creating a new branch for this commit and start a pull request</strong>:</p>
<p><img class="alignnone wp-image-13073" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_09.png" alt="" width="600" height="325" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_09.png 1401w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_09-300x163.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_09-768x417.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_09-1024x555.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>What you choose to do will depend on if the GitHub repository allows contributors or owners to commit directly to the default branch or if changes must be made by way of a <em>Pull Request</em>. I used a <em>Pull Request </em>as this allowed me to get the CI process working before making changes to my default branch.</p>
<p>Clicking <strong>Save and run</strong> again will commit the file and create the new <strong>Azure</strong> <strong>Pipeline</strong>. It will also <em>manually</em> trigger a build using the <strong>Azure Pipeline</strong> and the YAML file that was created.</p>
<p><em>Note: Although this happens automatically, this is still considered a manual trigger, because this wasn’t triggered by a commit to the GitHub repository.</em></p>
<p><img class="alignnone wp-image-13074" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_10.png" alt="" width="600" height="329" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_10.png 1385w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_10-300x165.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_10-768x421.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_10-1024x562.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>If your CI process was successful with no errors (no PowerShell errors or test failures) then you can merge your Pull Request in GitHub:</p>
<p><img class="alignnone wp-image-13075" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_11.png" alt="" width="600" height="272" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_11.png 1385w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_11-300x136.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_11-768x348.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_11-1024x464.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>The setup of the CI process in Azure Pipelines is now completed. It was pretty easy, and I managed to get the basics working in around 30 minutes.</p>
<h2>Triggers</h2>
<p>By default, when you create an <strong>Azure DevOps Pipeline </strong>it is configured to be triggered when any <strong>commits</strong> are made to the default branch (usually <strong>dev</strong>) and for any <strong>Pull Requests</strong> made to the default branch. In my case I prefer my <em>CI process</em> to be triggered on changes to <em>any</em> branch. So, I needed to edit the <strong>Triggers</strong> section in the <strong>Build</strong> definition by editing it in the <strong>Azure DevOps</strong> interface:</p>
<p><img class="alignnone wp-image-13076" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_12.png" alt="" width="600" height="310" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_12.png 1335w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_12-300x155.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_12-768x396.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_12-1024x528.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>Once in the <strong>Build</strong> definition on the <strong>Triggers</strong> tab I changed the <strong>Continuous Integration</strong> trigger to contain a <strong>Branch</strong> <strong>Filter</strong> to include <strong>*</strong>:</p>
<p><img class="alignnone wp-image-13077" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_13.png" alt="" width="600" height="310" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_13.png 1335w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_13-300x155.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_13-768x396.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_13-1024x528.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>The <strong>Save &amp; Queue</strong> (or just <strong>Save</strong>) the updated <strong>Build </strong>definition.</p>
<p>Note: This can also be done by adding a <strong>Triggers</strong> entry to the YAML definition in the source code repository (see <a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/build/triggers?view=vsts&amp;tabs=yaml">this page</a> for more information).</p>
<h2>Build Badges</h2>
<p>A useful feature of most CI tools is that they allow you to easily display the status of your CI process pipelines in a README.MD or other documentation in your repository:</p>
<p><img class="alignnone wp-image-13078" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_14.png" alt="" width="600" height="244" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_14.png 648w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_14-300x122.png 300w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p><strong>Azure DevOps Pipelines</strong> is no exception. To get the example markdown to display the badge:</p>
<p><img class="alignnone wp-image-13079" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_15.png" alt="" width="600" height="310" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_15.png 1337w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_15-300x155.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_15-768x396.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_15-1024x528.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p><img class="alignnone wp-image-13080" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_16.png" alt="" width="600" height="284" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_16.png 606w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_16-300x142.png 300w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>I copied the <strong>Sample Markdown</strong> and pasted it into the README.MD in my project. This also makes it simple for anyone to click the <em>badge</em> and jump over to the project.</p>
<h2>Making it Public</h2>
<p>If you had allowed the GitHub marketplace item for Azure DevOps Pipeline create your Azure DevOps organization and project, then it will have created the organization as allowing <strong>Public</strong> projects and made this project <strong>Public</strong>:</p>
<p><img class="alignnone wp-image-13081" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_17.png" alt="" width="600" height="282" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_17.png 1320w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_17-300x141.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_17-768x361.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_17-1024x481.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>This does not mean anyone can edit the project or create/edit builds, but it does mean that anything that happens gets logged in the CI process during the build will be visible to everyone.</p>
<p><em>Note: Azure DevOps Pipelines will try to protect sensitive build variables by masking them, but we’ll cover that shortly.</em></p>
<p>However, when I set up my first project I was using an existing organization, which was configured to prevent <strong>Public</strong> projects. This meant that my project was configured as <strong>Private</strong> visibility only. This wasn’t ideal because all contributors and end consumers of an open source project need to be able to view the <em>CI process</em> logs and test results. I had to go into my <strong>Organization settings</strong> and set the <strong>Allow public projects</strong> <em>Security Policy </em>to <strong>On</strong>:</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-13082" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_18.png" alt="" width="600" height="273" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_18.png 1512w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_18-300x137.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_18-768x349.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_18-1024x466.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<h2>My azure-pipelines.yml File</h2>
<p>The most challenging part of setting up an Azure DevOps Pipeline is configuring the <strong>azure-pipelines.yml</strong> file, but it becomes quite clear how this works without too much research. All other continuous integration tools I’ve used require or at least support a YAML or some other declarative syntax file to be provided within the repository to control the CI process, so this isn’t unusual. This is often referred to <strong>Pipeline as Code</strong>.</p>
<p><em>Note: With <strong>Azure DevOps Pipelines</strong>, you can also use a </em><a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/get-started-designer?view=vsts&amp;tabs=new-nav"><em>visual designer</em></a><em> to define a build pipeline and then export the YAML file or just keep the visually designed pipeline and use that. I’d recommend exporting the YAML file and putting it in your repository because then the CI process definition itself is under source control.</em></p>
<h3>Jobs</h3>
<p>In my case I created a file that defined a simple <em>CI process</em> that contained <strong>three jobs</strong>:</p>
<ol>
<li>Build and test the module on an agent running Windows Server 2016 using PowerShell.</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
  - job: Build_PS_Win2016
    pool:
      vmImage: vs2017-win2016
</pre>
<ol>2. Build and test the module on an agent running Ubuntu 16.04 using PowerShell Core.</ol>
<pre class="brush: plain; title: ; notranslate">
  - job: Build_PSCore_Ubuntu1604
    pool:
      vmImage: ubuntu-16.04
</pre>
<ol>3. Build and test the module on an agent running MacOS 10.13 using PowerShell Core.</ol>
<pre class="brush: plain; title: ; notranslate">
  - job: Build_PSCore_MacOS1013
    pool:
      vmImage: xcode9-macos10.13
</pre>
<p>The <strong>jobs</strong> run in series each time the build triggers and runs on an agent in an <strong>Agent Pool</strong> using the <strong>vmImage</strong> specified. The available Agent Pools and the exact software installed onto them can be found in the <strong>Project Settings</strong>:</p>
<p><img class="alignnone wp-image-13083" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_19.png" alt="" width="600" height="348" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_19.png 1596w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_19-300x174.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_19-768x445.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_19-1024x593.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<h3>Steps &amp; Tasks</h3>
<p>Each <strong>job </strong>contained a single <strong>step</strong> with two or three <strong>tasks</strong>:</p>
<p>For the <strong>Build_PS_Win2016 </strong>job, I just had two tasks:</p>
<ol>
<li>Run the PowerShell script <strong>ps1</strong> in the repository (with <strong>-Verbose</strong> output enabled). The <strong>build.ps1</strong> just runs the tests (using <a href="https://github.com/Pester/Pester">Pester</a>), build the PowerShell help (using <a href="https://github.com/PowerShell/platyPS">PlatyPS</a>) and prepares the module for publishing. It does this by using <a href="https://github.com/Psake/PSake">PSake</a> to run tasks defined in the <strong>psakefile.ps1</strong>:</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
    - powershell: |
        .\build.ps1 -Verbose
      displayName: 'Build and Test'
</pre>
<ol>
<li>Publish the result of the Pester tests to back to the build so that they are available in the report:</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
    - task: PublishTestResults@2
      inputs:
        testRunner: 'NUnit'
        testResultsFiles: '**/TestResults.xml'
        testRunTitle: 'PS_Win2016'
      displayName: 'Publish Test Results'
</pre>
<p>For the <strong>Build_PSCore_Ubuntu1604 </strong>job, I needed to perform similar tasks as the Windows one, but I also needed to add an additional <strong>task</strong>:</p>
<ol>
<li>Install <strong>PowerShell Core</strong>:</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
    - script: |
        curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
        curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list | sudo tee /etc/apt/sources.list.d/microsoft.list
        sudo apt-get update
        sudo apt-get install -y powershell
</pre>
<ol>
<li>Run the PowerShell script <strong>ps1</strong> in the repository (with <strong>-Verbose</strong> output enabled). The <strong>build.ps1</strong> just runs the tests (using <a href="https://github.com/Pester/Pester">Pester</a>), build the PowerShell help (using <a href="https://github.com/PowerShell/platyPS">Platyps</a>) and prepares the module for publishing. It does this by using <a href="https://github.com/Psake/PSake">PSake</a> to run tasks defined in the <strong>psakefile.ps1</strong>:</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
    - script: |
        pwsh -c '.\build.ps1'
      displayName: 'Build and Test'
</pre>
<p><em>Note: This is slightly different to the Windows task as I can’t use the PowerShell task type, I instead use the <strong>script</strong> task type, executing <strong>pwsh</strong> (PowerShell Core) and passing in the name of the script to run</em></p>
<ol>
<li>Publish the result of the Pester tests to back to the build so that they are available in the report:</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
    - task: PublishTestResults@2
      inputs:
        testRunner: 'NUnit'
        testResultsFiles: '**/TestResults.xml'
        testRunTitle: 'PSCore_Ubuntu1604'
        displayName: 'Publish Test Results'
</pre>
<p><em>Note: the testRunTitle attribute allows the tests to be grouped by the different agents. If this is omitted, all the tests for each agent get bundled together which makes it nearly impossible to tell which agent the tests failed on.</em></p>
<p><em> </em>Finally, for the <strong>Build_PSCore_MacOS1013 </strong>job, I needed to perform similar tasks as the Windows one, but I also needed to add an additional <strong>task</strong>:</p>
<ol>
<li>Install <strong>PowerShell Core</strong>:</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
    - script: |
        brew update
        brew tap caskroom/cask
        brew cask install powershell
      displayName: 'Install PowerShell Core'
</pre>
<ol>
<li>Run the PowerShell script <strong>ps1</strong> in the repository (with <strong>-Verbose</strong> output enabled). The <strong>build.ps1</strong> just runs the tests (using <a href="https://github.com/Pester/Pester">Pester</a>), build the PowerShell help (using <a href="https://github.com/PowerShell/platyPS">Platyps</a>) and prepares the module for publishing. It does this by using <a href="https://github.com/Psake/PSake">PSake</a> to run tasks defined in the <strong>psakefile.ps1</strong>:</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
    - script: |
        pwsh -c '.\build.ps1'
      displayName: 'Build and Test'
</pre>
<p>&nbsp;</p>
<p><em>Note: This is slightly different to the Windows task as I can’t use the PowerShell task type, I instead use the <strong>script</strong> task type, executing <strong>pwsh</strong> (PowerShell Core) and passing in the name of the script to run.</em></p>
<ol>
<li>Publish the result of the Pester tests to back to the build so that they are available in the report:</li>
</ol>
<pre class="brush: plain; title: ; notranslate">
- task: PublishTestResults@2
      inputs:
        testRunner: 'NUnit'
        testResultsFiles: '**/TestResults.xml'
        testRunTitle: 'PSCore_MacOS1013'
      displayName: 'Publish Test Results'
</pre>
<p><em>Note: the testRunTitle attribute allows the tests to be grouped by the different agents. If this is omitted, all the tests for each agent get bundled together which makes it nearly impossible to tell which agent the tests failed on.</em></p>
<p>As you can see, there is not too much functionality in the YAML file itself. The real work is done by the <strong>build.ps1</strong> and the <strong>psakefile.ps1</strong>. Both of these scripts work the same way no matter whether I’m using AppVeyor, Travis CI or Azure DevOps Pipelines, and I use pretty much the same code in all of them.</p>
<h2>Pipeline Variables and Secrets</h2>
<p>The final thing I needed to do to complete my <strong>Azure DevOps Pipeline</strong>, was to add <em>environment variables</em> containing the Azure service principal details allowing my Pester tests to connect to my Azure account to create a real Cosmos DB to test against. These <em>environment variables</em> are sensitive (they grant access to my Azure account) and so must be treated with care and <strong>never committed into a source code repository</strong>.</p>
<p>The normal way of sharing sensitive information with the<strong> agent</strong> is to define <strong>pipeline variables</strong><span style="font-size: 16px;">:</span></p>
<p><img class="alignnone wp-image-13084" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_20.png" alt="" width="600" height="202" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_20.png 1593w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_20-300x101.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_20-768x258.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_20-1024x344.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>Any variable that is <strong>not</strong> a <strong>secret</strong> variable will be available to the <strong>agent</strong> as an environment variable. However, <strong>pipeline variables</strong> that are defined as secret will not be available as an <em>environment variable</em>. They are exposed in other ways, see <a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=vsts&amp;tabs=yaml%2Cbatch">this page</a> for more details.</p>
<p><em>Note: This is different behaviour than in AppVeyor or Travis CI which expose both secret and non-secret variables as environment variables. In my case I will need to adjust my <strong>build.ps1</strong> script to take account of this.</em></p>
<p>Some other important notes about <strong>pipeline variables</strong>:</p>
<ul>
<li>Usually only the owner of the <strong>build definition</strong> can edit or see these build variables. Other <strong>public </strong>users can not view or edit the <strong>build definition</strong>, so they cannot see the <strong>pipeline variables</strong></li>
<li><strong>Pipeline variables </strong>are not made available to builds <em>triggered</em> by a <strong>pull request </strong>because this would make them accessible to anyone who submitted the <strong>pull request</strong> to your repository. This behaviour can be changed, but it would compromise any variables declared in the <strong>build definition</strong>.</li>
<li><strong>Pipeline variables</strong> can be defined within the <strong>azure-pipelines.yml</strong> file as well, however, this would result in the values being committed into your source code repository which would compromise the values.</li>
</ul>
<h2>Next Steps</h2>
<p>There are still several tasks I have yet to complete before I’m completely satisfied that my <strong>Azure DevOps Pipelines</strong> CI (and continuous delivery) process is 100% finished:</p>
<ol>
<li>Change the integration test process to be able to access the <strong>pipeline variables </strong>that are declared as secret. This will enable the integration tests in my Cosmos DB module to use my personal Azure account to create a Cosmos DB account and run tests against it.</li>
<li>Output my module and related files (documentation etc.) as a <strong>build artefact</strong>. This makes them part of the build output and available for download if the build still exists. My AppVeyor <em>CI Process</em> currently does this, and I just need to add the additional tasks to the <strong>azure-pipelines.yml</strong></li>
<li>Move the process of publishing the module to the <a href="https://www.powershellgallery.com/">PowerShell Gallery</a> from my AppVeyor CI process into <strong>Azure DevOps Pipelines</strong> as a <strong>Release Pipeline</strong>. The <strong>Release Pipeline</strong> will be triggered from a <strong>Build Pipeline </strong>producing an artefact from the <strong>master</strong></li>
</ol>
<p><em>Note: <strong>Release Pipelines</strong> are often linked to a <strong>Continuous Delivery </strong>process. If there is interest, I will share a guide on how I set this up. However, at the time of writing this article, <strong>Release Pipelines </strong>in <strong>Azure DevOps</strong> cannot be controlled from a YAML file, so the more traditional <strong>visual designer</strong> method of defining a <strong>Release Pipeline</strong> is required.</em></p>
<h2></h2>
<h2>Summary</h2>
<p>I found the experience of enabling <strong>Azure DevOps Pipelines</strong> seamless and well thought-out. It provides several great features that aren’t as full featured in other free CI tools, such as:</p>
<p><img class="alignnone wp-image-13085" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_21.png" alt="" width="600" height="368" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_21.png 1315w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_21-300x184.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_21-768x471.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_21-1024x628.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<ul>
<li>The <strong>Test</strong> <strong>Summary </strong>and test filters are also very useful when tracking down a failed test:</li>
</ul>
<p><img class="alignnone wp-image-13086" src="http://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_22.png" alt="" width="600" height="305" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_22.png 1330w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_22-300x153.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_22-768x391.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/09/Azure_DevOps_Pipelines_22-1024x521.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>If you’re looking into implementing a CI process for an open source project, then <strong>Azure DevOps Pipelines</strong> worth a look. Or if you’re just wanting to add another layer of validation to a project with an existing CI process then this should be a fun an easy implementation.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/09/20/converting-a-powershell-project-to-use-azure-devops-pipelines/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Configuring role-based access control (RBAC) for runbooks in Azure Automation</title>
		<link>https://www.powershellmagazine.com/2018/08/27/configuring-role-based-access-control-rbac-for-runbooks-in-azure-automation/</link>
				<comments>https://www.powershellmagazine.com/2018/08/27/configuring-role-based-access-control-rbac-for-runbooks-in-azure-automation/#respond</comments>
				<pubDate>Mon, 27 Aug 2018 16:00:58 +0000</pubDate>
		<dc:creator><![CDATA[Jan Egil Ring]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Azure]]></category>
		<category><![CDATA[Azure Automation]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13054</guid>
				<description><![CDATA[When setting up runbooks in Azure Automation to invoke automation of a process through PowerShell or Python runbooks, the ability... ]]></description>
								<content:encoded><![CDATA[<p>When setting up runbooks in Azure Automation to invoke automation of a process through PowerShell or Python runbooks, the ability to assign permissions per runbook for user types such as power users is something we missed in the early days of Azure Automation.</p>
<p>When working with permissions in Azure, we have the concept of RBAC available:</p>
<p><em>Role-based access control (RBAC) enables access management for Azure resources. Using RBAC, you can segregate duties within your team and grant only the amount of access to users, groups, and applications that they need to perform their jobs.</em></p>
<p>During the past year, the ability to <a href="https://docs.microsoft.com/en-us/azure/automation/automation-role-based-access-control#configure-rbac-for-runbooks">configure RBAC for runbooks</a> was added to the service.</p>
<p>When I initially tested this feature, I granted 1 single test user permissions to an existing user.</p>
<pre class="brush: powershell; title: ; notranslate">
Connect-AzureRmAccount
Connect-AzureAD

# Resource Group name for the Automation Account
$rgName = &quot;AutomationWestEurope-Rg&quot;
# Name of the Automation Account
$automationAccountName =&quot;AutomationWestEurope&quot;
# Name of the runbook
$rbName = &quot;Invoke-RobocopyBackup&quot;
$userId = (Get-AzureADUser -ObjectId 'demo.user@powershell.no').ObjectId 

# Gets the Automation Account resource
$aa = Get-AzureRmResource -ResourceGroupName $rgName -ResourceType &quot;Microsoft.Automation/automationAccounts&quot; -ResourceName $automationAccountName

# Get the runbook resource
$rb = Get-AzureRmResource -ResourceGroupName $rgName -ResourceType &quot;Microsoft.Automation/automationAccounts/runbooks&quot; -ResourceName &quot;$automationAccountName/$rbName&quot;

# The Automation Job Operator role only needs to be run once per user
New-AzureRmRoleAssignment -ObjectId $userId -RoleDefinitionName &quot;Automation Job Operator&quot; -Scope $aa.ResourceId

# Adds the user to the Automation Runbook Operator role to the runbook scope
New-AzureRmRoleAssignment -ObjectId $userId -RoleDefinitionName &quot;Automation Runbook Operator&quot; -Scope $rb.ResourceId
</pre>
<p>I then logged into the Azure subscription using that user. As expected, the only resource the user has access to is the single runbook where permissions are assigned:</p>
<p><img class="alignnone size-full wp-image-13055" src="http://www.powershellmagazine.com/wp-content/uploads/2018/08/AARBAC_01.png" alt="" width="348" height="133" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/08/AARBAC_01.png 348w, https://www.powershellmagazine.com/wp-content/uploads/2018/08/AARBAC_01-300x115.png 300w" sizes="(max-width: 348px) 100vw, 348px" /></p>
<p>This means the user is unable to see the whole Automation account, including resources such as global variables, credentials and other shared resources.</p>
<p>After starting a runbook job, everything looked like expected, except for the Output stream being unavailable (“No access”):</p>
<p><img class="alignnone size-full wp-image-13056" src="http://www.powershellmagazine.com/wp-content/uploads/2018/08/AARBAC_02.png" alt="" width="256" height="236" /></p>
<p>After reaching out to the Automation team, I got in touch with Chris Sanders which determined that the <em>Microsoft.Automation/automationAccounts/jobs/output/read</em> permission seemed to be missing.</p>
<p>I was recently asked to test again, since the bug is now fixed.</p>
<p>This time, the required permissions was in place and the Output stream was visible:</p>
<p><img class="alignnone wp-image-13057" src="http://www.powershellmagazine.com/wp-content/uploads/2018/08/AARBAC_03.png" alt="" width="600" height="363" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/08/AARBAC_03.png 943w, https://www.powershellmagazine.com/wp-content/uploads/2018/08/AARBAC_03-300x182.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/08/AARBAC_03-768x465.png 768w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>The ability to assign runbook permissions for users and groups is a very useful feature, making it possible to use the Azure Portal as the user interface for an automated process. It is then possible to perform tasks without granting the end user permissions directly to backend services such as a SQL database or local administrator permissions on a server.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/08/27/configuring-role-based-access-control-rbac-for-runbooks-in-azure-automation/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>#PowerShell Conference Asia 2018 #PSConfAsia</title>
		<link>https://www.powershellmagazine.com/2018/08/14/powershell-conference-asia-2018-psconfasia/</link>
				<comments>https://www.powershellmagazine.com/2018/08/14/powershell-conference-asia-2018-psconfasia/#respond</comments>
				<pubDate>Tue, 14 Aug 2018 11:56:44 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Community]]></category>
		<category><![CDATA[PSConfAsia]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13020</guid>
				<description><![CDATA[The PowerShell Conference Asia resumes for its 4th year this October, bringing speakers from Asia and around the world to... ]]></description>
								<content:encoded><![CDATA[<p>The <a href="http://psconf.asia/">PowerShell Conference Asia</a> resumes for its 4th year this October, bringing speakers from Asia and around the world to deliver in-depth PowerShell and DevOps sessions. I am excited to be a part of this great event for another year and can&#8217;t wait to meet all my old friends and make a few new friends. I will be bringing over a few copies of my <a href="https://www.apress.com/us/book/9781484234822">Pro PowerShell Desired State Configuration</a> book to give away!</p>
<p>Jeffrey Snover, father of PowerShell, will be at this year&#8217;s conference and other speakers include the Microsoft PowerShell Product Team members from Redmond and a strong line-up of MVPs, well-known international speakers, and community contributors.&nbsp;They&#8217;ll cover in-depth topics on the PowerShell language and how you can use PowerShell to automate the technologies you use every day, both within the Microsoft technology stack and well beyond. There will be strong focus on using PowerShell to enable DevOps practices whether on-premises or in the cloud as well as Systems Administration and Security.</p>
<p>Our theme for this year will emphasize on the Open Source aspects of PowerShell development as we look to build an even stronger community engagement and contribution.&nbsp;The main event runs on Friday and Saturday, but we also have a pre-conference day on Thursday for hands-on workshops.&nbsp;At the end of Day 1 (Friday), we have drinks and nibbles at a local bar where you can connect with peers and the speakers in a more relaxed setting. All included in your ticket price. Remember, we still have another full day of content on Saturday though!</p>
<p>Similar to my PowerShell Conference EU 2018 PS Drive, I made one for browsing PS Conference Asia agenda as well. You can download the module from my GitHub repository&nbsp;<a href="https://github.com/rchaganti/psconfasiadrive">https://github.com/rchaganti/psconfasiadrive</a>.</p>
<p>Here is a quick snip of using this module.<a href="http://www.powershellmagazine.com/wp-content/uploads/2018/08/psconfasiadrive.gif" rel="lightbox[13020]"><img class="aligncenter wp-image-13024" src="http://www.powershellmagazine.com/wp-content/uploads/2018/08/psconfasiadrive-1024x537.gif" alt="" width="713" height="374" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/08/psconfasiadrive-1024x537.gif 1024w, https://www.powershellmagazine.com/wp-content/uploads/2018/08/psconfasiadrive-300x157.gif 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/08/psconfasiadrive-768x403.gif 768w" sizes="(max-width: 713px) 100vw, 713px" /></a></p>
<p>If you are in the APJ / APAC region and you have not registered yet to attend this conference, this is the right time to do so. You can use <a href="https://ti.to/psconfasia/psconfasia2018/discount/RaviLuvPoSH">this special link</a> for PowerShell Magazine readers to avail discount on the conference pass.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/08/14/powershell-conference-asia-2018-psconfasia/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Configuring Azure environment to set up Azure SQL Database Managed Instance (preview)</title>
		<link>https://www.powershellmagazine.com/2018/07/23/configuring-azure-environment-to-set-up-azure-sql-database-managed-instance-preview/</link>
				<comments>https://www.powershellmagazine.com/2018/07/23/configuring-azure-environment-to-set-up-azure-sql-database-managed-instance-preview/#respond</comments>
				<pubDate>Mon, 23 Jul 2018 16:00:34 +0000</pubDate>
		<dc:creator><![CDATA[Jovan Popovic]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Azure]]></category>
		<category><![CDATA[SQL]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=13010</guid>
				<description><![CDATA[Azure SQL Database Managed Instance (preview) is a new database service in Azure. This is a fully-managed latest version of... ]]></description>
								<content:encoded><![CDATA[<p><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-transact-sql-information">Azure SQL Database Managed Instance (preview)</a> is a new database service in Azure. This is a fully-managed latest version of SQL Server database engine hosted in Azure and managed by Azure and SQL Server teams.</p>
<p>Azure SQL Managed Instance is your private database engine PaaS resource that is placed in your Azure VNet with assigned private IP from the range that you can choose. You need to pre-configure Azure environment where Managed Instance will be placed before creation of new managed instances.</p>
<p>Although you can create and configure all necessary VNets, subnets, and network access rules using the Azure portal, this could be error-prone task because you need to follow <a href="https://blogs.msdn.microsoft.com/sqlserverstorageengine/2018/03/14/how-to-configure-network-for-azure-sql-managed-instance/">the rules described here</a>. The better approach would be to create PowerShell script that creates and configures your Azure environment where you will place your Managed Instances. In the following sections, you will see one example of the PowerShell code that configures the environment for the Azure SQL Database Managed Instance.</p>
<h2>Configuring environment</h2>
<p>In order to provision SQL Database Managed Instances, you would need to do the following three important things:</p>
<ol>
<li>Create VNet where your Managed Instances will be injected.</li>
<li>Create a dedicated subnet in your VNet where Managed Instances should be placed.</li>
<li>Set user-defined route table that would enable communication between your Managed Instances in private subnet and Azure management service.</li>
</ol>
<p>In the following sections we describe how to configure these elements using PowerShell. The assumption is that you already have Azure subscription and that you can connect to your Azure account using PowerShell.</p>
<h2>Configuring virtual network</h2>
<p>In the first step, you need to create a VNet where your Managed Instances will be placed. In our example, VNet will use IP range 10.0.0.0/16 – you can change this according to your needs.</p>
<p>We also create the default subnet with a range 10.0.0.0/24 where you could place some of the resources that could communicate to Managed Instances (for example, VMs that you would use to install apps that will access managed instance). <strong>Note that other resources cannot be placed in the subnet that is dedicated to Managed Instances. </strong>If you don’t need other resources in the VNet, you can use this subnet to place Managed Instances.</p>
<p>The following commands create new resource group called myPowerShellMagazineResourceGroup&nbsp; and VNet called myPowerShellMagazineNetwork in West Central US region:</p>
<pre class="brush: powershell; title: ; notranslate">
$resourceGroup = 'myPowerShellMagazineResourceGroup'
$location = 'West Central US'
$vNetName = 'myPowerShellMagazineNetwork'

New-AzureRmResourceGroup -ResourceGroupName $resourceGroup -Location $location

$virtualNetwork = New-AzureRmVirtualNetwork -ResourceGroupName $resourceGroup -Location $location -Name $vNetName -AddressPrefix 10.0.0.0/16

$subnetConfig = Add-AzureRmVirtualNetworkSubnetConfig -Name default -AddressPrefix 10.0.0.0/24 -VirtualNetwork $virtualNetwork

</pre>
<p>Now, the first step is done and you need to create one or more additional subnets that would be dedicated to your Managed Instances.</p>
<h2>Configuring subnet for Managed Instance</h2>
<p>Every Managed Instance is placed in a subnet that defines the boundary of IP addresses that every instance can take. Managed Instances don’t have fixed IP addresses. Azure service that controls and manages instances can move an instance to a different IP addresses if OS or database engine code is patched/upgraded, some problem is detected and the instance needs to be moved to the new location, or if you want to assign more resources to the instance so it needs to be re-allocated to a machine with more resources.</p>
<p>Managed Instances will assume that they can take any place/IP address within the IP range of the subnet, so if you accidentally put some VM or other resource in this subnet it will clash with some Managed Instance. You would need to plan carefully how big is IP range that you would assign to this subnet because it cannot be changed once you create first resource in the subnet. IP range depends on the number of instances that you want to place in subnet and you would need at least two IP addresses per each instance and 4 IP addresses reserved for internal services (this might be changed in the future).</p>
<p>The subnet is dedicated to SQL Database Managed Instances. This subnet cannot contain any other resource such as VMs that could take some IP address in the subnet.</p>
<p>Let’s create a new subnet called “mi” with the IP address range 10.0.1.0/24 (this could be changed according to your needs):</p>
<pre class="brush: powershell; title: ; notranslate">

$subnetConfigMi = Add-AzureRmVirtualNetworkSubnetConfig -Name mi -AddressPrefix 10.0.1.0/24 -VirtualNetwork $virtualNetwork

$virtualNetwork | Set-AzureRmVirtualNetwork

</pre>
<p>If you need more subnets where you want to group and organize your instances, you can repeat this code with different -Name and -AddressPrefix values.</p>
<h2>Enable access to Azure Management Service</h2>
<p>The final step is to configure access that would enable the managed instances in the private IP range to communicate with the Azure services that manages them.</p>
<p>Managed Instance subnet must have access to Azure services that gets the heartbeat signals from the Managed Instances that are placed in your subnet. These signals enable the service to check instance health and manage the instance (for example, perform regular backups, failover instance if something is wrong, etc.).</p>
<p>You need to create one route table that will have address prefix 0.0.0.0/0 and next hop set to “Internet” to enable this communication, as shown in the following script:</p>
<pre class="brush: powershell; title: ; notranslate">

$routeTableMiManagementService = New-AzureRmRouteTable -Name 'myRouteTableMiManagementService' -ResourceGroupName $resourceGroup -location $location

Set-AzureRmVirtualNetworkSubnetConfig -VirtualNetwork $virtualNetwork -Name 'mi' -AddressPrefix 10.0.1.0/24 -RouteTable $routeTableMiManagementService |
Set-AzureRmVirtualNetwork

Get-AzureRmRouteTable -ResourceGroupName $resourceGroup -Name 'myRouteTableMiManagementService' |
Add-AzureRmRouteConfig -Name 'ToManagedInstanceManagementService' -AddressPrefix 0.0.0.0/0 -NextHopType 'Internet' |
Set-AzureRmRouteTable

</pre>
<p>NOTE: User route table with described configuration is the current requirement during the public preview period. This requirement will change in future and enable you to specify narrow set of IP ranges for the traffic that goes outside the subnet. Always check the Managed Instance documentation to see the latest security rules.</p>
<h2>Conclusion</h2>
<p>Once you finish with the steps described in this article, you will have prepared environment where you can create Azure SQL Database Managed Instances.</p>
<p>You can create Managed instances using Azure portal, ARM templates, Azure PowerShell, and Azure CLI. When you use some of these methods, you would need to provide resource group, VNet, and subnet for your managed instances.</p>
<p>Probably is the best to do it first in the Azure portal because you can visually see how to configure the instance, and then you can automate it using Azure PowerShell and following the steps from this article.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/07/23/configuring-azure-environment-to-set-up-azure-sql-database-managed-instance-preview/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>#PSConfEU Agenda as a PowerShell Drive using #SHiPS</title>
		<link>https://www.powershellmagazine.com/2018/04/03/psconfeu-agenda-as-a-powershell-drive-using-ships/</link>
				<comments>https://www.powershellmagazine.com/2018/04/03/psconfeu-agenda-as-a-powershell-drive-using-ships/#respond</comments>
				<pubDate>Tue, 03 Apr 2018 16:24:41 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[PSConfEU]]></category>
		<category><![CDATA[PSDrive]]></category>
		<category><![CDATA[SHiPS]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12990</guid>
				<description><![CDATA[The SHiPS module has several use cases with structured data. I have written a few proof-of-concept modules using SHiPS to... ]]></description>
								<content:encoded><![CDATA[<p>The <a href="https://github.com/powershell/SHiPS">SHiPS</a> module has several use cases with structured data. I have written a few proof-of-concept modules using SHiPS to understand how it works and try out different design patterns.</p>
<p>One of my sessions at <a href="http://psconf.eu">PowerShell Conference EU 2018</a> is around using SHiPS. In the process of creating different demos for this session, I started implementing PS drives for several different things. One such module I created enables the ability to browse PowerShell Conference EU 2018 agenda as a PowerShell drive. I have an initial draft of this module at <a href="https://github.com/rchaganti/PSConfDrive">https://github.com/rchaganti/PSConfDrive</a>.</p>
<h2 id="how-to-install-the-module-">How to install the module?</h2>
<p>Since this is still a very early version of the module, I have not published it yet on the PowerShell Gallery and you need to download the <a href="https://github.com/rchaganti/PSConfDrive/archive/master.zip">zip archive</a> of the GitHub repository and extract it to a folder represented by <code>$env:PSModulePath</code>. You will require the SHiPS module as well. This can be downloaded from the PowerShell Gallery.</p>
<p></p><pre class="crayon-plain-tag">Install-Module -Name SHiPS -Force</pre><p></p>
<p>The following commands will load the modules and map a PS drive.</p>
<p></p><pre class="crayon-plain-tag">Import-Module SHiPS -Force
Import-Module PSConfDrive -Force
New-PSDrive -Name PSConfEU -PSProvider SHiPS -Root psconfdrive#psconfeu</pre><p></p>
<p>Here is how you can use this PS drive for exploring the conference agenda.</p>
<p><img class="" src="https://i.imgur.com/cgdueER.gif" alt="" width="630" height="288" /></p>
<p>Once again, this is a POC only and the design still needs to be and can be optimized. If you plan to attend PSConfEU 2018, come to my session on SHiPS to understand how to use the module and choose the right design pattern for your modules.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/04/03/psconfeu-agenda-as-a-powershell-drive-using-ships/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PowerShell Conference EU 2018</title>
		<link>https://www.powershellmagazine.com/2018/02/09/powershell-conference-eu-2018/</link>
				<comments>https://www.powershellmagazine.com/2018/02/09/powershell-conference-eu-2018/#respond</comments>
				<pubDate>Fri, 09 Feb 2018 14:51:40 +0000</pubDate>
		<dc:creator><![CDATA[Tobias Weltner]]></dc:creator>
				<category><![CDATA[Community]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[2018]]></category>
		<category><![CDATA[PSConf EU]]></category>
		<category><![CDATA[PSConfEU]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12981</guid>
				<description><![CDATA[The US-based “PowerShell and DevOps Global Summit” sold out in record time this year, and the European “PowerShell Conference EU”... ]]></description>
								<content:encoded><![CDATA[<p>The US-based “<a href="http://powershell.org/summit">PowerShell and DevOps Global Summit</a>” sold out in record time this year, and the European “<a href="http://psconf.eu">PowerShell Conference EU</a>” (psconf.eu) is filling quickly. Both take place in April 2018. If they are new to you, then it’s probably because they are community-organized events, without a profit intention and with no huge marketing budget. That being said there is work being done to try to expand said marketing budget in the near future.&nbsp;</p>
<p>One of the ways we are trying to give ourselves a larger online marketing presence is by hiring a <a href="https://www.adss30.com/wikipedia-page-creation/">wikipedia page consultant</a>. This person will be in charge of giving us a professionally formatted wikipedia presence that our potential clients can peruse to learn about our business. Expanding the marketing budget is important of course, but in the meantime there are alternatives that we as a business can use that are still well within our budget. We should turn to those first to be sure.</p>
<p>As a co-organizer of psconf.eu, I’d like to walk you through this year’s event and some of its ideas. Here is the official AfterMovie from PSConfEU 2017.</p>
<p><iframe width="560" height="315" src="https://www.youtube.com/embed/G38tN7B46Fg?controls=0" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen="allowfullscreen"></iframe></p>
<p>Psconf.eu takes place this year April 17-20, in Hanover, Germany. The full agenda and last-minute information are available at www.psconf.eu.</p>
<p>There is no doubt that PowerShell is an essential skill set for modern administrators, and it has always evolved. When you look at the PowerShell ecosystem in the past 12 months, though, you’ll see an unprecedented pile of changes, some of which are transformative and even disruptive.</p>
<p>PowerShell went open-source, PowerShell Core 6 goes cross-platform and reaches out to Linux and macOS, admins are faced with two PowerShell “editions”, and Windows PowerShell is frozen. It’s an exciting mix of opportunities (reaching out to Linux/macOS, managing heterogenous environments, using PowerShell in the cloud) and deep desire for guidance (will I have to rewrite scripts with PowerShell Core 6? How can I develop code that works in all PowerShell editions? How safe is it to continue to rely on Windows PowerShell 5?).</p>
<p>Problem is: you can’t attend classes or get books for answers to any of these questions. Once there are classes or books, they would inevitably be outdated again. Classes and books are great for fundamentals, but too static to cope with cutting edge topics, and Microsoft has been pretty cutting edge lately.</p>
<p>As an admin in a small shop, you may get away with ignoring the latest developments for a while. As a consultant or admin in larger enterprises, you cannot. First-hand &amp; cutting-edge information is what your future decisions are based on. It drives your business. Guaranteeing state of the art and safe operations is a must.</p>
<p>That’s why PowerShell conferences are partially serving the training for hot topics these days and feel more like an intense advanced training for experienced PowerShell professionals. Instead of asking for permission to go to a conference, it would probably be more accurate to tap your training budget.</p>
<p>Should you feel you can’t get out of your job for 4 days, consider this: by bringing together experts from all disciplines to one location, and embedding them into a well-designed and rich agenda, surrounded by social and workshop events, these conferences provide the answers, guidance and orientation that save you endless hours of internet research, make sure you won’t reinvent the wheel, focus on future-proof techniques, and use the latest security guidelines.</p>
<p>Getting Orientation and Guidance</p>
<p>On day 1, we open the conference with delegate registration and then start together in the great historic Leibniz Saal. PowerShell inventor Jeffrey Snover kicks off the event with his keynote “PowerShell 2018 – State of the Art”, providing solid orientation and guidance: where do we stand, where do we go from here. Then, the conference fans out into three parallel tracks, most of which are delivered in English.</p>
<p>Some of these presentations dig deeper into PowerShell Core: Wojciech Sciesinski explains how to create future-proof PowerShell modules that are compatible with all PowerShell editions, and work cross-platform. PowerShell team members from Redmond showcase their current developments, Ben Gelens talks about new DSC, and German .NET Guru Dr. Schwichtenberg explains what to do with PowerShell on Linux, and summarizes the essential .NET knowledge any ambitioned PowerShell user should have.</p>
<h3>No-Nonsense Sessions &amp; Discussions</h3>
<p>If you ever attended a classic conference, you know how exhausting it can be to listen to endless presentations and slide decks. At psconf.eu, all presentations are 45 minutes sharp, then open up in discussion. We want presentations to be concise and on the point, and prefer demos over endless slides. At the end, you have 15 minutes of Q&amp;A. “Presentations are great, but coffee breaks are where you meet people” This is why psconf.eu has the highest coffee break ratio in the industry: Chat with the presenters, let the information further sink in, defrag your mind, and connect to others.</p>
<p>All materials will be downloadable, and sessions are recorded so you can replay them later. “Ask the Speakers” on day 1, “Ask the Experts” at every lunch, and the speakers and Microsoft finale at the end are all chances to get information for anything that wasn’t covered in the presentations. If you leave the event with a PowerShell question unanswered, then you did not ask.</p>
<h3>Social Event</h3>
<p>This is a personal conference where you get to know people, and know whom to ask. That’s why there is a limit on the number of delegates, and why we have social evenings. The legendary evening event on day 1 takes place in “Yukon Bay” again, an ancient gold digger town in the heart of the Hanover Zoo. It will be a big “Hello” for the alumnis, and a warm “Welcome” to anyone new to psconf.eu. We’ll have great food and drinks, polar bears and seals, beer and wine, and the chance to hang loose and make new connections and friendships. Everyone hangs out, including speakers. You may want to continue to talk about PowerShell, but you may just as well just kick back and enjoy the evening, the likelihood of which raises over time and number of beers.</p>
<h3>Security – Essential Knowledge to Boost Your Career</h3>
<p>psconf.eu delivers 75 sessions and covers almost every aspect of PowerShell. It wouldn’t make sense to go over all sessions here. Visit <a href="http://www.psconf.eu">www.psconf.eu</a> instead and review the full agenda. Tip: hover over a session to view the abstract. The agenda is preliminary, and we hope to be able to implement a mobile-friendly version soon.</p>
<p>One topic stands out: Security. We want delegates to become security experts and really know the best practices and how to deal with unsafe code and attackers. “Investing in people” is the best protection you can get, and psconf.eu is the place where you can do this investment, and improve security awareness and skills:</p>
<p>Security expert Matt Graeber reviews “the current state of PowerShell Security Features and Bypasses”. This includes JEA (“Just Enough Admin”), and when used correctly, it can be tremendously effective to increase security by reducing the blast radius of a compromise. Jan-Hendrik Peters and Raimund Andree from Microsoft Germany show you how: “Hands-on JEA”, complimented by David das Neves and Julien’s two-slot “The PowerShell Security Best Practice Live Demo”.</p>
<p>That’s literally just the tip of the iceberg. Red Teams and nation state threat actors alike are using PowerShell obfuscation to evade detection. Come see how the author of Invoke-Obfuscation and one of the original PowerShell developers tackle detecting obfuscation with PowerShell&#8217;s Abstract Syntax Tree and science in Revoke-Obfuscation (“Revoke-Obfuscation: PowerShell Obfuscation Detection (And Evasion) Using Science”). Attackers constantly update their tradecraft, forcing defenders to quickly build, tune, deploy &amp; maintain detections in never-ending sprints. Check out how applying DevOps practices &amp; frameworks like Pester, ScriptAnalyzer, &amp; custom fuzzers can drive robust methodology-based detection development (“DevSec Defense: How DevOps Practices Can Drive Detection Development For Defenders”)</p>
<p>Will Schroeder, one of the contributors of the “PowerShell Empire” post-exploitation agent, together with Jared Atkinson and Matt Graeber, sets up one of the three coding workshops on day 2 where you can get hands-on experience and learn how to check for security breaches in your own IT infrastructure.</p>
<h3>Plain Good Old PowerShell Knowledge</h3>
<p>Not every session is dead serious. The entire event is designed to have fun. Here are just a couple of sessions that are a bit eerie: Bartosz Bielawski dives into “PowerShell Yin-Yang: The Worst Practices and Tips &amp; Tricks”: Every Yin has its Yang. Every Jedi has her Sith. Every bad practice can be balanced with an awesome trick. Join me to see the darkest places of PowerShell scripting universe so that you know what to avoid! Get to know tricks that will impress your peers and tips that will make your life easier!</p>
<p>At PSConfEU17, Mathias Jessen talked about regex, and some of its common applications. This year, he&#8217;ll dive straight-first into some of the most bizarre functions .NET regex offers &#8211; the outer edge cases. Staffan Gustafsson takes a deep look into the PowerShell type system, including examples on how you can use it to adapt standard and third party types to your own situation and workflow.&nbsp; And Jared Atkinson investigates .NET reflection and how to access the Windows API from within PowerShell.</p>
<p>So to wrap it up, we’d love to welcome you in Hanover! To register and reserve your seat, or review the agenda, please visit <a href="http://www.psconf.eu">www.psconf.eu</a>. Should you have any questions, please drop me a mail at <a href="mailto:tobias@powertheshell.com">tobias@powertheshell.com</a>.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/02/09/powershell-conference-eu-2018/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Azure DevTest Labs Artifact for installing PowerShell Core</title>
		<link>https://www.powershellmagazine.com/2018/01/22/azure-devtest-labs-artifact-for-installing-powershell-core/</link>
				<comments>https://www.powershellmagazine.com/2018/01/22/azure-devtest-labs-artifact-for-installing-powershell-core/#respond</comments>
				<pubDate>Mon, 22 Jan 2018 17:00:50 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Azure]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[DevTest Labs]]></category>
		<category><![CDATA[linux]]></category>
		<category><![CDATA[powershell core]]></category>
		<category><![CDATA[Windows]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12959</guid>
				<description><![CDATA[Unless you were living under a rock, PowerShell Core general availability isn&#8217;t any breaking news. PowerShell Core is a cross-platform... ]]></description>
								<content:encoded><![CDATA[<p>Unless you were living under a rock, <a href="https://github.com/powershell/powershell">PowerShell Core</a> <a href="https://blogs.msdn.microsoft.com/powershell/2018/01/10/powershell-core-6-0-generally-available-ga-and-supported/">general availability</a> isn&#8217;t any breaking news.</p>
<blockquote><p>PowerShell Core is a cross-platform (Windows, Linux, and macOS) automation and configuration tool/framework that works well with your existing tools and is optimized for dealing with structured data (e.g. JSON, CSV, XML, etc.), REST APIs, and object models. It includes a command-line shell, an associated scripting language and a framework for processing cmdlets.</p></blockquote>
<p>Some time in 2016, I published the <a href="http://www.powershellmagazine.com/2016/08/22/azure-devtest-labs-artifact-for-installing-powershell-on-linux/">Azure DevTest Labs artifact for installing PowerShell for Linux</a> on Azure Linux virtual machines. Similar to this, I have now created a new artifact for installing PowerShell Core on Windows VMs in Azure. This new artifact is still not in the official artifacts repository and it is in my <a href="https://github.com/rchaganti/azure-devtestlab">GitHub repository</a>. Therefore, to be able to use this, you need to fork my repository and <a href="https://docs.microsoft.com/en-us/azure/devtest-lab/devtest-lab-add-artifact-repo">add it as an external repository source in your Azure DevTest lab</a>.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2018/01/1.png" rel="lightbox[12959]"><img class="aligncenter wp-image-12964" src="http://www.powershellmagazine.com/wp-content/uploads/2018/01/1.png" alt="" width="538" height="515" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/01/1.png 878w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/1-300x287.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/1-768x735.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/1-32x32.png 32w" sizes="(max-width: 538px) 100vw, 538px" /></a></p>
<p>Once this custom repository is added, here is how you use the PowerShell Core artifact.</p>
<p>Select the Virtual Machines blade in the Azure DevTest Labs.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2018/01/2-2.png" rel="lightbox[12959]"><img class="aligncenter wp-image-12965" src="http://www.powershellmagazine.com/wp-content/uploads/2018/01/2-2-1024x344.png" alt="" width="546" height="183" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/01/2-2-1024x344.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/2-2-300x101.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/2-2-768x258.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/2-2.png 1215w" sizes="(max-width: 546px) 100vw, 546px" /></a>Click the VM, and then click on Artifacts and Apply Artifacts. In the search box, type PowerShell Core, and then click on the result in the Apply Artifacts blade.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2018/01/2.png" rel="lightbox[12959]"><img class="aligncenter wp-image-12967" src="http://www.powershellmagazine.com/wp-content/uploads/2018/01/2.png" alt="" width="477" height="404" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/01/2.png 878w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/2-300x254.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/2-768x650.png 768w" sizes="(max-width: 477px) 100vw, 477px" /></a></p>
<p>Click on the artifact and supply the parameters required.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2018/01/3.png" rel="lightbox[12959]"><img class="aligncenter wp-image-12968" src="http://www.powershellmagazine.com/wp-content/uploads/2018/01/3.png" alt="" width="334" height="382" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/01/3.png 456w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/3-262x300.png 262w" sizes="(max-width: 334px) 100vw, 334px" /></a></p>
<p><strong>Package URL</strong> &#8211; The MSI URL for installing PowerShell Core. You can retrieve this from https://github.com/PowerShell/PowerShell/releases.</p>
<p><strong>Install C runtime for Windows OS prior to Windows Server 2016?</strong> &#8211; Select True for Windows Server 2012 R2 or select False. This is needed for Windows Server 2012 R2 if you want to use WinRM for PowerShell remoting.</p>
<p>Click Add. This artifact installation might take a few minutes and once complete, you can access the install script verbose logs.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2018/01/4.png" rel="lightbox[12959]"><img class="aligncenter wp-image-12969" src="http://www.powershellmagazine.com/wp-content/uploads/2018/01/4-1024x693.png" alt="" width="530" height="359" srcset="https://www.powershellmagazine.com/wp-content/uploads/2018/01/4-1024x693.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/4-300x203.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/4-768x520.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2018/01/4.png 1025w" sizes="(max-width: 530px) 100vw, 530px" /></a></p>
<p>This is it. You can, of course, install these artifacts (Windows or Linux) at the time of Azure DTL VM provisioning itself. And, you can do this deployment via an ARM template as well.</p>
<p></p><pre class="crayon-plain-tag">{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "newVMName": {
            "type": "string",
            "defaultValue": "PSCore1601"
        },
        "labName": {
            "type": "string",
            "defaultValue": "PSMagDTL"
        },
        "size": {
            "type": "string",
            "defaultValue": "Standard_A6"
        },
        "userName": {
            "type": "string",
            "defaultValue": "ravikanth"
        },
        "password": {
            "type": "securestring"
        },
        "PowerShell_Core_a.k.a_PowerShell_6_packageUrl": {
            "type": "string",
            "defaultValue": "https://github.com/PowerShell/PowerShell/releases/download/v6.0.0/PowerShell-6.0.0-win-x64.msi"
        },
        "PowerShell_Core_a.k.a_PowerShell_6_installCRuntime": {
            "type": "bool",
            "defaultValue": false
        },
        "labVirtualNetworkName" : {
            "type": "string"
        },
        "labSubnetName" : {
            "type" : "string"        
        }        
    },
    "variables": {
        "labVirtualNetworkId": "[resourceId('Microsoft.DevTestLab/labs/virtualnetworks', parameters('labName'), parameters('labVirtualNetworkName'))]",
        "vmId": "[resourceId ('Microsoft.DevTestLab/labs/virtualmachines', parameters('labName'), parameters('newVMName'))]",
        "vmName": "[concat(parameters('labName'), '/', parameters('newVMName'))]"
    },
    "resources": [
        {
            "apiVersion": "2017-04-26-preview",
            "type": "Microsoft.DevTestLab/labs/virtualmachines",
            "name": "[variables('vmName')]",
            "location": "[resourceGroup().location]",
            "properties": {
                "labVirtualNetworkId": "[variables('labVirtualNetworkId')]",
                "notes": "Windows Server 2016 Datacenter",
                "galleryImageReference": {
                    "offer": "WindowsServer",
                    "publisher": "MicrosoftWindowsServer",
                    "sku": "2016-Datacenter",
                    "osType": "Windows",
                    "version": "latest"
                },
                "size": "[parameters('size')]",
                "userName": "[parameters('userName')]",
                "password": "[parameters('password')]",
                "isAuthenticationWithSshKey": false,
                "artifacts": [
                    {
                        "artifactId": "[resourceId('Microsoft.DevTestLab/labs/artifactSources/artifacts', parameters('labName'), 'privaterepo596', 'windows-powershellcore')]",
                        "parameters": [
                            {
                                "name": "packageUrl",
                                "value": "[parameters('PowerShell_Core_a.k.a_PowerShell_6_packageUrl')]"
                            },
                            {
                                "name": "installCRuntime",
                                "value": "[parameters('PowerShell_Core_a.k.a_PowerShell_6_installCRuntime')]"
                            }
                        ]
                    }
                ],
                "labSubnetName": "[parameters('labSubnetName')]",
                "disallowPublicIpAddress": true,
                "storageType": "Standard",
                "allowClaim": false,
                "networkInterface": {
                    "sharedPublicIpAddressConfiguration": {
                        "inboundNatRules": [
                            {
                                "transportProtocol": "tcp",
                                "backendPort": 3389
                            }
                        ]
                    }
                }
            }
        }
    ],
    "outputs": {
        "labVMId": {
            "type": "string",
            "value": "[variables('vmId')]"
        }
    }
}</pre><p></p>
<p>At the end of this template deployment, you will have a Windows Server 2016 VM with PowerShell Core installed.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/01/22/azure-devtest-labs-artifact-for-installing-powershell-core/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PowerShell UserVoice: Add Support for NPM Type Version Specification in Module Manifest and #Requires</title>
		<link>https://www.powershellmagazine.com/2018/01/08/powershell-user-voice-add-support-for-npm-type-version-specification-in-module-manifest-and-requires/</link>
				<comments>https://www.powershellmagazine.com/2018/01/08/powershell-user-voice-add-support-for-npm-type-version-specification-in-module-manifest-and-requires/#respond</comments>
				<pubDate>Mon, 08 Jan 2018 17:00:53 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Module Specification]]></category>
		<category><![CDATA[PowerShell]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12949</guid>
				<description><![CDATA[If you have ever used Node.js, the packages.json file is used to specify the module dependencies. Here is an example:... ]]></description>
								<content:encoded><![CDATA[<p>If you have ever used Node.js, the packages.json file is used to specify the module dependencies. Here is an example:</p>
<p></p><pre class="crayon-plain-tag">{
  "name": "MyNodeJSApp",
  "version": "1.0.0",
  "description": "First Node JS Application",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" &amp;&amp; exit 1"
  },
  "author": "Ravikanth Chaganti",
  "license": "MIT",
  "dependencies": {
    "express": "^4.16.2"
  }
}</pre><p></p>
<p>In the above snippet, line 12 specifies express module with a version string <em>^4.16.2</em>. The version string here is prefixed with a caret (^) symbol. NPM supports different specification strings. We can prefix the version string with a tilde (~) as well or simply use an asterisk (*) to mean the most recent version or latest version of the module. Through the use of version range comparators, version can be specified in multiple ways. The <a href="https://github.com/npm/node-semver">node-semver repository</a> provides in-depth view into this.</p>
<p>From the node-semver page,</p>
<p>A <code>comparator</code> is composed of an <code>operator</code> and a <code>version</code>. The set of primitive <code>operators</code> is:</p>
<ul>
<li><code>&lt;</code> Less than</li>
<li><code>&lt;=</code> Less than or equal to</li>
<li><code>&gt;</code> Greater than</li>
<li><code>&gt;=</code> Greater than or equal to</li>
<li><code>=</code> Equal. If no operator is specified, then equality is assumed, so this operator is optional, but MAY be included.</li>
</ul>
<p>For example, the comparator <code>&gt;=1.2.7</code> would match the versions <code>1.2.7</code>, <code>1.2.8</code>, <code>2.5.3</code>, and <code>1.3.9</code>, but not the versions <code>1.2.6</code> or <code>1.1.0</code>.</p>
<p>The tilde (~) and caret (^) ranges can be used as well.</p>
<h4>X-Ranges <code>1.2.x</code> <code>1.X</code> <code>1.2.*</code> <code>*</code></h4>
<p>Any of <code>X</code>, <code>x</code>, or <code>*</code> may be used to &#8220;stand in&#8221; for one of the numeric values in the <code>[major, minor, patch]</code> tuple.</p>
<ul>
<li><code>*</code> := <code>&gt;=0.0.0</code> (Any version satisfies)</li>
<li><code>1.x</code> := <code>&gt;=1.0.0 &lt;2.0.0</code> (Matching major version)</li>
<li><code>1.2.x</code> := <code>&gt;=1.2.0 &lt;1.3.0</code> (Matching major and minor versions)</li>
</ul>
<h4><a id="user-content-tilde-ranges-123-12-1" class="anchor" href="https://github.com/npm/node-semver#tilde-ranges-123-12-1" aria-hidden="true"></a>Tilde Ranges <code>~1.2.3</code> <code>~1.2</code> <code>~1</code></h4>
<p>Allows patch-level changes if a minor version is specified on the comparator. Allows minor-level changes if not.</p>
<h4><a id="user-content-caret-ranges-123-025-004" class="anchor" href="https://github.com/npm/node-semver#caret-ranges-123-025-004" aria-hidden="true"></a>Caret Ranges <code>^1.2.3</code> <code>^0.2.5</code> <code>^0.0.4</code></h4>
<p>Allows changes that do not modify the left-most non-zero digit in the <code>[major, minor, patch]</code> tuple. In other words, this allows patch and minor updates for versions <code>1.0.0</code> and above, patch updates for versions <code>0.X &gt;=0.1.0</code>, and <em>no</em> updates for versions <code>0.0.X</code>.</p>
<h4>PowerShell UserVoice</h4>
<p>Coming to the subject of this article, having similar support in PowerShell module manifests and with #Requires statement, we can specify the module dependencies in a more flexible way. To this extent, I have created a UserVoice item: <a href="https://windowsserver.uservoice.com/forums/301869-powershell/suggestions/32845762-support-for-npm-type-version-strings-in-powershell">https://windowsserver.uservoice.com/forums/301869-powershell/suggestions/32845762-support-for-npm-type-version-strings-in-powershell</a></p>
<p>If you think this is useful feature in PowerShell, go ahead and vote it up!</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/01/08/powershell-user-voice-add-support-for-npm-type-version-specification-in-module-manifest-and-requires/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>#PSDSC Configuration Versioning in a Deployment Pipeline</title>
		<link>https://www.powershellmagazine.com/2018/01/05/psdsc-configuration-versioning-in-a-deployment-pipeline/</link>
				<comments>https://www.powershellmagazine.com/2018/01/05/psdsc-configuration-versioning-in-a-deployment-pipeline/#comments</comments>
				<pubDate>Fri, 05 Jan 2018 17:00:00 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[PSDSC]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12772</guid>
				<description><![CDATA[When we talk about applications or software deployed in the infrastructure, we simply refer to the version of the application... ]]></description>
								<content:encoded><![CDATA[<p>When we talk about applications or software deployed in the infrastructure, we simply <strong>refer to the version of the application</strong> <strong>or software</strong> running in the infrastructure. At any point in time, we can look at the installed service or software and understand what version of that is currently running on the system, we need to remember that <a href="https://www.salesforce.com/blog/2013/07/selling-service-not-sales.html">selling is service</a> not sales . How about your node configurations? Does your node tell you what version of the configuration it is currently using?</p>
<p>For example, consider that you have a set of web server nodes and each configured using PowerShell DSC. As a part of the initial configuration, you deployed a set of web applications on the node. And, at some point in time later, you made multiple changes to the node configuration in terms of adding or removing web applications and updating application configurations. If multiple such source controlled configurations are deployed on each of these nodes, how exactly do you figure out what version of the node configuration is being used on each of the nodes? This can probably be achieved by a complete suite of tools but is there a way, today, you can do this by querying the node itself? The answer is no. At least, in the PowerShell DSC world.</p>
<p>One of the core concepts of Infrastructure as Code (IaC) is to version/source control your infrastructure configurations so that it becomes easy to track what version of the configuration a target node has and rollback to last known working configuration when needed. But, the compiled DSC MOF files do not carry this information today nor they are aware of any source / version control systems. They need not be aware as well.</p>
<p>At this point in time, I track my node configuration version by adding a DSC resource that caches the node configuration version information. As a part of the build process, I can update the version of the configuration using this DSC resource. When I need to know what version of configuration a node is using, I simply invoke the <em>Get-DscConfiguration</em> cmdlets. to verify that.</p>
<p>I packaged this DSC resource into a module of its own and published in my Github account.</p>
<h4>ConfigurationDSC resource module</h4>
<p>Through the <a href="https://github.com/rchaganti/ConfigurationDsc">ConfigurationDSC</a> resource module, I plan to combine various resources that help in managing DSC based configurations in deployment pipeline. At this point in time, there is only one resource, <a href="https://github.com/rchaganti/ConfigurationDsc/tree/master/DscResources/ConfigurationVersion">ConfigurationVersion</a>, which helps in tracking the version of the configuration document. Here is an example of a configuration document with the <em>ConfigurationVersion </em>DSC resource.</p>
<p></p><pre class="crayon-plain-tag">Configuration VersionedConfiguration
{
    param
    (
        [Parameter(Mandatory = $true)]
        [String] $ConfigurationName,

        [Parameter(Mandatory = $true)]
        [String] $ConfigurationVersion
    )

    Import-DscResource -ModuleName PSDesiredStateConfiguration -ModuleVersion 1.1
    Import-DscResource -ModuleName ConfigurationDsc -ModuleVersion 1.0.0.0

    ConfigurationVersion WebServerConfiguration
    {
        Name = $ConfigurationName
        Version = $ConfigurationVersion
    }

    WindowsFeature WebServer
    {
        Name = 'Web-Server'
        IncludeAllSubFeature = $true
        Ensure = 'Present'
    }
}

VersionedConfiguration -ConfigurationName 'WebServerConfig' -ConfigurationVersion '1.0.0.0'</pre><p></p>
<p>Once this configuration is enacted, I can use the <em>Get-DscConfiguration </em>cmdlet to find the version of the node configuration.</p>
<p></p><pre class="crayon-plain-tag">Get-DscConfiguration | Where-Object { $_.CimClassName -eq 'ConfigurationVersion' } | Select -ExpandProperty Version</pre><p></p>
<p>What I have shown here is only a workaround. This approach has both pros and cons. I rather want this added into the existing DSC feature set. If you see a compiled MOF, it has a few configuration meta properties. For example, from the above example, the compiled MOF has the following instance of the <em>MSFT_ConfigurationDocument</em> class in the MOF.</p>
<p></p><pre class="crayon-plain-tag">instance of OMI_ConfigurationDocument
{
    Version="2.0.0";
    MinimumCompatibleVersion = "1.0.0";
    CompatibleVersionAdditionalProperties= {"Omi_BaseResource:ConfigurationName"};
    Author="Administrator";
    GenerationDate="01/05/2018 11:07:56";
    GenerationHost="MGMT01";
    Name="VersionedConfiguration";
};</pre><p></p>
<p>It would be ideal for us to be able to add the configuration version from the source / version control system to this instance of OMI_ConfigurationDocument. I created an user voice item for this: <a href="https://windowsserver.uservoice.com/forums/301869-powershell/suggestions/32825272-enable-configuration-version-tracking-for-compiled">https://windowsserver.uservoice.com/forums/301869-powershell/suggestions/32825272-enable-configuration-version-tracking-for-compiled</a></p>
<p>Go ahead and it vote it up if you think this will be useful in your infrastructure as well.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/01/05/psdsc-configuration-versioning-in-a-deployment-pipeline/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
							</item>
		<item>
		<title>Using NetworkAdapterProperty #PSDSC Resource to Configure Network Adapter Advanced Properties</title>
		<link>https://www.powershellmagazine.com/2018/01/04/using-networkadapterproperty-psdsc-resource-to-configure-network-adapter-advanced-properties/</link>
				<comments>https://www.powershellmagazine.com/2018/01/04/using-networkadapterproperty-psdsc-resource-to-configure-network-adapter-advanced-properties/#respond</comments>
				<pubDate>Thu, 04 Jan 2018 17:00:33 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[NetworkingDSC]]></category>
		<category><![CDATA[PSDSC]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12938</guid>
				<description><![CDATA[At times we need to set the physical adapter advanced properties such as VLAN ID. This can be done using... ]]></description>
								<content:encoded><![CDATA[<p>At times we need to set the physical adapter advanced properties such as VLAN ID. This can be done using the <em>Set-NetAdapterAdvancedProperty</em> cmdlet. However, when using DSC-based configuration management, it makes sense to configure these advanced properties as well using DSC resource modules. Within the automation that I have been working on for automated deployments of Storage Spaces Direct clusters, I have the need for setting adapter advanced properties and that is what triggered me to write this new resource module &#8212; <a href="https://github.com/rchaganti/NetworkingDSC">NetworkAdapterProperty</a>. This is a part of the <em>NetworkingDSC</em> module.</p>
<p><em>This is not a fork of the xNetworking module. I am adding only the resources that I am developing from scratch to this NetworkingDsc. These resources will follow the HQRM guidelines.</em></p>
<h4>Resource Properties</h4>
<table border="1">
<thead>
<tr>
<th>Property Name</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td>Id</td>
<td>Key property to uniquely identify the resource instance.</td>
</tr>
<tr>
<td>Name</td>
<td>Name of the network adapter.</td>
</tr>
<tr>
<td>DisplayName</td>
<td>Display Name of the advanced property.</td>
</tr>
<tr>
<td>DisplayValue</td>
<td>Value to be set on the advanced property.</td>
</tr>
<tr>
<td>Ensure</td>
<td>Specifies if the property needs to be configured (present) or reset to its default value (absent).</td>
</tr>
</tbody>
</table>
<p>The display name of a network adapter advanced property can be seen by using the <em>Get-NetAdapterAdvancedProperty</em> cmdlet. Depending on what the network adapter driver implements, this list changes between different network adapter models/vendors. Let&#8217;s see a couple of examples of using this resource.</p>
<h4>Configuring VLAN ID advanced property</h4>
<p>The display name of the advanced property for VLAN configuration on physical adapter is usually <em>VLAN ID</em>.</p>
<p></p><pre class="crayon-plain-tag">Configuration PhysicalAdapterVLAN
{
    Import-DscResource -ModuleName PSDesiredStateConfiguration -ModuleVersion 1.1
    Import-DscResource -ModuleName NetworkingDsc -ModuleVersion 1.0.0.0

    NetworkAdapterProperty VLAN
    {
        Id = 'S1P1VLAN'
        Name = 'SLOT 1 PORT 1'
        DisplayName =  'VLAN ID'
        DisplayValue = '102'
        Ensure = 'Present'
    }
}</pre><p></p>
<h4>Configuring DCBX Mode on Mellanox CX4 adapters</h4>
<p>This following example is specific to Mellanox CX4 adapters. These adapters support firmware or host controlled DCB exchange. We can configure this by changing the value of <em>DcbxMode</em> property.</p>
<p></p><pre class="crayon-plain-tag">Configuration PhysicalAdapterVLAN
{
    Import-DscResource -ModuleName PSDesiredStateConfiguration -ModuleVersion 1.1
    Import-DscResource -ModuleName NetworkingDsc -ModuleVersion 1.0.0.0

    NetworkAdapterProperty S1P1DCBX
    {
        Id = 'S1P1DCBX'
        Name = 'SLOT 1 PORT 1'
        DisplayName =  'DcbxMode'
        DisplayValue = 'Host in charge'
        Ensure = 'Present'
    }

    NetworkAdapterProperty S1P2DCBX
    {
        Id = 'S2P2DCBX'
        Name = 'SLOT 1 PORT 2'
        DisplayName =  'DcbxMode'
        DisplayValue = 'Host in charge'
        Ensure = 'Present'
    }    
}</pre><p></p>
<p>This resource will eventually be made a part of NetworkingDsc in the official DSC resource kit. Stay tuned.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/01/04/using-networkadapterproperty-psdsc-resource-to-configure-network-adapter-advanced-properties/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Using NetQoSDSC #PSDSC Resource Module to Configure Network QoS</title>
		<link>https://www.powershellmagazine.com/2018/01/03/using-netqosdsc-psdsc-resource-module-to-configure-network-qos/</link>
				<comments>https://www.powershellmagazine.com/2018/01/03/using-netqosdsc-psdsc-resource-module-to-configure-network-qos/#respond</comments>
				<pubDate>Wed, 03 Jan 2018 17:00:22 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[DSC]]></category>
		<category><![CDATA[PSDSC]]></category>
		<category><![CDATA[resource modules]]></category>
		<category><![CDATA[Resources]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12933</guid>
				<description><![CDATA[As a part of larger hyper-converged infrastructure (based on S2D) configuration automation using PowerShell DSC, I have written quite a... ]]></description>
								<content:encoded><![CDATA[<p>As a part of larger hyper-converged infrastructure (based on S2D) configuration automation using PowerShell DSC, I have written quite a few new DSC resource modules. <a href="https://github.com/rchaganti/FailoverClusterDSC">FailoverClusterDSC</a> was one of the modules in that list. I added <a href="https://github.com/rchaganti/NetQoSDSC">NetQoSDSC</a> as well to ensure I have the automated means to configure the QoS policies in Windows Server 2016.</p>
<p>This module contains five resources at the moment.</p>
<table border="1">
<thead>
<tr>
<th>Resource Name</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td>NetAdapterQoS</td>
<td>Enable/disable network adapter QoS.</td>
</tr>
<tr>
<td>NetQosDCBXSetting</td>
<td>Enable/disable DCBX willing state in the OS. This can be done at the global scope or for a specific interface.</td>
</tr>
<tr>
<td>NetQoSFlowControl</td>
<td>Enable or disable 802.1P action priorities.</td>
</tr>
<tr>
<td>NetQoSPoicy</td>
<td>Create and configure QoS policies.</td>
</tr>
<tr>
<td>NetQoSTrafficClass</td>
<td>Create and manage QoS traffic classes.</td>
</tr>
</tbody>
</table>
<h4>Enable/Disable Network Adapter QoS</h4>
<p>The <em>NetAdapterQoS</em> resource can be used to enable/disable QoS a specific network adapter.</p>
<p></p><pre class="crayon-plain-tag">Configuration NetAdapterQoS
{
    Import-DscResource -ModuleName PSDesiredStateConfiguration -ModuleVersion 1.1
    Import-DscResource -ModuleName NetQoSDSC -ModuleVersion 1.0.0.0

    NetAdapterQoS EnableAdapterQoS
    {
        NetAdapterName = 'Storage1'
        Ensure = 'Present'
    }
}</pre><p></p>
<h4>Enable/Disable DCBX Willing mode</h4>
<p>DCBX willing mode can be enabled or disabled using the <em>NetQoSDCBXSetting</em> resource. This can be done at an interface level or at the global level in the operating system.</p>
<p></p><pre class="crayon-plain-tag">Configuration DisableGlobalDCBX
{
    Import-DscResource -ModuleName PSDesiredStateConfiguration -ModuleVersion 1.1
    Import-DscResource -ModuleName NetQosDsc -ModuleVersion 1.0.0.0

    NetQoSDcbxSetting DisableGlobal
    {
        InterfaceAlias = 'Global'
        Willing = $false
    }

    NetQoSDcbxSetting EnableStorage1
    {
        InterfaceAlias = 'Storage1'
        Willing = $true
    }
}</pre><p></p>
<h4>Enable/Disable Network QoS flow control priorities</h4>
<p>The <em>NetQosFlowControl</em> resource can be used to enable or disable 802.1P flow control priorities.</p>
<p></p><pre class="crayon-plain-tag">Configuration NetQoSFlowControl
{
    Import-DscResource -ModuleName PSDesiredStateConfiguration -ModuleVersion 1.1
    Import-DscResource -ModuleName NetQoSDSC -ModuleVersion 1.0.0.0

    NetQoSFlowControl EnableP3
    {
        Id = 'Priority3'
        Priority = 3
        Enabled = $true
    }

    NetQoSFlowControl DisableRest
    {
        Id = 'RestPriority'
        Priority = @(0,1,2,4,5,6,7)
        Enabled = $false
    }
}</pre><p></p>
<h4>Create new QoS policies</h4>
<p>New network QoS policies can be created using the <em>NetQoSPolicy</em> resource.</p>
<p></p><pre class="crayon-plain-tag">Configuration NewNetQoSPolicy
{
    Import-DscResource -ModuleName PSDesiredStateConfiguration -ModuleVersion 1.1
    Import-DscResource -ModuleName NetQoSDSC -ModuleVersion 1.0.0.0

    NetQosPolicy SMB
    {
        Name = 'SMB'
        PriorityValue8021Action = 3
        PolicyStore = 'localhost'
        NetDirectPortMatchCondition = 445
        Ensure = 'Present'
    }
}</pre><p></p>
<h4>Manage Network QoS Traffic classes</h4>
<p>The NetQoSTrafficClass resource can be used to manage the traffic classes in network QoS.</p>
<p></p><pre class="crayon-plain-tag">Configuration NewTrafficClass
{
    Import-DscResource -ModuleName PSDesiredStateConfiguration -ModuleVersion 1.1
    Import-DscResource -ModuleName NetQoSDSC -ModuleVersion 1.0.0.0

    NetQosTrafficClass SMB
    {
        Name = 'SMB'
        Algorithm = 'ETS'
        Priority = 3
        BandwidthPercentage = 50
        Ensure = 'Present'
    }
}</pre><p></p>
<p>This module, while code complete, needs some more work to declare as fully HQRM-compliant. I am working towards that by adding tests and better examples. Feel free to submit your issues, feedback, or PRs.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/01/03/using-netqosdsc-psdsc-resource-module-to-configure-network-qos/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Hyper-V Server PowerShell Drive based on #SHiPS</title>
		<link>https://www.powershellmagazine.com/2018/01/02/hyper-v-server-powershell-drive-based-on-ships/</link>
				<comments>https://www.powershellmagazine.com/2018/01/02/hyper-v-server-powershell-drive-based-on-ships/#respond</comments>
				<pubDate>Tue, 02 Jan 2018 17:00:29 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Hyper-V]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Providers]]></category>
		<category><![CDATA[SHiPS]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12920</guid>
				<description><![CDATA[In an earlier article, I had written about a PowerShell provider for Failover Clusters written using the SHiPS provider framework.... ]]></description>
								<content:encoded><![CDATA[<p>In an earlier article, I had written about a <a href="http://www.powershellmagazine.com/2017/12/21/failover-cluster-powershell-drive-based-on-ships/">PowerShell provider for Failover Clusters</a> written using the SHiPS provider framework. I have been experimenting with this a bit and made a few more providers.</p>
<p>In today&#8217;s article, I am introducing the <a href="https://github.com/rchaganti/HyperVDrive">Hyper-V Server PowerShell provider</a>.</p>
<p>Using this provider, you can connect to local and remote Hyper-V hosts and browse the virtual machines and virtual networks as if they are folders on a file system.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2018/01/HyperVDrive.gif" rel="lightbox[12920]"><img class="aligncenter wp-image-12923" src="http://www.powershellmagazine.com/wp-content/uploads/2018/01/HyperVDrive.gif" alt="" width="424" height="218" /></a></p>
<p>Once again, like every other provider I am writing, this is experimental as well. I am writing these to understand what needs to be considered as part of the provider design and implementation. So, the final version of these providers may look and function differently.</p>
<div>
<h3>TODO:</h3>
<div>&#8211; Add support for Hyper-V Host properties as leaf</div>
<div>&#8211; Fix support for using the module on a system with RSAT-ClusteringTools and not a Hyper-V host.</div>
<div>&#8211; Add formats for better output</div>
</div>
<p>Follow the <a href="https://github.com/rchaganti/HyperVDrive">Github repository</a> for information on any future updates.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2018/01/02/hyper-v-server-powershell-drive-based-on-ships/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Failover Cluster PowerShell Drive based on #SHiPS</title>
		<link>https://www.powershellmagazine.com/2017/12/21/failover-cluster-powershell-drive-based-on-ships/</link>
				<comments>https://www.powershellmagazine.com/2017/12/21/failover-cluster-powershell-drive-based-on-ships/#comments</comments>
				<pubDate>Thu, 21 Dec 2017 17:00:48 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[failover cluster]]></category>
		<category><![CDATA[PowerShell Provider]]></category>
		<category><![CDATA[SHiPS]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12910</guid>
				<description><![CDATA[Simple Hierarchy in PowerShell (SHiPS) is a module that simplifies implementing PowerShell providers. If you are new to PowerShell providers, a... ]]></description>
								<content:encoded><![CDATA[<p>Simple Hierarchy in PowerShell (<a href="https://github.com/PowerShell/SHiPS">SHiPS</a>) is a module that simplifies implementing PowerShell providers. If you are new to PowerShell providers, a PowerShell provider allows any data store to be exposed like a file system as if it were a mounted drive. In other words, the data in your data store can be treated like files and directories so that a user can navigate data via <em>Set-Location</em> (cd) and <em>Get-ChildItem</em> (dir or ls).</p>
<p>I have been looking at this and experimenting with a few providers of my own. I will write more about how to approach writing a PowerShell provider using SHiPS but wanted to give you a sneak peek into the Failover Cluster PowerShell Drive (<a href="https://github.com/rchaganti/FailoverClusterDrive">FailoverClusterDrive</a>).</p>
<p>Here is the Failover Cluster PowerShell Drive in action.</p>
<p><a href="https://camo.githubusercontent.com/ce719f09863b6955b3ab6b06b3e1f54926ceb62c/68747470733a2f2f692e696d6775722e636f6d2f466a5461706f472e676966"><img class="alignnone" src="https://camo.githubusercontent.com/ce719f09863b6955b3ab6b06b3e1f54926ceb62c/68747470733a2f2f692e696d6775722e636f6d2f466a5461706f472e676966" alt="" width="596" height="308" /></a></p>
<p>This is still an experimental module. SHiPS currenly supports only get actions. So, the mounted failover cluster drive will only be read-only. There are a few more additions I am still working on in my free time and I will push another release early next year.</p>
<div>
<h3>TODO</h3>
<ul>
<li>Add support for Cluster Storage as a container</li>
<li>Add support for browsing cluster resource parameters as a container</li>
<li>Fix support for using the module on a system with RSAT-ClusteringTools and not a cluster node.</li>
<li>Add formats for better output</li>
</ul>
<p>Stay tuned!</p>
</div>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/12/21/failover-cluster-powershell-drive-based-on-ships/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
							</item>
		<item>
		<title>#PSDSC FailoverClusterDSC &#8211; Deploy a Storage Spaces Direct Cluster</title>
		<link>https://www.powershellmagazine.com/2017/12/19/psdsc-failoverclusterdsc-deploy-a-storage-spaces-direct-cluster/</link>
				<comments>https://www.powershellmagazine.com/2017/12/19/psdsc-failoverclusterdsc-deploy-a-storage-spaces-direct-cluster/#respond</comments>
				<pubDate>Tue, 19 Dec 2017 17:00:56 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[PowerShell DSC]]></category>
		<category><![CDATA[PSDSC]]></category>
		<category><![CDATA[S2D]]></category>
		<category><![CDATA[Storage Spaces Direct]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12902</guid>
				<description><![CDATA[I have been working on the FailoverClusterDsc resource module and finally had the chance to add some examples and make... ]]></description>
								<content:encoded><![CDATA[<p>I have been working on the <a href="https://github.com/rchaganti/FailoverClusterDSC"><em>FailoverClusterDsc</em> </a>resource module and finally had the chance to add some examples and make the repository public.</p>
<p><em>This is not a fork of the <a href="http://github.com/powershell/xFailoverCluster">xFailoverCluster</a> module. I am adding only the resources that I am developing from scratch to this module. These resources will follow the HQRM guidelines.</em></p>
<h3>Resources in this module</h3>
<table border="1">
<thead>
<tr>
<th>Resource Name</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td>FailoverCluster</td>
<td>Creates a failover cluster.</td>
</tr>
<tr>
<td>FailoverClusterNode</td>
<td>Adds/removes a node to/from a failover cluster</td>
</tr>
<tr>
<td>FailoverClusterQuorum</td>
<td>Configures a cluster disk/share/node majority quorum.</td>
</tr>
<tr>
<td>FailoverClusterCloudWitness</td>
<td>Configures cloud witness for failover cluster.</td>
</tr>
<tr>
<td>FailoverClusterResourceParameter</td>
<td>Configures a failover cluster resource parameter.</td>
</tr>
<tr>
<td>FailoverClusterS2D</td>
<td>Enables Storage Spaces Direct in a failover cluster.</td>
</tr>
<tr>
<td>WaitForFailoverCluster</td>
<td>Waits until a failover cluster becomes available.</td>
</tr>
<tr>
<td>WaitForFailoverClusterNode</td>
<td>Waits until a node join a failover cluster.</td>
</tr>
</tbody>
</table>
<p>You can take a look at each of these resources to check what different configuration options are supported as of today.</p>
<p>Here is an example of creating and configuring a Storage Spaces Direct cluster.</p>
<p></p><pre class="crayon-plain-tag">$configData = @{
    AllNodes = @(
        @{
            NodeName = 'localhost'
            thumbprint = '25A1359A27FB3F2D562D7508D98E7189F2A1F1B0'
            CertificateFile = 'C:\PublicKeys\S2D4N01.cer'
            PsDscAllowDomainUser = $true
        }
    )
}

Configuration CreateS2DCluster
{
    param
    (
        [Parameter(Mandatory = $true)]
        [pscredential]
        $Credential,

        [Parameter(Mandatory = $true)]
        [String[]]
        $ParticipantNodes,

        [Parameter(Mandatory = $true)]
        [String]
        $ClusterName,

        [Parameter(Mandatory = $true)]
        [String]
        $StaticAddress,

        [Parameter(Mandatory = $true)]
        [String[]]
        $IgnoreNetworks,

        [Parameter(Mandatory = $true)]
        [String]
        $QuorumResource,

        [Parameter(Mandatory = $true)]
        [String]
        $QuorumType
        
    )

    Import-DscResource -ModuleName FailoverClusterDsc

    Node $AllNodes.NodeName
    {
        FailoverCluster CreateCluster
        {
            ClusterName = $ClusterName
            StaticAddress = $StaticAddress
            NoStorage = $true
            IgnoreNetwork = $IgnoreNetworks
            Ensure = 'Present'
            PsDscRunAsCredential = $Credential
        }

        WaitForFailoverCluster WaitForCluster
        {
            ClusterName = $ClusterName
            PsDscRunAsCredential = $Credential
        }

        Foreach ($node in $ParticipantNodes)
        {
            FailoverClusterNode $node
            {
                NodeName = $node
                ClusterName = $ClusterName
                PsDscRunAsCredential = $Credential
                Ensure = 'Present'
            }
        }

        FailoverClusterQuorum FileShareQuorum
        {
            IsSingleInstance = 'Yes'
            QuorumType = $QuorumType
            Resource = $QuorumResource
        }

        FailoverClusterS2D EnableS2D
        {
            IsSingleInstance = 'yes'
            Ensure = 'Present'
        }
    }
}

CreateS2DCluster -Credential (Get-Credential) -ConfigurationData $configData `
                                           -QuorumType 'NodeAndFileShareMajority' `
                                           -QuorumResource '\\sofs\share' `
                                           -ClusterName 'S2D4NCluster' `
                                           -StaticAddress '172.16.102.45' `
                                           -IgnoreNetworks @('172.16.103.0/24','172.16.104.0/24') `
                                           -ParticipantNodes @('S2D4N02','S2D4N03','S2D4N04')</pre><p></p>
<p>In the above pattern, I am creating a failover cluster and then adding the remaining nodes using the same configuration document. You can, however, have the node addition configuration using the <em>FailoverClusterNode </em>resource as a separate configuration document that gets enacted on the participant node.</p>
<p>The failover cluster configuration requires administrator privileges and these resources do not have a Credential parameter of their own and depend on <em>PSDscRunAsCredential</em>. Therefore, you need at least PowerShell 5.0 to use these resources.</p>
<p>I am looking at expanding the resource modules to beyond what is there at the moment. If you see any issues or have feedback, feel free to create an issue in my repository. These resources lack tests today. I would be glad to accept any PRs for tests.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/12/19/psdsc-failoverclusterdsc-deploy-a-storage-spaces-direct-cluster/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Containerizing a web application</title>
		<link>https://www.powershellmagazine.com/2017/11/20/containerizing-a-web-application/</link>
				<comments>https://www.powershellmagazine.com/2017/11/20/containerizing-a-web-application/#comments</comments>
				<pubDate>Mon, 20 Nov 2017 17:00:50 +0000</pubDate>
		<dc:creator><![CDATA[Jan Egil Ring]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Containers]]></category>
		<category><![CDATA[Nano]]></category>
		<category><![CDATA[containers]]></category>
		<category><![CDATA[docker]]></category>
		<category><![CDATA[powershell core]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12777</guid>
				<description><![CDATA[In this article, we will look at different options for containerizing a web application. We will go through the following... ]]></description>
								<content:encoded><![CDATA[<p>In this article, we will look at different options for containerizing a web application.</p>
<p>We will go through the following deployment scenarios, going from traditional options to cloud services and containers:</p>
<ul>
<li>Deploy to local machine</li>
<li>Deploy to an Infrastructure as a Service (IaaS) VM</li>
<li>Deploy to a Platform as a Service (PaaS) website</li>
<li>Deploy to a container running Windows Server Core</li>
<li>Deploy to a container running Nano Server</li>
</ul>
<p>Our example application is the <a href="https://poshtools.com/powershell-universal-dashboard/">PowerShell Universal Dashboard</a> – a web application built on ASP .NET Core, PowerShell Core, ReactJS, and a handful of JavaScript libraries.</p>
<p>This means it can support cross-platform deployments, running on Windows, Linux, and macOS.</p>
<p>That is already a very flexible range of options, but we can get even more options by using containers.</p>
<p><em>The PowerShell Universal Dashboard PowerShell module allows for creation of web-based dashboards. The client- and server-side code for the dashboard is authored completely with PowerShell. Charts, monitors, tables and grids can easily be created with the cmdlets included in the module. The module is a cross-platform module and will run anywhere PowerShell Core can run.</em></p>
<p><img class="alignnone wp-image-12786" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_00.png" alt="" width="600" height="250" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_00.png 1652w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_00-300x125.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_00-768x320.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_00-1024x427.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>&nbsp;</p>
<h2><strong>Deployment options</strong></h2>
<p><strong> </strong></p>
<p><strong>Deploy to local machine</strong></p>
<p>Let us first look at how to install and setup the application locally on a client machine.</p>
<p>Install the UniversalDashboard module from the PowerShell Gallery using PowerShellGet:</p>
<p>Install-Module UniversalDashboard</p>
<p>When the module is installed you will have access to a number of new commands:</p>
<p><img class="alignnone wp-image-12781" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_03.png" alt="" width="600" height="649" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_03.png 682w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_03-277x300.png 277w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>Start-UDDashboard is the main command for starting a new instance of a dashboard.</p>
<p>For the example usage of the other commands, I would encourage you to have a look at the <a href="http://www.poshud.com/Components">Components section</a> of the beforementioned demo website.</p>
<p>The example dashboard I am going to use for this article is one from a real-world scenario. During an onboarding process, there was a need to gather some data (an employee number) from an external company. The link was to an instance of PowerShell Universal Dashboard, which would take the input from the external user and send it as a parameter to an <a href="https://docs.microsoft.com/en-us/azure/automation/automation-webhooks">Azure Automation webhook</a>. The webhook would start an Azure Automation runbook, which would register the data in the internal system.</p>
<p>The dashboard for this scenario is available in <a href="https://gist.github.com/janegilring/05f916b9852616125030938f94b0f5e4">this Gist</a>.</p>
<p>When the UniversalDashboard PowerShell module is installed, we can simply run the dashboard.ps1 script to start the dashboard. Here is a screenshot from performing this in Visual Studio Code:</p>
<p><img class="alignnone wp-image-12780" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_02.png" alt="" width="600" height="448" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_02.png 1108w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_02-300x224.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_02-768x573.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_02-1024x764.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>At this point, the website should be up and running. We can test by navigating to http://localhost/register/123</p>
<p><img class="alignnone wp-image-12779" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_01.png" alt="" width="600" height="350" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_01.png 634w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_01-300x175.png 300w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>In this example, we are also taking input from the URL. 123 can be any number (in the example scenario, a service request ID), and will be available as a parameter variable in the New-UDInput command.</p>
<p>At this point, we have the application up and running on a local computer as a PowerShell module. For production use, we want to run this on a server platform.</p>
<p>As stated in the <a href="https://adamdriscoll.gitbooks.io/powershell-tools-documentation/content/powershell-pro-tools-documentation/universal-dashboard/running-dashboards.html">product documentation</a>, the PowerShell Universal Dashboard can be hosted in Azure or IIS:</p>
<p><em>To host a dashboard in Azure or IIS, you will need to deploy the entire module to your site or WebApp. In the root module directory, place your dashboard.ps1. You need to specify the -Wait parameter on Start-UDDashboard so that the script blocks and waits for requests in Azure or IIS. Specifying the port isn&#8217;t necessary because Azure and IIS use dynamic port tunneling.</em></p>
<p><em>IIS requires that the </em><a href="https://docs.microsoft.com/en-us/aspnet/core/hosting/aspnet-core-module">ASP.NET Core module</a><em> is installed.</em><strong> </strong></p>
<p><strong>Deploy to an Infrastructure as a Service (IaaS) VM running Internet Information Services (IIS)</strong></p>
<p>The other option would be to simply run the website in IIS on a Windows Server which can run anywhere: on-premises, a public cloud, or a hosting provider.</p>
<p>In this example, we are using a virtual machine running in Azure based on the Azure Marketplace  <a href="https://azuremarketplace.microsoft.com/en-us/marketplace/apps/Microsoft.WindowsServer">Windows Server Datacenter, version 1709</a> image.</p>
<p>Step 1 – Install the web server role: Install-WindowsFeature -Name Web-Server</p>
<p>Step 2 – Install the .NET Core Windows Server Hosting bundle as described <a href="https://docs.microsoft.com/en-us/aspnet/core/publishing/iis?tabs=aspnetcore2x">here</a> (needed since the PowerShell Universal Dashboard is built on .NET Core and not the built-in .NET Desktop edition).</p>
<p>Step 3 – Copy the UniversalDashboard PowerShell module to the path where the website is running, for example C:\inetpub\wwwroot if the Default Web Site is leveraged.</p>
<p>Step 4 – Copy the dashboard.ps1 and license.lic files to the same directory:</p>
<p><img class="alignnone wp-image-12783" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_05.png" alt="" width="600" height="304" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_05.png 793w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_05-300x152.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_05-768x389.png 768w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>That`s it – at this point the website should be up and running.</p>
<p><strong>Deploy to a Platform as a Service (PaaS) website</strong></p>
<p>The route for the mentioned real-world scenario was to host the module in a public cloud service, in this case Azure App Service:</p>
<p><img class="alignnone wp-image-12782" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_04.png" alt="" width="600" height="152" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_04.png 1118w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_04-300x76.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_04-768x195.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_04-1024x260.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>There are many different options for deploying a web application to an instance of Azure App Service. Microsoft has provided some examples for us to use:</p>
<ol>
<li><a href="https://docs.microsoft.com/en-us/azure/app-service/scripts/app-service-powershell-deploy-github?toc=%2fpowershell%2fmodule%2ftoc.json">Create a web app with deployment from GitHub</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/app-service/scripts/app-service-powershell-continuous-deployment-github?toc=%2fpowershell%2fmodule%2ftoc.json">Create a web app with continuous deployment from GitHub</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/app-service/scripts/app-service-powershell-deploy-ftp?toc=%2fpowershell%2fmodule%2ftoc.json">Create a web app and deploy code with FTP</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/app-service/scripts/app-service-powershell-deploy-local-git?toc=%2fpowershell%2fmodule%2ftoc.json">Create a web app and deploy code from a local Git repository</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/app-service/scripts/app-service-powershell-deploy-staging-environment?toc=%2fpowershell%2fmodule%2ftoc.json">Create a web app and deploy code to a staging environment</a></li>
</ol>
<p>#3 was used to build the demo website for this article, but in a production environment a more appropriate method would be to leverage continuous integration to deploy files to the website based on commits from source code.</p>
<p>We get many benefits from leveraging a PaaS offering such as Azure App Service:</p>
<ul>
<li>No servers to manage (no patching, monitoring, etc)</li>
<li>We can add custom domains and SSL certificates as part of the service</li>
<li>(bigger VM sizes)</li>
<li>Scale in and out (more VM instances)</li>
<li>Deploy the application using continuous delivery such as Visual Studio Team Services</li>
</ul>
<p>There are also many other benefits such as pre-authentication, load balancing and more.</p>
<h2><strong>Containerization</strong></h2>
<p>Next, we will look at leveraging containers &#8211; a solution to the problem of how to get software to run reliably when moved from one computing environment to another.</p>
<p>This is a new technology with a promise to change the IT landscape the same way as virtualization did in the early 2000s.</p>
<p><strong>Deploy to a container running Windows Server Core</strong></p>
<p>Our first example of containerizing the PowerShell Universal Dashboard will be based on Windows Server Core, version 1709. Since we already have the application up and running on a native operating system, it should be easy in this case to transform it into a container.</p>
<p>The files for the following demos are available <a href="https://github.com/janegilring/PSCommunity/tree/master/Containers/PowerShell%20Universal%20Dashboard">here</a>.</p>
<p>In the WindowsServerCoreDemoWebsite, we have the following files:</p>
<p><img class="alignnone size-full wp-image-12784" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_06.png" alt="" width="245" height="88" /></p>
<p>dashboard.ps1 and license.lic are the same files we used when running the application in a native operating system. These will be referenced in the Dockerfile to be copied into the container image.</p>
<p>In this example, we are using Docker – a container management tool – to build and deploy our demos. From the <a href="https://docs.docker.com/engine/reference/builder/">documentation</a>:</p>
<p><em>Docker can build images automatically by reading the instructions from a Dockerfile. A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image. Using docker build users can create an automated build that executes several command-line instructions in succession.</em></p>
<p>Let us have a look at the <a href="https://github.com/janegilring/PSCommunity/blob/master/Containers/PowerShell%20Universal%20Dashboard/WindowsServerCoreDemoWebsite/Dockerfile">Dockerfile</a> which defines our Windows Server Core, version 1709 demo website:</p>
<p><img class="alignnone wp-image-12785" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_07.png" alt="" width="600" height="292" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_07.png 1029w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_07-300x146.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_07-768x374.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_07-1024x499.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>We are using an image from Docker Hub – a central repository for Docker images – which is built by Microsoft and have IIS pre-installed. Then we do not have to think about installing and configuring the Web-Server role.</p>
<p>Next, we are downloading and installing the .NET Core Windows Server Hosting bundle like we did when running the application on a native operating system.</p>
<p>PowerShellGet is used to download the UniversalDashboard module.</p>
<p>The last step is to copy the dashboard.ps1 and licence.lic files as well as exposing the port the website is running on.</p>
<pre class="brush: powershell; title: ; notranslate">
cd &quot;~\Git\PSCommunity\Containers\PowerShell Universal Dashboard&quot;

# Note: Remember to switch to Windows Containers before building the docker file (Linux is the default after installing Docker for Windows)
docker build WindowsServerCoreDemoWebsite -t psmag:demowebsite --no-cache
docker build NanoDemoWebsite -t psmag:nanodemowebsite --no-cache

#region 1 Windows Server Core
$ContainerID = docker run -d --rm psmag:demowebsite
$ContainerIP = docker inspect -f &quot;{{ .NetworkSettings.Networks.nat.IPAddress }}&quot; $ContainerID

# Verify that the website is up and running
Start-Process -FilePath iexplore.exe -ArgumentList http://$ContainerIP/register/123
Start-Process -FilePath chrome.exe -ArgumentList http://$ContainerIP/register/123

# Optionally, connect to a container instance interactively to inspect the environment.
# The IIS image have a service monitor as an entrypint, thus we need to override this to get into the container interactively
docker run --entrypoint=powershell -it psmag:demowebsite

docker stop $ContainerID

#endregion

#region 2 Nano Server 1709
$ContainerID = docker run -d --rm psmag:nanodemowebsite
$ContainerIP = docker inspect -f &quot;{{ .NetworkSettings.Networks.nat.IPAddress }}&quot; $ContainerID

# Verify that the website is up and running
Start-Process -FilePath iexplore.exe -ArgumentList http://$ContainerIP/register/123
Start-Process -FilePath chrome.exe -ArgumentList http://$ContainerIP/register/123

# Optionally, connect to the container instance interactively to inspect the environment
docker exec -ti $ContainerID pwsh #pwsh for Nano/powershell for Server Core

docker stop $ContainerID

#endregion
</pre>
<p>When the Dockerfile is ready, we can user docker.exe to build a container image (line 4-5).</p>
<p>When the image is successfully built, we are ready to test it by starting a container instance using our new image (line 8).</p>
<p>If you have made modifications to dashboard.ps1 or simply want the latest version of the UniversalDashboard module, re-run the docker build command and the image will be updated with any changes.</p>
<p><strong>Deploy to a container running Nano Server</strong></p>
<p>Before we try to make our demo application run on Nano Server, there are some <a href="https://docs.microsoft.com/en-us/windows-server/get-started/nano-in-semi-annual-channel">important changes</a> to Nano Server introduced in Windows Server 1709 to be aware of:</p>
<p><em>Starting with the new feature release of Windows Server, version 1709, Nano Server will be available only as a container base OS image. You must run it as a container in a container host, such as a Server Core installation of Windows Server. Running a container based on Nano Server in the new feature release differs from earlier releases in these ways:</em></p>
<ul>
<li><em>Nano Server has been optimized for .NET Core applications.</em></li>
<li><em>Nano Server is even smaller than the Windows Server 2016 version.</em></li>
<li><em>PowerShell Core, .NET Core, and WMI are no longer included by default, but you can include PowerShell Core and .NET Core container packages when building your container.</em></li>
<li><em>There is no longer a servicing stack included in Nano Server. Microsoft publishes an updated Nano container to Docker Hub that you redeploy.</em></li>
<li><em>You troubleshoot the new Nano Container by using Docker.</em></li>
<li><em>You can now run Nano containers on IoT Core.</em></li>
</ul>
<p>One more thing that is useful to know, but not mentioned in the referenced article, is that IIS is not available in Nano Server 1709.</p>
<p>This means we need to take a different approach to get the application running in a Nano-based container.</p>
<p>PowerShell Universal Dashboard is built on top of <a href="https://github.com/aspnet/KestrelHttpServer">Kestrel</a> &#8211; a cross-platform web server for ASP.NET Core. This means we can simply run Start-UDDashboard from PowerShell Core in Nano Server 1709 to get the web application up and running.</p>
<p><a href="https://twitter.com/rickstrahl">Rick Strahl</a> has written a great article about <a href="https://weblog.west-wind.com/posts/2016/Jun/06/Publishing-and-Running-ASPNET-Core-Applications-with-IIS">Publishing and Running ASP.NET Core Applications with IIS</a> where he mentions the following:</p>
<p><em>Kestrel is a .NET Web Server implementation that has been heavily optimized for throughput performance. It&#8217;s fast and functional in getting network requests into your application, but it&#8217;s ‘just’ a raw Web server. It does not include Web management services as a full featured server like IIS does. If you run on Windows you will likely want to run Kestrel behind IIS to gain infrastructure features like port 80/443 forwarding via Host Headers, process lifetime management and certificate management to name a few.</em></p>
<p><em>The bottom line for all of this is if you are hosting on Windows you&#8217;ll want to use IIS and the AspNetCoreModule.</em></p>
<p>Some of the limitations can be overcome by leveraging different options such as a PaaS service or custom reverse proxy to publish the application externally, as well as a container orchestration tool such as Kubernetes for scaling and managing the application.</p>
<p>With these limitations in mind, let us go on and see if we can get this working on the latest version of Nano Server.</p>
<p>As mentioned previously, PowerShell Core has been removed from Nano Server starting with the 1709 release. The first step would be to build a new container image where PowerShell Core is included.</p>
<p>Luckily, the PowerShell team have published <a href="https://github.com/PowerShell/PowerShell/blob/master/docker/release/nanoserver/Dockerfile">the Dockerfile</a> for the official Docker image for PowerShell Core on GitHub.</p>
<p>In our <a href="https://github.com/janegilring/PSCommunity/blob/master/Containers/PowerShell%20Universal%20Dashboard/NanoDemoWebsite/Dockerfile">custom Dockerfile</a>, we are leveraging almost all of the Dockerfile used to build PowerShell Core. We are also using the concept of <a href="https://docs.docker.com/engine/userguide/eng-image/multistage-build/#use-multi-stage-builds">multi stage builds</a> to include ASP .NET Core by referencing FROM microsoft/aspnetcore:2.0-nanoserver-1709 in the Dockerfile.</p>
<p>The only code we need to add is to download the UniversalDashboard PowerShell module as well as copy the dashboard.ps1 file as we did when running on Server Core:</p>
<p><img class="alignnone wp-image-12787" src="http://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_08.png" alt="" width="600" height="164" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_08.png 808w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_08-300x82.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/11/Containerizing_a_web_application_08-768x210.png 768w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>When using IIS, licence.lic was automatically read by the application. This is not the case when using the module directly from PowerShell, hence we have added Set-UDLicense to specify the licence inside dashboard.ps1.</p>
<p>Since IIS is not available, we need to add an entry point in the Dockerfile in order to launch the dashboard.ps1 file. This will launch the website by calling Start-UDDashboard.</p>
<p>You might notice that PowerShell Core is called by using pwsh.exe. With the release of PowerShell Core 6.0.0-beta.9 the binary for PowerShell Core was renamed from powershell.exe and powershell on Windows and Linux/Unix/macOS respectively to pwsh.exe and pwsh. <a href="https://twitter.com/markekraus">Mark Kraus</a> has written a great <a href="https://get-powershellblog.blogspot.no/2017/10/why-pwsh-was-chosen-for-powershell-core.html">article</a> with more background info about that change.</p>
<p>A final note about Nano Server: On the <a href="https://azuremarketplace.microsoft.com/en-us/marketplace/apps/Microsoft.WindowsServer">Azure Marketplace offering for Windows Server</a>, there is an offer called Windows Server Datacenter, version 1709 with Containers. If you deploy an instance of that image, Docker will be pre-installed and you can download the latest official Microsoft image of Nano Server from Docker Hub by running docker pull microsoft/nanoserver:1709.</p>
<p>&nbsp;</p>
<p><strong>Summary</strong></p>
<p>In this article, we have explored various options for hosting a web application in different environments, starting with a native operating system and ending up with a very small container image based on Nano Server.</p>
<p>&nbsp;</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/11/20/containerizing-a-web-application/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
							</item>
		<item>
		<title>PSRemotely – Authoring workflow (Part 1)</title>
		<link>https://www.powershellmagazine.com/2017/06/29/psremotely-authoring-workflow-part-1/</link>
				<comments>https://www.powershellmagazine.com/2017/06/29/psremotely-authoring-workflow-part-1/#comments</comments>
				<pubDate>Thu, 29 Jun 2017 16:00:00 +0000</pubDate>
		<dc:creator><![CDATA[Deepak Dhami]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Pester]]></category>
		<category><![CDATA[PSRemotely]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12743</guid>
				<description><![CDATA[Introduction After setting up the context in the previous post, it is time to look at how the authoring workflow looks... ]]></description>
								<content:encoded><![CDATA[<h2>Introduction</h2>
<p>After setting up the <a href="http://www.powershellmagazine.com/2017/04/07/psremotely-framework-to-enable-remote-operations-validation/">context in the previous post</a>, it is time to look at how the authoring workflow looks like when using Pester for writing operations validation tests, to begin with, and then leveraging PSRemotely DSL to target it to the remote nodes.<br />
This workflow consists of below stages:</p>
<ol>
<li>Getting your tests ready, target/test a single node.</li>
<li>Prepare configuration data (abstract hardcoded values).</li>
<li>Use PSRemotely for remote operations validation.</li>
<li>Debugging failures.</li>
<li>Reporting.</li>
</ol>
<p><strong>Note</strong> – Stages 1-3 will be covered in this post and there will be another post on Stages 4 &amp; 5.</p>
<p>Since <a href="https://github.com/DexterPOSH/RemoteOpsValidationLib/tree/design">PSRemotely</a> was born out of needs for validating an engineered solution, it excels at validating solutions e.g. where the nodes are consistent in behavior and have to be tested for the similar configurations.</p>
<h2>Scenario</h2>
<p>To illustrate the point, I am taking an example of deploying the <a href="https://technet.microsoft.com/en-us/windows-server-docs/storage/storage-spaces/hyper-converged-solution-using-storage-spaces-direct">Hyper-converged solution using Storage Spaces Direct</a>. Now as per the referenced article the deployment has three stages:</p>
<ol>
<li>Deploy Windows Server</li>
<li>Configure the Network</li>
<li>Configure Storage Spaces Direct</li>
</ol>
<p>Ideally, the operations validation should run after each step to validate that the entire solution is being configured as per the best practice. To keep today’s post simple, we will be validating only the first step which deploying Windows server but the similar steps apply while authoring the validation tests for the other stages in the deployment workflow.</p>
<p>Now take a look at the referenced link and gather the list of configurations that need to be in place on each node as per the step 1.</p>
<ol>
<li>Deploy Windows Server 2016.</li>
<li>Verify the domain account is a member of the local administrator group.</li>
</ol>
<p>So now we have the configurations we need to check on each node just before we configure networking on top of them. You can follow the commits on this <a href="https://github.com/DexterPOSH/RemoteOpsValidationLib/tree/design">branch</a> on this test repository to see the changes made as part of the authoring workflow.</p>
<h2>Stage 1 – Get your tests ready.</h2>
<p>This stage consists of authoring tests using Pester/PoshSpec for operations validations.<br />
Let us start by translating the above gathered configurations in Pester Describe blocks independently, to begin with.</p>
<p>Below is a very crude way that can be used to determine that Windows Server 2016 is installed on the node. There are two Pester assertions &#8211;. First one asserts that OS type is a server and the OS SKU is either datacenter edition with GUI or server core.</p>
<p></p><pre class="crayon-plain-tag"># Ensure that Server 2016 OS installation is done.
Describe "Operating system installed validation" {
    $OS = Get-CimInstance -Class Win32_OperatingSystem

    Context "Server Type check" {
        It "Should have the OSType as WinNT" {
            $OS.OSType | Should Be 18
        }
    }

    Context 'Server 2016 check' {
        It "Should have server 2016 installed" {
           $OS.Caption | Should BeLike '*Server 2016*'
        }
    }
}</pre><p></p>
<p>Here is another independent test for validating that the domain account is a member of the local administrators group on a node.</p>
<p></p><pre class="crayon-plain-tag"># Validate that the domain account is part of the local administrators group on a node.
Describe "Domain account is local administrator validation" {
    $LocalGroupMember = Get-LocalGroupMember -Group Administrators -Member "S2DClusterAdmin" -ErrorAction SilentlyContinue
    It "Should be member of local admins group" {
        $LocalGroupMember | Should NOT BeNullOrEmpty
    }
}</pre><p></p>
<h2>Stage 2 – Prepare node configuration data</h2>
<p>If you look at the authored Pester describe blocks to validate the configuration on the nodes, it might use environment specific data hard coded into the tests e.g. domain username in above example.</p>
<p>So we need to now collect all this data which is environment specific and decouple it from our tests.<br />
Start with the below empty configuration data (place it in the EnvironmentConfigData.psd1 file) and start populating it (it follows the DSC style configuration data syntax).</p>
<p></p><pre class="crayon-plain-tag">@{
    AllNodes = @(
        @{
            # common node information hashtable
            NodeName = '*'; # do not edit
        },
        @{
            # Individual node information hashtable
            NodeName = 'Node1'
        },
        @{
            NodeName = 'Node2'
        },
        @{
            NodeName = 'Node3'
        },
        @{
            NodeName = 'Node4'
        }
    )
}</pre><p></p>
<p>Start by placing them inside the node configuration data with a general thumb rule of mapping common data to common node information hashtable and node specific details to node configuration hashtable.</p>
<p>Now in the previous tests, the only input is the domain user name. So we can add that to common node information hashtable, since the domain user is a member of the local administrators group needs to be validated on all the nodes in the solution. So now the configuration data looks like below:</p>
<p></p><pre class="crayon-plain-tag">@{    
    AllNodes = @(
        @{
            # common node information hashtable
            NodeName = '*'; # do not edit
            DomainUser = 'S2DClusterAdmin'
        },
        @{
            # Individual node information hashtable
            NodeName = 'Node1'
        },
        @{
            NodeName = 'Node2'
        },
        @{
            NodeName = 'Node3'
        },
        @{
            NodeName = 'Node4'
        }
    )
}</pre><p></p>
<h2>Stage 3 &#8211; Using PSRemotely for remote ops validation</h2>
<p>At this stage in the authoring workflow, we have our tests ready along with the environment configuration data in hand. Before using PSRemotely to target all the nodes for deployment readiness we have to ask this question, How do we connect over PSRemoting to these nodes?</p>
<ul>
<li>Are the nodes domain joined?</li>
<li>Connect using the DNS name resolution or IPv4/IPv6 addresses for the remote nodes?</li>
<li>Connect using the logged in user account or specifying an alternate account?</li>
</ul>
<p>Based on the answers to the above questions usage with PSRemotely DSL varies a bit and most of them are documented. For this scenario, the DNS Name resolution of the nodes is used (nodes are already domain joined) and the logged in user account will be used to connect to the remote nodes.</p>
<p>Now it is time to wrap our existing operations validation tests inside the PSRemotely DSL. The DSL consists of two keywords <strong>PSRemotely</strong> and <strong>Node</strong>. PSRemotely is the outermost keyword which allows the framework to:</p>
<ul>
<li>Specify that all ops validations tests are housed inside a &lt;filename&gt;.PSRemotely.ps1 file.</li>
<li><a href="http://psremotely.readthedocs.io/en/latest/Example-ConfigurationData/">Specify the environment configuration data</a> e.g. hashtable/ .psd1 file/ .json file.</li>
<li><a href="http://psremotely.readthedocs.io/en/latest/Example-CredentialHash/">Specify credentials to be used to connect to the each node</a> (if required).</li>
<li><a href="http://psremotely.readthedocs.io/en/latest/Example-PreferNodeProperty/">Specify a node specific property in the configuration data</a> to be used for initiating the PSRemoting session.</li>
<li><a href="http://psremotely.readthedocs.io/en/latest/Example-Custom-Variables/">Populate custom variables in the remote node’s execution context</a>.</li>
</ul>
<p>Node keyword is where we target and organize our tests based on some environment specific data, it is very similar to the Node keyword in the DSC. If you would like to <a href="http://psremotely.readthedocs.io/en/latest/PSRemotely-Node-Based-Target/">target different validations to nodes based on some configuration data then it can be done using the Node keyword</a>.</p>
<p>Getting back to the problem at hand let’s wrap our existing Pester tests inside the PSRemotely DSL. This is straightforward for the problem at hand and looks like below. We can save the contents of below code snippet in a file called S2DValidation.PSRemotely.ps1 (PSRemotely only accepts file with .PSRemotely.ps1 extension).</p>
<p><strong>Note</strong> &#8211;  Take note of how the hard coded value for domain username (S2DClusterAdmin) from the standalone Pester tests is replaced with node specific configuration data e.g. $Node.DomainUser.</p>
<p></p><pre class="crayon-plain-tag"># Use the PSRemotely DSL to wrap the existing Pester tests and target remote nodes
PSRemotely -Path "$PSScriptRoot\EnvironmentConfigData.psd1" {
    # All the nodes in the solution are to be targeted
    Node $AllNodes.NodeName {
        # Ensure that Server 2016 OS installation is done.
        Describe "Operating system installed validation" {
            $OS = Get-CimInstance -Class Win32_OperatingSystem
            Context "Server Type check" {
                It "Should have the OSType as WinNT" {
                    $OS.OSType | Should Be 18
                }
            }
            Context 'Server 2016 check' {
                It "Should have server 2016 installed" {
                $OS.Caption | Should BeLike '*Server 2016*'
                }
            }
        }
        # Validate that the domain account is part of the local administrators group on a node.
        Describe "Domain account is local administrator validation" {
          $LocalGroupMember = Get-LocalGroupMember -Group Administrators -Member "$($Node.DomainUser)" -ErrorAction SilentlyContinue
            It "Should be member of local admins group" {
                $LocalGroupMember | Should NOT BeNullOrEmpty
            }
        }
    }
}</pre><p></p>
<p>We are all set and have two files in the directory e.g. EnvironmentConfigData.psd1 and S2DValidation.PSRemotely.ps1, it is finally time to <a href="http://psremotely.readthedocs.io/en/latest/Invoke-PSRemotely/">invoke PSRemotely</a> and give remote operations validation a go.</p>
<p>We can use Invoke-PSRemotely in the current directory to run all the operations validation housed inside it or specify a path to a file ending with *.PSRemotely.ps1 extension.</p>
<p><iframe title="PSRemotely demo for PSMag" width="760" height="570" src="https://www.youtube.com/embed/nsewGyRRiUw?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>
<p>As shown in the above video, for each node targeted a JSON object is returned. In the returned JSON object the property Status is true if all the tests (Describe blocks) passed on the remote node. Tests property is an array of individual tests (Describe block) run on the Remotely node if all the tests pass then an empty JSON object array of <em>TestResult</em> is returned otherwise the Error record thrown by Pester is returned.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/06/1.png" rel="lightbox[12743]"><img class="aligncenter wp-image-12744" src="http://www.powershellmagazine.com/wp-content/uploads/2017/06/1-1024x331.png" alt="" width="492" height="159" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/06/1-1024x331.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/1-300x97.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/1-768x248.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/1.png 1089w" sizes="(max-width: 492px) 100vw, 492px" /></a></p>
<p>For the node which failed one of the validations, the JSON object looks like below. Individual <em>TestResult</em> will contain more information on the failing tests on the remote nodes.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/06/2.png" rel="lightbox[12743]"><img class="aligncenter wp-image-12745" src="http://www.powershellmagazine.com/wp-content/uploads/2017/06/2-1024x401.png" alt="" width="483" height="189" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/06/2-1024x401.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/2-300x118.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/2-768x301.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/2.png 1455w" sizes="(max-width: 483px) 100vw, 483px" /></a></p>
<p>For the failed node, we can quickly verify that out of the two validations targeted at the remote node only one is failing.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/06/3.png" rel="lightbox[12743]"><img class="aligncenter wp-image-12746" src="http://www.powershellmagazine.com/wp-content/uploads/2017/06/3-1024x173.png" alt="" width="479" height="81" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/06/3-1024x173.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/3-300x51.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/3-768x130.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/3.png 1275w" sizes="(max-width: 479px) 100vw, 479px" /></a></p>
<p>Now there could be a lot many reasons on why the operation validations tests on the remote node are failing. In the next post, we will take a look at how to connect to the underlying PSSession being used by PSRemotely to debug these failures.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/06/29/psremotely-authoring-workflow-part-1/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
							</item>
		<item>
		<title>Infrastructure Blueprints &#8211; Adding Pre-deployment Validations</title>
		<link>https://www.powershellmagazine.com/2017/06/28/infrastructure-blueprints-adding-pre-deployment-validations/</link>
				<comments>https://www.powershellmagazine.com/2017/06/28/infrastructure-blueprints-adding-pre-deployment-validations/#comments</comments>
				<pubDate>Wed, 28 Jun 2017 16:00:22 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[DSC]]></category>
		<category><![CDATA[IaC]]></category>
		<category><![CDATA[PSDSC]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12733</guid>
				<description><![CDATA[In one of my earlier articles here, I wrote about the Infrastructure Blueprints project. Over the weekend, I published an... ]]></description>
								<content:encoded><![CDATA[<p>In one of my <a href="http://www.powershellmagazine.com/2017/05/15/infrastructure-blueprints-a-way-to-share-psdsc-configurations/">earlier articles</a> here, I wrote about the <a href="https://github.com/rchaganti/InfraBlueprints">Infrastructure Blueprints</a> project. Over the weekend, I published an update this project.</p>
<ul>
<li>Renamed Hyper-VConfigurations composite resource module to HyperVConfigurations. This is a breaking change.</li>
<li>Added SystemConfigurations composite resource module containing one composite configuration that includes domain join, remote desktop, timezone, and IE enhanced security configurations.</li>
<li>Added Pre-Deploy tests under Diagnostics for each composite resource.</li>
</ul>
<p>Here is a summary of all composite resources in this repository:</p>
<table border="1">
<thead>
<tr>
<th>Module Name</th>
<th>Composite Resources</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="https://www.powershellgallery.com/packages/HyperVConfigurations">HyperVConfigurations</a></td>
<td><a href="https://github.com/rchaganti/InfraBlueprints/tree/Dev/HyperVConfigurations/DSCResources/HyperVSwitchEmbeddedTeam">HyperVSwitchEmbeddedTeam</a></td>
</tr>
<tr>
<td></td>
<td><a href="https://github.com/rchaganti/InfraBlueprints/tree/Dev/HyperVConfigurations/DSCResources/HyperVSwitchEmbeddedTeamForS2D">HyperVSwitchEmbeddedTeamForS2D</a></td>
</tr>
<tr>
<td></td>
<td><a href="https://github.com/rchaganti/InfraBlueprints/tree/Dev/HyperVConfigurations/DSCResources/HyperVSwitchNativeTeam">HyperVSwitchNativeTeam</a></td>
</tr>
<tr>
<td><a href="https://www.powershellgallery.com/packages/SystemConfigurations">SystemConfigurations</a></td>
<td><a href="https://github.com/rchaganti/InfraBlueprints/tree/Dev/SystemConfigurations/DSCResources/SystemDomainJoinWithCustomTimezone">SystemDomainJoinWithCustomTimezone</a></td>
</tr>
</tbody>
</table>
<p>Let&#8217;s come to the subject of today&#8217;s post. In any infrastructure that you are deploying, even f you are no automation guy, there will be a set of prerequisite checks you would perform. For example, if your goal is to deploy a switch embedded team for a Hyper-V host configuration, you will have to check for the existence of physical network adapters in the system that you plan to use within the SET configuration. And, there will be many such pre-deployment checks that you need to perform. So, when using these infrastructure blueprints, it is ideal to package the pre-deployment tests as well into the composite resource module itself.</p>
<p>To address this need, I added PreDeploy scripts under diagnostics tests for each composite resource.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/06/Predeploy.png" rel="lightbox[12733]"><img class="aligncenter wp-image-12737" src="http://www.powershellmagazine.com/wp-content/uploads/2017/06/Predeploy-1024x354.png" alt="" width="606" height="209" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/06/Predeploy-1024x354.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/Predeploy-300x104.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/Predeploy-768x265.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/Predeploy.png 1447w" sizes="(max-width: 606px) 100vw, 606px" /></a></p>
<p>The PreDeploy folder is where all my pre-deployment tests are stored. Here is the pre-deployment test script for the SET team.</p>
<p></p><pre class="crayon-plain-tag">Describe 'Predeploy tests for Hyper-V Deployment with Switch Embedded Teaming and related network Configuration' {
    Context "Operating System verison tests" {
        $OS = Get-CimInstance -Class Win32_OperatingSystem
        It "Should have the OSType as WinNT" {
            $OS.OSType | Should Be 18
        }
        
        It "Should have server 2016 installed" {
           $OS.Caption | Should BeLike '*Server 2016*'
        }        
    }

    Context 'Network adapters should exist' {
        Foreach ($adapter in $configurationData.AllNodes.NetAdapterName)
        {
            It "Network adapter named '$adapter' should exist" {
                Get-NetAdapter -Name $adapter -ErrorAction SilentlyContinue | Should Not BeNullOrEmpty
            }
        }
    }
}</pre><p></p>
<p>In the above test scripts, we check that the OS version is indeed Windows Server 2016 to ensure SET configuration can be deployed. Also, we check for the presence of physical network adapters listed in the configuration data to ensure that the SET configuration completes with no errors.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/06/DeployValidation.png" rel="lightbox[12733]"><img class="aligncenter wp-image-12739" src="http://www.powershellmagazine.com/wp-content/uploads/2017/06/DeployValidation.png" alt="" width="416" height="235" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/06/DeployValidation.png 765w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/DeployValidation-300x169.png 300w" sizes="(max-width: 416px) 100vw, 416px" /></a></p>
<p>The above flow summarizes the deployment workflow. We execute the pre-deployment tests first and perform deployment only once these tests are all successful. Once the deployment is complete, we run either comprehensive or simple operations tests and put the system into operations only when these tests are successful.</p>
<p>Whatever orchestration script or method that you plan to use, putting this workflow into the process will certainly help you build a resilient deployment pipeline.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/06/28/infrastructure-blueprints-adding-pre-deployment-validations/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
							</item>
		<item>
		<title>Discover and Package Dependent Resource Modules for a #PSDSC Configuration</title>
		<link>https://www.powershellmagazine.com/2017/06/27/discover-and-package-dependent-resource-modules-for-a-psdsc-configuration/</link>
				<comments>https://www.powershellmagazine.com/2017/06/27/discover-and-package-dependent-resource-modules-for-a-psdsc-configuration/#respond</comments>
				<pubDate>Tue, 27 Jun 2017 16:00:45 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[PSDSC]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12708</guid>
				<description><![CDATA[If you have ever used the Publish-AzureRmVMDscConfiguration cmdlet in the Azure PowerShell tools, you may know already that this command... ]]></description>
								<content:encoded><![CDATA[<p>If you have ever used the <a href="https://docs.microsoft.com/en-us/powershell/module/azurerm.compute/publish-azurermvmdscconfiguration?view=azurermps-4.0.0" target="_blank" rel="noopener noreferrer">Publish-AzureRmVMDscConfiguration</a> cmdlet in the Azure PowerShell tools, you may know already that this command discovers module dependencies for a configuration and packages all dependencies along with the configuration as a zip archive.</p>
<p></p><pre class="crayon-plain-tag">Publish-AzureRmVMDscConfiguration ".\MyConfiguration.ps1" -OutputArchivePath ".\MyConfiguration.ps1.zip"</pre><p></p>
<p>When I first used this cmdlet, I felt this was really a good idea for on-premise build processes and immediately tried to find out how they discover module dependencies. I was almost certain that it was not just text parsing but may be a little bit more than that. This exploration lead me to the source code for this cmdlet and I certainly saw lot of traces towards <a href="https://msdn.microsoft.com/en-us/library/system.management.automation.language.ast(v=vs.85).aspx" target="_blank" rel="noopener noreferrer">AST</a> being used.</p>
<p>The second instance that I came across the usage of AST in finding resource module dependencies was in the <em>Configuration</em> function in the <em>PSDesiredStateConfiguration</em> module. This function, starting from WMF 5.0, has a runtime parameter called <em>ResourceModulesTuplesToImport. </em></p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/COnfigFunction.png" rel="lightbox[12708]"><img class="aligncenter wp-image-12712" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/COnfigFunction-1024x328.png" alt="" width="592" height="190" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/COnfigFunction-1024x328.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/COnfigFunction-300x96.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/COnfigFunction-768x246.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/COnfigFunction.png 1388w" sizes="(max-width: 592px) 100vw, 592px" /></a></p>
<p></p><pre class="crayon-plain-tag">PS C:\&gt; (Get-Command Configuration | Select-Object -ExpandProperty Parameters).ResourceModuleTuplesToImport


Name            : ResourceModuleTuplesToImport
ParameterType   : System.Collections.Generic.List`1[System.Tuple`3[System.String[],Microsoft.PowerShell.Commands.ModuleSpecification[],System.Version]]
ParameterSets   : {[__AllParameterSets, System.Management.Automation.ParameterSetMetadata]}
IsDynamic       : False
Aliases         : {}
Attributes      : {__AllParameterSets, System.Management.Automation.ArgumentTypeConverterAttribute}
SwitchParameter : False</pre><p></p>
<p>The argument for the <em>ResourceModulesTuplesToImport</em> gets populated at runtime &#8212; when a <em>Configuration </em>gets loaded for the first time. To be specific, when you create a configuration document and load it into the memory, AST gets triggered and populates the argument to this parameter. You can trace this back to <a href="https://github.com/PowerShell/PowerShell/blob/master/src/System.Management.Automation/engine/parser/ast.cs">ast.cs</a>. Here is a part of that.</p>
<p></p><pre class="crayon-plain-tag">///////////////////////////
// get import parameters
var bodyStatements = Body.ScriptBlock.EndBlock.Statements;
var resourceModulePairsToImport = new List&lt;Tuple&lt;string[], ModuleSpecification[], Version&gt;&gt;();
var resourceBody = (from stm in bodyStatements where !IsImportCommand(stm, resourceModulePairsToImport) select (StatementAst)stm.Copy()).ToList();</pre><p></p>
<p>So, the whole magic of deriving the dependent modules is happening in the <em>IsImportCommand</em> method. Once I reviewed the code there, it wasn&#8217;t tough to reverse engineer that into PowerShell.</p>
<p>I published my scripts to <a href="https://github.com/rchaganti/PSDSCUtils">https://github.com/rchaganti/PSDSCUtils</a>. Let&#8217;s take a look at the script now.</p>
<p></p><pre class="crayon-plain-tag">[CmdletBinding()]
param (
    [Parameter(Mandatory)]
    [String] $ConfigurationScript,

    [Parameter()]
    [Switch] $Package,

    [Parameter()]
    [String] $PackagePath
)

$ConfigurationScriptContent = Get-Content -Path $ConfigurationScript -Raw
$ast = [System.Management.Automation.Language.Parser]::ParseInput($ConfigurationScriptContent, [ref]$null, [ref]$null)
$configAst = $ast.FindAll({ $args[0] -is [System.Management.Automation.Language.ConfigurationDefinitionAst]}, $true)
$moduleSpecifcation = @()
foreach ($config in $configAst)
{
    $dksAst = $config.FindAll({ $args[0] -is [System.Management.Automation.Language.DynamicKeywordStatementAst]}, $true)

    foreach ($dynKeyword in $dksAst)
    {
        [System.Management.Automation.Language.CommandElementAst[]] $cea = $dynKeyword.CommandElements.Copy()
        $allCommands = [System.Management.Automation.Language.CommandAst]::new($dynKeyword.Extent, $cea, [System.Management.Automation.Language.TokenKind]::Unknown, $null)
        foreach ($importCommand in $allCommands)
        {
            if ($importCommand.CommandElements[0].Value -eq 'Import-DscResource')
            {
                [System.Management.Automation.Language.StaticBindingResult]$spBinder = [System.Management.Automation.Language.StaticParameterBinder]::BindCommand($importCommand, $false)
            
                $moduleNames = ''
                $resourceNames = ''
                $moduleVersion = ''
                foreach ($item in $spBinder.BoundParameters.GetEnumerator())
                { 
                    $parameterName = $item.key
                    $argument = $item.Value.Value.Extent.Text

                    #Check if the parametername is Name
                    $parameterToCheck = 'Name'
                    $parameterToCheckLength = $parameterToCheck.Length
                    $parameterNameLength = $parameterName.Length

                    if (($parameterNameLength -le $parameterToCheckLength) -and ($parameterName.Equals($parameterToCheck.Substring(0,$parameterNameLength))))
                    {
                        $resourceNames = $argument.Split(',')
                    }

                    #Check if the parametername is ModuleName
                    $parameterToCheck = 'ModuleName'
                    $parameterToCheckLength = $parameterToCheck.Length
                    $parameterNameLength = $parameterName.Length
                    if (($parameterNameLength -le $parameterToCheckLength) -and ($parameterName.Equals($parameterToCheck.Substring(0,$parameterNameLength))))
                    {
                        $moduleNames = $argument.Split(',')
                    }

                    #Check if the parametername is ModuleVersion
                    $parameterToCheck = 'ModuleVersion'
                    $parameterToCheckLength = $parameterToCheck.Length
                    $parameterNameLength = $parameterName.Length
                    if (($parameterNameLength -le $parameterToCheckLength) -and ($parameterName.Equals($parameterToCheck.Substring(0,$parameterNameLength))))
                    {
                        if (-not ($moduleVersion.Contains(',')))
                        {
                            $moduleVersion = $argument
                        }
                        else
                        {
                            throw 'Cannot specify more than one moduleversion' 
                        }
                    }
                }

                #Get the module details
                #"Module Names: " + $moduleNames
                #"Resource Name: " + $resourceNames
                #"Module Version: " + $moduleVersion 

                if($moduleVersion)
                {
                    if (-not $moduleNames)
                    {
                        throw '-ModuleName is required when -ModuleVersion is used'
                    }

                    if ($moduleNames.Count -gt 1)
                    {
                        throw 'Cannot specify more than one module when ModuleVersion parameter is used'
                    }
                }

                if ($resourceNames)
                {
                    if ($moduleNames.Count -gt 1)
                    {
                        throw 'Cannot specify more than one module when the Name parameter is used'
                    }
                }
            
                #We have multiple combinations of parameters possible
                #Case 1: All three are provided: ModuleName,ModuleVerison, and Name
                #Case 2: ModuleName and ModuleVersion are provided
                #Case 3: Only Name is provided
                #Case 4: Only ModuleName is provided
                
                #Case 1, 2, and 3
                #At the moment, there is no error check on the resource names supplied as argument to -Name
                if ($moduleNames)
                {
                    foreach ($module in $moduleNames)
                    {
                        if (-not ($module -eq 'PSDesiredStateConfiguration'))
                        {
                            $moduleHash = @{
                                ModuleName = $module
                            }

                            if ($moduleVersion)
                            {
                                $moduleHash.Add('ModuleVersion',$moduleVersion)
                            }
                            else
                            {
                                $availableModuleVersion = Get-RecentModuleVersion -ModuleName $module
                                $moduleHash.Add('ModuleVersion',$availableModuleVersion)
                            }

                            $moduleInfo = Get-Module -ListAvailable -FullyQualifiedName $moduleHash -Verbose:$false -ErrorAction SilentlyContinue
                            if ($moduleInfo)
                            {
                                #TODO: Check if listed resources are equal or subset of what module exports
                                $moduleSpecifcation += $moduleInfo
                            }
                            else
                            {
                                throw "No module exists with name ${module}"
                            }
                        }
                    }    
                }

                #Case 2
                #Foreach resource, we need to find a module
                if ((-not $moduleNames) -and $resourceNames)
                {
                    $moduleHash = Get-DscModulesFromResourceName -ResourceNames $resourceNames -Verbose:$false
                    foreach ($module in $moduleHash)
                    {
                        $moduleInfo = Get-Module -ListAvailable -FullyQualifiedName $module -Verbose:$false   
                        $moduleSpecifcation += $moduleInfo 
                    }
                }
            }
        }
    }
}

if ($Package)
{
    #Create a temp folder
    $null = mkdir "${env:temp}\modules" -Force -Verbose:$false

    #Copy all module folders to a temp folder
    foreach ($module in $moduleSpecifcation)
    {
        $null = mkdir "${env:temp}\modules\$($module.Name)"
        Copy-Item -Path $module.ModuleBase -Destination "${env:temp}\modules\$($module.Name)" -Container -Recurse -Verbose:$false
    }

    #Create an archive with all needed modules
    Compress-Archive -Path "${env:temp}\modules" -DestinationPath $PackagePath -Force -Verbose:$false

    #Remove the folder
    Remove-Item -Path "${env:temp}\modules" -Recurse -Force -Verbose:$false
}
else
{
    return $moduleSpecifcation
}

function Get-DscModulesFromResourceName
{
    [CmdletBinding()]
    param (
        [Parameter(Mandatory)]
        [string[]] $ResourceNames
    )

    process
    {
        $moduleInfo = Get-DscResource -Name $ResourceNames -Verbose:$false | Select -Expand ModuleName -Unique
        $moduleHash = @()
        foreach ($module in $moduleInfo)
        {
            $moduleHash += @{
                 ModuleName = $module
                 ModuleVersion = (Get-RecentModuleVersion -ModuleName $module)
            }
        }

        return $moduleHash
    }
}

function Get-DscResourcesFromModule
{
    [CmdletBinding()]
    param (
        [Parameter(Mandatory)]
        [String] $ModuleName,

        [Parameter()]
        [Version] $ModuleVersion
    )

    process
    {
        $resourceInfo = Get-DscResource -Module $ModuleName -Verbose:$false
        if ($resourceInfo)
        {
            if ($ModuleVersion)
            {
                $resources = $resourceInfo.Where({$_.Module.Version -eq $ModuleVersion})
                return $resources.Name
            }
            else
            {
                #check if there are multiple versions of the modules; if so, return the most recent one
                $mostRecentVersion = Get-RecentModuleVersion -ModuleName $ModuleName
                Get-DscResourcesFromModule -ModuleName $ModuleName -ModuleVersion $mostRecentVersion
            }
        }
    }
}

function Get-RecentModuleVersion
{
    [CmdletBinding()]
    param (
        [Parameter(Mandatory)]
        [String] $ModuleName
    )

    process
    {
        $moduleInfo = Get-Module -ListAvailable -Name $ModuleName -Verbose:$false | Sort -Property Version
        if ($moduleInfo)
        {
            return ($moduleInfo[-1].Version).ToString()
        }
    }
}</pre><p></p>
<p>Here is how you used this script:</p>
<p>With just the <em>-ConfigurationScript</em> parameter, this script emits a ModuleInfo object that contains a list of modules that are being imported in the configuration script.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/06/ModuleList-1.png" rel="lightbox[12708]"><img class="aligncenter wp-image-12728" src="http://www.powershellmagazine.com/wp-content/uploads/2017/06/ModuleList-1-1024x152.png" alt="" width="605" height="90" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/06/ModuleList-1-1024x152.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/ModuleList-1-300x45.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/06/ModuleList-1-768x114.png 768w" sizes="(max-width: 605px) 100vw, 605px" /></a></p>
<p>In case you need to package the modules into a zip archive, you can use the -Package and -PackagePath parameters.</p>
<p></p><pre class="crayon-plain-tag">.\Get-DSCResourceModulesFromConfiguration.ps1 -ConfigurationScript C:\Scripts\VMDscDemo.ps1 -Package -PackagePath C:\Scripts\modules.zip</pre><p></p>
<p>There are many uses cases for this. I use this extensively in my Hyper-V lab configurations. What are your use cases?</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/06/27/discover-and-package-dependent-resource-modules-for-a-psdsc-configuration/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Cisco UCS Custom Resource for the Windows PowerShell Desired State Configuration (DSC)</title>
		<link>https://www.powershellmagazine.com/2017/05/29/cisco-ucs-custom-resource-for-the-windows-powershell-desired-state-configuration-dsc/</link>
				<comments>https://www.powershellmagazine.com/2017/05/29/cisco-ucs-custom-resource-for-the-windows-powershell-desired-state-configuration-dsc/#comments</comments>
				<pubDate>Mon, 29 May 2017 16:00:59 +0000</pubDate>
		<dc:creator><![CDATA[Sumanth BR]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Cisco]]></category>
		<category><![CDATA[PSDSC]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12717</guid>
				<description><![CDATA[The Cisco UCS PowerTool Suite is a set of PowerShell modules for Cisco UCS Manager, Cisco IMC (C-Series stand-alone servers)... ]]></description>
								<content:encoded><![CDATA[<p>The Cisco UCS PowerTool Suite is a set of PowerShell modules for Cisco UCS Manager, Cisco IMC (C-Series stand-alone servers) and Cisco UCS Central that helps in configuration and management of Cisco UCS domains and solutions.  The Cisco UCS PowerTool Suite 2.0.1 release added a new module <strong>Cisco.Ucs.DesiredStateConfiguration </strong>which consists of custom resources for configuring Cisco UCS Manager and Cisco IMC using the PowerShell DSC platform. You can download the latest version of the UCS PowerTool Suite from <a href="https://software.cisco.com/download/release.html?i=!y&amp;mdfid=286305108&amp;softwareid=284574017&amp;release=2.0.2" target="_blank" rel="noopener noreferrer">cisco.com</a>. Refer to <a href="https://communities.cisco.com/docs/DOC-37154" target="_blank" rel="noopener noreferrer">Cisco UCS PowerTool Suite</a> page on Cisco Communities for more resources.</p>
<p>PowerShell Desired State Configuration (DSC) is a management platform which enables you to configure, deploy, and manage systems. DSC provides <strong>declarative</strong>, <strong>autonomous</strong> and <strong>idempotent</strong> deployment, configuration and conformance for standards-based managed elements. For more information on DSC refer to the <a href="https://msdn.microsoft.com/en-us/powershell/dsc/overview" target="_blank" rel="noopener noreferrer">PowerShell DSC documentation</a>.</p>
<p>Cisco UCS DSC Resource aids in achieving Configuration as Code in turn helping you to follow the DevOps model.</p>
<p>The Cisco UCS DSC module provides six DSC custom resources which cover the majority of use cases.  You can view the custom UCS resources by running the <em>Get-DscResource</em> cmdlet as shown below.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/1.png" rel="lightbox[12717]"><img class="aligncenter wp-image-12718" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/1.png" alt="" width="604" height="107" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/1.png 843w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/1-300x53.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/1-768x137.png 768w" sizes="(max-width: 604px) 100vw, 604px" /></a></p>
<h2>Solution Architecture Overview</h2>
<p>Before getting in to the details of the resources let’s review some basic concepts of DSC and the overall architecture of the UCS PowerTool DSC solution.</p>
<p>The DSC management platform consists of three main components.</p>
<ul>
<li><strong>Configuration: </strong>This is where you define the configurations that need to be applied in a declarative manner. Once you run this configuration, DSC will take care of ensuring the system is in the state that is defined in the configuration.</li>
<li><strong>Resources:</strong> These are the building blocks for the configurations.</li>
<li><strong>Local Configuration Manager (LCM):</strong> This is the engine that facilitates the interaction between resources and configurations. The LCM regularly polls the state of the system and takes appropriate actions based on the resource. The LCM runs on every target node.</li>
</ul>
<p>In DSC there are two ways to deploy a configuration.</p>
<ul>
<li><strong>Push Mode</strong> &#8211; Push mode is a manual deployment of DSC resources to target nodes. Once the configuration is compiled the user runs the Start-DscConfiguration cmdlet to push the configuration to the desired nodes.</li>
<li><strong>Pull Mode</strong> &#8211; Pull mode configures nodes to check in to a DSC web pull server to retrieve the configuration files. When a new configuration is available, the LCM downloads and applies it to the target node.</li>
</ul>
<p>To utilize DSC functionality with Cisco UCS Manager or Cisco IMC an intermediate server is required. The intermediate server is a Windows Server having the required Windows Management Framework (WMF), PowerShell and the UCS PowerTool Suite installed.  A typical architecture is shown in the figure.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/2.jpg" rel="lightbox[12717]"><img class="aligncenter wp-image-12719" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/2.jpg" alt="" width="585" height="337" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/2.jpg 720w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/2-300x173.jpg 300w" sizes="(max-width: 585px) 100vw, 585px" /></a></p>
<p><strong>Central Server</strong>—This server is used to write the UCS DSC configuration scripts for Cisco UCS Manager or Cisco IMC. This can be configured as a pull server if the method of deployment is <strong>Pull Mode</strong>.</p>
<p><strong>Intermediate Server</strong>— The Central server deploys the configuration to the Intermediate server. This server applies the configuration to the Cisco UCS Manager or Cisco IMC using the UCS PowerTool DSC cmdlets.</p>
<h2>Cisco UCS Manager DSC Resources</h2>
<p>There are four custom resources provided for configuring Cisco UCS Manager.</p>
<ul>
<li><strong>UcsSyncMoWithReference:</strong> This resource syncs configuration from a reference UCS Manager to any target UCS Managers.</li>
<li><strong>UcsManagedObject:</strong> This resource configures any UCS Manager Managed Object (MO) by specifying the MO details.</li>
<li><strong>UcsScript:</strong> This resource allows for the execution of UCS Manager PowerTool cmdlets.</li>
<li><strong>UcsSyncFromBackup:</strong> This resource applies configuration from a backup file to any target UCS Managers.</li>
</ul>
<h3>Generating DSC configuration document for UCS Manager GUI operations</h3>
<p>To simplify the process of authoring the DSC configuration documents for UCS Manager use the <strong>ConvertTo-UcsDscConfig</strong> cmdlet. This cmdlet is similar to the <strong>ConvertTo-UcsCmdlet</strong> cmdlet that generates the UCS PowerTool cmdlets for the actions performed on the UCS Manager GUI.  Creating a configuration document is a simple two-step process.</p>
<ol>
<li>Launch the UCS Manager GUI, either using the <strong>Start-UcsGuiSession</strong> cmdlet or manually, then run the <strong>ConvertTo-UcsDscConfig </strong>cmdlet, if -OutputFilePath is supplied the DSC configuration will be written to the specified file, otherwise the ConvertTo-UcsDscConfig cmdlet will produce output to the UCS PowerTool console session.</li>
<li>Create the required configuration using the UCS Manager GUI. When you are done creating configurations you can open the input file specified in the step 1. The required DSC configuration document will have been auto-generated. If no output file was specified, cut and paste the UCS PowerTool console output to a file.</li>
</ol>
<p>Once you have the auto-generated document you just need to customize a few environment-related settings like Configuration Data, UCS Manager connection details and Credentials.</p>
<h3>UcsSyncMoWithReference Custom Resource</h3>
<p>If you have more than one UCS domain in your datacenter and want to maintain a baseline configuration across all the UCS domains use the UcsSyncMoWithReference resource.</p>
<p>You can create a configuration using this resource by specifying the Distinguished Name (DN) of the Managed Object (MO) that needs to be synced.  Here is an example of how you can sync a Service Profile, Service Profile Template, and LDAP Groups.</p>
<p></p><pre class="crayon-plain-tag">UcsSyncMoWithReference SyncServiceProfile
         {
           UcsCredentials = $ucsCredential
           UcsConnectionString = $ucsConnString
           RefUcsCredentials = $refUcsCredential
           RefUcsConnectionString = $refUcsConString
           Ensure="Present"
           Identifier ="2"
           Hierarchy=$true
           Dn = "org-root/ls-SPExchangeServer"
         } 

         UcsSyncMoWithReference SyncSpTemplate
         {
           UcsCredentials = $ucsCredential
           UcsConnectionString = $ucsConnString
           RefUcsCredentials = $refUcsCredential
           RefUcsConnectionString = $refUcsConString
           Ensure="Present"
           Identifier ="3"
           Hierarchy=$true
           Dn = "org-root/ls-SPTSqlServer"
         }


         UcsSyncMoWithReference SyncLDAPGroups
         {
           UcsCredentials = $ucsCredential
           UcsConnectionString = $ucsConnString
           RefUcsCredentials = $RefUcsCredential
           RefUcsConnectionString = $refUcsConString
           Ensure="Present"
           Identifier ="4"
           DeleteNotPresent=$true
     #Sync all the LDAP groups by specifying the DN and Hierarchy true
           Hierarchy=$true
           Dn="sys/ldap-ext" 
           
         }</pre><p></p>
<p>In the above example I have specified the DN of the SP, SP Template, and the LDAP group. By specifying Ensure=&#8221;Present&#8221; the UcsSyncMoWithReference resource ensures that the MOs are created on the UCS domain. You can also specify what action to take in case if there are additional MOs than compared to the MOs present on the reference UCS. If you want to delete the additional MOs, you need to specify DeleteNotPresent= $true as done in the LDAP sync configuration in the above example.  Refer to UCS Manager PowerTool User Guide for more details on the properties of the Resource.</p>
<h3>UcsManagedObject Custom Resource</h3>
<p>This is a generic resource provided to configure any MO in UCS Manager. To use this resource, you need to be familiar with the MO definitions and properties. One way to make use of this resource is by generating this configuration automatically as explained in the earlier section.  If you are writing the configuration manually refer to the <a href="http://www.cisco.com/c/en/us/td/docs/unified_computing/ucs/sw/api/b_ucs_api_book.html">UCS Manager XML API Programmer’s Guide</a>.  Below is an example configuration of creating an Org in the UCS Manager. There are few key things to consider while creating the configuration, you need to specify the DN, XML API Class ID, and the Property Map.</p>
<p></p><pre class="crayon-plain-tag">UcsManagedObject CreateOrganisationDemo
 {
    Ensure = "Present"
ModifyPresent = $true
ClassId= "orgOrg"
Dn = "org-root/org-DSCDemoOrg"
PropertyMap= "Descr = test for DSC with certificate `nName = DSCDemoOrg"
UcsCredentials = $ucsCredential
UcsConnectionString = $connectionString
Identifier = "2"
 }</pre><p></p>
<p>You need to specify the properties of the managed object as key value pairs using the below format</p>
<p></p><pre class="crayon-plain-tag">`&lt;key1&gt;=&lt;value1&gt; `&lt;key2&gt;=&lt;value2&gt;</pre><p></p>
<h3>UcsScript Custom Resource</h3>
<p>This is a generic resource provided to execute UCS Manager PowerTool cmdlets in a script. You can use this resource in cases where the configuration is complex and needs to be written as a script. You can also generate the configuration automatically for this resource as explained earlier.  Below is an example configuration of renaming a Service Profile.</p>
<p></p><pre class="crayon-plain-tag">UcsScript RenameServiceProfileDemo
        {
            Ensure = "Present"
            Dn = "org-root/ls-dscdemo"
            Script = "Get-UcsOrg -Level root | Get-UcsServiceProfile -Name 'TestSP' -LimitScope | Rename-UcsServiceProfile -NewName 'dscdemo' "
            UcsCredentials = $ucsCredential
            UcsConnectionString = $connectionString
            Identifier ="1"
        }</pre><p></p>
<p>If the configuration script is complex you can specify multiple DNs in a comma separated format.</p>
<h3>Configuration Example</h3>
<p>This section details how you can put together all the things in a DSC configuration document.</p>
<p>For all the examples mentioned above you need to specify environment settings, UCS Connection details and Credentials.</p>
<p>UCS connection string needs to be specified in the following format.</p>
<p></p><pre class="crayon-plain-tag">Name=&lt;ipAddress&gt; [`nNoSsl=&lt;bool&gt;][`nPort=&lt;ushort&gt;] [`nProxyAddress=&lt;proxyAddress&gt;] [`nUseProxyDefaultCredentials=&lt;bool&gt;]</pre><p></p>
<p>UCS Manager credentials needs to be specified as a PSCredential object. You can use certificates for encrypting the credentials to keep it secure. For information on using certificates for encryption refer to <a href="https://msdn.microsoft.com/en-us/powershell/dsc/securemof" target="_blank" rel="noopener noreferrer">Microsoft DSC documentation</a>.</p>
<p>Below is an example configuration which has all the details.</p>
<p></p><pre class="crayon-plain-tag">$ConfigData=    @{
        AllNodes = @(    
                        @{ 
                            # The name of the node we are describing
                            NodeName ="10.105.219.128"
                            # The path to the .cer file containing the
                            # public key of the Encryption Certificate
                            # used to encrypt credentials for this node
                             CertificateFile = "C:\Certificate\MyCertificate.cer"


                            # The thumbprint of the Encryption Certificate
                            # used to decrypt the credentials on target node
                            Thumbprint = "558CF40844CDC6303D25494FB007189F75BEE060"
                        };
                    );   
    } 

Configuration AutoGeneratedConfig
{
    param(
        [Parameter(Mandatory=$true)]
        [PsCredential] $ucsCredential,

        [Parameter(Mandatory=$true)]
        [string] $connectionString
        )
    
        Import-DSCResource -ModuleName Cisco.Ucs.DesiredStateConfiguration
        
        
        Node "10.105.219.128"
        {
         
            LocalConfigurationManager
            {
                CertificateId = $node.Thumbprint
                ConfigurationMode = 'ApplyOnly'
               RefreshMode = 'Push'
               
            } 

            UcsManagedObject UcsManagedObject1
            {
                Ensure = "Present"
                ModifyPresent = $true
                ClassId= "equipmentLocatorLed"
                Dn = "sys/chassis-1/blade-1/locator-led"
                PropertyMap= "Id = 1 `nBoardType = single `nAdminState = on"
                UcsCredentials = $ucsCredential
                UcsConnectionString = $connectionString
                Identifier = "1"
            }
           

           UcsManagedObject ucsManagedobject2
           {
               Ensure = "Present"
               ModifyPresent = $true
               ClassId= "orgOrg"
               Dn = "org-root/org-SubOrg2"
               PropertyMap= "Descr = test for DSC with certificate `nName = SubOrg2"
               UcsCredentials = $ucsCredential
               UcsConnectionString = $connectionString
               Identifier = "2"
           }

        }

 }

try
{
${Error}.Clear()
       $credential = Get-Credential

    AutoGeneratedConfig  -ConfigurationData $ConfigData `
                         -ucsCredential $credential `
                         -connectionString "Name=10.65.183.5" `
                         -OutputPath "C:\DscDemo\AutoGeneratedConfig"  
}
Catch
{
    Write-Host ${Error}
   
    exit
}</pre><p></p>
<p>When you run this script PowerShell will generate the corresponding MOF files. You can deploy the configuration based on the LCM configuration mode. If it is set to Push mode, then you can enact the configuration using the below syntax.</p>
<p></p><pre class="crayon-plain-tag">Start-DscConfiguration -Path "C:\DscDemo\AutoGeneratedConfig\" -Wait -Verbose -force</pre><p></p>
<p>&nbsp;</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/05/29/cisco-ucs-custom-resource-for-the-windows-powershell-desired-state-configuration-dsc/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
							</item>
		<item>
		<title>#PSDSC Doing It Right &#8211; Configuration vs Orchestration</title>
		<link>https://www.powershellmagazine.com/2017/05/24/psdsc-doing-it-right-configuration-vs-orchestration/</link>
				<comments>https://www.powershellmagazine.com/2017/05/24/psdsc-doing-it-right-configuration-vs-orchestration/#respond</comments>
				<pubDate>Wed, 24 May 2017 16:00:49 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[DoingItRight]]></category>
		<category><![CDATA[PSDSC]]></category>
		<category><![CDATA[PSDSCDIR]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12622</guid>
				<description><![CDATA[In the last part of this series, we looked at why resource granularity is important. In this part, we will... ]]></description>
								<content:encoded><![CDATA[<p>In the last part of <a href="http://www.powershellmagazine.com/tag/psdscdir/">this series</a>, we looked at why <a href="http://wp.me/p1KBLb-3hX">resource granularity</a> is important. In this part, we will see the difference between configuration items and orchestration steps.</p>
<p>When using any configuration management platform or a tool, the imperative resource modules, that &#8220;make it so&#8221;, are the most important. The declarative configuration documents combine the resource definitions and the imperative scripts get called behind the scenes to do the job. Now, it can be very tempting to write a resource for everything going by the resource granularity principle. But, this is where we need to apply some filters. There are many such filters but let&#8217;s first start with the <em><strong>configuration vs orchestration filter</strong></em> which is the subject of this article.</p>
<p>Take a look at this flowchart for an example.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/deployflow.png" rel="lightbox[12622]"><img class="aligncenter wp-image-12680" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/deployflow.png" alt="" width="125" height="699" /></a></p>
<p>The flowchart above is about deploying OS through automated processes in my lab. This is very high-level and abstracted. Notice the step before complete OS deploy. We need to set the bare metal system to perform one time PXE boot so that the WDS server kick starts the WinPE boot and completes required OS deployment steps. In a scenario where you have artifacts developed for even bare metal configuration, it is certainly possible to put this one time PXE boot also as a resource configuration. After all, it is just BIOS attribute configuration.</p>
<p>I have put together an example for this.</p>
<p></p><pre class="crayon-plain-tag">$configData = 
@{
    AllNodes = 
    @(
        @{
            NodeName = 'localhost'
            PSDscAllowPlainTextPassword = $true
        }
    )
}

Configuration PEPXEBootDemo
{
    Import-DscResource -ModuleName DellPEWsManTools -Name PEOneTimeBoot
    Import-DscResource -ModuleName PSDesiredStateConfiguration

    Node $AllNodes.NodeName {
        PEOneTimeBoot PEPXEBootDemo
        {
            Id = 'Node17-UniqueBoot'
            DRACIPAddress = '172.16.100.17'
            DRACCredential = (Get-Credential)
            OneTimeBootMode = 'Enabled'
            OneTimeBootDevice = 'NIC.PxeDevice.1-1'
        }
    }
}

PEPXEBootDemo -ConfigurationData $configData</pre><p></p>
<p>The above configuration document uses the <em>PEOneTimeBoot</em> resource from the <em>DellPEWSManTools</em> resource module. The <em>PEOneTimeBoot</em> is a proxy DSC resource.</p>
<p>Here is what it does when we enact this configuration.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBoot.png" rel="lightbox[12622]"><img class="aligncenter wp-image-12682" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBoot-1024x306.png" alt="" width="615" height="184" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBoot-1024x306.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBoot-300x90.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBoot-768x230.png 768w" sizes="(max-width: 615px) 100vw, 615px" /></a></p>
<p>All is well and good. Once the bare metal system restarts, it boots into PXE and completes OS deployment. Now, here is the interesting part. If I use the <em>Test-DscConfiguration</em> cmdlet to check if my node is in desired state or not, here is what I see.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBootDrift.png" rel="lightbox[12622]"><img class="aligncenter wp-image-12684" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBootDrift.png" alt="" width="621" height="85" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBootDrift.png 974w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBootDrift-300x41.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/PEBootDrift-768x105.png 768w" sizes="(max-width: 621px) 100vw, 621px" /></a></p>
<p>There is a configuration drift. And, it will always be like this. This is because the <em>PEOneTimeBoot</em> resource is configuring a BIOS attribute that is short lived. It gets reset or gets disabled after the configuration is complete. So, when we check the attribute after the configuration enact is complete, the <em>Test-DscConfiguration</em> will always find that this resource has drifted from the expected configuration. When we aggregate the resource desired state at the complete system level, this drift in PEOneTimeBoot will roll up as drift at the whole system level. And, this is what makes this, <em>PEOneTimeBoot,</em> an incorrect choice for creating a DSC resource.</p>
<p>In practice, this is an orchestration step and not a configuration item. As a part of the deployment process orchestration, we would need an orchestration step that PXE boots the bare metal system for OS deployment to complete and then perform any other post-OS configuration tasks using DSC.</p>
<p>Therefore when designing and developing a DSC resource module, apply this filter to check if the resource configuration in question is short lived. In other terms, check if the resource configuration implies to be an orchestration step or a configuration item.</p>
<p>Stay tuned for more in this series!</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/05/24/psdsc-doing-it-right-configuration-vs-orchestration/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>#PSDSC Doing It Right &#8211; Resource Granularity</title>
		<link>https://www.powershellmagazine.com/2017/05/23/psdsc-doing-it-right-resource-granularity/</link>
				<comments>https://www.powershellmagazine.com/2017/05/23/psdsc-doing-it-right-resource-granularity/#respond</comments>
				<pubDate>Tue, 23 May 2017 16:00:02 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[DoingItRight]]></category>
		<category><![CDATA[PSDSC]]></category>
		<category><![CDATA[PSDSCDIR]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12645</guid>
				<description><![CDATA[This is the first article in the #PSDSC Doing It Right series. This series of articles will introduce you to... ]]></description>
								<content:encoded><![CDATA[<p>This is the first article in the <a href="http://www.powershellmagazine.com/tag/psdscdir/">#PSDSC Doing It Right series</a>. This series of articles will introduce you to some of the best practices we have learned through some real-world implementations of PowerShell DSC-based configuration management. Read on!</p>
<p>There are several custom DSC resource modules available either via the official <a href="https://github.com/PowerShell/DscResources/">PowerShell team&#8217;s DSC resource kit</a> or via other community repositories. There are guidelines on <a href="https://github.com/PowerShell/DscResources/blob/master/CONTRIBUTING.md">contributing to the official resource kit.</a> This includes a set of requirements for creating <a href="https://github.com/PowerShell/DscResources/blob/master/HighQualityModuleGuidelines.md">High Quality DSC Resource modules</a> and the <a href="https://github.com/PowerShell/DscResources/blob/master/StyleGuidelines.md">style guidelines</a> that must be followed when creating these High Quality Resource Modules.</p>
<p>These guidelines only discuss how you should structure the DSC resource modules, how and what type of tests you should include, how you must document your module, and what coding style guidelines should be followed, and so on but not how you should design your resource module and what a resource within the module should represent. This is not just about logic in the resource module&#8217;s imperative scripts but what and how those imperative scripts configure. In other words, the granularity of resource configuration should be one of the design considerations when writing resource modules.</p>
<p>Let&#8217;s look at an example to understand this.</p>
<p>Here is the <a href="https://github.com/PowerShell/xHyper-V/tree/dev/DSCResources/MSFT_xVMHyperV">xVMHyperV</a> resource in <a href="https://github.com/PowerShell/xHyper-V">xHyper-V</a> module. This resource has a bunch of properties and supports creation of a VM and some of its related components such as network adapters.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/xVMHyperV.png" rel="lightbox[12645]"><img class="aligncenter wp-image-12654" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/xVMHyperV.png" alt="" width="589" height="476" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/xVMHyperV.png 732w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/xVMHyperV-300x243.png 300w" sizes="(max-width: 589px) 100vw, 589px" /></a></p>
<p>If I look at the properties listed here, I see a few issues with the way this resource is designed.</p>
<ul>
<li>This resource takes an array of switch names <em>(SwitchName</em> property) and then attaches a network adapter for each switch that is specified in the resource instance configuration. This takes care of adding new adapters in future through the same configuration. However, it fails when you want to remove an adapter. Even if you want to implement that logic in the same resource, it becomes complex.</li>
<li>While there is support for multiple network adapters, there is no VLAN ID configuration available in the resource. There is no way we can configure other network adapter settings such as bandwidth settings, DHCP Guard, and other supported configuration settings.</li>
<li>This module does a lot of heavy lifting when it comes to virtual hard drive and dynamic memory configurations. However, there is no VHDX Q0S configuration that is possible.</li>
</ul>
<p>While this resource takes care of certain VM settings, it excludes many of the other VM settings. Adding all this support within the same <em>xVMHyperV</em> resource module will only increase its complexity. Instead, separating out the configuration into multiple smaller resource modules would be a good design.</p>
<p>One of the things that I consider when starting out with developing a new resource module is to first understand the resource itself. Let us look at the VM example again. Here is how I would represent the VM and its related components in a hierarchical manner.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/DIR-ResourceGranularity.png" rel="lightbox[12645]"><img class="aligncenter wp-image-12655" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/DIR-ResourceGranularity-1024x220.png" alt="" width="632" height="136" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/DIR-ResourceGranularity-1024x220.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/DIR-ResourceGranularity-300x65.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/DIR-ResourceGranularity-768x165.png 768w" sizes="(max-width: 632px) 100vw, 632px" /></a></p>
<p>Once I have this representation, I look at what items in the hierarchy are best suited to be resource on their own. For example, processor in the VM context need not a resource on its own. When we create a new VM, we generally assign number of virtual CPU and that is it. But, the processor settings such as Resource Control, NUMA Topology and so on can be packaged into a different resource so that the VM resource need not worry about having all the logic to manage these special configuration settings. The same thought process applies to VM memory as well. I want to be able to manage the dynamic memory settings only when I need and not package them into the VM resource configuration.</p>
<p>However, when it comes to a network adapter, I want to add or remove network adapters without really touching the VM resource configuration. And, configure other settings for these network adapters when I need them. So, I would create separate resources for network adapters and their associated settings.</p>
<p>Simply put, the more complex your resource becomes, the more complex your tests need to be. Without proper tests for a complex resource, you end up creating something that is substandard. Period.</p>
<p>The granular resource design gives me flexibility while reducing the complexity. You may argue that the configuration documents tend to become very long and difficult to write with so many granular resources just to create a single VM. Yes, if you are doing this only one time, it shouldn&#8217;t matter. But, if you plan to reuse these configurations, composite resources solve this exact problem. I combine long and complex configurations into composite resources and use them very often. These composite resources become my <a href="http://www.powershellmagazine.com/2017/05/15/infrastructure-blueprints-a-way-to-share-psdsc-configurations/">infrastructure blueprints</a>.</p>
<p>Here is how my SimpleVM resource looks like.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/SimpleVM.png" rel="lightbox[12645]"><img class="aligncenter wp-image-12671" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/SimpleVM.png" alt="" width="463" height="334" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/SimpleVM.png 536w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/SimpleVM-300x217.png 300w" sizes="(max-width: 463px) 100vw, 463px" /></a></p>
<p><strong>Note</strong>: This is not yet in my public release of <a href="https://github.com/rchaganti/DSCResources/tree/master/cHyper-V">cHyper-V</a> module. If you want to give this a try, just give a shout and I will be able to invite you to try a few more things along with this.</p>
<p>This SimpleVM resource provides functionality that is good enough to create a VM that has a VHD attached and the required memory and CPU configuration. You can choose to leave the default network adapter or remove it if you want to use the xVMNetworkAdapter to add one or more resources separately.</p>
<p>Using this method for resource design also helped me create resources that are used for different types of components. Consider my <a href="https://github.com/PowerShell/xNetworking/tree/dev/Modules/xNetworking/DSCResources/MSFT_xNetAdapterRDMA">xNetAdapterRDMA</a> resource for an example. This resource is used to enable or disable RDMA on a network adapter. This is a very simple resource. It just flips the RDMA configuration on a network adapter.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/xNetAdapterRDMA.png" rel="lightbox[12645]"><img class="aligncenter wp-image-12668" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/xNetAdapterRDMA.png" alt="" width="465" height="160" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/xNetAdapterRDMA.png 608w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/xNetAdapterRDMA-300x103.png 300w" sizes="(max-width: 465px) 100vw, 465px" /></a></p>
<p>I have also written the <a href="https://github.com/PowerShell/xHyper-V/tree/dev/DSCResources/MSFT_xVMNetworkAdapter">xVMNetworkAdapter </a>resource which adds/removes VM network adapters in the management OS or to the virtual machines. So, when I have to add a VM network adapter that is RDMA capable, I could have implemented the logic to make it so in the xVMNetworkAdapter itself. But, that design, while making the resource complex, would prevent me from using the same functionality with physical adapters. So, if I were to configure physical adapter RDMA settings, I would have ended up writing a different resource or do it some other way. Instead, I chose to separate the RDMA configuration into its own module so that I can share that functionality across different types of network adapters.</p>
<p>To summarize, when designing a resource module, you should look at how and what the resource would do and structure your module in a way that allows flexible resource configurations.</p>
<p>Stay tuned for more in this series.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/05/23/psdsc-doing-it-right-resource-granularity/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>Working with binary files in PowerShell</title>
		<link>https://www.powershellmagazine.com/2017/05/22/working-with-binary-files-in-powershell/</link>
				<comments>https://www.powershellmagazine.com/2017/05/22/working-with-binary-files-in-powershell/#comments</comments>
				<pubDate>Mon, 22 May 2017 16:00:45 +0000</pubDate>
		<dc:creator><![CDATA[Jan Egil Ring]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[binary]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12701</guid>
				<description><![CDATA[This article is co-authored by Jan Egil Ring and Øyvind Kallstad. In this article, we will look at how binary... ]]></description>
								<content:encoded><![CDATA[<p><em>This article is co-authored by </em><em>Jan Egil Ring</em><em> and Ø</em><em>yvind Kallstad</em><em>.</em></p>
<p>In this article, we will look at how binary files can be interpreted in PowerShell by looking at a real world challenge as an example.</p>
<p><strong>The challenge</strong></p>
<p>In System Center Data Protection Manager (DPM), there is an agent installed on protected computers for managing backup and restore operations. If there is a lot of DPM servers in the environment and you do not know what DPM server is protecting an agent, you need a way to find this information on the local machine where the DPM agent is installed on. You might want to look at all the DPM servers, but the agent might be inactive and left over from a decommissioned DPM server.</p>
<p>There aren`t any official references on this topic besides some guidance in a <a href="https://social.technet.microsoft.com/Forums/en-US/d2570d6c-3213-410f-85bc-7062cef607b4/dpm-2012-sp1-is-there-a-way-to-find-out-which-dpm-server-an-agent-is-currently-backup-up-to?forum=dataprotectionmanager">forum thread</a> on the TechNet Forums:</p>
<p><em>Open an administrative command prompt, then run:</em></p>
<p><em>     C:\&gt; type &#8220;C:\Program Files\Microsoft Data Protection Manager\DPM\ActiveOwner&#42;.</em></p>
<p><em>The beginning of the first line returned will be the FQDN in Unicode of the DPM Server owning the agent on the protected server/client</em></p>
<p>Let&#8217;s try:</p>
<p><img class="alignnone wp-image-12702" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_01.png" alt="" width="600" height="125" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_01.png 640w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_01-300x62.png 300w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>This gives us the information we want, but not in a very convenient format. Optimally, we would like to gather this information remotely via PowerShell remoting and get an object back with information about the DPM agent. This could be a function containing version information in addition to the name of the DPM server(s) an agent is attached to.</p>
<p>In PowerShell we can use Get-Content (or its alias type) to get the same information:</p>
<p><img class="alignnone wp-image-12703" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_02.png" alt="" width="600" height="68" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_02.png 842w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_02-300x34.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_02-768x88.png 768w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>At this point, I was thinking that regular expressions might be an appropriate way to solve this challenge. I presented the challenge to my colleague Øyvind, which had experience working with binary files like this in PowerShell.</p>
<p><strong>Working with binary files</strong></p>
<p>Usually you would want to have some kind of documentation of the file format in question before trying to parse a binary file format. Unfortunately, we couldn&#8217;t find any for this file type. However, it seems from reading the file contents raw, that the information we want is right at the beginning of the file format. If you look at the raw format representation of the string we want to extract, you see that each character has a space between them. This tells us that it&#8217;s a Unicode encoded string.</p>
<p>It&#8217;s useful to use a dedicated hex editor when working with binary file types. In the following screenshot, I&#8217;m using the 010 Editor, and as you can see the editor have confirmed our suspicion that the text is in Unicode format.</p>
<p><img class="alignnone wp-image-12704" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_03.png" alt="" width="600" height="337" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_03.png 1583w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_03-300x168.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_03-768x431.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/Working-with-binary-files-in-PowerShell_03-1024x574.png 1024w" sizes="(max-width: 600px) 100vw, 600px" /></p>
<p>What we don&#8217;t know is length of this field, so we must do some guess work. Since this information is referring to a hostname or a domain name, we can assume it&#8217;s not going to contain any spaces. We also hope that there will be at least one space between this field and the next one. Building on these assumptions, we can create a do-while loop that keeps reading bytes until we encounter a byte that when converted to a Unicode string equals to an empty string. The resulting string should be the data that we are after.</p>
<p>Note that since we are reading Unicode-encoded strings, we need to read 2 bytes in each pass.</p>
<p>The way to read data from a binary file is to set up a BinaryReader object. This class has the ReadBytes method that we will use to read bytes from the binary stream.</p>
<p>We also need some way of converting the binary data to something meaningful. Since we already have identified the data as Unicode string, we can use the Unicode.GetString static method in the System.Text.Encoding class in .NET for this.</p>
<p>That&#8217;s all we really need for this particular case. You can find the full code at <a href="https://gist.github.com/janegilring/afec8213d3d14e4f436d0f9d88804f74">https://gist.github.com/janegilring/afec8213d3d14e4f436d0f9d88804f74</a></p>
<p>If you are interested in learning more about how to parse data from binary file formats, Øyvind did a talk about this at PowerShell Conference Europe 2017. The video recording of the talk is available <a href="https://www.youtube.com/watch?v=3ilvoOZNDLE">here</a>.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/05/22/working-with-binary-files-in-powershell/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
							</item>
		<item>
		<title>Infrastructure Blueprints &#8211; An Easier and Better Way to Share #PSDSC Configurations</title>
		<link>https://www.powershellmagazine.com/2017/05/15/infrastructure-blueprints-a-way-to-share-psdsc-configurations/</link>
				<comments>https://www.powershellmagazine.com/2017/05/15/infrastructure-blueprints-a-way-to-share-psdsc-configurations/#respond</comments>
				<pubDate>Mon, 15 May 2017 16:00:20 +0000</pubDate>
		<dc:creator><![CDATA[Ravikanth C]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[DSC]]></category>
		<category><![CDATA[IaC]]></category>
		<category><![CDATA[PSDSC]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12627</guid>
				<description><![CDATA[A while ago, I wrote an article to introduce Infrastructure Blueprints. Today&#8217;s post is about a more refined version of... ]]></description>
								<content:encoded><![CDATA[<p>A while ago, I wrote an article to <a href="http://www.powershellmagazine.com/2016/05/13/devops-infrastructure-as-code-and-powershell-dsc-infrastructure-blueprints/">introduce Infrastructure Blueprints</a>. Today&#8217;s post is about a more refined version of that project. These infrastructure blueprints are the starting point for a bigger discussion on dynamic infrastructure. We will discuss that later. Read on!</p>
<p>Within the Infrastructure as Code (IaC) practices, there is enough emphasis on the repeatable and reusable automation enabled using configuration management tools. This is referred to as configuration as Code. Configuration as Code enables consistent methods to configure IT systems. And, when we integrate these processes into DevOps practices, we can ensure that the configuration across different stages of the deployment (Development / Test / Staging / Production) can be done in a efficient and consistent manner.</p>
<p>One of the best practices in Configuration as Code practice is to ensure that the configurations that we deploy are made reusable. This means that the configurations are parameterized and have the environmental data separated from the structural configuration data.</p>
<p>PowerShell Desired State Configuration (DSC) supports the separation of environmental configuration from structural configuration using the <a href="https://msdn.microsoft.com/en-us/powershell/dsc/configdata">configuration data</a> artifacts. And, sharing of parameterized configurations is done using the composite configuration modules or what we call <a href="https://msdn.microsoft.com/en-us/powershell/dsc/authoringresourcecomposite">composite resources</a>. Composite configurations are very useful when a node configuration requires a combination multiple resources and becomes long and complex.</p>
<p>For example, building a Hyper-V cluster includes configuring host networking, domain join, updating firewall rules, creating/joining a cluster, and so on. Each node in the cluster should have this configuration done in a consistent manner. Therefore, the configurations that are applied for each of these items can be made reusable using the composite configuration methods.</p>
<p>Also, a real-world deployment pipeline implemented using IaC practices should also have validation of the infrastructure configuration at various stages of the deployment. In the PowerShell world, this is done using <a href="https://github.com/pester/pester">Pester</a>. Within DSC too, Pester plays a very important role in validating the desired state after a configuration is applied and in the operations validation space.</p>
<h1><a id="user-content-infrastructure-blueprints" class="anchor" href="https://github.com/rchaganti/InfraBlueprints#infrastructure-blueprints"></a>Infrastructure Blueprints</h1>
<p>The <a href="https://github.com/rchaganti/InfraBlueprints">infrastructure blueprints</a> project provides guidelines on enabling reusable and repeatable DSC configurations combined with Pester validations that are identified by <a href="https://github.com/PowerShell/Operation-Validation-Framework">Operations Validation Framework</a>. As a part of this repository, there will be a set of composite resource modules for various common configurations that you will see in a typical IT infrastructure.</p>
<p>Infrastructure Blueprints are essentially composite resource packages that contain node configurations and Pester tests that validate desired state and integration after the configuration is applied.</p>
<p>This repository contains multiple composite configuration modules. Each module contains multiple composite resources with ready to use examples and tests that validate the configuration.</p>
<p>The following folder structure shows how these composite modules are packaged.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/module_structure.png" rel="lightbox[12627]"><img class="aligncenter wp-image-12632" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/module_structure-271x300.png" alt="" width="500" height="554" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/module_structure-271x300.png 271w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/module_structure.png 718w" sizes="(max-width: 500px) 100vw, 500px" /></a></p>
<ul>
<li><em>Diagnostics</em> folder contains the Simple and Comprehensive tests for performing operations validation.
<ul>
<li><strong>Simple</strong>: A set of tests that validate the functionality of infrastructure at the desired state.</li>
<li><strong>Comprehensive</strong>: A set of tests that perform a comprehensive operations validation of the infrastructure at the desired state.</li>
<li>For ease of identification, the test script names include <em>Simple</em> or <em>Comprehensive</em> within the file name.</li>
</ul>
</li>
</ul>
<p>The Operations Validation Framework can be used to retrieve the list of tests in this module and invoke the relevant ones.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/OVF.png" rel="lightbox[12627]"><img class="aligncenter wp-image-12633" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/OVF-300x96.png" alt="" width="600" height="191" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/OVF-300x96.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/OVF-768x245.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/OVF-1024x326.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/OVF.png 1092w" sizes="(max-width: 600px) 100vw, 600px" /></a></p>
<p>Once you know the composite resource that is applied on the system, you can invoke either simple or comprehensive tests using the <em>Invoke-OperationValidation</em> cmdlet.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/ovf-run.png" rel="lightbox[12627]"><img class="aligncenter wp-image-12634" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/ovf-run-300x130.png" alt="" width="600" height="259" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/ovf-run-300x130.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/ovf-run-768x332.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/ovf-run-1024x443.png 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/ovf-run.png 1099w" sizes="(max-width: 600px) 100vw, 600px" /></a></p>
<ul>
<li><em>Examples</em> folder contains a sample configuration data foreach composite configuration and also a configuration document that demonstrates how to use the composite resource.</li>
<li><em>CompositeModuleName</em>.psd1 is the module manifest for the composite configuration module.</li>
</ul>
<p>This manifest contains the <em>RequiredModules</em> key that has all the required modules for the composite configuration to work. This is listed as a module specification object. For example, the <em>RequiredModules</em> key for Hyper-VConfigurations composite module contains the following hashtable.</p>
<p></p><pre class="crayon-plain-tag"># Modules that must be imported into the global environment prior to importing this module
RequiredModules = @(
    @{ModuleName='cHyper-v';ModuleVersion='3.0.0.0'},
    @{ModuleName='xNetworking';ModuleVersion='2.12.0.0'}
)</pre><p></p>
<p>These composite modules are available in the PowerShell Gallery as well. And, therefore, having the <em>RequiredModules</em> in the module manifest enables automatic download of all module dependencies automatically.</p>
<p><a href="http://www.powershellmagazine.com/wp-content/uploads/2017/05/findmodule.png" rel="lightbox[12627]"><img class="aligncenter wp-image-12635" src="http://www.powershellmagazine.com/wp-content/uploads/2017/05/findmodule-300x49.png" alt="" width="600" height="98" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/05/findmodule-300x49.png 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/findmodule-768x126.png 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/05/findmodule.png 921w" sizes="(max-width: 600px) 100vw, 600px" /></a></p>
<p></p><pre class="crayon-plain-tag">PS C:\&gt; Install-Module -Name Hyper-VConfigurations -Force -Verbose
VERBOSE: Using the provider 'PowerShellGet' for searching packages.
VERBOSE: The -Repository parameter was not specified.  PowerShellGet will use all of the registered repositories.
VERBOSE: Getting the provider object for the PackageManagement Provider 'NuGet'.
VERBOSE: The specified Location is 'https://www.powershellgallery.com/api/v2/' and PackageManagementProvider is 'NuGet'.
VERBOSE: Searching repository 'https://www.powershellgallery.com/api/v2/FindPackagesById()?id='Hyper-VConfigurations'' for ''.
VERBOSE: Total package yield:'1' for the specified package 'Hyper-VConfigurations'.
VERBOSE: Performing the operation "Install-Module" on target "Version '1.0.0.0' of module 'Hyper-VConfigurations'".
VERBOSE: The installation scope is specified to be 'AllUsers'.
VERBOSE: The specified module will be installed in 'C:\Program Files\WindowsPowerShell\Modules'.
VERBOSE: The specified Location is 'NuGet' and PackageManagementProvider is 'NuGet'.
VERBOSE: Downloading module 'Hyper-VConfigurations' with version '1.0.0.0' from the repository 'https://www.powershellgallery.com/api/v2/'.
VERBOSE: Searching repository 'https://www.powershellgallery.com/api/v2/FindPackagesById()?id='Hyper-VConfigurations'' for ''.
VERBOSE: Searching repository 'https://www.powershellgallery.com/api/v2/FindPackagesById()?id='cHyper-v'' for ''.
VERBOSE: Searching repository 'https://www.powershellgallery.com/api/v2/FindPackagesById()?id='xNetworking'' for ''.
VERBOSE: InstallPackage' - name='cHyper-V', version='3.0.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645'
VERBOSE: DownloadPackage' - name='cHyper-V', version='3.0.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645\cHyper-V\cHyper-V.nupkg', uri='https://www.powershe
llgallery.com/api/v2/package/cHyper-V/3.0.0'
VERBOSE: Downloading 'https://www.powershellgallery.com/api/v2/package/cHyper-V/3.0.0'.
VERBOSE: Completed downloading 'https://www.powershellgallery.com/api/v2/package/cHyper-V/3.0.0'.
VERBOSE: Completed downloading 'cHyper-V'.
VERBOSE: InstallPackageLocal' - name='cHyper-V', version='3.0.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645'
VERBOSE: InstallPackage' - name='xNetworking', version='3.2.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645'
VERBOSE: DownloadPackage' - name='xNetworking', version='3.2.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645\xNetworking\xNetworking.nupkg', uri='https://www
.powershellgallery.com/api/v2/package/xNetworking/3.2.0'
VERBOSE: Downloading 'https://www.powershellgallery.com/api/v2/package/xNetworking/3.2.0'.
VERBOSE: Completed downloading 'https://www.powershellgallery.com/api/v2/package/xNetworking/3.2.0'.
VERBOSE: Completed downloading 'xNetworking'.
VERBOSE: InstallPackageLocal' - name='xNetworking', version='3.2.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645'
VERBOSE: InstallPackage' - name='Hyper-VConfigurations', version='1.0.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645'
VERBOSE: DownloadPackage' - name='Hyper-VConfigurations', version='1.0.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645\Hyper-VConfigurations\Hyper-VConfigura
tions.nupkg', uri='https://www.powershellgallery.com/api/v2/package/Hyper-VConfigurations/1.0.0'
VERBOSE: Downloading 'https://www.powershellgallery.com/api/v2/package/Hyper-VConfigurations/1.0.0'.
VERBOSE: Completed downloading 'https://www.powershellgallery.com/api/v2/package/Hyper-VConfigurations/1.0.0'.
VERBOSE: Completed downloading 'Hyper-VConfigurations'.
VERBOSE: InstallPackageLocal' - name='Hyper-VConfigurations', version='1.0.0.0',destination='C:\Users\ravikanth_chaganti\AppData\Local\Temp\1037779645'
VERBOSE: Installing the dependency module 'cHyper-V' with version '3.0.0.0' for the module 'Hyper-VConfigurations'.
VERBOSE: Module 'cHyper-V' was installed successfully.
VERBOSE: Installing the dependency module 'xNetworking' with version '3.2.0.0' for the module 'Hyper-VConfigurations'.
VERBOSE: Module 'xNetworking' was installed successfully.
VERBOSE: Module 'Hyper-VConfigurations' was installed successfully.</pre><p></p>
<p>As you can see in the above <em>Install-Module</em> cmdlet output, the required modules are downloaded from the gallery. Thanks to <a href="https://blog.netnerds.net/2017/05/powershell-gallery-metapackages/">Chrissy</a> for this tip.</p>
<p>You can contribute to this project by submitting a pull request. All you need a set of composite resource modules packaged as a PowerShell module with DSC resources. Of course, ensure you add clear examples and tests.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/05/15/infrastructure-blueprints-a-way-to-share-psdsc-configurations/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
		<item>
		<title>PSRemotely – Framework to Enable Remote Operations Validation</title>
		<link>https://www.powershellmagazine.com/2017/04/07/psremotely-framework-to-enable-remote-operations-validation/</link>
				<comments>https://www.powershellmagazine.com/2017/04/07/psremotely-framework-to-enable-remote-operations-validation/#comments</comments>
				<pubDate>Fri, 07 Apr 2017 16:00:40 +0000</pubDate>
		<dc:creator><![CDATA[Deepak Dhami]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[Online Only]]></category>
		<category><![CDATA[Pester]]></category>
		<category><![CDATA[PSRemotely]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12611</guid>
				<description><![CDATA[Before we get started with what is PSRemotely, here is some background. As part of my work in an engineering... ]]></description>
								<content:encoded><![CDATA[<p>Before we get started with what is PSRemotely, here is some background.</p>
<p>As part of my work in an engineering team, I am tasked with writing scripts which will validate the <strong><em>underlying infrastructure</em></strong> before the automation (using PowerShell DSC) kicks in to deploy the solution.</p>
<p>Below are the different generic phases which comprise the whole automation process:</p>
<ul>
<li><strong>Pre-deployment</strong> – getting the base infrastructure ready, the bare minimum required for automation. For example – network configuration on the nodes is needed.</li>
<li><strong>Deployment</strong> – deployment of the solution leveraging PowerShell DSC.</li>
<li><strong>Post-deployment</strong> – scripts/runbooks configuring or tweaking the environment.</li>
</ul>
<p>What I meant by validating <strong><em>underlying infrastructure</em></strong> above, is that the compute and storage physical hosts/nodes have a valid IP configuration, connectivity to the AD/DNS infrastructure etc. the <strong>key</strong> components that we required to be tested and validated to get confidence in our readiness to deploy the engineered solution on top of it.</p>
<p><strong>Note</strong> – Our solution had scripts in place that would configure the network based on some input parameters and record this in a manifest XML file. After the script ran, we would assume that everything is in place. These assumptions at some points cost us a lot of efforts in troubleshooting.</p>
<p>In short, initial idea was to have scripts validating, what the scripts did in an earlier step. So it began, I started writing PowerShell functions, using workflows (to target parallel execution on nodes). This was a decent solution until there were requests to add validation tests for entirely everything in the solution stack e.g. DNS configuration, network connectivity, proxy configuration, disks (SSD/HDD) attached to the storage nodes etc.</p>
<p>Phew! It was a nightmare maintaining it.</p>
<p>Rays of hope: <a href="https://github.com/pester/pester">Pester</a>, <a href="https://github.com/Ticketmaster/poshspec">PoshSpec</a>, and <a href="https://github.com/PowerShell/Remotely">Remotely</a>!</p>
<p>We went into looking at how to use some of the open source PowerShell modules into helping us perform operations validation. At this time in community, Pester was gaining traction for the operations validation.</p>
<h2>Using Pester</h2>
<p>We moved away from using standalone scripts for the operations validation and started converting our scripts into Pester tests. It is not surprising to see that many operations people find it easier to relate to using Pester for Ops validation, since we have been doing this validation for ages manually. Pester just makes it easy to automate all of it.</p>
<p>For example, in our solution each compute node gets three NIC cards, pre-deployment script configures them. If we had to test whether the network adapter’s configuration was indeed correct, it would look something like below using Pester:</p>
<p></p><pre class="crayon-plain-tag">Describe "TestIPConfiguration" {
    It "Should have a valid IP address on the Management NIC" {
        (Get-NetIPAddress -AddressFamily IPv4 -InterfaceAlias 'vEthernet(Management)' | Select-Object -ExpandProperty IPAddress) |
            Should be '10.10.10.1' 
    }

    It "Should have a valid IP address on the Storage1 NIC" {
       (Get-NetIPAddress -AddressFamily IPv4 -InterfaceAlias 'vEthernet(Storage1)' | Select-Object -ExpandProperty IPAddress) |
            Should be '10.20.10.1'  
    }

    It "Should have a valid IP address on the Storage2 NIC" {
        (Get-NetIPAddress -AddressFamily IPv4 -InterfaceAlias 'vEthernet(Storage1)' | Select-Object -ExpandProperty IPAddress) |
            Should be '10.30.10.1'  
}</pre><p></p>
<h2>Using PoshSpec &amp; Pester</h2>
<p>PoshSpec added yet another layer of abstraction on our infrastructure tests by adding yet another DSL.<br />
Below is how our tests started looking with usage of Pester and PoshSpec.</p>
<p><strong>Note</strong> – For validation of IPv4 Address, another keyword named IPv4Address was added to PoshSpec which would essentially call Get-NetIPAddress and spit out the IPv4 address assigned on the NIC interface with specified alias.</p>
<p></p><pre class="crayon-plain-tag">Describe "TestIPConfiguration" {
    Context "Validate the Management NIC " {
        IPv4Address 'vEthernet(Management)' {Should be '10.10.10.1'} 
    }

   Context "Validate the Storage1 NIC" {
        IPv4Address 'vEthernet(Storage1)' {Should be '10.20.10.1'} 
    }

    Context "Validate the Storage2 NIC" {
        IPv4Address 'vEthernet(Storage2)' {Should be '10.30.10.1'}
    }
}</pre><p></p>
<p>By using Pester and PoshSpec to write tests, it sure made maintaining these tests easy, but we still have a problem at hand.  How do we target our above tests to all the nodes in the solution?</p>
<p><a href="https://github.com/PowerShell/Remotely">Remotely</a>??</p>
<p>At some point <a href="http://www.powershellmagazine.com/author/ravikanth/">Ravi</a> was <a href="https://github.com/rchaganti/Remotely">tinkering</a> with this particular PowerShell module and suggested to take a look at it. It was promising to begin with as he added support for passing Credential hash to Remotely. We would have to specify a hash table with the computer name as key and credential as value to Remotely and it would take care of connecting to those nodes, executing the script block in the remote runspace. At this point things started falling in place for what we had in mind. Our tests started looking nice and concise:</p>
<p></p><pre class="crayon-plain-tag">$CredHash = @{
    'ComputeNode1' = Get-Credential
    'ComputeNode2' = Get-Credential
}
Describe "TestIPConfiguration" {
    Context "Validate the Management NIC " {
       Remotely ComputeNode1, ComputeNode2 {IPv4Address 'vEthernet(Management)' {Should be '10.10.10.1'}} 
    }

   Context "Validate the Storage1 NIC" {
        Remotely ComputeNode1, ComputeNode2 {IPv4Address 'vEthernet(Storage1)' {Should be '10.20.10.1'}} 
    }

    Context "Validate the Storage2 NIC" {
        Remotely ComputeNode1, ComputeNode2 {IPv4Address 'vEthernet(Storage2)' {Should be '10.30.10.1'}} 
    }
}</pre><p></p>
<p>Soon we realized that the Assertions above e.g. {Should Be ’10.10.10.1’} are to be dynamically created by reading the manifest XML file which drives the whole deployment. It contains what is the expected configuration on the remote nodes.</p>
<p>We wanted our tests to be generic so that we could target them to all nodes part of the solution. We were looking to have our tests organized like below, where of course a node-specific details e.g. $ManagementIPv4Address etc. would be read from the manifest file and created on the fly either on the local machine or remote node :</p>
<p></p><pre class="crayon-plain-tag">$CredHash = @{
    'ComputeNode1' = Get-Credential
    'ComputeNode2' = Get-Credential
}
Describe "TestIPConfiguration" {
    Context "Validate the Management NIC " {
       Remotely ComputeNode1, ComputeNode2 {IPv4Address 'vEthernet(Management)' {Should be  $ManagementIPv4Address}} 
    }

   Context "Validate the Storage1 NIC" {
        Remotely ComputeNode1, ComputeNode2 {IPv4Address 'vEthernet(Storage1)' {Should be $Storage1IPv4Address}} 
    }

    Context "Validate the  Storage2 NIC" {
        Remotely ComputeNode1, ComputeNode2 {IPv4Address 'vEthernet(Storage2)' {Should be $Storage2IPv4Address}}  
    }
}</pre><p></p>
<p>The above syntax looks quite descriptive and decouples the validation tests and environment details too.</p>
<p>But there were some downsides to the above approach.</p>
<ul>
<li>Requires us re-writing our existing tests to accommodate keyword Remotely for executing script block on remote and running assertions locally.</li>
<li>Remotely connects each time to all the nodes to run each PoshSpec based ops validation tests. Results in lot of overhead to run a large number of validation tests.</li>
<li>Trouble passing environment specific data to the remote nodes e.g. in the above tests passing the expected IPv4 address to the remote node.</li>
<li>For running Pester/PoshSpec tests on the remote nodes, these modules need to be present on the remote node, to begin with.</li>
</ul>
<p>The existing Remotely framework was meant to execute script block against a remote runspace but it was not specifically built to perform operations validation remotely.</p>
<h2>Enter PSRemotely</h2>
<p>After trying to integrate Remotely with Pester/PoshSpec based tests, we had a general idea on what we needed from a framework/DSL, if it was to provide us with the capability of orchestrating operations validation remotely on the nodes. Below are some of those features we had in mind along with the arguments for these to be implemented:</p>
<ul>
<li>Target Pester/PoshSpec based operations validation tests on the remote nodes.</li>
<li>Allow specifying environment data separately from the tests, so that same tests could be applied across on nodes.<br />
<em><em>We decided on the ability to use DSC Style configuration data here for specifying node specific environment details.</em></em></li>
<li>Easier debugging on the remote nodes, in case tests fail.<br />
<em><em>If something failed on the remote node during validation, we should be able to connect to the underlying PowerShell remoting session and debug issues.</em></em></li>
<li>Allow re-running specific tests on the remote nodes.<br />
<em><em>In case a test failed, performing a quick remediation action and validating that specific test passed it is a good to have feature when you have lot of tests in your suite.</em></em></li>
<li>Self-contained solution.<br />
<em><em>Have the framework bootstrap the remote nodes with required modules version (Pester &amp; PoshSpec) under the hood. Remote nodes might not have internet connectivity here.</em></em></li>
<li>Allow copying required artifacts to remote nodes.<br />
<em><em>For our solution, we require a manifest file with details about the deployment to be copied on each node.</em></em></li>
<li>Use PowerShell remoting as underlying transport mechanism for everything.</li>
<li>Return bare minimum JSON output, if everything passes. If a test fails then return the error record thrown by Pester.</li>
</ul>
<p>And, <a href="https://github.com/DexterPOSH/PSRemotely">PSRemotely</a> was born!</p>
<p>So after a lot of discussions with Ravi, I finally got insight on how the remote operations validation DSL should look like:</p>
<p></p><pre class="crayon-plain-tag">$CredentialHash = @{
    'ComputeNode1' = Get-Credential
    #'ComputeNode2' = Import-CliXML # If a node is missed here, current user creds are used with PSRemoting
}

# Configuration data, can be generated separately or specified from a .psd1 or .json file
$ConfigData = @{
    AllNodes = @(
        @{
            NodeName = '*' 
            DomainFQDN = 'dexter.lab'
        },
        @{
            NodeName = "ComputeNode1"
            ManagementIPv4Address = '10.10.10.1'
            Storage1IPv4Address = '10.20.10.1'
            Storage2IPv4Address = '10.30.10.1'
            Type = 'Compute'
        },
        @{
            NodeName = "ComputeNode2"
            ManagementIPv4Address = '10.10.10.2'
            Storage1IPv4Address = '10.20.10.2'
            Storage2IPv4Address = '10.30.10.2'
            Type = 'Compute'
        }
    )
}

# PSRemotely tests, specify CredentialHash and Configuration data to the PSRemotely
PSRemotely -ConfigurationData $ConfigData -CredentialHash $CredentialHash {
    Node $AllNodes.Where({$_.Type -eq 'Compute'}).NodeName {
        
        # Below are the existing Pester/PoshSpec-based tests, with changes on how node specific data is supplied
        Describe 'TestIPConfiguration' -Tag IP {
            
            Context "Validate the Management NIC" {
                    # pre-deployment script always creates NICs with these names
                    IPv4Address 'vEthernet(Management)' {Should be  $Node.ManagementIPv4Address}
                }

            Context "Validate the Storage1 NIC" {
                    IPv4Address 'vEthernet(Storage1)' {Should be $Node.Storage1IPv4Address} 
                }

            Context "Validate the Storage2 NIC" {
                IPv4Address 'vEthernet(Storage2)' {Should be $Node.Storage2IPv4Address}
            }
        }       
    }
}</pre><p></p>
<p>After having a clear idea on the features required in the framework and how we wanted the DSL to look like, I started working on it. This post has set up the context on why we began working on something entirely new from scratch.</p>
<p>Join me in the second post where I try to explain how to use <a href="https://github.com/DexterPOSH/PSRemotely">PSRemotely</a> to target remote nodes for operations validation.</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/04/07/psremotely-framework-to-enable-remote-operations-validation/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
							</item>
		<item>
		<title>PowerShell Conference EU 2017 &#8211; Speakers and Sessions</title>
		<link>https://www.powershellmagazine.com/2017/02/02/powershell-conference-eu-2017-speakers-and-sessions/</link>
				<comments>https://www.powershellmagazine.com/2017/02/02/powershell-conference-eu-2017-speakers-and-sessions/#respond</comments>
				<pubDate>Thu, 02 Feb 2017 22:43:28 +0000</pubDate>
		<dc:creator><![CDATA[Tobias Weltner]]></dc:creator>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[News]]></category>

		<guid isPermaLink="false">http://www.powershellmagazine.com/?p=12574</guid>
				<description><![CDATA[Modern administrators are responsible for business-critical automation, influence IT architectural design, and are a crucial part of corporate security both... ]]></description>
								<content:encoded><![CDATA[<p>Modern administrators are responsible for business-critical automation, influence IT architectural design, and are a crucial part of corporate security both in daily operations and in conceptual planning. PowerShell is the driving force behind most of this, and rapidly expands and evolves.</p>
<p><a href="http://www.psconf.eu"><img class="alignnone wp-image-12577" src="http://www.powershellmagazine.com/wp-content/uploads/2017/02/pic_jeffrey.jpg" width="500" height="281" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/02/pic_jeffrey.jpg 1024w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/pic_jeffrey-300x169.jpg 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/pic_jeffrey-768x432.jpg 768w" sizes="(max-width: 500px) 100vw, 500px" /></a></p>
<p>While you find plenty of PowerShell beginners classes to teach fresh admins the fundamentals, it&#8217;s much harder to find places for experienced admins to learn relevant new information and be on equal height with speakers and other delegates. <a href="http://www.psconf.eu" target="_blank" rel="noopener noreferrer">psconf.eu</a> is such a place and lightened up last year as a completely new PowerShell event format.</p>
<p><em>“Can’t wait for another #psconfeu!”</em>, <em>“You don’t want to miss this”</em>, <em>“best conference”</em>, <em>“highlight of the year”</em> is just some feedback when you search Twitter for #psconfeu. Let&#8217;s take a look at what makes psconf.eu so special, and what to expect this year!</p>
<p><a href="http://www.psconf.eu"><img class="alignnone wp-image-12598" src="http://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3689-1.jpg" width="500" height="375" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3689-1.jpg 3264w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3689-1-300x225.jpg 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3689-1-768x576.jpg 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3689-1-1024x768.jpg 1024w" sizes="(max-width: 500px) 100vw, 500px" /></a></p>
<p>This year, the conference has an unprecedented speaker lineup: five of the six powershellmagazine.com editors will be there and speaking. PowerShell inventor Jeffrey Snover opens the conference with his keynote &#8220;State of the Union&#8221;. Microsoft sends six PowerShell team members covering PowerShell 6, Open Source, DSC, PowerShell for VSCode, and more. So you&#8217;re not just getting answers &#8211; you are talking directly to the people <em>actually doing</em> these things. And get first hand information, including where the PowerShell journey will go next.</p>
<p><a href="http://www.psconf.eu" target="_blank" rel="noopener noreferrer">psconf.eu</a> covers all major areas of PowerShell, and the agenda information below uses color-coding to help you better find your favorite topics:</p>
<table style="text-align: left; width: 100%;" border="1" cellspacing="2" cellpadding="2">
<tbody>
<tr>
<td>Group</td>
<td>Description</td>
</tr>
<tr>
<td style="background-color: rgba(30, 100, 200,0.3);">PowerShell</td>
<td>Covers aspects of the PowerShell language</td>
</tr>
<tr>
<td style="background-color: rgba(102, 102, 51,0.3);">Cross-Platform and OpenSource</td>
<td>Focuses on running PowerShell cross-platform (PowerShell on Linux) and collaborating in open-source projects</td>
</tr>
<tr>
<td style="background-color: rgba(255, 0, 0,0.5);">Security</td>
<td>Topics concerned with writing secure code, identifying security issues, and using PowerShell to audit and report security-related incidents</td>
</tr>
<tr>
<td style="background-color: rgba(0,255,0,0.3);">Pester and Test Driven Development</td>
<td>Using Pester to ensure that PowerShell code complies with desired results</td>
</tr>
<tr>
<td style="background-color: rgba(255,255,0,0.3);">DSC and DevOps</td>
<td>Addresses Desired State Configuration, Azure, and DevOps-strategies to<br />
use PowerShell for deployments and configuration</td>
</tr>
<tr>
<td style="background-color: rgba(255, 153, 51,0.3);">Automation</td>
<td>Automating specific tasks such as AD, virtualization, and Azure</td>
</tr>
<tr>
<td style="background-color: rgba(0, 255, 255,0.3);">PowerShell as a Service / Nano Server</td>
<td>Running PowerShell code as a service, by using containers and Nano Server</td>
</tr>
<tr>
<td style="background-color: rgba(255, 153, 255,0.3);">Database</td>
<td>Using PowerShell to manage databases like SQL Server</td>
</tr>
<tr>
<td style="background-color: rgba(0, 0, 128,0.3);">PowerShell Team</td>
<td>Session held by members of the Microsoft PowerShell Team. The &#8220;Speakers Roundtable&#8221; on day 1 is delivered by all speakers including the PowerShell Team members.</td>
</tr>
</tbody>
</table>
<h2>Special Trainings Ops</h2>
<p>Right before the main conference starts, you have the option to participate in our optional preconf special trainings day. These are definitely no MOC-style courses. Workshops aim to bring experienced people up to speed. You can polish fundamentals, dive into topics that are new to you (like DSC, GUIs, or JEA Security), and warm up. These &#8220;Special Trainings&#8221; make sure you get the most out of the main conference sessions, and prepare yourself for topics that may be new to you &#8211; like DSC, or JEA, or building GUI tools. Workshops are available in English and in German, and you can suggest more topics by contacting us via the contact form at the bottom of <a href="http://www.psconf.eu" target="_blank" rel="noopener noreferrer">psconf.eu</a>.</p>
<h2>Opening Day</h2>
<p>On day 1, we open the conference with delegate registration and then start together in the great historic Leibniz Saal. PowerShell inventor Jeffrey Snover delivers the keynote &#8220;State of the Union&#8221;, summarizing where PowerShell stands today and where it is heading in the light of being open source now and available on Linux and OS X.</p>
<table style="text-align: left; width: 100%;" border="1" cellspacing="2" cellpadding="2">
<tbody>
<tr>
<td>Time</td>
<td>Track 1 (German)</td>
<td>Track 2 (English)</td>
<td>Track 3 (English)</td>
<td>Track 4 (English)</td>
</tr>
<tr>
<td>8:00 &#8211; 9:00</td>
<td colspan="5" rowspan="1">Delegate Registration</td>
</tr>
<tr>
<td>9:00 &#8211; 9:15</td>
<td title="Conference Opening with a quick introduction and orientation" colspan="5" rowspan="1" data-toggle="tooltip">Opening Ceremony</td>
</tr>
<tr>
<td>9:15 &#8211; 10:00</td>
<td style="background-color: rgba(0, 0, 128,0.3);" colspan="5" rowspan="1">State of the Union (Keynote)<br />
Jeffrey Snover</td>
</tr>
<tr>
<td>10:15 &#8211; 11:00</td>
<td style="background-color: rgba(30, 100, 200,0.3);" colspan="5" rowspan="1">PowerShell Warm-Up: Quiz &amp; Quirks<br />
Tobias Weltner</td>
</tr>
<tr>
<td>11:00 &#8211; 11:45</td>
<td style="background-color: rgba(255, 0, 0,0.5);" colspan="5" rowspan="1"><span style="color: #000000; font-family: 'Times New Roman'; font-size: medium; font-style: normal; font-weight: normal; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; display: inline ! important; float: none;">Catch Me If You Can &#8211; PowerShell Red vs. Blue</span><br />
Will Schroeder</td>
</tr>
<tr>
<td>12:00 &#8211; 13:00</td>
<td colspan="5" rowspan="1">Lunch</td>
</tr>
<tr>
<td>13:00 &#8211; 14:00</td>
<td style="background-color: rgba(0,255,0,0.3);" title="Lernen Sie in dieser Session, warum ein Testing Framework witchtig ist für PowerShell und dessen Fortschritt. Erhalten Sie in dieser Session einen Einblick, wie Sie Pester auf Ihrem System Einrichten und die ersten «Gehversuche» mit Pester im Unternehmen. Nehmen Sie einige Code-Snippets direkt für ihre Gehversuche mit nach Hause, die Ihren Start vereinfachen." data-toggle="tooltip">Pester 101 (German)(E1)<br />
Rinon Belegu</td>
<td style="background-color: rgba(102, 102, 51,0.3);" title="PowerShell was always bound to the single operating system and for quite some time it was one of the core elements of Windows. It was hard to imagine it running on any other OS. Also, as a core element of Microsoft’s flagship it didn’t seem like a perfect candidate for an open source project. Last year both things changed. Join this session to see PowerShell running on Linux and to learn how important this change is to the entire PowerShell ecosystem. Learn how you can participate to make PowerShell better for everybody!" data-toggle="tooltip">Hell freezing over: PowerShell on Linux and GitHub(B1)<br />
Bartosz Bielawski</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="Too many people having too many rights is a widespread problem. When the F1 team I worked for noticed that scripts were running as administrator, but could be modified by less privileged users, that was the spur to build a JEA solution and without using the aids introduced in WMF 5.0, it needed to be built from the ground up. It offered the chance to use centralized scripts to make IT tasks easier to carry out, and more consistent, as well as improving security. This &quot;notes from the field&quot; session will explain how the solution was developed, covering constrained endpoints, solving credentials problems, and improving the experience for those who used the solution, and will also look at the lessons - technical and otherwise - that were learnt in the process." data-toggle="tooltip">Building a Just Enough Admin solution in formula one(A1)<br />
James O&#8217;Neill</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="The GUI, calling it our mortal enemy would probably be a bit too far. There are some obvious use-cases in which using a GUI to discover new functionality and to get acquainted to a new product would be a solid approach. In this session we will do exactly that, we will setup a number of different components on a server while using PowerShell logging to record what is happening on the background. We will analyze these logs and retrieve a step-by-step playbook to reproduce the configuration. Level 300-400" data-toggle="tooltip">Breakdown the GUI: PowerShell logging to automate everything(D1)<br />
Jaap Brasser</td>
</tr>
<tr>
<td>14:15 &#8211; 15:15</td>
<td style="background-color: rgba(255,255,0,0.3);" title="Die Anforderung, One-Click Domain Controller deployment experience. Gegeben sind blanke Windows 2012 R2 oder 2016 Server. DSC soll genutzt werden, um alles vom Disk Layout über Netzwerkeinstellung über das DC Setup bis hin zur DNS Server Konfiguration und das in einer global verteilten Struktur. Die Konfigurationen werden via PowerShell aus einer Datenbank erstellt, um individuelle Anpassungen möglich zu machen und es sollen bestehende DSC Resourcen genutzt und erweitert werden. Erfahrungen aus einem Projekt mit einem global verteilten Unternehmen." data-toggle="tooltip">Automatisiertes DC Deployment mit DSC(E2)<br />
Raimund Andree, Jan-Hendrik Peters</td>
<td style="background-color: rgba(0, 255, 255,0.3);" title="In this session I want take the audience on the journey of why, what and how Windows Server Containers. This session will cover real life scenario's that traditional teams run into and how to overcome these in a very modern agile approach. This session will contain numerous demo's such as introducing the concept of Docker for Windows Server, Creating a highly available web application and some of the major advantages of virtual machines. This is aimed at people who may have little to no knowledge in Docker and or Containers on Nano Server." data-toggle="tooltip">Getting started with Windows Server Containers(B2)<br />
Flynn Bundy</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="How sure are you that your PowerShell code is prepared to handle anything that a user might throw at it? What if the user was an attacker attempting to circumvent security controls by exploiting a vulnerability in your script? This may sound unrealistic but this is a legitimate concern of the PowerShell team when including PowerShell code in the operating system. In a high-security environment where strict AppLocker or Device Guard rules are deployed, PowerShell exposes a large attack surface that can be used to circumvent security controls. While constrained language mode goes a long way in preventing malicious PowerShell code from executing, attackers will seek out vulnerabilities in trusted signed code in order to circumvent security controls. This talk will cover numerous different ways in which attackers can influence the execution of your code in unanticipated ways. A thorough discussion of mitigations against such attacks will then follow." data-toggle="tooltip">Defensive Coding Strategies for a High-Security Environment(A2)<br />
Matthew Graeber</td>
<td style="background-color: rgba(102, 102, 51,0.3);" title="Looking to get more contributors for your Open Source project? Once you've got them, what's awesome and what sucks about your baby being popular?In this session, Community Leader Rob Sewell and PowerShell MVP Chrissy LeMaire talk about their real world experiences with building popular PowerShell repos and the catalysts for their projects becoming owned and beloved by the open source community." data-toggle="tooltip">Creating and Maintaining Successful Open Source Projects(D2)<br />
Chrissy LeMaire</td>
</tr>
<tr>
<td>15:30 &#8211; 16:30</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="You think you know Hyper-V? Really? There are some awesome features hidden in Hyper-Vs belly you perhaps don´t know - but you have to use powershell to unleash them." data-toggle="tooltip">Virtual Machines auf Ihrem Desktop mit Powershell und Hyper-V(E3)<br />
Holger Voges</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="In this session Mathias and Øyvind will walk you through a real-life scenario and look at different optimization techniques for getting the best possible performance out of every part of the code. You will learn how to check for, and spot, performance bottlenecks and how to solve them. We will also cover using Runspaces for parallel processing, utilizing the WIN32 API to get faster code, algorithmic optimization and much much more!" data-toggle="tooltip">When every second counts!(B3)<br />
Mathias Jessen &amp; Øyvind Kallstad</td>
<td style="background-color: rgba(0, 255, 255,0.3);" title="Nano Server is small. Nano Server is fast. And secure. Nano Server is the future of Windows Server. In this session we will show you different management techniques, the power of running Nano Server as a container, and what happens behind the scenes when you use Server Management Tools (SMT). Have you noticed I haven't mentioned PowerShell yet? PowerShell is the core of Nano Server's power." data-toggle="tooltip">Look! Up in the sky! It&#8217;s a bird. It&#8217;s a plane. It&#8217;s Nanoman!(A3)<br />
Aleksandar Nikolic, Jan Egil Ring</td>
<td style="background-color: rgba(255,255,0,0.3);" title="More Organisations are beginning to evaluate or make the move to DevOps as technology has made it more practical. Microsoft have released a Release Pipeline whitepaper and demo guide to get you set up in TFS, but what does it mean to actually implement this? In this session, Matt will walk through how he has helped a large Microsoft customer make the transition to Infra-as-Code for one of their core technologies. We will take a look at the timeline of events and activities that got us there, what a Release Pipeline actually looks like in practice with its supporting components and how it works together. More importantly, we will talk about the cultural resistance and challenges that were overcome along the way and what you might consider doing to get your organisation over those same hurdles." data-toggle="tooltip">The Release Pipeline in Practice(D3)<br />
Matt Hitchcock</td>
</tr>
<tr>
<td>16:45 &#8211; 17:45</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="Sie haben vielleicht mitbekommen, dass in PowerShell nun auch Klassen möglich sind. Nur, wissen Sie nicht, was eine Klasse, ein Objekt oder Konstruktoren sind? Was ist den bitte ein überladener Konstruktor? In dieser Session erhalten Sie eine Einführung in die objektorientierte Programmierung und wie es ihren PowerShell-Horizont erweitern kann." data-toggle="tooltip">Sind Sie auch eine Klasse für sich? Oder doch &#8220;nur&#8221; ein Objekt?(E4)<br />
Rinon Belegu</td>
<td style="background-color: rgba(0,255,0,0.3);" title="Relevant to any technicians who use checklists. Intro level and idea forming, getting people to look at the way they can use Pester I was required to prove that I had successfully installed and configured a backup solution across a large estate. I had a number of success criteria that had to be met. Checking all of these by hand (eye) would have been error prone, so I wrote a test to do this for me and an easy for management to read HTML report using PowerShell and Pester. The session has come from that situation and is about enabling you to provide an easy to read output to quickly and repeatedly show that infrastructure is as expected for a set of checks. There are many use cases for this type of solution; DR testing, installation, first line checks, presentation setups After this session you will have a basic understanding of how Pester works and the capability to examine your checklists and create your own validation tests and provide some reporting for management. You will pick up ideas of processes and checks that can be automated within your company I am able to perform the following checks in 15 minutes across hundreds of servers and thousands of databases and I think it is cool and useful and everyone should be able to do so and take the functionality to create their own. Every Job in Ola Hallengrens Maintenance Solution exists, is enabled, has a schedule, has succeeded, has 2 job steps, has a generate restore script job step, the root backup folder is contactable, for every database the correct folders exist for the full, differential and log backup depending on recovery model in that folder, that each of those backup folders has files in it, that the most recent file in each of those folders is less than the required frequency for those jobs" data-toggle="tooltip">Green is bad Red is Good &#8211; Turning your Checklists into Pester Tests(B4)<br />
Rob Sewell</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="What administration and automation tool do state-sponsored actors like APT 29, criminal actors like Phineas Fisher, and IT staff all rely upon heavily? PowerShell. Integrated into many offensive toolkits, PowerShell is gaining respect in offensive circles as “Microsoft’s Post-Exploitation Language”. In a quest to combat this perceived threat, many defenders attempt to disable PowerShell rather than realizing its defensive potential. In this talk, we will cover PowerShell defensive tools and techniques that you can effectively use to detect malicious activity without the need for a bloated host-based agent. Don’t be afraid of PowerShell! Fight back, instill fear in the attacker, and reclaim your enterprise! It’s time to acknowledge PowerShell as &quot;Microsoft’s Incident Response Language&quot;!" data-toggle="tooltip">Flip the Script: PowerShell “Microsoft’s Incident Response Language”(A4)<br />
Jared Atkinson</td>
<td style="background-color: rgba(255,255,0,0.3);" title="PS DSC existed for a few years now and has continuously evolved with new and Powerful features. This sessions rapidly demonstrates some tips and tricks and know-hows from complex real-world deployments." data-toggle="tooltip">PowerShell Desired State Configuration &#8211; A rapid fire. Tips and Tricks, Know-hows(D4)<br />
Ravikanth Chaganti</td>
</tr>
<tr>
<td>18:00 &#8211; 18:30</td>
<td style="background-color: rgba(0, 0, 128,0.3);" colspan="5" rowspan="1">Ask the Speakers Roundtable</td>
</tr>
<tr>
<td>19:00 &#8211; 01:00</td>
<td colspan="5" rowspan="1">Evening Event at Yukon Bay/Zoo Hannover<br />
Meeting Point: 19h Zoo Main Entrance. <span style="color: #ff6600;">Bring your badge (required to enter Zoo)</span>!<br />
<small><small>We&#8217;ll leave at 18:30h from Leibniz Saal and walk down to the Zoo together. You can also get there on your own, take a car (if you have one and don&#8217;t drink), the tram (one station), or a cab. Just make sure you are at the Zoo Main Entrance at 19:00h (7 pm) sharp, and <span style="color: #ff6600;">bring your badge. Your badge is required to enter the Zoo</span>.<br />
</small></small>Dinner and Drinks</td>
</tr>
</tbody>
</table>
<p>We&#8217;ll then warm up with &#8220;Quiz and Quirks&#8221;: I extracted some of the funniest and strangest questions asked in our internal MVP forum, and you get the chance to test your knowledge and learn new things: would you have been able to answer the MVP questions easily? We are all in this conference together, and this session encourages interaction, asking questions, and being part of it.</p>
<p>Will &#8220;Harmj0y&#8221; Schroeder finally ends the morning with an awesome security presentation. His presentation last year was crammed, so we wanted him to deliver it in the largest room we have: listen closely where attackers can sneak into your systems, and how PowerShell can be used both as a forensic and penetration testing tool.</p>
<p>We all then have a good lunch together, and move into the workshop area. After lunch, we fan out into four tracks, giving you the choice. Track 1 will be in German, and tracks 2, 3 and 4 in English.</p>
<p>In the afternoon, you have the chance to meet many of the renowned speakers in person, and make connections. After the first round of presentations, we all come back to Leibniz Saal for the &#8220;Ask the Speakers Roundtable&#8221;.</p>
<p><a href="http://www.psconf.eu"><img class="alignnone wp-image-12586" src="http://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3676.jpg" width="500" height="375" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3676.jpg 3264w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3676-300x225.jpg 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3676-768x576.jpg 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/IMG_3676-1024x768.jpg 1024w" sizes="(max-width: 500px) 100vw, 500px" /></a></p>
<p>Then, we walk over to the Hannover Zoo to enjoy the Evening Event.</p>
<h2>Evening Event in the Zoo</h2>
<p>The evening event is absolutely unique: we have &#8220;Yukon Bay&#8221;, an ancient gold digger town, all to us. We&#8217;ll have great food and drinks, beer and wine, and the chance to hang loose and make new connections and friendships. Everyone will be there, including all speakers. You may want to continue to talk about PowerShell, but you may just as well just kick back and enjoy the evening.</p>
<p><a href="http://www.psconf.eu"><img class="alignnone wp-image-12584" src="http://www.powershellmagazine.com/wp-content/uploads/2017/02/yukon.jpg" width="500" height="360" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/02/yukon.jpg 2048w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/yukon-300x216.jpg 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/yukon-768x553.jpg 768w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/yukon-1024x737.jpg 1024w" sizes="(max-width: 500px) 100vw, 500px" /></a></p>
<h2>Conference Day 2 + 3</h2>
<p>On the remaining two days, we fan out in 5 parallel tracks. Track 1 stays German, and the other four tracks deliver English sessions. 19 Microsoft MVPs, six PowerShell team members, former MVPs, Engineers from Microsoft Germany, and many other awesome PowerShell experts deliver presentations covering pretty much all areas that are relevant to PowerShell.</p>
<p>Here&#8217;s the preliminary agenda for day 2:</p>
<table style="text-align: left; width: 100%;" border="1" cellspacing="2" cellpadding="2">
<tbody>
<tr>
<td>Time</td>
<td>Track 1 (German)</td>
<td>Track 2 (English)</td>
<td>Track 3 (English)</td>
<td>Track 4 (English)</td>
<td>Track 5 (English)</td>
</tr>
<tr>
<td>8:30 &#8211; 9:30</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="In diesem Vortrag werden wir die Basiskomponenten für ein erfolgreiches Deployment von Umgebungen in Azure betrachten, die Begrifflichkeiten erklären, und anhand eines Walkthroughs beleuchten. Anschließend werden wir einen Ausblick darauf geben, wie komplexe Labor-Umgebungen einschließlich der Einrichtung von Active Directory, SQL und DFS auch ohne die Erstellung von JSON-Templates oder Runbooks vollautomatisch möglich sind. - Basiskomponenten für ein erfolgreiches Deployment: Resource Groups, Storage Accounts, Virtual Networks, Virtual NICs, Public IPs, Gateways, Load Balancer, Virtual Machines - Walkthrough: Erstellung von virtuellen Maschinen unter Berücksichtigung aller Vorbedingungen - Ausblick: Automatisiertes erstellen komplexer Labs on Azure ohne Erstellung von JSON-Templates oder Runbooks" data-toggle="tooltip">Azure 101 &#8211; PowerShell in der Cloud(E5)<br />
Jan-Hendrik Peters, Raimund Andrée</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="Pilots joke that a landing you walk away from is OK, and if you can use the aircraft again it's a good landing. There is useful PowerShell at the &quot;crash landing&quot; level - but we need a higher standard when sharing with peers, colleagues or the wider community. That's the difference between your hard work being used and recommended by others or being forgotten. The presenter of the session wrote one of the most downloaded modules ever, and contributed the 10 cmdletments for reusable code to Manning's PowerShell Deep Dives book - but several years have passed since then and distributing code has become easier, and ideas of good and bad practice have evolved, so the time has come to see which established ideas still hold, and which new ones separate the modules that succeed from those that don't." data-toggle="tooltip">What makes a good shared PowerShell module?(B5)<br />
James O&#8217;Neill</td>
<td style="background-color: rgba(255,255,0,0.3);" title="We've seen in the last few years, how the traditional 'IT Pros' or Ops paradigm has shifted towards the practices previously embraced by the Devs only. But I believe Dev is the New Ops, so let's explore how SOA and Microservices applied to PowerShell can help the 'IT Pros'. In this session, after bootstrapping your Message Queue (MQ) knowledge and its use for Inter-Process Communication (IPC), we’ll introduce PSRabbitMQ, a module to interact with the message broker. Then we'll take a deep dive into a demo of PowerShell Microservices communicating using JSON messages sent via a Message Queue. In the second demo, we'll see a multi-node, multi-thread, cross-technology and cross-platform communications, with Python services using the Celery library being called, and calling PowerShell Services on distributed systems. Since PowerShell and .Net core are now open-source and the RabbitMQ .Net client will soon target .Net core, the tools and techniques demonstrated in this presentation may be a path to cross-platform PowerShell Microservices on Windows, Nano, Linux, and all sorts of containers!" data-toggle="tooltip">PowerShell Microservices: At the Service of DevOps(A5)<br />
Gael Colas</td>
<td style="background-color: rgba(0, 0, 128,0.3);" title="TBD" data-toggle="tooltip">TBD(D5)<br />
Joey Aiello (Microsoft)</td>
<td style="background-color: rgba(255,255,0,0.3);" title="Still trying to get your head around Desired State Configuration? Start with what you know! Cloud and Datacenter Management MVP Will Anderson takes you through learning to build DSC configurations using scripts that you’ve already created. In this mega-session, you’ll learn: • Why DSC is important when it comes to auditing, compliance, and enforcement. • To familiarize yourself with the DSC declarative language format." data-toggle="tooltip">Conceptualizing Desired State Configuration Using Your Own Scripts (Part I)(C5)<br />
Will Anderson</td>
</tr>
<tr>
<td>9:45 &#8211; 10:45</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="For our Training-Center we had to automate the installation, Patching and deployment of the Student-PCs. I will show how we managed to automate the full process of deploying a training-environment with patching and resetting the Workstations after each training back to default without using System Center Configuration manager, just with Powershell and Windows PE." data-toggle="tooltip">Ein Test-Lab mit reinen PowerShell Bordmitteln aufbauen(E6)<br />
Holger Voges</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="&quot;If you want speed, you should not use Powershell for log parsing&quot;. I bet you've heard this before. The alternative tool of choice seems to be Log Parser. Can it really outperform Powershell? And is this only because of speed? Does Powershell win the battle when it comes to complexity and flexibility? Let's have a competition! We will look at what both of them have to offer and how they work. We will also try to push the limits of these technologies to see how far we can go. Perhaps we can even cheat a bit in Powershell ;-)" data-toggle="tooltip">Log parser vs Powershell smackdown(B6)<br />
André Kamman</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="So Timmy, your new college intern needs to run some commands against the 3000 servers in your server farm. You let Timmy PowerShell remote in on the default endpoints since manually configuring new endpoints on all 3000 is not easy. Timmy then “accidentally” starts deleting content that he should have never touched on nodes he should have never been on. Sound familiar? The problem that many IT Pros have with remoting endpoints is that they do not know how to deploy them in mass. Constrained endpoints can allow your staff to work with specific components of PowerShell only, preventing the next Timmy from making a costly mistake. In this deep dive, we will take a look at PowerShell remoting and learn how to use it to deploy constrained endpoints rapidly across your organization allowing you to follow the “Principle of Least Privilege” in the PowerShell world." data-toggle="tooltip">Accelerated Constrained Endpoint Deployment(A6)<br />
Jason Yoder</td>
<td style="background-color: rgba(102, 102, 51,0.3);" title="Talk about OSS xPlat PS. Demo heavy. Based on my earlier session https://github.com/bgelens/Presentations/tree/master/092016_DuPSUG but updated of course :-)" data-toggle="tooltip">Start-NewEra -Repo PowerShell(D6)<br />
Ben Gelens</td>
<td style="background-color: rgba(255,255,0,0.3);" title="Still trying to get your head around Desired State Configuration? Start with what you know! Cloud and Datacenter Management MVP Will Anderson takes you through learning to build DSC configurations using scripts that you’ve already created. In this mega-session, you’ll learn: • Why DSC is important when it comes to auditing, compliance, and enforcement. • To familiarize yourself with the DSC declarative language format." data-toggle="tooltip">Conceptualizing Desired State Configuration Using Your Own Scripts (Part II)(C6)<br />
Will Anderson</td>
</tr>
<tr>
<td>11:00 &#8211; 12:00</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="Via Powershell ein komplettes Lab erstellen, sich nicht um Netzwerkkonfigurationen, DCs, PKI, differenzierte Festplatten kümmern - und das Ganze mit wenigen Zeilen Code. Und das egal ob Windows Server 2008 R2 oder 2016. Unmöglich? David und Raimund demonstrieren das Modul AutomatedLab im Einsatz mit vielen Beispielen on-Hands und erklären auch einige Deep-Dive-Bereich aus dem Code." data-toggle="tooltip">AutomatedLab – das erste voll flexible Hydration Tool (E7)<br />
David das Neves, Raimund Andrée</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="What are your options when the script you write takes to long to run? I this session we take a deep dive in what you can do to make your PowerShell code as fast as possible. How can you measure it and solutions to common pitfalls. We will look at how classes can be used to speed up many operations, the costs of the pipeline, I/O, Linq etc." data-toggle="tooltip">Performant PowerShell(B7)<br />
Staffan Gustafsson</td>
<td style="background-color: rgba(0, 0, 128,0.3);" title="TBD" data-toggle="tooltip">TBD(A7)<br />
Bruce Payette (Microsoft)</td>
<td style="background-color: rgba(102, 102, 51,0.3);" title="Are you in desperate need to enable your Linux folks a way to manage resources offered on your Windows servers? Would you like to understand techniques used and control what people can, and what they can’t do? If that is the case - this session is for you. I will try to cover multiple techniques you can use to allow such connection, including pywinrm with constrained endpoints, PowerShell Remoting over SSH, and if time allows (and Microsoft/community delivers) – via PowerShell remoting using WinRM." data-toggle="tooltip">Knock, knock, knock – Linux at your door(D7)<br />
Bartosz Bielawski</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="Ever thought about sending chat messages from your automation platform, for example to get rid of all that email? Or to get notified via chat when a critical system is down? In this session we will cover how to chat with PowerShell using the Lync SDK and the Skype for Business UCWA (Unified Communications Web API). As these don't cover posting messages to a persistent chat room, I'll also demo a poor man's solution on using a HTTP API to post messages to a persistent chat room." data-toggle="tooltip">Chatting with PowerShell – Skype for Business(C7)<br />
Daniël Both</td>
</tr>
<tr>
<td>12:00 &#8211; 13:00</td>
<td colspan="5" rowspan="1">Lunch &amp;<br />
Ask the Experts</td>
</tr>
<tr>
<td>13:00 &#8211; 14:00</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="Do you know about all the the pitfalls, craziness and exceptions in our beloved PowerShell? Join us for the Powershell QUIZ, where we will transform all the weirdness into an entertaining event. Yes, we will adopt the famous &quot;Who wants to be ..&quot; format and crown the the successful winner. &quot;Phone a Friend&quot; and prove that you're the greatest nerd! And of course: we will have some gorgeous winnings for YOU. Get-Fun -force" data-toggle="tooltip">Who wants to be a Shellionaire?(E8)<br />
Thorsten Butz</td>
<td style="background-color: rgba(255,255,0,0.3);" title="Step up your automation game by implementing an automated release pipeline! In this talk we will see how we can leverage the Release Pipeline for more than just publishing modules to the PowerShell Gallery. We will demonstrate how you can manage infrastructure deployment, change management and decommission using Cloud Templates, Configuration Management, Source Control and Continous Integration services." data-toggle="tooltip">Take your Automated Release Pipeline to the next level!(B8)<br />
Jan Egil Ring &amp; Øyvind Kallstad</td>
<td style="background-color: rgba(0,255,0,0.3);" title="I'm a DBA, and configuration drift is one of the most important battles I need to fight. Many, if not all, monitoring tools will warn you if something which has been configured has gone wrong. But not a lot of them will tell you if something was never configured in the first place. In my search for a solution I ran into Pester tests, and the Operations Validation Framework (OVF) in particular. The existing OVF framework itself was not &quot;dynamic&quot; enough for my needs, so I decided to roll my own. Based on the Convention over Configuration paradigm my code tests the whole landscape and can quickly identify if some configuration does not adhere to (my own) best practices or even if exceptions that I've put in place are still configured as such. I will demo based on code that is actively used in production and my samples will use SQL Server because I'm a database guy. But this session is also useful if you're not a SQL admin." data-toggle="tooltip">Fighting Configuration Drift with Dynamic Pester Tests(A8)<br />
André Kamman</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="How long do you wait for PowerShell to complete? I had a student from my PowerShell class take a 640 hour manual task down to just 1 hour and 20 minutes utilizing PowerShell. Automation is great, but writing code that runs faster gives you and your organization a tactical advantage. Simply put, the faster things work, the more money you save. What difference does a quarter of a second make? Well, try working in an environment of 100,000 nodes. Just a delay of .25 seconds per node will cost you an extra 7 hours. This deep dive will take a look at various ways to gauge the speed of your code execution and tactics on how to accelerate your goals through the use of methods, background jobs, and simple coding techniques. Whether it is through remoting on-premises or reaching out to the cloud, we will make things happen faster." data-toggle="tooltip">FASTER!!!! Make Your Code Run Faster!(D8)<br />
Jason Yoder</td>
<td style="background-color: rgba(255,255,0,0.3);" title="Although DSC has matured with WMF5, and the documentation around it is improving, it is difficult to find concrete example of DSC implementation treating servers like cattle and not pets, while keeping it flexible enough for real life. The basic principles of separating Configuration Data from Configuration code are well covered, but resources showing how to implement a flexible Configuration Data set for a real life environment are scarce. In this session we will revisit and expand on Steve Murawski's 2014 session 'Building Scalable Configurations with DSC' along with the module he created while at Stack Exchange, and updated by Dave Wyatt. We'll see how we can test our DSC configurations and data, integrate it to our Build Pipeline, leverage the information into Operation Validation, and how advanced configuration data management using the DscConfiguration module makes DSC a first class platform for Policy-Driven Infrastructure. We'll then look at how this file-based configuration registry can evolve to a database store such as a NoSQL database (i.e. Redis), acting as a CMDB and responding to dynamic events." data-toggle="tooltip">Making sense of DSC Configuration Data(C8)<br />
Gael Colas</td>
</tr>
<tr>
<td>14:15 &#8211; 15:15</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="There are serveral Cmdlets to manage Group Policys with Powershell. I will show how to use them and then introduce my Powershell-Module for advanced administration and Group Policy Healtcheck" data-toggle="tooltip">Group Policy Administration mit Powershell(E9)<br />
Holger Voges</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="Enterprise IT applications span across multiple infrastructures and usually include multiple tiers of services. PowerShell DSC evolved to be more than just a simple configuration management platform and covers more landscape than any other configuration management tool. In this session, we look at a complex enterprise application deployment completely using PowerShell DSC. In the process, we look at some best practices and guidance in creating such a complex end to end configuration using PSDSC." data-toggle="tooltip">Approaching Complex On-premise deployments with PSDSC(B9)<br />
Ravikanth Chaganti</td>
<td style="background-color: rgba(0, 0, 128,0.3);" title="TBD" data-toggle="tooltip">TBD(A9)<br />
Mark Gray (Microsoft)</td>
<td style="background-color: rgba(255, 153, 255,0.3);" title="With the new sqlserver module released with SQL 2016 I decided to revisit the SQL Server Provider. Come and take a look through it with me and I will show you what works and what doesn’t and give you ideas on how you can make use of it." data-toggle="tooltip">Using the SQL Server Provider(D9)<br />
Rob Sewell</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="As deployments get complex and thousands of devices are being used to perform similar tasks, the ability to programmatically control your networking device is essential for network owners. This session will provide practical tips and tricks for automating a NetScaler VPX deployment and configuration using NITRO (NetSCaler’s REST API) and PowerShell. In this session, you will learn: • About the REST API and NetScaler’s NITRO-based API framework • How NITRO with PowerShell can automate the NetScaler configuration with REST API calls • PowerShell scripting to automate the deployment of NetScaler VPX on XenServer or vSphere hypervisors with fixed IP address. Presentation link: You can check out the presentation at Citrix Synergy 2016 at the following link: http://live.citrixsynergy.com/2016/player/ondemandplayer.php?presentation_id=458df873-6e5d-414c-b095-3399fa8e667b" data-toggle="tooltip">NetScaler Automation – Talking NITRO with PowerShell(C9)<br />
Esther Barthel</td>
</tr>
<tr>
<td>15:30 &#8211; 16:30</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="Hyper-V und System Center Virtual Machine Manager 2016 setzten sich auf dem Virtualisierungsmarkt immer mehr durch. Lernen Sie in der Session, wie Sie den VMM mittels PowerShell Installieren und Konfigurieren. Erstellen Sie in handumdrehene Administrative Accounts, Netzwerk Konfigurationen und binden Sie Hyper-V Server ein. In dieser praktischen Session, wird der Virtual Machine Manager 2016 in Betrieb genommen und für eine Hybrid Lösung mit Azure Verbunden." data-toggle="tooltip">Verwalten von System Center Virtual Machine Manager 2016 mit PowerShell(E10)<br />
Rinon Belegu</td>
<td style="background-color: rgba(255,255,0,0.3);" title="In a number of projects around DSC we wanted to focus on developing DSC Resources, DSC Configurations and testing this in a Multi-Pull Server scenario. Being able to rapidly deploy a lab specific to each with AD, PKI and DSC Pull Servers already deployed was a huge timesaver. This session will walk through the deployment process automated with AutomatedLab. We will also demo the result of a large DSC project to automate the configuration of domain controllers within this lab which covers DSC Partial Configurations on multiple DSC Pull Servers, automated creation of DSC Configurations and DSC Reporting. " data-toggle="tooltip">Mastering a complex DSC lab infrastructure for rapid DSC deployment(B10)<br />
Raimund Andree, Jan-Hendrik Peters</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="The escalation of privileges to Administrator rights is an extremely common step of many malicious attack chains. As patching practices have matured, attackers are less frequently able to rely upon actual exploits of the Windows operating system to achieve elevation. However, enterprise deployment of ‘gold’ workstation images nearly always include third party (and often in-house designed) software that often introduces misconfigurations which may lead to privilege escalation opportunities. This talk will cover my tool PowerUp, a PowerShell script that automates checks for many of these escalation vectors. I will cover several common misconfigurations that we’ve seen in the field, the technical implementations of the checks that PowerUp uses, and how PowerUp can help complicate an attacker’s workflow." data-toggle="tooltip">Defending Your “Gold”(A10)<br />
Will Schroeder</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="Advanced regular expression session with focus on accuracy, performance and Unicode support(!). Learn about common tricks, pattern construction and catastrophic backtracking" data-toggle="tooltip">RegEx 2.0: Full Coverage(D10)<br />
Mathias R Jessen</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="This session will provide practical tips and tricks for automating a NetScaler VPX deployment and configuration using NITRO (NetScaler's REST API) and PowerShell. In this session you will learn: - About REST API and NetScaler's NITRO-based API-framework - How NITRO with PowerShell can automate the NetScaler configuration with REST API calls - PowerShell scripting to automate the deployment of NetScaler VPX on XenServer or vSphere hypervisors aith a fixed IP address." data-toggle="tooltip">NetScaler Automation – Talking NITRO with PowerShell(C10)<br />
Esther Barthel</td>
</tr>
<tr>
<td>16:45 &#8211; 17:45</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="Kennen Sie ein ein anderes Thema, dass gleichzeitig so herausfordernd und gespickt mit Besonderheiten ist und doch eigentlich Basiswissen darstellt? Ich nicht. Wir sprechen über vermeintlich einfache Konfigurationsaufgaben (Freigaben, Zugriffsrechte, Laufwerksmapping) wie auch über &quot;Mandatory Integrity Level&quot; und &quot;Dynamic Access Control&quot;. Sie werden lernen, dass explorer.exe der natürliche Fein der Administratoren ist und dasss powershell.exe vielleicht nicht die endgültige Lösung darstellt, jedoch einen großen Schritt in Richtung Ihres Seelensfriedens darstellen kann. Wir lernen, wie man Dateisystem ACLs mittels PowerShell konfiguriert, auf beliebige Ordner unabhängig von den vorgefundenen Berechtigungen Zugriff erlangt und wie man die vielen Ungereimtheiten der Bordmittel bei der Konfiguration von Fileservern umschifft. " data-toggle="tooltip">Was Sie schon immer über Fileserver wissen wollten, aber bisher nicht zu fragen wagten(E11)<br />
Thorsten Butz</td>
<td style="background-color: rgba(255,255,0,0.3);" title="I began running a DevOps enablement project in mid-2016. In this session I'll share some background &amp; stories of what we got wrong (and a few things we got right) as well as resources, tools &amp; ideas that really helped me." data-toggle="tooltip">From Zero to 4 Deployments a Week &#8211; Lessons Learned adopting Agile Ops Methodologies(B11)<br />
Benjamin Hodge</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="Just Enough Administration, also known as JEA, has been around for several years. It has been actively updated by the PowerShell team and you might have experimented with the technology. In this session, I will take you through the process of setting up your first JEA configuration, discuss the pitfalls and common issues. Furthermore, I will show several ways to analyze and secure your configuration. Level 300" data-toggle="tooltip">Building your own JEA Configuration(A11)<br />
Jaap Brasser</td>
<td style="background-color: rgba(0, 0, 128,0.3);" title="TBD" data-toggle="tooltip">TBD(D11)<br />
Kenneth Hansen (Microsoft)</td>
<td style="background-color: rgba(255,255,0,0.3);" title="Deploying development and test environments via PowerShell Desired State Configuration (DSC) with Lability is a now breeze. Unfortunately, technologies such as Active Directory and Hyper-V are traditionally difficult to perform integration testing on due to numerous factors and platform requirements. Could we exploit Lability in a Continuous Integration or Continuous Delvery pipeline to facilitate integration testing? Join in this session to see how we can use Server 2016, Lability and DSC to deploy an Active Directory integration testing environment using nested virtualisation. We will then use the open source Active Directory DSC resource module from Github to perform some integration tests using Pester and the Operation Validation Framework (OVF)." data-toggle="tooltip">Integrating Lability in a CI/CD Release Pipeline(C11)<br />
Iain Brighton</td>
</tr>
<tr>
<td>17:45</td>
<td colspan="5" rowspan="1">Informal Meeting<br />
in Workshop Foyer<br />
Team up and find a bar/restaurant</td>
</tr>
</tbody>
</table>
<p>To give delegates the chance to even better connect to speakers, we&#8217;ll have designated &#8220;Ask the Experts&#8221; areas during lunch so you can socialize with the people from your favorite field of interest.</p>
<p>Net length for all presentations is 45 minutes sharp. We want presentations to be concise and on the point, and prefer demos over endless slides. At the end of each slot, you have 15 minutes of Q&amp;A. If you are hungry for more, you can ask the speaker to extend: we have breakout-session rooms available where you have all the time you need for extensive Q&amp;A, and where you can hold spontaneous whiteboard sessions or meet with user groups.</p>
<p><a href="http://www.psconf.eu"><img class="alignnone wp-image-12587" src="http://www.powershellmagazine.com/wp-content/uploads/2017/02/interactive_session.jpg" width="500" height="375" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/02/interactive_session.jpg 600w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/interactive_session-300x225.jpg 300w" sizes="(max-width: 500px) 100vw, 500px" /></a></p>
<p>The conference ends on May 5 at 16:30h (4:30 pm). We provide a baggage room to leave your stuff. Here is the preliminary agenda for day 3:</p>
<table style="text-align: left; width: 100%;" border="1" cellspacing="2" cellpadding="2">
<tbody>
<tr>
<td>Time</td>
<td>Track 1 (German/English)</td>
<td>Track 2 (English)</td>
<td>Track 3 (English)</td>
<td>Track 4 (English)</td>
<td>Track 5 (English)</td>
</tr>
<tr>
<td>8:30 &#8211; 9:30</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="Von Docker bis zu Hyper-V Containern, seit Microsoft mit Windows Server 2016 und Windows 10 Container eingeführt hat, sind diese in aller Munde. Wir werden einen praktischen Blick auf die junge Technologie werfen - nicht nur in Hinblick auf Microsofts Implementierung sondern auch in Hinblick auf das Original unter Linux. Natürlich steht auch hier die PowerShell im Mittelpunkt, wenn es um Setup und Wartung der Ressourcen geht. " data-toggle="tooltip">about_Containers #A note from the field(E12)<br />
Thorsten Butz</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="" data-toggle="tooltip">Using Credentials &#8220;securely&#8221; in the Release Pipeline(B12)<br />
Matt Hitchcock</td>
<td style="background-color: rgba(0, 255, 255,0.3);" title="Following the latest Windows Server 2016 release, Lability has been expanded to support both Server 2016 and Nano server. Nano server - as you may aware - comes with its own set of unique management and deployment challenges. Fortunately, Lability helps abstract some of these changes and treats them the same as full Windows Server or Server Core (where possible). Come and join this session to see how we can use PowerShell Desired State Configuration (DSC) to deploy a hyper-converged Nano server cluster using nested virtualisation, Lability and a single configuration document." data-toggle="tooltip">Deploying Nano server with Lability and DSC(A12)<br />
Iain Brighton</td>
<td style="background-color: rgba(0, 0, 128,0.3);" title="TBD" data-toggle="tooltip">TBD(D12)<br />
David Wilson (Microsoft)</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="a session packed with ideas for runbook design patterns, best practices and other useful tips and tricks. Get inspired and learn how a PowerShell runbooks should be designed and structured" data-toggle="tooltip">Azure Automation &#8211; Advanced Runbook Design(C12)<br />
Jakob Gottlieb Svendsen</td>
</tr>
<tr>
<td>9:45 &#8211; 10:45</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="Für Unternehmen wird Powershell immer interessanter - kennen Sie jedoch auch alle relevanten Security Themen? Execution Policy, Logging und Powershell Remoting ist nur die elementare Sammlung, die Sie kennen sollten. In dieser Session führt David Sie im Schnellüberblick durch alle Themen, die für Unternehmen von Interesse sein sollten und macht Schluss mit weit verbreiteten Fehlannahmen." data-toggle="tooltip">PowerShell sicher im Unternehmen einsetzen(E13)<br />
David das Neves</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="A picture says more than a 1000 words is very true in the area of performance reporting.This session will show - how to collect such performance data in a vSphere environment in an efficient way- demonstrate several methods to visualise the collected data- discuss what to visualise, such as not to overwhelm the reader/viewer" data-toggle="tooltip">Visualising performance in VMware vSphere(B13)<br />
Luc Dekens</td>
<td style="background-color: rgba(255, 153, 255,0.3);" title="In this session, I’ll feature some incredibly useful PowerShell commands from the dbatools module. This SQL Server-centric toolset, which has evolved into an open-source community project with over 20 contributors, includes commands like Remove-SqlDatabaseSafely, Set-SqlTempDbConfiguration, Find-SqlDuplicateIndex, and Test-SqlPowerPlan.Join for me for this electric, demo-heavy session to learn how to greatly simplify your life as a DBA." data-toggle="tooltip">Modern Database Administration using PowerShell and dbatools(A13)<br />
Chrissy LeMaire</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="A joint session together with Jason Yoder we will go into the nitty gritty details of some fun and interesting things that you can do with PowerShell which it probably was never intended to do. We will cover several different concepts and examples. We will explain the code and the thoughts behind the design of the scripts, functions and blocks of code, in short you will get an uncensored view of the world of PowerShell. Level 200-300" data-toggle="tooltip">PowerShell Uncensored(D13)<br />
Jaap Brasser</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="Written a ton of awesome PowerShell scripts but find it hard getting your end users or your support people to use them? Let's solve this with ChatOps. We will have a quick introduction to ChatOps and then dive into installing a Hubot (a chat bot) on Windows. After learning the basics of Hubot, we will run through some code examples so you can easily bring your scripts to users through chat. Finally we will touch on what security and authentication methods you can use when getting into ChatOps with PowerShell." data-toggle="tooltip">ChatOps with PowerShell(C13)<br />
Matthew Hodgkins</td>
</tr>
<tr>
<td>11:00 &#8211; 12:00</td>
<td style="background-color: rgba(102, 102, 51,0.3);" title="In diesem Vortrag werden wir uns die neuen Möglichkeiten anschauen, die PowerShell unter Linux mit sich bringt. Angefangen bei dem Bezug von PowerShell für die verschiedenen Systeme werden wir anhand von Scripts erarbeiten, was die Vorteile von PowerShell im Vergleich zu klassischen nx Shells sind. PowerShell ist nicht vollständig ohne Desired State Configuration: Wir werden uns anschauen, welche Ressourcen unter Linux verfügbar sind und wie sie sich erfolgreich in Push- und Pull-Szenarien nutzen lassen. - Bezug/Installation - Scripting unter Linux - Was geht, was sind die Vorteile gegenüber klassischen nx Shells? - DSC unter Linux – OMI, Bezug/Installation, verfügbare Ressourcen, Verwendung eines Pull-Servers, Pushen von Konfigurationen" data-toggle="tooltip">PowerShell on Linux &#8211; Cross-Platform skripten(E14)<br />
Jan-Hendrik Peters</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="Expect a fast-paced, demo-heavy session about Just Enough Administration (JEA), a new PowerShell security feature. Learn how to create boundaries for your users, and give them the minimal set of privileges they need to perform their tasks. In this session, we’ll demonstrate how to author, deploy, use, and audit a JEA including the improvements in the latest version of PowerShell. If you want PowerPoint slides, look for another session." data-toggle="tooltip">JEA Deep Dive in 45 Minutes(B14)<br />
Aleksandar Nikolic</td>
<td style="background-color: rgba(255,255,0,0.3);" title="Nano server, the new deployment model in Windows Server 2016, allows faster deployment and servicing of not just virtual environments but bare-metal servers too. When we combine Nano Server deployment model with configuration management platform like PowerShell DSC and build a release pipeline, we get an immutable infrastructure. This session demonstrates a complete release pipeline built with Nano and PowerShell DSC and demonstrates how an immutable web infrastructure gets deployed and serviced. This session also includes some interesting demos of using IoT enabled devices to monitor the release pipeline and infrastructure configuration." data-toggle="tooltip">Build an immutable application infrastructure with Nano Server, PowerShell DSC, and the release pipeline(A14)<br />
Ravikanth Chaganti</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="“Never reinvent the wheel.” My professors taught me that this is the first rule of Computer Science. PowerShell has a lot of hidden gems if you just dig deep enough to find them. We are going to go through some of these mysterious variables and references and show you how to leverage them to your advantage. A little deep dive and a little to make you laugh. We will learn new tricks to use in your coding and a few to make the work place a little more fun. Why code more when the work has already been done for you?" data-toggle="tooltip">Demystifying $THIS, $_, $PSITEM, $$, $Whatever…(D14)<br />
Jason Yoder</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="In this session we will go through all you need to know to start working with binary file formats with PowerShell." data-toggle="tooltip">Working with binary file formats(C14)<br />
Øyvind Kallstad</td>
</tr>
<tr>
<td>12:00 &#8211; 13:00</td>
<td colspan="5" rowspan="1">Lunch &amp;<br />
Ask the Experts</td>
</tr>
<tr>
<td>13:00 &#8211; 14:00</td>
<td style="background-color: rgba(0, 0, 128,0.3);" title="Angel is General Product Manager at Microsoft for Azure Configuration Management and Automation. This includes PowerShell, DSC, OMS Automation and Configuration. So Angel is the perfect speaker to provide an update on Microsofts current and future plans for Windows, Linux and Azure, and the road ahead to make PowerShell the best tool for DevOps and to enable Artificial Intelligence for IT " data-toggle="tooltip">PowerShell Present and Future(E15)<br />
Angel Calvo (Microsoft)</td>
<td style="background-color: rgba(255, 0, 0,0.5);" title="With the relentless proliferation of compiled and script-based malware, trusting prevention and detection to antivirus solutions alone simply won't cut it. The only ideal method of effectively blocking binaries and scripts on a host is with a robust whitelisting solution. Device Guard is one such solution offered by Microsoft for Windows 10 and Server 2016 and if implemented properly, can eliminate an entire suite of attacks your organization may face. Additionally, the only interface into configuring Device Guard is with PowerShell. Device Guard, like any other whitelisting solution, will never be impervious to bypasses, however. A robust solution will, however, provide mechanisms to block known bypasses. Device Guard provides such functionality in addition to providing features that can effectively block rogue administrators from altering policies or disabling the service. This talk will cover PowerShell-based configuration and deployment of a restrictive whitelisting policy, bypasses to the policy through exploitation of trusted applications, and mitigation strategies for effectively blocking such bypasses." data-toggle="tooltip">Architecting a Modern Defense using Device Guard and PowerShell(B15)<br />
Matthew Graeber</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="What do you have to do to make your cmdlets as usable as the best of the ones that ship with PowerShell? We look at handling of Paths, globbing, typed output, ShouldProcess and how to design your cmdlets and types together so they flow in a pipeline." data-toggle="tooltip">Professional cmdlets and module design(A15)<br />
Staffan Gustafsson</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="Understanding Agile IT Operations systems like Continuous Delivery &amp; Continuous Integration can be difficult to visualise &amp; understand when you're trying to get started. In this session we'll build a complete CI/CD Pipeline in Azure to show you how the different stages &amp; elements work together &amp; make it easier to understand the benefits &amp; thinking behind the theory." data-toggle="tooltip">Building Your First CI/CD Pipeline in Azure(D15)<br />
Benjamin Hodge</td>
<td style="background-color: rgba(255,255,0,0.3);" title="Cloud and Datacenter Management MVP Will Anderson takes you through the use of Azure’s Desired State Configuration to deliver VM deployment solutions in the cloud that conform to your organizations’ requirements. Learn the different methods to deploy your desired state configurations into the Azure environment to manage both your Service Management and Resource Manager IaaS deployments." data-toggle="tooltip">Using Desired State Configuration in Azure(C15)<br />
Will Anderson</td>
</tr>
<tr>
<td>14:15 &#8211; 15:15</td>
<td style="background-color: rgba(0,255,0,0.3);" title="Testing code is now a requirement for the majority of open-source projects - including the Microsoft DSC resources on GitHub. Most of us have probably written a Pester test or two, but! Do you know what makes a good or a bad unit test? When you look at your code do you know what you should or shouldn’t be testing? Can you tell the difference between an integration test and a unit test? In this session we will use personal examples of tests written when first starting out to highlight both good and bad testing practices, unit and integration tests and maybe a few Pester tips and gotchas. Note: whilst this session uses Pester for its examples, this session is more about the philosophy of software testing and less about the intricacies of Pester." data-toggle="tooltip">Writing Effective Tests(E16)<br />
Iain Brighton</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="It all started as a side project created by (at the time) former member of the PowerShell team, Jason Shirk. After Jason got back “home”, this project made its way back to the PowerShell core. In this session I will try to share with you current “state of the art”, and show you how you can create dynamic completers for your modules. I will also share how adding custom completers will look in the future, when PowerShell version 5 will become de facto standard version. Expect a lot of demos and some examples of how you can create module/command specific completers and completers that will work for any command on your system." data-toggle="tooltip">TabExpansionPlusPlus in examples(B16)<br />
Bartosz Bielawski</td>
<td style="background-color: rgba(102, 102, 51,0.3);" title="In August of 2016, Microsoft officially open sourced the PowerShell project. This is an important step towards making PowerShell a single platform for automation of management/administration tasks on all operating system variants, not just Windows. Naturally, many open source powershell developers rushed to make sure their script modules were compatible with PowerShell Core and open source PowerShell on non-Windows operating systems. Porting script modules turns out to be a “relatively” painless process, but binary modules create additional problems to overcome. In this presentation, I will discuss writing Binary PowerShell Modules targeting the PowerShell Core and open source PowerShell, specifically focused on modules built to run in *nix environments. I will cover writing P/Invoke definitions for Linux and Mac OSX function calls, reviewing source code to see how Microsoft does it, and how to add operating system detection logic. Finally, I will demonstrate running PowerForensics on a non-Windows platform." data-toggle="tooltip">Porting Binary PowerShell Modules to Linux/OSX(A16)<br />
Jared Atkinson</td>
<td style="background-color: rgba(255,255,0,0.3);" title="In this session we'll explore using the script resource, resource kit resources and creation your own resource." data-toggle="tooltip">What to do when inbox DSC Resources don&#8217;t cover your goal(D16)<br />
Ben Gelens</td>
<td style="background-color: rgba(30, 100, 200,0.3);" title="Developers know what Continuous Delivery means and we don't really have to explain to them what's available and why it is important. ITPros though, have a bigger challenge. Testing our code can be complicated, because it often touches the infrastructure. Do we then need a complete test infrastructure? Doesn't that make the test project time-consuming and cumbersome? In this session I'll show what can be done out of the box, and we will investigate what is possible if we think outside of that box a bit. (And we'll talk about the limitations as well of course)" data-toggle="tooltip">Test your Powershell code with AppVeyor for ITPros(C16)<br />
André Kamman</td>
</tr>
<tr>
<td>15:30 &#8211; 16:30</td>
<td style="background-color: rgba(255,255,0,0.3);" title="To use DSC in a &quot;foreign&quot; environment, some hurdles have to be taken. The vSphereDSC module is one attempt at doing this. It allows one to use DSC to manage a VMware vSphere environment. The session shows the concepts behind the vSphereDSC DSC Resource module, how to add resources in the module and how to use the module." data-toggle="tooltip">DSC in a vSphere Environment(E17)<br />
Luc Dekens</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="This session will focus on concepts of automation using OMS/Azure Automation, which is part of Microsoft Operational Suite (OMS). It will showcase a number of scenarios while explaining the technology behind them. Join this session for advanced integrations using Microsoft Graph API, Web Services and more!" data-toggle="tooltip">Automation in a hybrid world(B17)<br />
Jakob Gottlieb Svendsen</td>
<td style="background-color: rgba(0, 0, 128,0.3);" title="TBD" data-toggle="tooltip">TBD(A17)<br />
PowerShell Team (Microsoft)</td>
<td style="background-color: rgba(255,255,0,0.3);" title="In this session, I'd like to take the audience on a journey from the perspective of a developers and also from the perspective of an Ops guy. This will focus on how DSC makes the life-cycle of application development and server configuration that bit easier. The session will focus on a few key pieces of Windows tech such as CI/CD with Appveyor, Chocolatey for Package Management, Github for Source Control and of course DSC for configuration management. I will also focus on some of the pain points for developers and Ops guys and point out ways a proper CI/CD pipeline can solve these. This presentation will be based on how we use DSC within our CI/CD pipeline at my place of work (Coolblue)." data-toggle="tooltip">DSC: From commit to production.(D17)<br />
Flynn Bundy</td>
<td style="background-color: rgba(255, 153, 51,0.3);" title="SCOM provides a lot of monitoring out of the box. When there is a need for extending the built-in monitoring, PowerShell is there to help you out. This session will cover how to create discoveries, monitors and performance collection rules using PowerShell in a SCOM management pack. In this session we'll also cover how to get data out of SCOM with PowerShell using command channels and subscriptions." data-toggle="tooltip">Monitoring with PowerShell in SCOM(C17)<br />
Daniël Both</td>
</tr>
<tr>
<td>16:30</td>
<td colspan="5" rowspan="1">Conference ends</td>
</tr>
</tbody>
</table>
<p>Of course, we are working on conference memorabilia to take home &#8211; these are secret until the conference starts. Except for our famous &#8220;PowerShell container&#8221;: you&#8217;ll get one of these limited collectors items of course.</p>
<p><a href="http://www.psconf.eu"><img class="alignnone wp-image-12589" src="http://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs.jpg" width="500" height="500" srcset="https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs.jpg 600w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-150x150.jpg 150w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-300x300.jpg 300w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-80x80.jpg 80w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-200x200.jpg 200w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-32x32.jpg 32w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-50x50.jpg 50w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-64x64.jpg 64w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-96x96.jpg 96w, https://www.powershellmagazine.com/wp-content/uploads/2017/02/mugs-128x128.jpg 128w" sizes="(max-width: 500px) 100vw, 500px" /></a></p>
<h2>Register Now &#8211; Seats are Limited</h2>
<p>We want this conference to stay personal. So even though we decided to further increase number of speakers and tracks, there&#8217;s room for a maximum of roughly 250 delegates. Half of these seats were taken before we published the agenda, and last year&#8217;s event sold out. We&#8217;d love to see you join this fun PowerShell event!</p>
<p>When you register on <a href="http://www.psconf.eu" target="_blank" rel="noopener noreferrer">www.psconf.eu</a>, your seat is immediately reserved for 30 days, and you receive an invoice. Pay it within 30 days, and your seat is secured.</p>
<p>If you have any additional questions, please use the contact form at the bottom of <a href="http://www.psconf.eu" target="_blank" rel="noopener noreferrer">www.psconf.eu</a> to get in touch. Looking forward to seeing you soon!</p>
<p>Tobias</p>
]]></content:encoded>
							<wfw:commentRss>https://www.powershellmagazine.com/2017/02/02/powershell-conference-eu-2017-speakers-and-sessions/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
							</item>
	</channel>
</rss>
