Building a Package Builder Web Endpoint

Package Builder - Web Edition

Preface: This isn’t officially supported by Chocolatey. But, I’m nothing if not adventurous. What’s the saying? “Where there’s a will, there’s a way”? Anyway, I digress, onto the good stuff.

Earlier today I had a conversation with myself, as often happens. It went a little something like this.

Self: You haven't done something ridiculous in a while.
Busy-Self: I know, self, because I'm busy!
Self: But _this_ would be so cool!!
Busy-Self: Yes, but, _points at list_
Self: .....
Busy-Self: .....

So with Self winning the argument as per usual I set out on the task of building a web front end for the Package Builder feature in Chocolatey For Business, which enables you to take a standalone executable, be that .exe,.msi, or .msu and turn that into a full functioning Chocolatey package in as little as 5 seconds, depending on the size of the installer. Yes, I know, you’re going to need a C4B license for this. Sorry.

Goals

I had the following goals in mind:

  • Simple interface
  • No complex code (aka regex, cause I friggin hate it)
  • RESTful would be nice, but not a must-have

The Solution

In developing the solution I went through a few iterations using PowerShell Universal. If you have not yet checked that out, you really should. It can turn anyone with some PowerShell chops into a WebDev* in just a couple of minutes. I particularly love it for how fast it allows me to build out these hair-brained proof of concept ideas.

* It takes years to be a good WebDev

Initially I wanted to do a web form, wherein the end user could fill in some info, click a button, and out pops a Chocolatey package. That ultimately worked, but I didn’t quite like the user experience. That’s way more work than just right-clicking a file after all, and not very extensible.

Then the lightbulb moment hit. PowerShell Universal has an awesome API endpoint system! I friggin love working with RESTful api stuff, and kicked myself for not just going that route in the first place.


Creating the REST Api Endpoint

Once you have PowerShell Universal running (choco install powershelluniversal -y for the lazy efficient), open up a web browser and head to http://localhost:5000. You’ll be asked to login. Use the username admin, and any password. This is the default behavior. You can secure it more later if you want, but for the purposes of this blog post and POC nature of the project, we’re just gonna YOLO it.

Once logged in, follow these steps:

  • Go to the API section on the left-hand navigation.

PowerShell Universal Navigation

  • Then you’ll want to add an endpoint.

Add an Endpoint

  • Configure the endpoint for POST requests

Configure Endpoint

  • Edit the endpoint to, you know, do something by clicking ‘View Endpoint’

Edit Endpoint

  • Select ‘Edit’ once on the ‘View Endpoint’ page, and write the code you want your endpoint to execute

Make it do something

  • Save your changes

Save the script

The code that I’m currently using is conveniently provided below.


Code:

$package = $headers.Package
$fileName = $headers.File

$tempPath = 'C:\tmp'
$file = "$tempPath\$($fileName)"
[System.IO.File]::WriteAllBytes("$file", $Data)

function New-ChocolateyPackage {
    [cmdletBinding()]
    param(
        [parameter()]
        [string]
        $file,

        [parameter()]
        [string]
        $PackageName,

        [parameter()]
        [string]
        $outputdirectory
    )

    process {
                
        $statements = "new $PackageName --file $file --build-package --output-directory $OutputDirectory"
        $process = New-Object System.Diagnostics.Process
        $process.EnableRaisingEvents = $true

        Register-ObjectEvent -InputObject $process -SourceIdentifier "LogOutput_ChocolateyProc" -EventName OutputDataReceived -Action $writeOutput | Out-Null
        Register-ObjectEvent -InputObject $process -SourceIdentifier "LogErrors_ChocolateyProc" -EventName ErrorDataReceived -Action  $writeError | Out-Null

        $psi = New-Object System.Diagnostics.ProcessStartInfo
        $psi.FileName = 'C:\ProgramData\chocolatey\bin\choco.exe'
        $psi.Arguments = "$statements"

        $process.StartInfo = $psi
        $process.Start()
        $process.WaitForExit()
        $process.Dispose()
    }
}

Start-Sleep 3
New-ChocolateyPackage -File $file -PackageName $package -OutputDirectory C:\processed
Start-Sleep 3

Get-ChildItem 'C:\processed' -Exclude *.nupkg | Remove-Item -Recurse -Force
Remove-Item $file -Force

System Setup

If you’re going to use the code above verbatim, you’ll need to ensure a few things. You will need the following directories to exist on the system:

  1. C:\tmp
  2. C:\processed

The script will temporarily create a copy of the installer in C:\tmp, which then Chocolatey will pick up the file from there and build the package. It’s designed this way to simulate a remote call to the API. The C:\processed directory will be the place where the completed Chocolatey packages are stored.


End Result

Once you’ve done all of that, it’s off to run some powershell! To make this work, we are going to need to provide a header. This part I don’t quite 100% like, but for the purposes of “Could this even work”, it’s good enough.

Open up an elevated* PowerShell console and run the following, editing it with an actual installer file, and the name you wish to give the package:

$header = @{
    File = 'someinstallerfilename.exe'
    Package = 'GiveThePackageAName'
}

Next, we just need to craft a call to the API with `Invoke-RestMethod:

$irmParams = @{
    Uri = 'http://localhost:5000/packagebuilder'
    Method = 'Post'
    Headers = $header
    InFile = 'InstallerFile'
}

Invoke-RestMethod @irmParams

Caveats

This is a proof of concept. Please don’t consider it Gospel and run off and use it in Production until you’ve really vetted the solution. Also, PowerShell Universal requires a license if you are going to use authentication to the REST endpoint. (You should support Adam anyways, he’s really good people.)

Because of the nature of Invoke-RestMethod large installers aren’t going to work here. There may be some additional tweaking and poking to be done to make it work, but as of the time of writing this, anything over 100MB threw a big ol temper tantrum when I tried to use it.

Read More

Universal Automation For Chocolatey Package Internalizer

So @adamdriscoll created this new amazing product called Universal Automation. It’s a PowerShell job scheduling platform built on top of his awesome Universal Dashboard product. I was fortunate enough to be invited into the Private Beta, so I’ve been playing with it off and on over the last couple of weeks. I had the idea this evening to see “Can I use this to do Chocolatey Package Internalization?” We have documentation on setting it up in Jenkins over here. Note, this is a business feature, so if you are running Open Source or have a Pro License, I’m sorry, this blog post probably won’t be much help, other than showing off how awesome Universal Automation is.

The Setup

Since I use vagrant, it was really quick for me to spin up an environment. I keep a copy of our chocolatey test environment handy, and I’ve updated it to run on Server 2019. So using my Vagrantey module I just did a quick Start-VagrantEnvironment -Environment Choco-Test and brought up a box.

Next I installed a couple of choco packages:

choco install vscode vscode-powershell googlechrome nexus-repository -y

I configured my nexus repository for chocolatey, and then configured chocolatey to talk to it.

After I had that work completed I installed the latest build of Universal Automation onto the box. Unfortunately, I can’t give much detail there, you’ll have to wait for the public beta, but I can say it’s as simple as importing a couple modules, starting up a server and dashboard, and you are off to the races.

Turning scripts into a module

I decided that it would be far easier to have the scripts we provide in a module, it’s just much cleaner that way. So whipped up a quick ChocoInternalizer module by simply copying the scripts from above into a folder named ChocoInternalizer. Then in VSCode I whipped up a quick psm1 and psd1 file, made sure my cmdlets were proper functions, and did a quick test with Import-Module C:\Git\ChocoInternalizer\ChocoInternalizer.psd1 and verified all my cmdlets showed up with Get-Command -Module ChocoInternalizer.

One quick note, I did make a slight adjustment to the scripts in that in ConvertTo-ChocoObject I changed the split to be on the ‘|’ character, and modified other scripts to use the -r flag of choco to limit the output to something a little easier to digest. You absolutely don’t have to do that, I just like to mess with @pauby.

With that good to go, I set off to do the work in Universal Automation.

Getting things going in UA

The documentation has us start a UA Server, and then a dashboard to access all of its goodness at http://localhost:10001, so I did just that, opened up Chrome, and was welcomed by the UA Homepage. It looked something like this, without all the history

UA Homepage

Adding the Script

  1. Universal Automation is all about scheduling scripts to run. Click on The Scripts tab, and at the bottom click “New Script”.

  2. On the General Tab give it a nice Name and Description, future you will thank present you for being descriptive here.

  3. Click over to the Script tab. Here is where you define the Powershell Script you want to run. I used the following:
    Import-Module C:\Git\ChocoInternalizer\ChocoInternalizer.psd1 -Force
    Get-UpdatedPackage -LocalRepo 'http://localhost:8081/repository/choco-local' -LocalRepoApiKey $SuperSecretKey -RemoteRepo 'https://chocolatey.org/api/v2'
    
    • Don’t worry about that $SuperSecretKey bit, we’ll get to that in a moment
  4. Click on the Finalize tab and give a good commit message. UA uses git internally to track scripts, or you can point to your own Git Repository if you wish.

  5. Click Create Script

Scheduling the script

You probably want to schedule this to run at specific intervals. You can do so in UA using either Cron syntax, or he’s also got a dumby proof Easy-Mode that just lets you select from a pre-defined list of intervals.

To schedule you script do the following:

  1. Click on the script that you just created.
  2. Click the “Schedule” Button in the upper-right side of the window.
  3. Click the “Easy Schedule” tab and select an option.
  4. Click “Submit”. You can verify the schedule by clicking on the Schedule tab after the window closes.

That $SuperSecretKey

Click on the “Universal Automation” banner to go back to the home page. From there do the following:

  1. Click on the Variables tab.
  2. Click “New Variable”
  3. Give it a name, in my case SuperSecretKey, and a value, which is the API key for my local repository server.
  4. Click “Submit”

Let’s Make Some Magic!

Ok. We are done. This thing is ready to run! Click on the Scripts tab and press the “> Run” button. Give it a bit of time, depending on your package count, this might take a minute.

Once it completes you can click on the “Past Jobs” tab from the home page, click the script you just ran, and then select the “View” button to see the output.

Wrap-up

That’s it! We just setup packages to internalize from the community repository in just a few short minutes. I think it took me around an hour to get everything setup the first time as I was installing modules and writing a module and tinkering with things a bit. I encourage you to checkout Universal Automation when it enters public beta next week. It’s a really powerful tool to centralize a lot of your code that might be running across different systems for “reasons” that don’t need to be “reasons” anymore! Thanks for reading!

Read More

Stop Installing Software Manually

Today’s post is a little bit biased. I’ll be talking about the choco upgrade command and how it can save you oodles of time moving forward keeping your system up to date. Why am I biased? Well, that’s because I’m currently a Senior Support Engineer for Chocolatey Software.

The Concept

Chocolatey, or ‘choco’ for short, is a software package management application for Windows. With right around 7000 packages available on the Community Repository the software makes it dead simple to not only get what you need to be productive, but easily maintain those softwares over time.

Using it

Getting chocolatey installed is dead simple. Open up an administrative powershell prompt and enter Set-ExecutionPolicy Bypass -Force -Scope Process; iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/Install.ps1')). If you want to see what that install script contains before running it, you can view it here.

Once you have chocolatey installed, you will be able to install any packages listed on the community repository. As an example I’ll demonstrate installing Google Chrome, and Visual Studio Code.

You could do it this way:

choco install googlechrome -y
choco install vscode -y

The -y here just skips the confirmations prompts about “Do you really want to run this script to install this software?”

We can shorten that command down to one line though, as choco accepts a space separated list of package names. The following does the same thing as above:

choco install googlechrome vscode -y

Similar to installs, uninstalls are just as simple. Simply replace install with uninstall, and Bob’s your uncle!

Updates over time

Now that you are managing Google Chrome and VSCode with chocolatey, it’s time to upgrade those packages to their latest version. You can do all currently installed software packages in one fell swoop with choco upgrade all. Alternatively, you could supply just the name of a single package to the upgrade command to only install an update to that specific application’s package.

If you have a situation where you’ve installed a version of a particular application with chocolatey, and you must maintain that version of that package over time, but wish to upgrade all other packages with choco upgrade all, don’t worry, we thought of that too. Using the choco pin command you are able to maintain a particular package to a specific version, which will be skipped during a choco upgrade operation.

Wrap up

That’s it. Short and sweet. Running one line of PowerShell to get choco up and running on your system can save you hours of time keeping software packages up to date manually. I hope you found this information helpful. Chocolatey is Open Source for everyone, with a Business Edition available if you would like to leverage it in your enterprise. The business editon comes with features like:

  • Package Builder (Right Click => Create Chocolatey Package functionality.)
  • Package Internalizer (Bring in a Community Repository including any binaries for use on your own private repositories)
  • Self Service (Non-Admin users can install packages from a catalog via a wonderful GUI application similar to SCCM’s Software Center)
  • Central Management (Centralized reporting of all machines with choco installed, detailing installed packages, and their versions, with more features coming soon)
Read More

Imposter Syndrome Is Bull

Imposter Syndrom is bull@^$%. There. I said it.

I wanted to take a little bit of time to recap my weekend spent in Raleigh, North Carolina, at the PowerShell Saturday event put on by the local user group there. Man, what a great time.

It was exciting to see so many people (around 100!) come together to learn from one another. There were many first time speakers at the event, and it was refreshing to see and hear them be successful.

Community is the biggest and best aspect of PowerShell to me, personally. I feel thankful every day that I’m in a position that affords me the opportunity to reach out and assist people from all over the world resolve complex problems in their environments using PowerShell.

If you’re on the fence about becoming more active, or are scared that what you’ve done to date isn’t “good enough”, just stop. You’ve solved a complex problem that will help someone else, I guarantee it. So share! There are so many amazing communities and avenues available to you to do so, and I challenge you to step outside of your comfort zone and join in the conversation. There’s a very active community of folks in the PowerShell Slack channel, which is helpfully bridged to Discord if you prefer that chat client.

You’ll also find many of us on Twitter. (Seriously, follow me!), you’ll be able to connect with at on of people directly off of my following/followers list.

Look in your local area for a PowerShell User Group. There’s a section of Slack dedicated to many of them, so jump in and say “Hi”, and get details on when meetings are. If there are none close to you, many of the larger ones in big metro areas have the ability to join remotely via Webex, or Zoom, or some other online web conference platform. It’s a great way to join in if you are geographically unable to attend in person.

Share that first script, throw it in a gist and link to it somewhere. Ask questions (there aren’t any stupid ones, and someone who tells you otherwise is lying). YOU make the community great, and I look forward to sharing in your success in the future. In closing, for real, Imposter Syndrome is bullshit, so come join the fun!

Read More

Pipeline Build Scripts Demystified

There are a myriad of ways to leverage build scripts inside of Azure DevOps Pipelines, and all the other popular CI/CD providers. In fact, there are whole modules dedicated to it including InvokeBuild,PSDepend,PSake and others.

I hate all of them. Not necessarily because they are bad, quite the opposite. I think they are each quite good and very well written. However, when I build out my pipelines I like to keep things as simple as possible. This helps me to reduce the complexity of the pipeline and make troubleshooting things much much easier. In this blog post I’m going to provide a copy of the build script that I’m currently using for my PSChocoConfig module in Azure Pipelines. I’ll give the whole script to you up front, but then we’ll break it down in sections as we walk through it.

The code

Here’s the full script:


[cmdletBinding()]
Param(
    [Parameter()]
    [Switch]
    $Test,

    [Parameter()]
    [Switch]
    $Build,

    [Parameter()]
    [Switch]
    $Deploy
)

#Make some variables, shall we?
$innvocationPath = "$(Split-Path -Parent $MyInvocation.MyCommand.Definition)"
$PSModuleRoot = Split-Path -Parent $innvocationPath
$TestPath = Join-Path $PSModuleRoot "Tests"

#Do Stuff based on passed Args
Switch($true){

    $Test {

        If(-not (Get-Module Pester)){
            Install-Module -Name Pester -SkipPublisherCheck -Force
        }

        Invoke-Pester -Script $TestPath -OutputFile "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig.Results.xml" -OutputFormat 'NUnitXml'

        #
        Get-ChildItem $env:Build_ArtifactStagingDirectory
    }

    $Build {

        If(Test-Path "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig"){
            Remove-Item "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig" -Recurse -Force
        }

        $null = New-Item "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig" -ItemType Directory

        Get-ChildItem $PSModuleRoot\Public\*.ps1 | Foreach-Object {

            Get-Content $_.FullName | Add-Content "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig\PSChocoConfig.psm1"
        }

        Copy-Item "$PSModuleRoot\PSChocoConfig.psd1" "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig"

        #Verification of contents
        Get-ChildItem -Path "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig" -Recurse

        #Verify we can load the module and see cmdlets
        Import-Module "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig\PSChocoConfig.psd1"
        Get-Command -Module PSChocoConfig

    }

    $Deploy {


        Try {

            $deployCommands = @{
                Path = (Resolve-Path -Path "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig")
                NuGetApiKey = $env:NuGetApiKey
                ErrorAction = 'Stop'
            }

            Publish-Module @deployCommands

        }

        Catch {

            throw $_

        }

    }

    default {

        echo "Please Provide one of the following switches: -Test, -Build, -Deploy"
    }

}

Breaking things down

Part 1: Testing

Let’s have a look at the first section of the script.


$Test {

        If(-not (Get-Module Pester)){
            Install-Module -Name Pester -SkipPublisherCheck -Force
        }

        Invoke-Pester -Script $TestPath -OutputFile "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig.Results.xml" -OutputFormat 'NUnitXml'

        #
        Get-ChildItem $env:Build_ArtifactStagingDirectory
    }

Inside of Azure Pipelines I have a build step which I’ve called ‘Run Pester Tests’. It calls the build.ps1 file from the Build directory in my repository, and I pass in the -Test argument to the script. Because “Test” has been passed in, the Switch statement evaluates to “True” for that switch, and thus the test code is executed.

The If statement simply puts the latest version of the Pester module onto the build agent if it is not there. If you’re running this pipeline on a hosted agent, where you can control what modules exist on the agent box, this step will just be ignored as the test for the module will pass and it will happily move along to actually running the tests.

After we have Pester installed we invoke all of the *.test.ps1 files that are located in the Tests directory of the repository. The -Script parameter of Invoke-Pester accepts an array of paths, and will execute all the test files that it comes across.

You’ll also notice that I’m passing in an -OutputFile and -OutputFormat parameter, specifying an xml file name and that they are of type NUnitXml. I do this, as I the next step after running the tests is to publish those tests to the pipeline. This gives you a very nice chart-style view of your test results after the pipeline executes, and managers love eye candy, am I right?!

The last line can be ignored, it’s simply there to verify that the xml file that I publish test results to is stored as I expect it to be.

Here’s the YAML code for that pipeline step in Azure:

steps:
- task: PowerShell@2
  displayName: 'Run Pester Tests'
  inputs:
    targetType: filePath
    filePath: ./Build/build.ps1
    arguments: '-Test'

Part 2 : Building the module

After I have ran the Pester tests and published their results I build the module. I develop with all of the functions split up into their own individual ps1 files in a Public folder, and keep a copy of a psm1 that dot sources everything in that folder when I load the psd1. This is so I can test as I develop quickly, but not ultimately how the module should behave when publishing for public consumption.

The build script pulls the contents of each of those ps1 files and, using Add-Content, are written to a fresh copy of a PSChocoConfig.psm1 file. This I then just use Copy-Item to lift and shift my psd1 file over to the PSChocoConfig directory that I create in the pipeline’s ArtifactStagingDirectory.

Here’s what that code looks like:

$Build {

        If(Test-Path "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig"){
            Remove-Item "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig" -Recurse -Force
        }

        $null = New-Item "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig" -ItemType Directory

        Get-ChildItem $PSModuleRoot\Public\*.ps1 | Foreach-Object {

            Get-Content $_.FullName | Add-Content "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig\PSChocoConfig.psm1"
        }

        Copy-Item "$PSModuleRoot\PSChocoConfig.psd1" "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig"

        #Verification of contents
        Get-ChildItem -Path "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig" -Recurse

        #Verify we can load the module and see cmdlets
        Import-Module "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig\PSChocoConfig.psd1"
        Get-Command -Module PSChocoConfig

    }

Again, the last three lines of this section are simply verification that shows up in the output of the pipeline logs so I can trust that things worked correctly.

And here is what that YAML looks like:

steps:
- task: PowerShell@2
  displayName: 'Build Module'
  inputs:
    targetType: filePath
    filePath: ./Build/build.ps1
    arguments: '-Build'

Part 3 : Publishing to PSGallery

The final step in the pipeline is the publish the latest version of the module to the PowerShell Gallery. I do all the work to prep for release on the repository side, like making sure I’ve bumped the version inside the psd1 file. If I forget, the step will fail, which will fail the build, so I’ll know right away what I did.

Here’s what that deploy code looks like:

$Deploy {


        Try {

            $deployCommands = @{
                Path = (Resolve-Path -Path "$($env:Build_ArtifactStagingDirectory)\PSChocoConfig")
                NuGetApiKey = $env:NuGetApiKey
                ErrorAction = 'Stop'
            }

            Publish-Module @deployCommands

        }

        Catch {

            throw $_

        }

    }

You’ll notice the $env variable there. I’ve defined that as a secret variable in the Pipeline Variables section of the build pipeline. I reference it into the script using the $(NuGetApiKey) variable in the YAML.

And here’s that YAML for the build step:

steps:
- task: PowerShell@2
  displayName: 'Deploy to PSGallery'
  inputs:
    targetType: filePath
    filePath: ./Build/build.ps1
    arguments: '-Deploy'
  env:
    NugetApiKey: $(NuGetApiKey)

Wrapping Up

I hope you’ve found my approach to pipelines useful. Sometimes keeping it simple is the best way to approach things, and this method works quite well. If you enjoyed this article, or have any feedback, please feel free to leave me a message here or drop me a line on Twitter @steviecoaster. Thanks for reading! Until next time…

Read More

Chocolatey Metapackages Explained

Chocolatey has the concept of a “meta” package. These types of packages don’t typically include any sort of logic that would leverage installation, upgrade, or uninstallation scripts. Alternatively, they provide a means to bundle existing packages as dependencies, giving you a single package to install, which in turn will install all of those subsequent dependent packages.

Controlling what packages you wish to include is handled entirely in the dependencies section of a nuspec file. In the following sections, we will work through the creation of a meta package.

Creating your metapackage

Open up an administrative Powershell window and issue the following commands:


#configure a location for your metapackage
mkdir C:\packages

#enter the newly created directory
Set-Location C:\packages

#Create a new package
choco new -n metapackage

This sequence of commands will create a new C:\packages folder, and scaffold out a package named “metapackage” inside of that directory

Let’s clean up this newly scaffolded package. Issue the following command:


Remove-Item C:\packages\metapackage\tools -Recurse -Force

This will remove the tools folder and all the files inside of it, since we don’t need them. Next, open up the metapackage.nuspec file in your favorite editor. I’ll be using VSCode moving forward.


code C:\packages\metapackage\metapackage.nuspec

To make things a little easier, just replace the code inside of this file with the following:


<?xml version="1.0" encoding="utf-8"?>
<package xmlns="http://schemas.microsoft.com/packaging/2015/06/nuspec.xsd">
  <metadata>
    <id>metapackage</id>
    <version>0.1.0</version>
    <!-- <packageSourceUrl>Where is this Chocolatey package located (think GitHub)? packageSourceUrl is highly recommended for the community feed</packageSourceUrl>-->
    <!-- owners is a poor name for maintainers of the package. It sticks around by this name for compatibility reasons. It basically means you. -->
    <!--<owners>__REPLACE_YOUR_NAME__</owners>-->
    <title>metapackage (Install)</title>
    <authors>Your Name Here</authors>
    <tags>metapackage SPACE_SEPARATED</tags>
    <summary>Installs included packages onto a system</summary>
    <description>
        #Chocolatey MetaPackage Installer

        Installs the following applications:

        - Google Chrome
        - Foxit Reader
        - VLC Media Player
    </description>
    <!-- Specifying dependencies and version ranges? https://docs.nuget.org/create/versioning#specifying-version-ranges-in-.nuspec-files -->
    <dependencies>
      <dependency id="googlechrome"  />
      <dependency id="foxitreader"  />
      <dependency id="vlc"  />
    </dependencies>
  </metadata>
</package>

In our example package, we are going to be installing Google Chrome, Foxit Reader, and VLC at their latest versions available from our internal repository. In your environment, you may use any packages you have available in your internal repositories. You may also specify strict version requirements for each package. The versioning follows the NuGet standard, which you can reference here.

Packaging your metapackage

Once you have all of the packages you require for your metapackage to be complete, we can pack it to create the package nupkg. Issue the following command:

#Enter the package files directory
Set-Location C:\packages\metapackage

#Pack the package
choco pack metapackage.nuspec

Pushing your package

Next, let’s push this up to our internal repo


#Enter the package files directory
Set-Location C:\packages\metapackage

#Push the package
choco push metapackage.0.1.0.nupkg -s $YourRepositoryUrl

Testing your package

Now that we have our nupkg generated, we can test it:


choco install metapackage -y

You should see that all 3 of our packages are installed on the system.

Uninstalling metapackages

Thought must be taken when you wish to remove a metapackage. By default issuing a choco uninstall metapackage will not remove all of the dependencies installed by the package. In order to uninstall the metapackage, and all of its dependencies, use the -x or --remove-depenedencies command line argument.

We’ll uninstall the metapackage and all of its dependencies next.


choco uninstall metapackage -x

Wrapping up

If you’ve followed along with this document, you should have successfully created a metapackage and worked through the installation and uninstallation of that package. If you are stuck, our awesome community is here to help. Join the conversation on Gitter. Or, if you are a business customer, reach out through our commercial support channels.

Chocolatey metapackages….they’re sweet!

Read More

Using Azure Pipelines To Publish Chocolatey Packages

Hey folks! Wow, it’s been quite the dry spell, eh? Apologies, I’ve been super busy with my new role with Chocolatey Software. It’s involved a lot of Powershell, and the opportunity to go to a few conferences and network with you, the Community. If we’ve met in person, great! I’m so glad I’ve got to meet you! If we haven’t yet, look out for me at some upcoming events like AnsibleFest, Microsoft Ignite! Orlando, and next year at Powershell Summit to start. Now….onto business.

Premise

The topic here today is one near and dear to me involving Powershell, CI/CD, and Chocolatey! This post will be a walk-through guide to getting setup to deploy chocolatey packages as code in your environments.

Prereqs

To be successful here you’ll need the following:

  • An organization on Azure Pipelines (don’t worry, you can sign up for free here)
  • Chocolatey installed on the server/workstation to run this against
  • An Azure pipelines build Agent
  • The Chocolatey Extension added to your pipeline from the Azure Pipelines Marketplace

First things first, install Chocolatey

You can install chocolatey by following the instructions on chocolatey.org. There’s a lot of information there, but the line you’ll ultimately be after is this one:

Set-ExecutionPolicy Bypass -Scope Process -Force; iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1'))

If you are a Chocolatey For Business customer, you’ve likely already got choco installed, or have it available to install via an internal repository server, so we’ll skip that part here.

Next: An Azure Devops Account

You’ll need to sign up for Azure Devops in order to get an organization. You can do that by following the link here to the Sign Up page. I recommend the ‘Start Free’ link, rather than signing up with Github, so as not to mix things up too much.

Getting things ready in Azure: Step 1

The first thing to do is Navigate to Security by clicking the circle with your First intial in it, in the top right-hand corner of the azure devops window.

Once there, click on Personal Access Tokens, and create a new one. Here’s you’ll have options for expiration time, Scope, and various access levels. Select “All Scops” from the bottom, and give ‘Agent Pools’ the ‘Read & Manage’ right. Go ahead and also select ‘Read & Execute’ for the ‘Build’ Scope as well, since that is the type of pipeline we will be working on.

Click ‘Create’.

IMPORTANT: Make sure you keep the PAT key somewhere like Notepad, or in a Password vault, as once you have it copied, you cannot view it again.

Getting things ready in Azure: Step 2

Next, we’ll need to create an Agent pool. Click ‘Azure Devops’ in the upper left-hand area of your browser window to be taken back to the landing page. In the bottom left-hand corner, select Organization Settings.

From the menu on the left, under Pipelines select ‘Agent pools’ Click ‘Add Pool’ and give it a meaningful name. I used ‘Choco’…obviously.

Getting things ready in Azure: Step 3

Now it’s time to add an agent. Since we are using Chocolatey, this is as easy as running the following:

choco install azure-pipelines-agent -y --param="'/Token:yourtokenhere /Pool:PoolNameFromStep2 /Url:https://dev.azure.com/yourorgname'"

Once you run the above, you should be able to go back into Organization Settings, and under Agent Pool select the Pool you created, and then Agents to view your newly connected Agent.

Create your package

Create a chocolatey package. Put this new package into source control. I used Github. I’ll also give you some sample yaml to include in the repo, which will be used in the next step to create the pipeline.

pool:
    name: Choco
trigger:
    branches:
        include:
        -master
        exclude:
        -develop
steps:
-task: gep13.chocolatey-azuredevops.chocolatey-azuredevops.ChocolateyCommand@0
    displayName: 'Chocolatey pack'
    inputs:
        packNuspecFileName: 'yourpackagenuspec.nuspec'
-task: gep13.chocolatey-azuredevops.chocolatey-azuredevops.ChocolateyCommand@0
    displayName: 'Chocolatey push'
    inputs:
        command: push
        pushNupkgFileName: 'yourpackagename.nupkg'
        pushSource: 'http://yourinternalrepo/url'
        pushApikey: $(ChocoApiKey)
        pushForce: true

Change the above yaml to suit your needs. The only things you should need to worry about are the file names, the url, the agent pool, and the Apikey variable. More on that variable in the next step.

Create a pipeline

Back in Azure Devops, if you have not created a Project, do so now. I recommend naming it the same as the package you created.

Inside the project, select Pipelines, and then Builds, and create a new Build Pipeline. Select the Use the classic editor link at the bottom of the options. Select Github as your source, and configure any connections it requires, and then select your source repository and default branch (which should be master). On the next page select Configuration as Code, and select the yaml file from your repository. and save the pipeline.

Let’s talk quickly about that ApiKey variable. Under your new Build Pipeline click ‘Edit’ from the the menu if you’ve already saved it. You’ll notice when you edit the pipeline, your yaml file is shown. In the right-hand corner is a ‘Variables’ button. Click that button, and then select the ‘+’ sign to add a new variable. Give it a meaningful name, and paste the apikey that is used for your repository server. Ensure you select the ‘Keep this value secret’ checkbox. This will elimate the risk of it being exposed in your yaml and at runtime in logs.

It’s time to run!

If you’ve been able to follow all of the steps above, you should be good to go. Since you’ve got your package files ready to go on your repository, you should be able to queue up the pipeline, and let it run. Check it for errors, and correct any that show up.

If you have any questions, feel free to reach out to me on Twitter.

Read More

Pscmdlet Unwrapped

I’m sure we’ve all seen it in code online. We’re checking out some code, probably on Github, and we notice folks using this strange $PScmdlet thing. When I first started I considered this variable “black magic”, and really after getting used to it and what it does I still kinda feel like that’s what it is. But let’s dive in and unwrap it so we can see how the magic trick actually works.

Read More

It's Ok To Fail

We’ve all hit that low spot, when we are working with an especially difficult problem, and the code we are writing to try and solve it just won’t work the way we want it too. I often find that it is in these moments, when I’m contemplating throwing in the IT towel and becoming a hermit, that I grow the most.

Read More

Write Help! It Matters!

There’s nothing more frustrating that running Get-Help against a cmdlet, and not getting a lot of useful information back. The Powershell Help System is a robust tool that makes your scripts extremely discoverable to its users.

Read More

Adding Pipeline Support to Your Scripts!

You’ve probably used the pipeline 1,000 times and not thought about how exactly you can send the output of one command to another. It’s actually, incredibly simple to do, and enables you to have a LOT of flexibility in your scripts!

Read More

Setting up Jeykll to build with AzDOS

I spent the better part of the weekend exploring the various ways I could leverage CI/CD to streamline getting content that I’ve written onto my blog. This post will be about my journey, and hopefully provide a useful resource to you, should you choose to do something similar.

Read More

Using Toast notifications in Powershell

Notifications can be quite a bit more than just those annoying pop-ups you see down by your system clock from time to time. When created using Powershell, you can get an incredible amount of functionality out of them.

Read More