classic-azure-devops-build-process.jpg

Beginner’s Guide to Using the Classic Azure DevOps Build Process

Introduction

For one of our recent modernization projects, I had the opportunity to implement Continuous Integration (CI) using the Azure DevOps Build and Release process, so I thought I would share what I learned.  This is part 1 of a two-part series documenting first the Build Process and in a second part, the Release Process.  Neither is meant as a comprehensive document but together is a good ‘getting started’ guide for creating your own builds.

There are currently two different approaches in Azure DevOps for implementing continuous integration.  I’m only going to touch on the Classic Editor version, which Microsoft refers to as ‘Builds and Releases’.  The other version is called ‘Azure Multi-stage Pipelines’ – which I believe is now out of preview mode – and uses YAML files for saving the build definitions, which I hope to learn more about in the future.  Since my work on this project took me down the ‘classic editor’ path, that is what I will be focusing on today.  If you decide you want to play with the YAML version, there are a couple of cool tools available to help you start making the transition from classic to the new YAML way.

For our project, the purpose of the pipeline was to create a SharePoint .sppkg package file that can be deployed into multiple environments.  If you are not working on a SharePoint O365 project, then your steps will be different but will have a similar outcome.  The pipeline basically mimics what we do manually to create this package.  The big gain is that it is created automatically each time source code is merged into the master branch.  We no longer have to manually do each step, then place the .sppkg package where our QA person can find it.

The Build Process

The first thing you need is a project with an associated repository in Azure DevOps.  If you currently do not have one, you can follow a guide like this provided by Microsoft to set something up.  Once you have your repository added and some code to build, you are ready to go.

In this example, I’ll be working with a Git repository that is hosting a SharePoint O365 SPFx React project and uses Node.js, Gulp, npm and React.

Let’s Get Started

To get started, click on the Pipelines link found under the Pipelines link (huh?) and you will be shown a screen like the below.  Note that I already have two builds here.  The first is our official pipeline used for Dev, QA and Production builds, while the second is a build my co-worker added in order to do some of his own release testing.  You can have as many pipelines for a project as you need or want, which can come in quite handy.

Click the big blue ‘New pipeline’ button found in the upper right-hand corner which will take you to a screen with lots of confusing looking options.  Since I’m focusing on the classic editor, we will ignore the top portion and click on the small link at the bottom called ‘Use the classic editor’.  The classic editor walks you through using a wizard-like process to guide you as you create your build.

The first thing it wants to know is where your code is.  It will default to the current project, current repository, and the master branch of said repository.  If you have multiple repositories in your project, you can change to any of them and can select alternate branches for your build.  In most cases, we will accept what is chosen for us already.

Let’s Continue

Click the Continue button and we are taken to a screen where we can select a template to get us started or we can build from scratch with an empty job.  When starting out, I recommend selecting a template that is close to what you need as your jumping-off point.  As you get more proficient with the tool, you can start building from scratch by starting with an empty job.

As you can see, there are many templates to choose from.  We are using Node.js with Gulp, so I selected that template to get me started.  After selecting your template, click on the Apply button.

And now the real fun begins.  Notice that the template has put in key tasks for us already.

Skeleton

Now that we have the skeleton, let’s fill it out.  If we were to do generate our own package file manually each time, these are the steps we would need to do:

  • Open PowerShell (or your favorite other terminal) and change it to the correct project’s folder.
  • Make sure you are in the correct branch (usually master) and that your source code is current.
  • Now run the following in order:
    • gulp clean
    • gulp build
    • gulp bundle –ship (–ship means to prepare it as a standalone app package that can run in production)
    • gulp package-solution –ship
  • Then we copy the output created to source control or send it to the QA person.

Not really a process I want to follow each time I push a batch of changes to master.  Here is what the same steps look like when using the Pipelines process (taken from our current QA/Prod pipeline).  Note that I have steps in place for creating both a QA/Prod app package as well as a Debug app package that can be used by devs.  In the example you will follow here, I’ll only add the QA/Prod tasks for brevity.  Feel free to add the dev steps on your own.

Pipeline Walkthrough

Let’s walk through each section/step, starting with the Pipeline itself.  Here you specify the name of the Pipeline and tell it which Agent pool and Agent to use.  Fortunately, Microsoft provides us with a pool of agents, so we don’t have to build our own. Unless you just want to, of course.  What Microsoft provides for us are virtual machine images that you can use to run your pipeline and there are many to choose from.  I’ll admit to not learning enough about this when setting up my job and ended up just using the default agent, which for me was Ubuntu-16.04.  Probably not the smartest thing to do, but … it has worked out so far.  😊

For more information on Agents, both creating your own or using a Microsoft-hosted agent, give these links a look.

https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops&tabs=browser

https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/hosted?view=azure-devops&tabs=yaml

Next Step

The next step in the process is to specify the source location.  This should already be populated for you but feel free to change to some other codebase.  As you can see, it handles multiple types of repositories.

We’re now ready to tackle the meat of the process, which is building out our agent job.  If you started with the same template I did, you will see that several tasks are already added for us.  We just need to configure these and add our other tasks, configuring and ordering them as we go.

Since our project is Node-based, the first thing we need to do is install have it installed on the agent.  To do this, I clicked on ‘Agent job 1’ and clicked on the big ‘+’ sign to the right.  This brought up a long alphabetical list of tasks that we can use to cherry-pick the things we need.  Scroll down this list until you see the ‘Node.js tool installer’ and add it to the agent job.  Note that it was automatically added as the last task.  Of course, it should be the first task, so simply drag and drop it from the bottom to the top.  Tip:  If you select a task in the agent job first, it will add the new one below the selected task instead of at the bottom.

Our project requires Node 10.x so after adding the task, I changed the configuration to specify the specific version we need.  I also changed the Display name to something more meaningful.  Tip:  Most tasks have an ‘info’ button that will give you more information on what exactly the task will be doing.

The Next Step

The next task – npm install – task needs no further configuration by me as it already does what I need it to do, which is install npm.  (Since this is a newly created VM with each run, it has to do the installs (node/npm/gulp) with each run in order to bundle our code correctly.)

The next steps all revolve around gulp.  You will see that it adds one gulp task for us with some default values already set.  You will need to tell it specifically what you need it to do using the ‘gulp Task(s)’ and ‘Arguments’ fields.  For our first task, we will specify that it run a ‘build’ with no arguments.  Tip:  It’s also a good idea to get into the habit of changing the Display name to be meaningful and descriptive.  It really helps when revisiting the Pipeline later.

Also, note that the ‘JUnit Test Results’ section is sometimes set to run (Publish checkbox is checked).  This section is used for sending any test results from the gulp task to Azure Pipelines.  I haven’t delved into this or any unit testing yet, so within my processes, I have unchecked this.  In the future, I hope to learn more about automated testing, but for now, I just keep it unchecked for each gulp task.

Adding Gulp Tasks

We now add two more gulp tasks and configure them to match the steps we would typically do manually, specifically ‘bundle’ and ‘package-solution’.  When creating an SPFx project and running gulp bundle, it by default creates a ‘dev’ build.  We need to tell it to get it ready for QA/production though, so we will use the Arguments field for this by adding ‘–ship’ as an argument.

And now the package step, which is almost identical to the bundle step.

Congrats!

You just generated a production-ready .sppkg package.  But … where is it?  Right now, it is residing in the root of your repo on the agent, so we need to copy this elsewhere in order to ultimately publish it to an artifacts location.

Microsoft provides us with many predefined variables that we can use to help us with this process.  To move our file to a location we can use later, we need to use two tasks.  The first is the ‘Copy files’ task.  Within this task, we need to tell it what we are copying and where we want it to go.  For the file copy, we are going to use the Build.ArtifactsStagingDirectory variable and specify a ‘prod’ folder within this.  I’m also specifying what exactly I want to copy.  Since all I need for my SharePoint app is the .sppkg file, that is all we are going to copy.

To learn more about predefined variables, check out this link.  https://docs.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml

The template we selected includes an ‘Archive files’ task by default.  This task will compress the copied files into a zip format of your choice.  Since we are only dealing with one file, I removed this task from my pipeline.

Final Task

This leaves us with the final task needed to be able to access our file easily – the ‘Publish artifacts’ task.  This task does not require any changes if you follow the model Microsoft recommends of using the staging directory variable and a ‘drop’ folder.  You will see below that it is already prepopulated with what we need by default.

We are now ready to run our pipeline and create our package.  One really cool thing is that once you save and queue the job, you can track its progress by drilling down into the job and watching the tasks as they run. Below is a snapshot of our actual production build.  I’ve selected the Copy files task to show you how it expands the pre-defined variables for you as it incorporates them into the build.  In this run, all the tasks were successful and our file ended up in the drop folder.

So let’s go find it!  From Pipelines/Pipelines on the left, select your specific pipeline and then the job that just completed and you should see the below screen.  From here you can go find the file you just created.  To do this, click on the ‘x published’ link found under the ‘Related’ section which will take you to the ‘Artifacts’ list.

Finding the File

Here you will find the ‘drop’ folder and can drill down to anything that was published to it.  For SharePoint .sppkg files, it is usually buried deep in a SharePoint folder.  Remember that in the Copy file(s) task, we specified $(Build.ArtifactStagingDirectory)/prod as our target folder.  The target folder is found just under the ‘drop’ folder.  Drill down until you get to the file(s) you need where you will then have the ability to download.

This is a snapshot of actual output from our pipeline.  While in development, it’s nice to have a debug (dev) version and a prod version of the app package, so I added some extra steps in order to create and publish both, as you see here.

Conclusion

And that is it!  You now have a fairly robust automated build tied to your repository.  I hope this has been helpful.  Keep an eye out for my next blog post which will show you how to create a Release process tied to this build, that can be configured to run either automatically or manually, depending on your specific needs.

Enjoy!

Caroline SosebeeBeginner’s Guide to Using the Classic Azure DevOps Build Process

Join the conversation

This site uses Akismet to reduce spam. Learn how your comment data is processed.