Deploying your SPA to Azure with Gulp

Rob Greyling
Rob Greyling
20 Feb 2015
blog post featured image

This question keeps coming up from time to time, and while it may not always be the same front-end, this tutorial should at least get you going on the right path getting your build automated with Azure. Who knows, it may work for you - hopefully it does!

OK, so you've got a front-end SPA written in Durandal, and a back-end written in .NET taking the form of a NancyFx REST API?

No? Well close enough...if yours is similar, you can probably follow along most of the way, but the key thing we have in common is that you want to take advantage of being able to automatically deploy on Azure Websites using their nifty link direct to your source control - effectively using the cloud as your build and deployment server right? Alrighty then, let's go!

What you'll need

First, I'm going to make a few assumptions that you'll need to have in place before you can pick up from this article:

  • You have an Azure account where you can login to the portal and create a hosted Azure website;
  • You have your code checked into to some kind of Source Control (like GitHub for example) that can be accessed over the web. It doesn't matter if it's private - just as long as Azure would be able to reach it if given the right credentials;
  • Your Durandal based website is working locally and now you're ready to deploy.

Assembling the puzzle pieces

First you need to choose a source control branch that you want to deploy, and make sure Azure is paying attention to it. You can follow this tutorial to get that part setup. After that, you can follow on from here... go ahead, I'll wait :)

Ok, now we need to get our local environment setup. First you'll need to have installed node.js, at which point you should be able to open a console (Administrator level) to your web project root folder and run this:

npm install gulp -g

We now have gulp installed which we will use as the second half of our build process - the first half being MSBuild for the .NET bits.

Taking the 'ass' out of assets

One of the big questions that comes up is "how do I version my stuff so that regular visitors get the new asset updates?". It's a good question, and you could use some cache busting techniques suggested by some.

Of course, there are numerous post out there that will tell you NOT to add your version to the querystring, the biggest problem being that proxies around the web often do not cache pages with querystrings - and this therefore completely invalidates your reason for using it in the first place. What you end up with is some visitors to your site getting the latest scripts, and others who happen to arrive via a Squid proxy for example never updating to your latest scripts and they get strange or broken experiences.

I say, forget about versioning assets automagically - there's no point as it really doesn't take long in your code and you should be keeping control of that stuff anyway - especially when it's going to production, and in development you can just tell Chrome or Firefox dev tools not to cache.

I felt I needed to address that reasoning first as you'll see the versioning stuff built into the next parts as we setup the build and may have been confusing.

Getting it all ready to build automatically

As you probably know, most SPA's run off of one index.html in the root folder of the project. So I assume since you have the website working locally, this is already working as expected. The next thing I do is put a bit of logic in the Nancy viewmodel just, enough to tell the view engine what to spit out in the index.html render:

public class HomeModule : NancyModule
    public HomeModule()
        this.Get["/robots.txt"] = p => this.Response.AsFile("robots.txt");
        this.Get["/sitemap.xml"] = p => this.Response.AsFile("sitemap.xml");
        this.Get["/"] = x => this.HomePage();
        this.Get["/{path*}"] = x => this.HomePage();
    private dynamic HomePage()
        #if DEBUG
            dynamic model = new { IsDebug = true};
            dynamic model = new { IsDebug = false};
    return this.View["index", model];

You can obviously send any viewmodel details that make sense for your app, but right now we're interested in whether or not this is a debug or release build as you can see. Now that we know, we can do the following in the index.html (or razor file if that's your bag but I'm just using the SSVE bundled with Nancyfx which does the job nicely):

In the <head> tag

<link href="//" rel="stylesheet" type="text/css">


<link rel="stylesheet" href="/content/lib/bootstrap/css/bootstrap.min.css" />
<link rel="stylesheet" href="/content/lib/font-awesome/css/font-awesome.min.css" />
<link rel="stylesheet" href="/content/lib/datepicker/css/datepicker.min.css" />
<link rel="stylesheet" href="/content/lib/durandal/css/durandal.min.css" />
<link rel="stylesheet" href="/content/css/ie10mobile.min.css" />
<link rel="stylesheet" href="/content/css/site.css" />



<link rel="stylesheet" href="/content/css/deps.min-1.5.17.css" />
<link rel="stylesheet" href="/content/css/site.min-1.5.17.css" />


And lower down:

<script src="/content/lib/require/require.js" data-main="/app/main"></script>

<script type="text/javascript" src="/content/lib/site.min-1.5.17.js"></script>

As you can see, the debug build contains all of my assets as I would expect during development, but I'm expecting that all of these will be combined and minified to my site.min asset files when it comes to production. More on how to create those later.

But first, let's get MSBuild outputting this somewhere we can test it. Create yourself a batch file in the solution root folder that you can use to test MSBuid that outputs to a build folder.

The reason for doing this is not because you don't trust Visual Studio to build, but because you want your local efforts to mimic the deployment environment as much as possible and for that you need MSBuild. It should end up looking something like this:


c:\Windows\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe MySpa.Web\MySpa.Web.csproj /nologo /verbosity:m /t:Build /t:pipelinePreDeployCopyAllFilesToOneFolder /p:_PackageTempDir=../build;AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release

Go ahead and double click it and check that the website pieces output to the build folder as intended. This will be the first step you execute when it comes to testing this locally before Azure can get its grubby paws on the release.

Unfortunately, once you get past .NET code into JavaScript and Html territory, MSBuild doesn't distinguish between which files should be deployed and which should be ignored, so we need to make sure that the output to the build folder is as clean as you can get it. For that, we're going to add it to the list of Gulp tasks that take care of it for us.

Gulp it all down - yes all of it

In the beginning, the custom build tool for Durandal was called Weyland which was actually pretty darn good. It is, however, no longer maintained or supported now, and the recommended approach is to use gulp. This will also help in the long run as if you decide to use Durandal vNext - AKA Aurelia, where gulp is also the recommended approach, so you will not need to throw the skills away when migrating.

First thing you want to do is create a package.json file and gulpfile.js in your root folder of your solution. Now I could give you all the commands to install each package you need from npm individually, but I'll just provide an example package.json here and you can use that as a starting point:

  "name": "MySpa",
  "abbr": "ms",
  "instance": "MySpa.Web",
  "version": "1.5.17",
  "description": "Run Gulp",
  "main": "gulpfile.js",
  "repository": {
    "type": "git",
    "url": "https://your.repository.url"
  "devDependencies": {
    "gulp": "^3.8.7",
    "gulp-concat": "^2.3.5",
    "gulp-debug": "~1.0.0",
    "gulp-jshint": "^1.8.4",
    "gulp-rename": "^1.2.0",
    "gulp-sass": "^0.7.3",
    "gulp-uglify": "^1.0.1",
    "gulp-durandal": "~1.1.5",
    "gulp-util": "~3.0.1",
    "karma": "~0.12.23",
    "karma-jasmine": "~0.1.5",
    "karma-chrome-launcher": "~0.1.4",
    "karma-phantomjs-launcher": "~0.1.4",
    "requirejs": "~2.1.15",
    "karma-requirejs": "~0.2.2",
    "karma-html-reporter": "~0.2.4",
    "gulp-minify-css": "~0.3.8",
    "gulp-clean": "~0.3.1",
    "rimraf": "~2.2.8",
    "gulp-rimraf": "~0.1.0",
    "run-sequence": "~0.3.6",
    "gulp-minify-html": "~0.1.5"
  "dependencies": {
    "gulp": "^3.8.7",
    "gulp-concat": "^2.3.5",
    "gulp-debug": "~1.0.0",
    "gulp-jshint": "^1.8.4",
    "gulp-rename": "^1.2.0",
    "gulp-sass": "^0.7.3",
    "gulp-uglify": "^1.0.1",
    "gulp-durandal": "~1.1.5",
    "gulp-util": "~3.0.1",
    "karma": "~0.12.23",
    "karma-jasmine": "~0.1.5",
    "karma-chrome-launcher": "~0.1.4",
    "karma-phantomjs-launcher": "~0.1.4",
    "requirejs": "~2.1.15",
    "karma-requirejs": "~0.2.2",
    "karma-html-reporter": "~0.2.4",
    "gulp-minify-css": "~0.3.8",
    "gulp-clean": "~0.3.1",
    "rimraf": "~2.2.8",
    "gulp-rimraf": "~0.1.0",
    "run-sequence": "~0.3.6",
    "gulp-minify-html": "~0.1.5"

Notice near the top of the file - there is the version specified for the JavaScript side of the equation - we'll be using that later in gulp.

Once you've got the file saved, then run this command from the console to download all the packages:

npm install

You can of course use npm to make sure you have the latest versions of those packages above, but I'll leave that as an exercise for the reader.

Next comes the gulpfile.js where we tell gulp exactly how to construct our site. We've already got half of it done in a build folder because of the MSBuild step above, and now we just need to polish it. Gulp is just written using plain old JavaScript, so first we will tell it to import all the bits we need for our tasks:

var pkg = require('./package.json');

// Include gulp
var gulp = require('gulp');

// Include Our Plugins
var runSequence = require('run-sequence');
var clean = require('gulp-clean');
var jshint = require('gulp-jshint');
var minifyCSS = require('gulp-minify-css');
var concat = require('gulp-concat');
var uglify = require('gulp-uglify');
var rename = require('gulp-rename');
var debug = require('gulp-debug');
var durandal = require('gulp-durandal');

Now add a paths object to provide easy access for the tasks to the files you want them to deal with:

var paths = {
    root: pkg.instance,
    rootFiles: [
        pkg.instance + '/*.html',
        pkg.instance + '/favicon.ico',
        pkg.instance + '/Web.config',
        pkg.instance + '/robots.txt',
        pkg.instance + '/App_Offline.bak'
    bin: pkg.instance + '/bin/*.dll',
    app: pkg.instance + '/app',
    scripts: pkg.instance + '/app/**/*.js',
    html: pkg.instance + '/app/**/*.html',
    sass: pkg.instance + '/content/sass/**/*.scss',
    libs: pkg.instance + '/content/lib/**/*.js',
    fonts: [
        pkg.instance + '/content/lib/bootstrap/fonts/bootstrap/*.*',
        pkg.instance + '/content/lib/font-awesome/fonts/*.*'
    img: [
        pkg.instance + '/content/img/*.*'
    css: [
        pkg.instance + '/content/lib/bootstrap/css/bootstrap.min.css',
        pkg.instance + '/content/lib/font-awesome/css/font-awesome.min.css',
        pkg.instance + '/content/lib/datepicker/css/datepicker.min.css',
        pkg.instance + '/content/lib/durandal/css/durandal.min.css',
        pkg.instance + '/content/css/ie10mobile.min.css'

Next we add our main task which is executed by default when running gulp from the console.

// Default Task
gulp.task('default', ['build']);

If you run that it will break, because we've told gulp that our default task is called 'build' and we haven't defined it yet - so let's do that in the file:

gulp.task('build', function (callback) {
        ['lint', 'sass', 'deps-css', 'fonts'],

First thing you may notice above is the runSequence call. You see, gulp is multi-threaded and can run tasks concurrently. This is not always a good thing since you don't want to be cleaning a folder out and pushing files into it at the same time. That would be hilari.... no, no it wouldn't.

So we use that runSequence plugin to make sure certain calls are made sequentially, and others are run in parallel. Everything you see grouped in the array, we are telling gulp can be done in parallel and won't affect each other. Otherwise, it should use the sequence you specify.

So first, we run our 'clean' task - oh crap, we don't have that yet. Come to think of it, we don't have any of those tasks listed yet. In that case, we better add them - here you go:

//Clean build folders
gulp.task('clean', function () {
    return gulp.src(['build/app', 'build/content/sass', 'build/content/lib', 'build/specs', 'build/test-main.js'], { read: false })

// Lint Task
gulp.task('lint', function () {
    return gulp.src(paths.scripts)
        .pipe(jshint({ sub: true })) // ignore dot notation errors

// Compile Our Sass
gulp.task('sass', function () {
    return gulp.src(paths.sass)
        //.pipe(debug({ verbose: true }))
        .pipe(concat('site-' + pkg.version + '.css'))
        .pipe(minifyCSS({ keepBreaks: true }))
        .pipe(rename('site.min-' + pkg.version + '.css'))

// Compile Our dependency CSS
gulp.task('deps-css', function () {
    return gulp.src(paths.css)
        .pipe(minifyCSS({ keepBreaks: true }))
        .pipe(rename('deps.min.' + pkg.version + '.css'))

// Copy our fonts
gulp.task('fonts', function () {
    return gulp.src(paths.fonts)

gulp.task('durandal', function () {
    return durandal({
        output: 'site.min.' + pkg.version + '.js',
        almond: true,
        minify: true

Notice that we only clean out certain folders - for example we leave build/bin alone because those are the .NET files needed to run the site. We only use Gulp to clean out the folders where content and scripts live so that they get properly recycled on the build. Of course there are a million ways to skin this cat, so decide on a strategy that works for you.

Also notice the use of pkg.version in naming the output css and js files. That will sync up with what we have output from the .NET side.

The 'durandal' task is the bit of magic which crawls through your Durandal main.js require bits and pulls all the dependency graphs together into one output file. I would recommend setting minify to false at first while you're testing because it take incredibly long to uglify as you may or may not have experienced elsewhere. Once you know it's working, then flip minify to true to test that nothing got lost in translation.

I have a few other gulp tasks that I use because I run a TDD setup, but those are out of scope for this article. Also, be aware that you can break your gulpfile into multiple files when they get too big, but it's easy to find tutorials on that as well so I'll leave it up to you.

Now you should be able to call gulp from the console and watch your project output to the same build folder that MSBuild used, but now with a clean build.

Testing the output locally

Right! So you have this build folder with a potentially working site - now you need to test it to see if all looks well before you dive into the Azure swimming pool. You'll want to test on IIS if you can or something that'll run the .NET code. For me, the easiest thing it to just install Microsoft Web Matrix. It's small, lightweight and quickly lets you test if you site is running the way you want.

Once installed, you should be able to right-click on your build folder and hit the "Open with Web Matrix" option which will fire up an IIS Express instance on that folder and once it's open, just hit Run in the interface and your browser will open the site. Fingers crossed - it's working! If not, then at least you didn't put it live right?! =)

Deployment Script and KuduSync

Assuming things are running smoothly on your local deployed site, then you can be pretty sure it'll run up on Azure in its minified form. In order to get the deployment going automatically, you'll need to setup the following files in your solution root folder:


command = deploy.cmd

and deploy.cmd. Here's the file first and I'll do some explaining after:

@if "%SCM_TRACE_LEVEL%" NEQ "4" @echo off

:: ----------------------
:: KUDU Deployment Script
:: Version: 0.1.10
:: ----------------------

:: Prerequisites
:: -------------

:: Verify node.js installed
where node 2>nul >nul
  echo Missing node.js executable, please install node.js, if already installed make sure it can be reached from current environment.
  goto error

:: Setup
:: -----

setlocal enabledelayedexpansion

SET ARTIFACTS=%~dp0%..\artifacts





  :: Install kudu sync
  echo Installing Kudu Sync
  call npm install kudusync -g --silent
  IF !ERRORLEVEL! NEQ 0 goto error

  :: Locally just running "kuduSync" would also work
  SET KUDU_SYNC_CMD=%appdata%\npm\kuduSync.cmd
  SET DEPLOYMENT_TEMP=%temp%\___deployTemp%random%


  SET MSBUILD_PATH=%WINDIR%\Microsoft.NET\Framework\v4.0.30319\msbuild.exe

:: Deployment
:: ----------

echo Handling .NET Web Application deployment. Starting %TIME%

:: 1. Restore NuGet packages
IF /I "MySpa.sln" NEQ "" (
  echo Restoring Nuget Packages: Starting %TIME%
  call :ExecuteCmd "%NUGET_EXE%" restore "%DEPLOYMENT_SOURCE%\MySpa.sln"
  IF !ERRORLEVEL! NEQ 0 goto error
  echo Restoring Nuget Packages: Finished %TIME%

:: 2. Build to the temporary path
echo Building VS Solution: Starting %TIME%
  call :ExecuteCmd "%MSBUILD_PATH%" "%DEPLOYMENT_SOURCE%\MySpa.Web\MySpa.Web.csproj" /nologo /verbosity:m /t:Build /t:pipelinePreDeployCopyAllFilesToOneFolder /p:_PackageTempDir="%DEPLOYMENT_SOURCE%\build";AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release /p:SolutionDir="%DEPLOYMENT_SOURCE%\.\\" %SCM_BUILD_ARGS%
) ELSE (
  call :ExecuteCmd "%MSBUILD_PATH%" "%DEPLOYMENT_SOURCE%\MySpa.Web\MySpa.Web.csproj" /nologo /verbosity:m /t:Build /p:AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release /p:SolutionDir="%DEPLOYMENT_SOURCE%\.\\" %SCM_BUILD_ARGS%
echo Building VS Solution: Finished %TIME%

IF !ERRORLEVEL! NEQ 0 goto error

:: 3. Restore NPM packages
IF /I "packages.json" NEQ "" (
  echo Installing npm packages: Starting %TIME%
  call npm install --production
  echo Installing npm packages: Finished %TIME%
  IF !ERRORLEVEL! NEQ 0 goto error

:: 4. Restore Gulp packages and run Gulp tasks
IF /I "gulpfile.js" NEQ "" (
  echo Installing Gulp dependencies: Starting %TIME%
  call npm install gulp
  echo Installing Gulp dependencies: Finished %TIME%
  IF !ERRORLEVEL! NEQ 0 goto error
  echo Running Gulp deployment: Starting %TIME%
  call :ExecuteCmd "%DEPLOYMENT_SOURCE%\node_modules\.bin\gulp"
  echo Running Gulp deployment: Finished %TIME%
  IF !ERRORLEVEL! NEQ 0 goto error

:: 3. KuduSync
  echo Running Kudu Sync: Starting %TIME%
  call :ExecuteCmd "%KUDU_SYNC_CMD%" -v 50 -f "%DEPLOYMENT_SOURCE%\build" -t "%DEPLOYMENT_TARGET%" -n "%NEXT_MANIFEST_PATH%" -p "%PREVIOUS_MANIFEST_PATH%" -i ".git;.hg;.deployment;deploy.cmd"
  echo Running Kudu Sync: Finished %TIME%
  IF !ERRORLEVEL! NEQ 0 goto error


:: Post deployment stub
IF !ERRORLEVEL! NEQ 0 goto error

goto end

:: Execute command routine that will echo out when error
set _CMD_=%*
call %_CMD_%
if "%ERRORLEVEL%" NEQ "0" echo Failed exitCode=%ERRORLEVEL%, command=%_CMD_%
exit /b %ERRORLEVEL%

echo An error has occurred during web site deployment.
call :exitSetErrorLevel
call :exitFromFunction 2>nul

exit /b 1


echo Finished successfully.

This is a script that targets an app called KuduSync that lives up on Azure. It runs on node.js itself and gives you the ability to do numerous nifty things. I would check out the project if you'd like to know more.

Most of this kudu file is boilerplate environment vars etc. that don't really change between projects. I've used almost the same script on numerous projects now and they work pretty well. As you can see, it performs the following basic functions which you may notice are very similar to what you setup on your local environment:

  • Ensures node.js is installed
  • Ensures all the NuGet packages referenced in your .Net solution are available
  • Runs the MSBuild command to output the .NET bits to the build folder
  • Checks that all the npm packages are installed
  • Checks that gulp is installed and runs the default gulp task
  • Uses KuduSync to sync the output build folder with the live website files assuming the build was successful which means your website should now be live!

Azure tips and tricks

You might be wondering if Azure is just a big black box that you throw your code at and hope it works, or if there is actually a way you can peek inside to know what's going on under the hood. You'll be happy to hear there are a number of places you can poke your nose in to see if things are going well.

Firstly, you can FTP to your website, where you will find not only your website files, but along side that, all the web logs and the deployment folders used to store Nuget and node.js packages etc. You have access to browse all of that. You can find the FTP credentials and host in the Azure portal on the dashboard for your website if you look at the publish profile settings - so take a look, it's well worth knowing what files are all there.

The second thing you can do is take a look at the process running during a deployment. Through the new Azure portal, you'll notice a "Processes" tile you can open to see them. I sometimes use this to kill the node.js or w3wp process if it look like they have hung - a handy tip to know.

Another thing you need to know is that it takes WAY longer to build up on Azure than locally because the CPU in your beefy dev rig will far outstrip the instance your website is running on. And the minification of your code seems to take the most time in this process.

So I recommend checking how long it takes on your local machine, and giving yourself at least 3 times longer before getting trigger happy with the kill process button. I also recommend going to your website application settings (the same place you put connection strings and app settings) and adding the following alongside your other app settings to set the deployment timeout higher: SCM_COMMAND_IDLE_TIMEOUT to a value of around 600. That should give you 10 minutes of inactivity before Kudu barfs which you likely won't need, but will frustrate you if you didn't have it in place because it means kicking off the build once again.

Lastly, you need to be ready for your very first deployment to undoubtedly fail. The reason for this I've found is the amount of time it takes the first time for Nuget and npm to download all their respective packages. But don't worry, run it again and the files already there won't be downloaded again and will likely pass this time, but it is a bit of a kick in the teeth when you notice that the first time around. Good luck!


So hopefully this has been helpful in getting you started (and done) deploying automatically to Azure with as little fuss as possible. I've gotta say that once this stuff is in place, deployment and releasing in general becomes much less of a chore and you're more likely to get more frequent awesomeness out to your users, giving you a good reason to be down the pub having a well-deserved tipple of choice...

Until next time... RobertTheGrey

Close chatbot
Open chatbot
Open chatbot