Konda.eu

Category Archives: Development

No comments

Symphony of Gulp and Bower for Front-end Web Development

Gulp and Bower. For some a nightmare and for some their best friends, but for all a must have in the modern web development. In this post I'm not going to talk about how to use one or the other. There is a gazillion (± 100) of other tutorials out there covering that. Take a look at this for Bower or this for Gulp. Instead, I'm going to talk about how they can ease your life and work together perfectly.

Bower

What is Bower and why should you use it?

Bower is a package manager for simple installation of front-end libraries. Libraries like, AngularJS, ReactJS, Bootstrap, Materialize and thousand more. It was created by two twitter employees back in 2012.

Installing and updating scripts is as easy as writing bower install angularjs --save in your command line or a terminal. It means you don't need to go to your browser, search and download Bootstrap (for example) and finally extract and copy required files into your project. Now imagine updating every single package in your project and there are 20 of them. An hour of pure boredom that will haunt you for days. Instead, you can use Bower, type bower update and do whatever you want while the magic is happening.

The downsides

If it's too good to be true, you probably have a fan boy talking. There are indeed a couple of downsides.

First of all, any tinkering with the source code of the packages is futile. The next update will override everything. So no more throwing components out of Bootstrap to reduce size because you just don't need them.

Second, the package you're searching might not exists. You don't really have a lot of options in that case. You can either revert back to the good old find, download, extract and copy way, beg someone to write that json file for a package or do it yourself. You can also download entire git repository with Bower, which leads us to other two problems.

Sometimes, git repository is exactly what you get. A folder with a bunch of files, when you almost always want only one or two. Or the other option when you get five versions of the same JavaScript library, one is probably bundled file, one is minimized version of it and a couple of more are there with no reason. Things like this really make you problems to automate everything, which is what we are going to (try) and solve.

Gulp

Gulp

Gulp is a task/build runner for development. Usually it handles most, ideally all, of the task related to the development workflow. There are a lot of packages to do simple tasks such as build sass to css, TypeScript to JavaScript, minimize and bundle things, I wouldn't be even surprised if there is a package that makes you coffee (and I don't have CoffeeScript in mind).

After first hour of toying with Gulp, frustrations and how nothing works, it's expected to search who is the author, search for his entire wider family and putting them on your kill list. But when it finally works and you realize how powerful it is, it'll always be your first weapon of front end web development. Just bear in mind that your disk might collapse into a black hole due to insanely large amount of files that come with NodeJS packages.

The problem of Gulp and Bower

Bower (package directory structure)

jQquery
    dist
        jquery.js
        jquery.min.js
        jquery.min.map
        jquery.slim.js
        jquery.slim.min.js
        jquery.slim.min.map
    sizzle
        // More JavaScript files
    src
        // another 109 JavaScript files
    ...

Gulp

gulp.task('scripts', function() {
    return gulp.src('./lib/*.js')
        .pipe(concat('all.js'))
        .pipe(gulp.dest('./dist/'));
});

As mentioned Bower gives you a bunch of files, that are if you're lucky only triplicated. On the other side, you have Gulp that should get only one file. That is jquery.js.

The Solution

One option is to manually update a json of files which are relevant to you. I honestly hope that at least someone sees this as a bad idea, waste of time and is too lazy to do it.

Instead of manually tracking which files we want, we're going to automate it. Automate everything! The first and the only problem is probably how to find relevant files for us? After an inspection of Bower files, you can see that in every bower.json there is a main property, which tells us what are the relevant files.

{
  "name": "jquery",
  "main": "dist/jquery.js",
  "license": "MIT",
  "ignore": [
    "package.json"
  ],
  ...
}

Now we have everything. To get those main files to Gulp, there is a package called main-bower-files.

var gulp = require('gulp');
var mainBowerFiles = require('main-bower-files');

gulp.task('scripts', function() {
    return gulp.src(mainBowerFiles('/**/*.js'))
        .pipe(concat('all.js'))
        .pipe(gulp.dest('./dist/'));
});

I have Visual Studio and NuGet!

Visual Studio is awesome. NuGet also, as long as front-end web development isn't involved. Packages are usually a couple of months old. Everybody that had fifteen minutes of spare time created its own package even if it already exists (now figure which one the best among the worst) and you have even less control than with Bower. So do yourself a favor and forget that NuGet exists for front-end web development.

Conclusion

Sit back and enjoy, how everything is installing and being updated for you, with a click of a button :).

 

Disclaimer: Problems mentioned exist purely trough my perspective. If your experience is different and you can convince me about it, please do so!

No comments

Continuous Integration from Visual Studio Team Services to Azure with NodeJs, ASP.NET 5, PHP...

NodeJs, ASP.NET 5, PHP, you name it, it can run on Azure as an App Service (previously known as Web App). But in modern development cycle this simply isn't enough. Things like CD (continuous deployment) and CI (continuous integration) are a must for a modern development process. Azure supports continuous integration of out the box, but you can quickly stumble upon an obstacle if you want to perform custom build/install steps such as NodeJs or ASP.NET 5 package restore, JavaScript bundling and minimisation...

Visual Studio Team Services (VS TS ex - Visual Studio Online) is absolutely awesome at this. It gives you free source control, managing work items and most importantly for me, continuous deployment and integration. But as always, not everything is as good as it seems. Things may quickly complicate when you go outside of Microsoft's waters.

Setting up the build process

My usual build process (bellow is specific for ASP.NET 5 website I used, if you're interested in ASP.NET 5 Build script, take a look at it here):

build-process

Note: Azure Web App Deployment is DISABLED, replaced by PowerShell script.

I solved all of my problems with a simple PowerShell script:

param($websiteName, $packOutput)

$website = Get-AzureWebsite -Name $websiteName

# get the scm url to use with MSDeploy.  By default this will be the second in the array
$msdeployurl = $website.EnabledHostNames[1]


$publishProperties = @{'WebPublishMethod'='MSDeploy';
                        'MSDeployServiceUrl'=$msdeployurl;
                        'DeployIisAppPath'=$website.Name;
                        'Username'=$website.PublishingUsername;
                        'Password'=$website.PublishingPassword}


$publishScript = "${env:ProgramFiles(x86)}\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\Web Tools\Publish\Scripts\default-publish.ps1"


Stop-AzureWebsite -Name $websiteName
. $publishScript -publishProperties $publishProperties  -packOutput $packOutput
Start-AzureWebsite -Name $websiteName

In short what it does, it connects to Azure, finds required Web App, stops it, syncs only differences to Azure to keep it up to date, and starts the Web App back up.

Usage itself is pretty simple. Add an Azure PowerShell build step in your build process. On the right side in the configurations, specify Azure connection type and subscription. In the path select path to the PowerShell script and provide arguments, which are which Web App and which directories to sync (example below).

My usual configuration is something like:

-websiteName WebsiteName -packOutput $(Build.SourcesDirectory)\bin\output

Don't forget to replace WebsiteName with your actual website name and build variables on msdn.

build-setting

Enabling continuous integration

Simply go to Triggers tab and check Continuous integration (CI) checkbox to enable it.

enable-ci

Conclusion

I am aware, that Azure also supports Custom Deployment Scripts, but I prefer things described above. That way you keep unnecessary files produced during build or things that don't really need to be deployed away, keeping your production environment clean and also integrating your entire development cycle with unit testing, load tests or building a C++ library if your app needs it.

No comments

ASP.NET 5 install and build script

I really like ASP.NET 5, but hate it when I'm left without Visual Studio to deploy it or am doing continuous integration from Visual Studio Team Services or other tools (such as vscode). To ease my pain I came up with a simple powershell script that does everything for me. By everything I mean install dot net execution environment, install required packages for selected project, build it and publish it.

# bootstrap DNVM into this session.
&{$Branch='dev';iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/aspnet/Home/dev/dnvminstall.ps1'))}

# load up the global.json so we can find the DNX version
$globalJson = Get-Content -Path $PSScriptRoot\global.json -Raw -ErrorAction Ignore | ConvertFrom-Json -ErrorAction Ignore

if($globalJson)
{
    $dnxVersion = $globalJson.sdk.version
}
else
{
    Write-Warning "Unable to locate global.json to determine using 'latest'"
    $dnxVersion = "latest"
}

# install DNX
# only installs the default (x86, clr) runtime of the framework.
# If you need additional architectures or runtimes you should add additional calls
# ex: & $env:USERPROFILE\.dnx\bin\dnvm install $dnxVersion -r coreclr
& $env:USERPROFILE\.dnx\bin\dnvm install $dnxVersion -Persistent

 # run DNU restore on all project.json files in the src folder including 2>1 to redirect stderr to stdout for badly behaved tools
Get-ChildItem -Path $PSScriptRoot\src -Filter project.json -Recurse | ForEach-Object { & dnu restore $_.FullName 2>1 }

# Restore packages
dnu restore

# Build to check for errors
dnu build

# Pack 
dnu publish --runtime active

Usage, extremely simple. Open powershell and run it!