Monday 7 May 2018

Exploring ASP.NET Core 2 and Angular 5 Applications with Docker -Advanced programming

When embarking on large software projects, developers weigh their options for front-end frameworks — Angular, React, Vue, Polymer, and Ember, to name a few. As for backend frameworks, their choices range from ASP.NET to ASP.NET Core, Node.js, Ruby on Rails, Django, and beyond.
Angular and .NET technology, such as .NET Core, are popular solutions for many teams. These development frameworks are supported by companies at the forefront of new technologies. ASP.NET 5 and .NET Core are supported by Microsoft, while Angular is supported by Google.
9 Questions All Businesses Should Ask About Cloud Architecture and Deployment
 to the support infrastructure of the companies behind the technology, .NET technology and Angular allow teams to write powerful web applications that — when controlled properly — can serve business needs indefinitely.
Projects that involve highly complex architecture demand frequent yet seamless testing and build deployments. These are essential elements to low downtime and excellent experiences for end users.
How do the best teams build high-performing software solutions for the business?

Use Docker to Improve Your Angular 5, ASP.NET Core 2 Apps

Development teams build more successful applications with Angular or .NET technology when they use Docker. Dockerizing applications is key to testing and building web applications in a lightweight yet rapidly executable way.

Why Should .NET Technology Teams Know About Docker?

Docker is actually a platform that facilitates the deployment of software in containers. The Docker website provides a helpful definition of containers and the important role they play in software development:
“A container image is a lightweight, stand-alone, executable package of a piece of software that includes everything needed to run it: code, runtime, system tools, system libraries, settings. Available for both Linux and Windows based apps, containerized software will always run the same, regardless of the environment.”
A container is a software component that sits on top on another piece of software in an isolated environment. The container can be used to create an isolated environment in which to develop, test, and launch software.
Then, whether deployed on Linux or Windows-based machines, the containerized applications will run smoothly on them despite any customized settings that differ from the environment in which their code was written and tested.
Use of Hyper-V, VMWare, or other activated virtualization tools are required in order to use containers.

How Do Teams Benefit by Using a Container?

There are a few reasons why developers prefer to use containers over normal virtual machines, or VMs.
  • Environment Consistency. Containers encapsulate all necessary application files and software dependencies. They serve as a building block that can be deployed on any compute resource regardless of software, operating system, or hardware configurations. When you run an application in a container, what you run on your local environment runs the same way on QA, staging, and production.
  • Developer Productivity. A developer no longer needs to install SDKs, track library versions, setup registries keys, and configure hosts — not to mention all the extra work that is normally required when he or she must be onboarded to a project. The only things the developer needs are the Docker and Docker compose files. With the container, everything else is pre-configured. This leads to faster integration of new team members and better overall team productivity.
  • Easy delivery and integration. Using Docker ise useful during the continuous integration (CI) and continuous deployment (CD) processes. Containers allow developers to track versions of application code and their dependencies. With a single command, teams can generate new environments for different software versions, roll-back versions, deploy a container to cloud providers, or implement containers in CI pipelines.
  • Operational efficiency. Containers allow technical teams to run multiple applications on the same instance — representing a major boost in efficiency in the use of computing resources. With containers, one can specify the exact amount of memory, disk space, and CPU to be used by a container.
  • Team efficiency. When several teams must develop multiple applications and services for the same product, containers allow them to focus their energy on more complicated tasks. An Angular team working in Linux can start their development machine using .NET technology and PHP service with the same configuration as they will run in production. Meanwhile, another team can create APIs, databases, web applications, storages, or other solutions on Linux or Windows for future purposes. This can be a big benefit when one system involves multiple technology stacks. It also helps to reduce the huge costs of getting new software up and running.
  • Portability between on-premise and cloud providers. If an application runs in a container, support teams can easily switch operations from an on-premise solution to a cloud provider, or vice versa. All you need to do is start a new instance on the provider and redirect any existing domains. Since it’s relatively easy to switch from local servers to the cloud, there is no reason why teams couldn’t switch between actual cloud providers. This is an effective way for product owners to control costs of services while also deriving the most value from storage and support services.

.NET Technology | the logos of various containerizing services available to developers
Docker isn’t the only containerizing service available on the market.

What is the Structure of a Container?

To answer this, imagine a team wants to run an Angular 5 application with nginx in a container.
They will have to deploy their own application in container my-application that will use nginx as the base image. nginx will use debian as base image. Relying on a base images ensures that the application runs in the same environment every time.
  1. debian:stretch-slim
  2. nginx:1
  3. my-app:1 

How Do You Dockerize an Application?

Below is a demonstration of how to Dockerize — or containerize — an application (a simple application that serves as a to-do list, written in Angular is served by a .NET Core API for storing data) will be Dockerized in this example.
By learning how to Dockerize an application, businesses can implement containers in their own development environments, saving themselves time and money as they speed development cycles along.
The code for the application in question can be found here in our company GitHub account.

Project Layout

Here is a simple overview of the project directory structure. Understanding this project should help you understand the dockerizing process.
  • /scripts/ – Docker compose files and scripts
  • /src/ – application source codes
    • /client/ – front-end Angular application
      • /src/ – Angular source code for the todos front-end
      • dockerfile – Angular front-end Dockerfile
      • docker.nginx.default.conf – nginx configurations from Angular
    • /server/ – .NET Core solution for the todos API
      • /docker-compose/ – Here will be my Docker-compose project (.dcproj)
      • /MyApp.Api/ – Todos ASP.NET Core API
      • /MyApp.Tests/ – Todos API tests
      • MyApp.sln – Todos ASP.NET Core API solution file
    • /configs/ – some configurations like stylecop.json.

How Do You Start Dockerizing an Application?

Separate containers can be built for frontend and backend applications. But in the end, they will actually run together smoothly.
Before this is possible, a single Docker-compose.yml for the application’s services is needed. This data is found in the /scripts/ folder.
Then  both applications can run like this:

Why Docker Compose is Key to Building Lightweight Solutions

Docker Compose, a tool which is typically installed with Docker, helps to define and run multi-container Docker applications as a single entity.
Creating a docker-compose.yml file allows developers to configure an application’s services, making it easier to build, test, and launch an application with a single command.
More information about running run multi-container Docker applications with Docker Compose can be found here.

How to Dockerize a .NET Technology Application

Visual Studio 2017 and ASP.NET Core have built-in Docker support.
Developers can build, run, and debug .NET technology in a Docker container inside Visual Studio using Visual Studio Tools for DockerHere is an article for how to Dockerize an application in the official ASP.Net core documentation.
Overall when you select a web application and choose “Add Docker Support“, VS will create an extra docker-compose project (docker-compose.dcproj), adding the following files:
  • Dockerfile: describes the environment in which your application will run. The base container is microsoft/aspnetcore.
  • The two docker-compose files — “docker-compose.yml” and “docker-compose.override.yml” —  include the build and run docker services.
.NET Technology | a drop down menu demonstrates how to activate docker on a desktop
When dockerizing a .NET Core Application, begin by moving the docker-compose file to the folder /scripts so all docker-compose services are in a single location. If the following configuration is left unchanged, the dcproj will not work.

The docker-compose project requires the docker-compose.yml to be in the same directory as the dcproj file. This is accomplished through symbolic links to the file.
Open the command prompt in your .dcproj folder and execute the link creation command

With the symbolic link in place, you can run the docker-compose project and it will use the docker-compose.yml in the scripts directory.
Once a MSSQL container is added in the compose file, it looks as follows:

This method is effective if when run with docker-compose up. But with Visual Studio, it tries to find the context relative to the .dcproj file. Instead, developers must use the docker-compose.override.yml in the .dcproj folder to override the context path.
When executing the docker-compose up command, if you do not specify a file, docker-compose tries to find and use “docker-compose.yml ” and “docker-compose.override.yml” in the same folder.
Our docker-compose.override.yml will look like this:

Now when we run our project from Visual Studio, it should start and run successfully.
.NET Technology | a screen shows the docker tool running in visual studio

Account for CI & CD Functionality in .NET Technology

We need to also build the application for environments other than Visual Studio. This is vital to integration with CI and CD servers and tools.
For that reason the Docker Tools create an extra yml file called “docker-compose.ci.build.yml”. With this file we can build the project outside of Visual Studio by executing something like “docker-compose -f docker-compose.ci.build.yml up ci-build”.
But in this file there are several dotnet commands that will queue up in a command pipeline. This does not make for a flexible approach. Instead, using a build script that can be integrated with different parameters to do different tasks simultaneously — build, run tests, analyze, publish — can be convenient to busy development teams.
While there are many approaches out there to synchronizing different parameters, in this instance I will use a simple tool called dotnet-script. It is easily installed in a container and is a runner for C# script. So we can use C# to build C# or anything else in that matter.
I have written several helper methods you can find in /scripts/builder/csx/common.csx. The main build commands are in the file /scripts/builder/build.csx and are very similar to the original docker-compose.ci.build.yml. They execute dotnet-restoredotnet-test, and dotnet-publish in a sequence.
See the code snippet in the next section.

Dockerizing an Angular 5 Application

When Dockerizing the Angular application, first add the build scripts. This is a simpler process, requiring developers to add only the yarn install and ng build.
See how to build Angular application on the framework GitHub page.
Our final build.csx file should look like this:
Now we can setup a common file that holds all build script.
This can be docker-compose.ci.build.yml, but for convenience, when we use docker-compose, I will use a docker-compose.override.yml file in /scripts folder.

Why Shouldn’t You Use the docker-compose.yml?

The Docker Tools for visual studio cannot tolerate anything besides ASP.NET and mssql in the docker-compose file. Therefore, developers must “trick” it.
First, put anything that is not a .NET technology-based application in the override file or in another file to which you can later refer. This will include:
  • ASP.NET Core build service
  • Angular build service
  • Angular run service

The Angular Dockerfile

The same way VS created Dockerfile in the API folder, we should create the Angular application Dockerfile in the Angular folder. We can use nginx as base container.
See more at nginx wiki page about nginx configurations.


About the Base Build Container

When building an ASP.NET application, a developer can simply use microsoft/aspnetcore-build
base image. But now with Angular and dotnet-scripts we will need some extra components installed in order for it to function properly.
How is this done?
The developer can create a new build image on top of microsoft/aspnetcore-build and install yarn, dotnet-scripts, and Angular/cli.

Build and Publish With Docker to Drive Your Efficiency

At the end of a project, teams normally build and deploy the finished application to a server. All that is needed to deploy the API application to a server at id=100.100.100.100 is this:

Even the most efficient software builds are resource-intensive. When time and talent are dedicated to building large applications to serve equally large and complex organizations, development teams must equip themselves with the right tools that make their work as streamlined as possible


The magic of containers and the approached described above is that, once set up, image starting, services setup, databases, Redis cache, RabbitMQ, Elasticsearch, identity providers, background jobs — or anything else you may need — become operable with a single command.

Angular Tutorial (Update to Angular 7)

As Angular 7 has just been released a few days ago. This tutorial is updated to show you how to create an Angular 7 project and the new fe...