Monday, 20 March 2017

Run any Windows app on any device with Azure RemoteApp

Important
Azure RemoteApp is being discontinued. Read the announcement for details.
You can run a Windows application anywhere on any device, right now, seriously - just by using Azure RemoteApp. Whether it's a custom application written 10 years ago, or an Office app, your users no longer have to be tied to a specific operating system (like Windows XP) for those few applications.
With Azure RemoteApp, your users can also use their own Android or Apple devices and get the same experience they got on Windows (or on Windows Phones). This is accomplished by hosting your Windows application in a collection of Windows virtual machines on Azure - your users can access them from anywhere they have an internet connection.
Read on for an example of exactly how to do this.
In this article, we're going to share Access with all of our users. However, you can use ANY app. As long as you can install your app on a Windows Server 2012 R2 computer, you can share it using the steps below. You can review the app requirements to make sure your app will work.
Please note that because Access is a database, and we want that database to be useful, we will be doing a few extra steps to let users access the Access data share. If your app isn't a database, or you don't need your users to be able to access a file share, you can skip those steps in this tutorial
Note
You need an Azure account to complete this tutorial:
  • You can open an Azure account for free: You get credits you can use to try out paid Azure services, and even after they're used up you can keep the account and use free Azure services, such as Websites. Your credit card will never be charged, unless you explicitly change your settings and ask to be charged.
  • You can activate MSDN subscriber benefits: Your MSDN subscription gives you credits every month that you can use for paid Azure services.

Create a collection in RemoteApp

Start by creating a collection. The collection serves as a container for your apps and users. Each collection is based on an image - you can create your own or use one provided with your subscription. For this tutorial, we're using the Office 2013 trial image - it contains the app that we want to share.
  1. In the Azure portal, scroll down in the left hand nav tree until you see RemoteApp. Open that page.
  2. Click Create a RemoteApp collection.
  3. Click Quick create and enter a name for your collection.
  4. Select the region you want to use to create your collection. For the best experience, select the region that is closest geographically to the location where your users will access the app. For example, in this tutorial, the users will be located in Redmond, Washington. The closest Azure region is West US.
  5. Select the billing plan you want to use. The basic billing plan puts 16 users on a large Azure VM, while the standard billing plan has 10 users on a large Azure VM. As a general example, the basic plan works great for data entry-type workflow. For a productivity app, like Office, you'd want the standard plan.
  6. Finally, select the Office 2013 Professional image. This image contains Office 2013 apps. Just a reminder - this image is only good for trial collections and POCs. You' can't use this image in a production collection.
  7. Now, click Create RemoteApp collection.
Create a cloud collection in RemoteApp
This starts creating your collection, but it can take up to an hour.
Now you're ready to add your users.

Share the app with users

Once your collection has been created successfully, it’s time to publish Access to users and add the users who should have access to it.
If you have navigated away from the Azure RemoteApp node while the collection was being created, start by making your way back to it from the Azure home page.
  1. Click the collection you created earlier to access additional options and configure the collection. A new RemoteApp cloud collection
  2. On the Publishing tab, click Publish at the bottom of the screen, and then click Publish Start menu programsPublish a RemoteApp program
  3. Select the apps you want to publish from the list. For our purpose, we chose Access. Click Complete. Wait for the apps to finish publishing. Publishing Access in RemoteApp
  4. Once the app has finished publishing, head over to the User Access tab to add all the users that need access to your apps. Enter user names (email address) for your users and then click Save.
Add users to RemoteApp
  1. Now, it’s time to tell your users about these new apps and how to access them. To do this, send your users an email pointing them to the Remote Desktop client download URL. The client download URL for RemoteApp

Configure access to Access

Some apps need extra configuration after you deploy them through RemoteApp. In particular, for Access, we're going to create a file share on Azure that any user can access. (If you don't want to do that, you can create a hybrid collection [instead of our cloud collection] that lets your users access files and information on your local network.) Then, we'll need to tell our users to map a local drive on their computer to the Azure file system.
The first part you as the admin do. Then, we have some steps for your users.
  1. Start by publishing the command line interface (cmd.exe). In the Publishing tab, select cmd, and then click Publish > Publish program using path.
  2. Enter the name of the app and the path. For our purpose, use "File Explorer" as the name and "%SYSTEMDRIVE%\windows\explorer.exe" as the path. Publish the cmd.exe file.
  3. Now you need to create an Azure storage account. We named ours "accessstorage," so pick a name that's meaningful to you. (To misquote Highlander, there can be only one "accessstorage.") Our Azure storage account
  4. Now go back to your dashboard so you can get the path to your storage (endpoint location). You'll use this in a bit, so make sure you copy it somewhere. The storage account path
  5. Next, once the storage account has been created, you need the primary access key. Click Manage access keys, and then copy the primary access key.
  6. Now, set the context of the storage account and create a new file share for Access. Run the following cmdlets in an elevated Windows PowerShell window:
    Copy
     
     $ctx=New-AzureStorageContext <account name> <account key>
     $s = New-AzureStorageShare <share name> -Context $ctx
    
    So for our share, these are the cmdlets we run:
    Copy
     
     $ctx=New-AzureStorageContext accessstorage <key>
     $s = New-AzureStorageShare <share name> -Context $ctx
    
Now, it's the user's turn. First, have your users install a RemoteApp client. Next, the users need to map a drive from their account to that Azure file share you created and add their Access files. This is how they do that:
  1. In the RemoteApp client, access the published apps. Start the cmd.exe program.
  2. Run the following command to map a drive from your computer to the file share:
    Copy
     
     net use z: \\<accountname>.file.core.windows.net\<share name> /u:<user name> <account key>
    
    If you set the /persistent parameter to yes, the mapped drive will persist across sessions.
  3. Now, launch the File Explorer app from RemoteApp. Copy any Access files you want to use in the shared app to the file share. Putting Access files in an Azure share
  4. Finally, open Access, and then open the database that you just shared. You should see your data in Access running from the cloud. A real Access database running from the cloud
Now you can use Access on any of your devices - just make sure you install a RemoteApp client.

Next steps


Tuesday, 21 February 2017

Application Insights and Semantic Logging for Service Fabric Microservices

Today’s data climate is fast-paced and it’s not slowing down. Here’s why your current integration solution is not enough. Brought to you in partnership with Liaison Technologies.
Borrowing heavily from MSDN documentation, the term semantic logging refers specifically to the use of strongly typed events and consistent structure of log messages. In Service Fabric, semantic logging is baked right into the platform and tooling. For example, if we look at any auto-generated .cs file for an actor, stateful or stateless service we see examples of logging via the ServiceEventSource or ActorEventSource classes:
ServiceEventSource.Current.ServiceTypeRegistered(Process.GetCurrentProcess().Id, typeof(AcmeService).Name);

When an event such as the one above is logged, it includes a payload containing individual variables as typed values that match a pre-defined schema. Moreover, as we’ll see later on in this article, when the event is routed to a suitable destination, such as Application Insights, the event’s payload is written as discrete elements, making it much easier to analyze, correlate and query. For those new to Application Insights, the following offical introduction provides a good starting point.

Having briefly defined semantic logging and mentioning that it’s baked into Service Fabric, we should clarify that ServiceEventSource and ActorEventSource inherit from EventSource, which, in turn, writes events to ETW. Event Tracing for Windows or more commonly ETW is an efficient kernel-level tracing facility built into Windows that logs kernel or application-defined events.

Given the above, we now turn our attention to exporting these ETW events to Application Insights or for that matter to any other supported target via two libraries, the Microsoft library aptly named Semantic Logging (formerly known as the Semantic Logging Application Block or SLAB) and the SemanticLogging.ApplicationInsights library (also known as SLAB_AppInsights).

As all my Service Fabric projects are in .Net Core xproj structure (see previous articles) I ended up contributing to Fidel’s excellent library by converting the SemanticLogging.ApplicationInsights project to .Net Core xproj. My humble contribution has been merged into the master SemanticLogging.ApplicationInsights branch by Fidel and is used in the rest of the article below. As the NuGet package is somewhat behind, we’ll first start by downloading the master branch directly from GitHub and by adding it to our Visual Studio 2015 solution. Your solution will end up looking something like this:
Image title

In your Service Fabric service (in my example AcmeService) edit the project.json:
{
  "title": "AcmeService",
  "description": "AcmeService",
  "version": "1.0.0-*",
  "buildOptions": {
    "emitEntryPoint": true,
    "preserveCompilationContext": true,
    "compile": {
      "exclude": [
        "PackageRoot"
      ]
    }
  },
  "dependencies": {
    "Microsoft.ServiceFabric": "5.1.150",
    "Microsoft.ServiceFabric.Services": "2.1.150",
    "EnterpriseLibrary.SemanticLogging": "2.0.1406.1",
    "SemanticLogging.ApplicationInsights": "1.0.0-*",
    "Microsoft.Extensions.PlatformAbstractions": "1.0.0",
    "Microsoft.Extensions.Configuration": "1.0.0",
    "Microsoft.Extensions.Configuration.FileExtensions": "1.0.0",
    "Microsoft.Extensions.Configuration.Json": "1.0.0",
    "Microsoft.Extensions.Configuration.Binder": "1.0.0"
  },
  "frameworks": {
    "net46": {
    }
  },
  "runtimes": {
    "win7-x64": {}
  }
}

Add an appsettings.Development.json file and make sure to set your ASPNETCORE_ENVIRONMENT variable accordingly. Moreover, you will need to set the Application Insights InstrumentationKey.
{
  "Logging": {
    "IncludeScopes": false,
    "LogLevel": {
      "Default": "Debug",
      "System": "Information",
      "Microsoft": "Information"
    }
  },
  "ApplicationInsights": {
    "InstrumentationKey": "YOUR KEY GOES HERE"
  }
}

We’ll add an AppSettings class so that we can bind our settings file to a strongly typed object:
namespace AcmeService
{
    public class AppSettings
    {
        public AppSettings()
        {
            ApplicationInsights = new ApplicationInsightsOptions();
        }
        public ApplicationInsightsOptions ApplicationInsights { get; set; }
    }
    public class ApplicationInsightsOptions
    {
        public string InstrumentationKey { get; set; }
    }
}

In a previous article we looked out how to share Asp.Net Core appsettings.json with Service Fabric Microservices so we’ll re-use the same logic and create a ConfigurationHelper:
using Microsoft.Extensions.PlatformAbstractions;
using Microsoft.Extensions.Configuration;
using System;
namespace AcmeService
{
    public static class ConfigurationHelper
    {
        public static AppSettings GetAppSettings()
        {
            var appSettings = new AppSettings();
            var configRoot = GetConfigurationRoot();
            configRoot.Bind(appSettings);
            return appSettings;
        }
        public static IConfigurationRoot GetConfigurationRoot()
        {
            IConfigurationRoot configuration = null;
            var basePath = PlatformServices.Default.Application.ApplicationBasePath;
            var environmentName = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
            if (!string.IsNullOrEmpty(environmentName))
            {
                var configurationBuilder = new ConfigurationBuilder()
                    .SetBasePath(basePath)
                    .AddJsonFile($"appsettings.{environmentName}.json");
                configuration = configurationBuilder.Build();
            }
            return configuration;
        }
    }
}

Now for the secret sauce, we create a LoggingHelper class which returns an ObservableEventListener. The class configures the Application Insights sink from the SemanticLogging.ApplicationInsights library:

listener.LogToApplicationInsights(...) 

And subscribes to Service Fabric ServiceEventSource events using the Semantic Logging library:

listener.EnableEvents(ServiceEventSource.Current.Name, EventLevel.Verbose); 
using Microsoft.ApplicationInsights.Extensibility;
using Microsoft.Practices.EnterpriseLibrary.SemanticLogging;
using System;
using System.Collections.Generic;
using System.Diagnostics.Tracing;
namespace AcmeService
{
    public static class LoggingHelper
    {
        public static ObservableEventListener GetEventListener()
        {
            ObservableEventListener listener = new ObservableEventListener();
            try
            {
                var appSettings = ConfigurationHelper.GetAppSettings();
                if (appSettings != null)
                {
                    TelemetryConfiguration.CreateDefault();
                    TelemetryConfiguration.Active.InstrumentationKey = appSettings.ApplicationInsights.InstrumentationKey;
                    listener.LogToApplicationInsights(TelemetryConfiguration.Active.InstrumentationKey, new List<ITelemetryInitializer>(TelemetryConfiguration.Active.TelemetryInitializers).ToArray());
                }
                listener.EnableEvents(ServiceEventSource.Current.Name, EventLevel.Verbose);
            }
            catch (Exception ex)
            {
                ServiceEventSource.Current.Message(ex.ToString());
            }
            return listener;
        }
    }
}

All that is now left is the addition of a “one-liner” to your Service Fabric Microservice (Program.cs) to enable Semantic Logging:

private static readonly ObservableEventListener _listener = LoggingHelper.GetEventListener(); 
using Microsoft.Practices.EnterpriseLibrary.SemanticLogging;
using Microsoft.ServiceFabric.Services.Runtime;
using System;
using System.Diagnostics;
using System.Threading;
namespace AcmeService
{
    internal static class Program
    {
        private static readonly ObservableEventListener _listener = LoggingHelper.GetEventListener();
        /// <summary>
        /// This is the entry point of the service host process.
        /// </summary>
        private static void Main()
        {
            try
            {
                ServiceRuntime.RegisterServiceAsync("LoggingServiceType",
                    context => new LoggingService(context)).GetAwaiter().GetResult();
                ServiceEventSource.Current.ServiceTypeRegistered(Process.GetCurrentProcess().Id, typeof(LoggingService).Name);
                // Prevents this host process from terminating so services keep running.
                Thread.Sleep(Timeout.Infinite);
            }
            catch (Exception e)
            {
                ServiceEventSource.Current.ServiceHostInitializationFailed(e.ToString());
                throw;
            }
        }
    }
}

And that’s about it! See how simple is it to get your Service Fabric application events sent to Application Insights? Given the event producer (your Service Fabric application) is decoupled from the target through the magic of ETW and Semantic Logging libraries, the exact same approach and with minimal code changes successfully allows me to target Elastic Search as the event target. In fact, for your systems, you might also prefer to send some events to Application Insights and others to an Elastic Search cluster. Lastly, I would like to conclude by saying if you find any of the above useful in your projects do consider contributing to Fidel’s excellent library or by creating completely new sinks for Semantic Logging!

Is iPaaS solving the right problems? Not knowing the fundamental difference between iPaaS and iPaaS+ could cost you down the road. Brought to you in partnership with Liaison Technologies.

Angular Tutorial (Update to Angular 7)

As Angular 7 has just been released a few days ago. This tutorial is updated to show you how to create an Angular 7 project and the new fe...