Nov 222019
 

Background

In our development process e believe a lot in microservices and the CICD philosophy. To achieve that we have introduced feature toggles to control the features that are already delivered to the customers. So when the feature is stable enough we enable the feature on the next update and the new features is usable in the product.

With the passage of time and maturity in the product we moved to tenant based solution. The biggest challenge that we faced in that was to toggle functionality based on customer. We had our toggles loaded in the start up of the application. After the start up the objects were registered based on the toggle and the application was started. Now in case of single Tenant installation that was not a problem as we could easily restart the customer to load the latest token. With a multi tenant installation it’s a bit hard to do that as we have to restart all the tenants in order to load the tenants.

So in this post I am going to address this problem and also discuss the solution that we implement. So our old system had the following architecture.

There are my reference classes

So I have a two services service class On and service class Off. The service class On will be called if the toggle is set to ON the service class Off should be called if the toggle is set to OFF

Here is where I am using the toggle and the service so the following steps are performed in this class

  1. Load the toggle
  2. Initialize the service collection
  3. Register based on toggle
  4. Initialize the service
  5. Access the Serve method
  6. Ask user if he wants to continue and if yes continue to step 5

Now when I run this code I will see the following output

Now in this output even if I change the content of the toggles.txt the new toggles will not be effected until I restart the application.

So our solution was to handle it with some changes in the way we load and register the application. so instead of loading the toggle in the beginning we added that part in out interceptor. Before I got into details let me talk a bit about the interceptor.

In the old solution we had toggle information loaded before we access the object. So we already know what type of object it was. In the new solution we will write an interceptor that will be called every time we initialize that service object. Which will give us the opportunity to figure out the toggle before assigning the type of the object lets see the code for better understanding.

So the most important thing to note here is on line 17,18 and 19. We register both the On and off service and then we register the service interface using the Register Capability Components. The register combability is added via extension method to the service collection.  

Every time we access the “Serve” method this capability interceptor is called and the toggle is loaded to see load the type of the object. This type if based on the provided toggles. So the toggles are accessed and the type is determined before the method invocation. Lets run the program again to see the output.

As mentioned above the toggles are updated with stopping the application and the latest change was impacted. This is how we solved the multi feature multi-tenant dynamic invocation problem. I know that this could be done in million ways and we could also optimize the code to be more efficient. This code example is just to highlight the problem and a potential solution.

The code is available at

https://github.com/alineutron/toggledependencyinjection

 Posted by at 10:34 am
Nov 222019
 

Step 1: Register an app in active directory app registration

You need to do a new application registration

Once that application is registered you need the application Id which is also called the client id. The other thing that you need to configure the permission.

In permission you need to give access to log analytics data as user that should be given

Step2: then we should have a secret id from the azure

That you can get from the key configure that in the app that you just register and make a new key you need o save that

Once you have that lets switch to log analytics

Step3: you need to the workspace id from the log analytics

Step4: to access control

Add a new role assignment

Role: Contributor

Select: the app that you created

Save

Now you have set the framework and the two way communication between the app and log analytics. Now comes the code part so we need the following code

Lets walkthrough the code a bit.

So the first thing that we do in GetMetricBeat is to get the access token. Before we move forward we can see how that is made. So to get the token we are contracting the URL. To access the long analytics the URL is  ‘https://login.windows.net/’tenantID. Once that is constructed we need to create the client credential for the app that that we have created by providing the clientappID and clientAppSecret to acquire the access token.

After that we are constructing a very basic log analytics query which is in Json format.

To access the log we have to create a HTTP request to ‘https://api.loganalytics.io/v1/workspaces/{workspaceId}/query’ by providing the workspace if. The http client must be given the access token for a valid request. Once we post the request a json will be returned as a return string value that we can parse for our result.

 Posted by at 10:28 am
Nov 122019
 

To do that we need the executable MSbuild.exe to build our project. Normally its part of the environment variables so if you open command window and type MSBuild.exe and hit enter. If you see any error saying that the command is not found then it means that its not part of the environment variable.

Normally its located  at this path C:\Program Files (x86)\MSBuild\14.0\Bin so either you open this path and write the command there or you add that to the environment variables.

After that its really easy to run the build all you have to do is write the command

 

MSBuild.exe solutionname.sln

 

This command will build the solution. This command could come really handy if you plan to make your own power tools.

 Posted by at 10:31 pm
Nov 122019
 

Lets start by thinking what we want to do here. Now there could be some requirement where we want to log all the web calls that user is making so that we can log some audit trail.

Audit trail are good to keep accountability in the application so that problem and breach detection is easy and trackable.

If you want to add audit trail on a request level its really easy. You can make a filter that can intercept each web request and log that action. You can either log the action name or decorate each action name and use that to log.

Lets start by writing some code. So first we need to write that attribute or filter

audit1

If you notice this is a very basic class that is driving from the action filter attribute. By doing that we also get access to the overrideable methods of ActionFilterAttribute. One of which is OnActionExecuted. This method will be invoked when the action method is finished execution.

This method will be called on each web request and you can log/validate the action call here.

Lets say there are some action that you don’t want to log like api calls. In those cases you can write exception classes which are simple action attribute.

audit2

To use this we will go to our LogThisAttribute and add a exclusion so that we can skip logging if he action method is decorated with DoNotLogThis attribute.

audit3

Now that we have all the building blocks setup lets start using it.

I am going to use it in my basic MVC Dotnet Core web application. To use this I need to configure it in the startup.cs class

audit4

If you notice on line 38 I added the filter to the global Filters so it should be called on every action call. This is added in the MVC middleware options.

NOTE: IN the action executed method one could say that I only want to login stuff on successful execution else I want to skip the logging. There is no directly way of handling that except you throw a validation exception indicating that the method execution was unsuccessful. You have to process that exception in the context and make decisions on it.

You can find the code at

https://github.com/alineutron/Lab/tree/master/dotnet/HttpLogIntercepter

 

 Posted by at 10:28 pm
Nov 122019
 

Azure graph search is a search that you can do from the azure cli. Normal you cant just directly do it you have to install the extension for the azure graph. To do that you have to open the CLI and add the extension.

>az extension add – – name resource-graph

This will add the resource graph extension to the azure cli

You can view the list of the extension by using this command.

>az extension list

After that you are ready to take the advantage of the graph query there are many command you can run you can find them by running

>azure graph query -h

To list down all the projects run this command

you can view whats under a certain subscription by running this command

  • az graph query -q “project id” -s “subscriptionid”

this will give you a json presentation of the output if you want to have a more readable output you need to pass the table output paramegter.

  • Az graph query -q “project id, name” -o table

If you want to add another command in the query you have to pipe the command the easest example is to sort the ourput so run this command

  • Az graph query -q “project id,name | order by name” -o table
 Posted by at 10:20 pm
Nov 122019
 

Most of the configuration that we do n azure portal can also be generated in the form of ARMN templates. Azure Resource Manager (ARM) templates dictate how the resources will be provisioned on azure.

In this blog I will follow a very basic guide in creating a ARM template. This guide consist of 6 different steps that I will follow.

 

are Azure Resource Managere template This command is to validate the arm template

az group deployment validate –resource-group videolunch –template-file .\template.json –parameters .\parameters.json

these steps are used to make an arm template

step1: use git

step2: validate and then commit

setp3: reduce the number of parameters. Remove it from the parameters json and add the variables thr instead

step4: use unique string

step5:user vaiables. Use variables instead of constants

step6: use tshirt size or smart options. Use readble words

creating a template

template manager in azure portal

using VS create a  cloud managers and then ARM template

it contains

list of parameters, variables, resources, outputs

 

Template format

In its simplest structure, a template has the following elements:

JSONCopy

{ “$schema”:”https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#”, “contentVersion”:””, “apiProfile”:””, “parameters”: {  }, “variables”: {  }, “functions”: [  ], “resources”: [  ], “outputs”: {  }}

Element name Required Description
$schema Yes Location of the JSON schema file that describes the version of the template language.

For resource group deployments, use: https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#

For subscription deployments, use: https://schema.management.azure.com/schemas/2018-05-01/subscriptionDeploymentTemplate.json#

contentVersion Yes Version of the template (such as 1.0.0.0). You can provide any value for this element. Use this value to document significant changes in your template. When deploying resources using the template, this value can be used to make sure that the right template is being used.
apiProfile No An API version that serves as a collection of API versions for resource types. Use this value to avoid having to specify API versions for each resource in the template. When you specify an API profile version and don’t specify an API version for the resource type, Resource Manager uses the API version for that resource type that is defined in the profile.

The API profile property is especially helpful when deploying a template to different environments, such as Azure Stack and global Azure. Use the API profile version to make sure your template automatically uses versions that are supported in both environments. For a list of the current API profile versions and the resources API versions defined in the profile, see API Profile.

For more information, see Track versions using API profiles.

parameters No Values that are provided when deployment is executed to customize resource deployment.
variables No Values that are used as JSON fragments in the template to simplify template language expressions.
functions No User-defined functions that are available within the template.
resources Yes Resource types that are deployed or updated in a resource group or subscription.
outputs No Values that are returned after deployment.

Each element has properties you can set. This article describes the sections of the template in greater detail.

 Posted by at 10:19 pm
Nov 122019
 

Below are some of the useful azure command that I use. I will not explain them fully just will mention briefly about their output.

Login:

First of all you need to connect to azure and to do that you need to run the command az login. When you run the command you will be prompted with this message

Az login

azlogin

You will need to open the browser to authenticate your self. Once you do that the output on the PowerShell will change which will show the list of all the subscriptions you are allowed to see.

List subscriptions

Lets list all the subscriptions again. Run the following command to do that.

Az account list

You will see the list of all the subscriptions you are allowed to see

 

Show active subscription

Now you need to verify that which subscription are you on. If you have only one subscription then you don’t have to worry about that. But if you have a dev and production subscription that its always good to make sure what is your active subscription. The following command will show the active subscription. This command show the default subscription.

Az account show

 

Change the subscription

All the command that you run are against the active subscription. If you have more than one subscription the following command can be used to switch.

Az account set  -s subscription_id

These are the basic command that are used to connect to azure and get your self started after that pretty much everything that you do on the portal can be automated and written in command in power shell.

 Posted by at 10:17 pm
Nov 122019
 

This part of the build chain in DevOps was a huge riddle for me. I never quite get my head around it. But now that I have got some idea I will try to tell the story from top down approach.

So in a development process we start to have the code on developer machine. In order to share this code with the other developers we use code repositories and normally we use online/central repositories which all the developers can access. We can achieve this  by using Git or VSTS or any other code repository.

Once the code is pushed to an online repository it not only serves the purpose of sharing the code but it is also used to build the code. Once that code is pushed there then the code is build. After a successfull build the application artifacts are generate that are then published to released. Now these artifacts could be deployable files or executables.

The release ready artifacts are then directed towards the installation platform which is a normally a service on a machines and the artifcats are deployed there. In our case this will be a web app in azure cloud services.

So to achieve this process I first created the ‘code on the developer machine’ part so I created a helloworld website in .netcore. For my example I will use the github as the online repository you can also use git (available in VSTS) which can directly be used by VSTS.

  • Gubhub: is website which is using git the azure devops gives integration with the git hub but in visual studio you can also use Git.
  • VS Git: this is using the azure devops git and once you push the code here all the code will be pushed to the devops.

I am using github to host my code. Now I will push all my code to my github repository. I also created a ignore file of type visual studio to my github as I don’t want to push the unwanted files. Finally my code is now on github. But how can I introduce a end to end processes where every code push should re-deploy my website.

So this is the process that I followed.

First of you need to create a new organization I have already done that with the name Asadmirza0855. Once that is done you need to open it. To open an organization on VSTS just click it and you will be in that context.

 

cicd4

Once you open that organization now we need to create a new project in that organization.

cicd2

You can see that you can create a public of provate project by clicking on the left top corner where it says ‘new project’. I have already created a project called CICD3.

Now lets setup the build process. So to do that you need to go to the build menu under Pipelines. Once you are there then hit ‘New’. Now you will be able to create the build process here. This is a four step process

Step1: Where is your code, in our case we will select github. Its going to to authetciate you on github and then will fetch all the repos from there.

cicd3

Step2: After that you need to select which repo you want to build

cicd4

Step3: In this step you need to configure the type of the application you are trying to build. VSTS have provided some predefined template so I will use those. In my case I will use ASP.NET Core.

cicd5

In the review its going to generate the Yml file that I will use for the build. In my case my yml file looks like this.

cicd6

This is the not the standard generated file I have modified it a bit. There are two main tasks here to build the project which will generate the artefact and the other task is two publish that artefact. You can also use varaibles in the YAML file. Now push that YAML file and once that is pushed the build will be triggered.

The next step is to create a release pipeline. But first lets create the Web App in azure portal. I will not explain how to create a web app here so I assume you know that. Once that web app is created we will create a release pipeline using the ‘Azure app service deplyment’ which is a service template.

  1. Click new under release and select Azure App service Deployment.

cicd7

I have selected my project CICD3, then I need to select to latest version so it knows which version to pick and the source alias.

  1. You also need to select a trigger. I have selected the very basic trigger that is to release after every new build. You also need to update the task and deployment paramaters.

cicd8

  1. In the default stage when you click on the task you will be presented with this page

 

cicd9

cicd10

in this page you need to provide the web app name and also the azure subscription under which that web app exists.

You can also view this confgiration in YAML file as well.

 

Once all of this setup is done you can trigger the build by triggering the build using a fake push. This will perform the following steps.

cicd11

 

 

 

 

 Posted by at 8:34 am
Nov 112019
 

Ok so this is something that I was not really aware on how to do. The assignment was to read the data from the log analytics in azure to show it in one of our application by using c#

The challenge was to understand how to do it. So I started searching google and I found these three very interesting links

https://dev.loganalytics.io/documentation/Tools/CSharp-Sdk

this link is mentioned on these SDKs but it limited to limitations in the OpenAPISpecification. I am going to try that out to see if it works or not. I couldn’t quite understand the domain in this part may be its super easy and obvious but as I said I am not fully aware I think I will skip that,

The other way that I found is mentioned on this link

https://docs.microsoft.com/en-us/rest/api/loganalytics/query/get

this is the more direct way of accessing it but its more to access via power shell or python. There is still no way mentioned here to access it via c# so I found these two links

https://stackoverflow.com/questions/53915236/querying-azure-log-analytics-from-c-sharp-application

https://blogs.technet.microsoft.com/livedevopsinjapan/2017/08/23/log-analytics-log-search-rest-api-for-c/

from the second link I was able to get the bearer token so I use this link to get the token and from the first link I checked how to use the api call.

Initially I was still getting the 443 forbidden so what I did is I created a new app and allowed the log analytics access to that app. I did another run and it still didn’t work. So then I realized that my log analytics workspace should also that app to access the logs. So I have to add the role assignment in the workspace for that app. I did that and after that I got a 404 because the log table that I accessed doesn’t exists so I added another table name ‘Usage’ in the query and it worked fine.

 Posted by at 10:22 pm
Nov 112019
 

I was unable to run the functions the HostBuilder was working quite fine but it was unable to connect with the Azure function I was getting the connection refused exception. The thing that I was missing was the storage simulator.

I will start going through in making in hello world azure function

azf1

azf3

We have now selected the HttpTrigger. Every time a HTTP call is made to the specified URL this function will be triggered. For this demo, we will select the authorization level anonymous meaning anyone can access the function.

azf2

Once you start the function you will see a bulk of output in the console. I have explained it later about those options. At the end of the output, you will be able to see the URL that is used to access the function.

Write this URL in a browser or postman and put a breakpoint at the beginning of the option. When you access the URL it will hit the breakpoint.

 

azf4

Let’s see the logs in a bit detail we have

LoggerFilterOptions: with level and rules. This is used for logging if there is any provided. The level of the log and the rules are also mentioned here

FunctionResultAggregatorOptions: The function result aggregator is the option to identify what are the options that should be present in the resulting bulk.

SingletonOptions: These options ensure the singularity of the functions. This ensures only one instance of the function is running.

HttpOptions: we also have HTTP options where we mention the maximum number of current and outstanding requests. You can also define the route prefix here

 Posted by at 2:42 pm