Getting Started with Bicep

Install Bicep to your local environment:

# Create the install folder
$installPath = "$env:USERPROFILE\.bicep"
$installDir = New-Item -ItemType Directory -Path $installPath -Force
$installDir.Attributes += 'Hidden'
# Fetch the latest Bicep CLI binary
(New-Object Net.WebClient).DownloadFile("https://github.com/Azure/bicep/releases/latest/download/bicep-win-x64.exe", "$installPath\bicep.exe")
# Add bicep to your PATH
$currentPath = (Get-Item -path "HKCU:\Environment" ).GetValue('Path', '', 'DoNotExpandEnvironmentNames')
if (-not $currentPath.Contains("%USERPROFILE%\.bicep")) { setx PATH ($currentPath + ";%USERPROFILE%\.bicep") }
if (-not $env:path.Contains($installPath)) { $env:path += ";$installPath" }
# Verify you can now access the 'bicep' command.
bicep --help
# Done!

Install Bicep extension on VS Code

Look into Bicep documentations: https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/

Deploy your main.bicep file using PowerShell:

Connect-AzAccount
$context = Get-AzSubscription -SubscriptionName 'Concierge Subscription'
Set-AzContext $context

Get-AzSubscription
Set-AzDefault -ResourceGroupName learn-8ea5f9c3-7fab-4c37-8fc7-f424f67ef1a5

New-AzResourceGroupDeployment -TemplateFile main.bicep

Using KeyVault Secrets as Parameter value

Assume you need a login and password in your main bicep resource:

resource sqlServer 'Microsoft.Sql/servers@2020-11-01-preview' = {
  name: sqlServerName
  location: location
  properties: {
    administratorLogin: sqlServerAdministratorLogin
    administratorLoginPassword: sqlServerAdministratorPassword
  }
}

So you define these 2 as parameters like this:

@secure()
@description('The administrator login username for the SQL server.')
param sqlServerAdministratorLogin string

@secure()
@description('The administrator login password for the SQL server.')
param sqlServerAdministratorPassword string

But you can’t pass these sensitive values in your pipeline. Instead you put them in the parameters.json to get the value from KeyVault:

{
    "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
      "sqlServerAdministratorLogin": {
        "reference": {
          "keyVault": {
            "id": "/subscriptions/a597e5fe-3c45-4412-b944-53e730b31c57/resourceGroups/learn-bicep-rg/providers/Microsoft.KeyVault/vaults/example-bicep-kv"
          },
          "secretName": "sqlServerAdministratorLogin"
        }
      },
      "sqlServerAdministratorPassword": {
        "reference": {
          "keyVault": {
            "id": "/subscriptions/a597e5fe-3c45-4412-b944-53e730b31c57/resourceGroups/learn-bicep-rg/providers/Microsoft.KeyVault/vaults/example-bicep-kv"
          },
          "secretName": "sqlServerAdministratorPassword"
        }
      }
    }
  }

Using Conditions

Deploying resources may also deppend on a given parameter. In this case put the condition before the object definition:

@allowed([
  'Development'
  'Production'
])
param environmentName string

// define your condition
var auditingEnabled = environmentName == 'Production'

resource auditingSettings 'Microsoft.Sql/servers/auditingSettings@2020-11-01-preview' = if (auditingEnabled) {
  parent: server
  name: 'default'
  properties: {
  }
}

You may also want to use conditions in properties section. Assuming you have defined an auditStorageAccount in this main bicep file, you can use the properties of this storage resource to assign values to auditSettings too:

resource auditStorageAccount 'Microsoft.Storage/storageAccounts@2021-02-01' = if (auditingEnabled) {
  name: auditStorageAccountName
  location: location
  sku: {
    name: storageAccountSkuName
  }
  kind: 'StorageV2'
}
resource auditingSettings 'Microsoft.Sql/servers/auditingSettings@2020-11-01-preview' = if (auditingEnabled) {
  parent: server
  name: 'default'
  properties: {
    state: 'Enabled'
    storageEndpoint: environmentName == 'Production' ? auditStorageAccount.properties.primaryEndpoints.blob : ''
    storageAccountAccessKey: environmentName == 'Production' ? listKeys(auditStorageAccount.id, auditStorageAccount.apiVersion).keys[0].value : ''
  }
}

Use Copy loops

You can use an array to create multiple resources in a loop

param storageAccountNames array = [
  'saauditus'
  'saauditeurope'
  'saauditapac'
]

resource storageAccountResources 'Microsoft.Storage/storageAccounts@2021-01-01' = [for storageAccountName in storageAccountNames: {
  name: storageAccountName
  location: resourceGroup().location
  kind: 'StorageV2'
  sku: {
    name: 'Standard_LRS'
  }
}]

Loop based on a count

Bicep provides the range() function, which creates an array of numbers. 

resource storageAccountResources 'Microsoft.Storage/storageAccounts@2021-01-01' = [for i in range(1,4): {
  name: 'sa${i}'
  location: resourceGroup().location
  kind: 'StorageV2'
  sku: {
    name: 'Standard_LRS'
  }
}]

Access the iteration index

When you need both the value in each iteration and index of that value you can specify the index variable with the array iterator:

param locations array = [
  'westeurope'
  'eastus2'
  'eastasia'
]

resource sqlServers 'Microsoft.Sql/servers@2020-11-01-preview' = [for (location, i) in locations: {
  name: 'sqlserver-${i+1}'
  location: location
  properties: {
    administratorLogin: administratorLogin
    administratorLoginPassword: administratorLoginPassword
  }
}]

Filter items with loops

Sometimes one value in a given array is enough, but we might need more associated parameters:

param sqlServerDetails array = [
  {
    name: 'sqlserver-we'
    location: 'westeurope'
    environmentName: 'Production'
  }
  {
    name: 'sqlserver-eus2'
    location: 'eastus2'
    environmentName: 'Development'
  }
  {
    name: 'sqlserver-eas'
    location: 'eastasia'
    environmentName: 'Production'
  }
]

resource sqlServers 'Microsoft.Sql/servers@2020-11-01-preview' = [for sqlServer in sqlServerDetails: if (sqlServer.environmentName == 'Production') {
  name: sqlServer.name
  location: sqlServer.location
  properties: {
    administratorLogin: administratorLogin
    administratorLoginPassword: administratorLoginPassword
  }
  tags: {
    environment: sqlServer.environmentName
  }
}]

Use loops with resource properties

param subnetNames array = [
  'api'
  'worker'
]

var subnetCount = 2

resource virtualNetworks 'Microsoft.Network/virtualNetworks@2020-11-01' = [for (location, i) in locations : {
  name: 'vnet-${location}'
  location: location
  properties: {
    addressSpace:{
      addressPrefixes:[
        '10.${i}.0.0/16'
      ]
    }
    subnets: [for j in range(1, subnetCount): {
      name: 'subnet-${j}'
      properties: {
        addressPrefix: '10.${i}.${j}.0/24'
      }
    }]
  }
}]

Variable loops

The next example creates an array that contains the values item1item2item3item4, and item5.

var items = [for i in range(1, 5): 'item${i}']

Don’t use outputs to return secrets, such as access keys or passwords. Outputs are logged, and they aren’t designed for handling secure data.

Getting started with Python

This article is about all you need to setup Python on Windows for a .Net Developer working on Windows using SQL Server.

Install Latest Python for Windows: https://www.python.org/downloads/windows/
-> My choice today: 3.10.3 Windows installer (64-bit)

Installing Python extension for Visual Studio Code which enables Jupiter Notebook
Installing Python for VSCode which brings syntax highlighting
Installing Python Extension Pack which supports intelliSense
Installing ODBC driver
pip install pyodbc

Sample code to run:

import pyodbc 
conn_str = ("Driver={SQL Server};"
            "Server='tcp:myserver.database.windows.net;"
            "Database=mydb;"
            "UID=sqlUser;"
            "PWD=myPassWord;"
            "Trusted_Connection=no;") 
conn = pyodbc.connect(conn_str)
cursor = conn.cursor()
sql_str = ("SET NOCOUNT ON; \n"
           "select * from myTable"
          )
cursor.execute(sql_str)
for row in cursor:
    print(row)

Nice to read more: https://wiki.python.org/moin/BeginnersGuide/Programmers

Next:

  • Installing Jupiter Notebook from https://jupyter.org/install
  • Installing Pandas as part of Anaconda:
    https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html

Purge Old Azure Log Analytics Ingested Data

This article is based on Microsoft document:

Assuming you have a Log Analytics workspace in a resource group, you can call the API url:

POST https://management.azure.com/subscriptions/your-subsc-ription-id/resourceGroups/yor-resource-group-rg/providers/Microsoft.OperationalInsights/workspaces/your-log-analytics-name/purge?api-version=2020-08-01

You need to pass Authorization as Bearer token in the header.

The body of the POST request will contain a filter and a table like this:

{
  "table": "Heartbeat",
  "filters": [
    {
      "column": "TimeGenerated",
      "operator": "<",
      "value": "2021-10-09T00:00:00"
    }
  ]
}

The response will have a header like

x-ms-status-location: https://management.azure.com/subscriptions/{subscriptioId}/resourceGroups/{resourceGroupName}/providers/microsoft.operationalinsights/workspaces/{workspaceName}/operations/purge-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx?api-version=2017-01-01-preview

This is a GET url to send and see the status of the operation. This Url is also given as the body of the first POST request.

The status will be something like:

{
    "status": "pending"
}

Tip: You can find the records to delete using a simple query like this:

W3CIISLog 
| where TimeGenerated > ago(32d)
| summarize count() by bin(TimeGenerated, 1d)


Heartbeat 
| where TimeGenerated > ago(32d)
| summarize count() by bin(TimeGenerated, 1d)

Get Function Token CLI

 Notice the name of the token which is created as App keys in the function.

function GetFunctionToken {
    param (
        [string[]]$funcResGroupName,
        [string[]]$funcName
    )

    # Get function token
    $url = "/subscriptions/$subscriptionId/resourceGroups/$funcResGroupName/providers/Microsoft.Web/sites/$funcName/host/default/listKeys?api-version=2018-11-01"

    $functionCode = az rest --method post --uri $url --query functionKeys.appkeyName --output tsv
    return $functionCode
}

Test & Feedback extension

 This extension can be installed on Chrome, Edge and FireFox in any operating systems. You simply connect the extension to your organization Azure Devops. It helps you write notes, take screenshots and add a bug when you exploring the website your team is responsible for.

See more:   https://docs.microsoft.com/en-us/azure/devops/test/perform-exploratory-tests?view=azure-devops 

You can even work in Standalone mode which makes the report in your Download folder of the browser.

Azure Synapse Analytics Day

This page summarizes some highlights from the Azure Immersion Workshop on Analytics.

On GitHub:

What is Delta Lake: from docs.microsoft.com 

Delta Lake is an open-source storage layer that brings ACID (atomicity, consistency, isolation, and durability) transactions to Apache Spark and big data workloads.

It supports many features lie ACID, Time Travel and Open format which enables Apache Parquet into baseline

Comparing Synapse Analytics versus Azure Data Factory
https://docs.microsoft.com/en-us/azure/synapse-analytics/data-integration/concepts-data-factory-differences 

Tutorial: Get Started 
https://docs.microsoft.com/en-us/azure/synapse-analytics/get-started

White Paper: POLARIS – The Distributed SQL Engine in Azure Synapse
https://www.vldb.org/pvldb/vol13/p3204-saborit.pdf 

Unleash Your SAP Data
https://go.qlik.com/Qlik-Unleash-Your-SAP-Data.html

Experience with free account
https://azure.microsoft.com/en-us/free/synapse-analytics/

Application Insights – Sampling

 

In Application Insights, using sampling is an effective way to reduce the amount of telemetry data that is sent to Application Insights. If you are worried about high storage costs if all telemetry data gets sent to Application Insights, you can make use of sampling in Application Insights.

By default Application Insights sampling is already enabled when you use the ASP.NET, ASP.NET Core software development kits.

For ASP.Net applications you can configure adaptive sampling by tuning parameters in the ApplicationInsights.config file. Some of the settings are

<MaxTelemetryItemsPerSecond>5</MaxTelemetryItemsPerSecond>

This will ensure to limit the number of items that are sent to Application Insights

<EvaluationInterval>00:00:15</EvaluationInterval>

This is the interval at which the current rate of telemetry is reevaluated

<InitialSamplingPercentage>100</InitialSamplingPercentage>

This is the amount of telemetry data that is sent when the application is started

In ASP.Net Core applications, there is no ApplicationInsights.config file, so the configuration is done via code.

Adaptive sampling is enabled by default for all ASP.NET Core applications

For more information on adaptive sampling, you can use the below link

https://docs.microsoft.com/en-us/azure/azure-monitor/app/sampling

Azure Active Directory Commands

You may request to read registrations in AAD even if you have not enough permissions to do that from the Azure Portal. You can, for example request the name of the groups you are in, or even the members of a group or an application.

First thing you need to do in PowerShell is to make sure you have Azure AD module installed. Then you need to log into your Azure AD

# Connect to Azure AD
Connect-AzureAD
# In case Connect-AzureAD is not recognized as a cpommandlet, install it:
# Install-Module AzureAD -Force

Now you can query the AAD. The followi

Now you can query the AAD. The following are some samples:

# Get the name of applications that I have been part of?
Get-AzureADUser -SearchString "Pouya Panahy" | Get-AzureADUserAppRoleAssignment -All $true

# Get the list of groups that I am part of
Get-AzureADUser -SearchString "Pouya Panahy" `
  | Get-AzureADUserMembership -All $true `
  | Sort-Object -Property DisplayName 

# Where am I direct descendent from
Get-AzRoleAssignment -SignInName 'p.panahy@company.nl'

# Show all rights I've got
Get-AzRoleAssignment -SignInName 'p.panahy@company.nl' -ExpandPrincipalGroups  `
 | Sort-Object -Property DisplayName `
 | Select-Object ObjectType, RoleDefinitionName, DisplayName, Scope `
 | Format-Table

# Is my application registered?
Get-AzureADUser -SearchString "Pouya Panahy" `
 | Get-AzureADUserCreatedObject -All $true `
 | Sort-Object -Property ObjectType `
 | Select-Object ObjectType, AppId, DisplayName, HomePage, IdentifierUris `
 | Format-Table

# Looking for an application that some one else have registered
Get-AzureADServicePrincipal -All $true -Filter "startswith(DisplayName, 'AppName')"

# Who has access to my resources in a given resource group?
Get-AzRoleAssignment -Scope "/subscriptions/xxxxxxxx-xxxx-xxxx-dxxx-xxxxxxxxxxxx/resourceGroups/res-grp-name"  `
 | Sort-Object -Property RoleDefinitionName, DisplayName `
 | Select-Object ObjectType, RoleDefinitionName, DisplayName, Scope `
 | Format-Table

# List the members of a group
Get-AzureAdGroup -All $true -SearchString 'Group Name' | Get-AzureADGroupMember

 

OWASP test in Release Pipeline

In this page we are going to add some tasks in Azure Release pipeline to do the tests.

Prerequisites

There is already a docker image containing  Zap2 files and a Python file called zap-baseline.py to run the process. The image is called owasp/zap2docker-stable requires a shared folder to put the report in it. To mount a file share I use a storage account in azure containing the shared location called security. I generate the Key to access the shared location and start the process.

When process has been completed you need to have a file called OWASPToNUnit3.xslt to convert the report into an NUnit file that can be published as a test result.

OWASP Stage Tasks

There are 3 tasks in this stage:

  1. OWASP in Azure CLI
    which stes up a Container Instance that runs the tests
  2.  Transforming PowerShell Script
    which uses a PowerShell script to transform the result into NUnit
  3. Publish Test Results
    which makes the result visible in the pipeline as Test Results

Stage Tasks Yaml

steps: 
  - task: AzureCLI@2 
    displayName: 'OWASP in Azure CLI ' 
    inputs: 
      azureSubscription: 'Owasp_grp_sp' 
      scriptType: ps 
    scriptLocation: inlineScript 
    inlineScript: | 
    $key='"the-Key-to-Storage-Account-shared-location=="' 
    $ZAP_COMMAND="/zap/zap-baseline.py -t """"https://the-url-to-test.something"""" -x OWASP-ZAP-Report.xml" 

    az container create `
       --resource-group owasp_grp `
       --name owasp ` 
       --image owasp/zap2docker-stable ` 
       --ip-address public `
       --ports 8080 `
       --azure-file-volume-account-name owaspstore1000 `
       --azure-file-volume-account-key $key `
       --azure-file-volume-share-name security `
       --azure-file-volume-mount-path /zap/wrk/ `
       --command-line $ZAP_COMMAND 
    az storage file download `
       --account-name owaspstore1000 `
       --account-key $key `
       -s security `
       -p OWASP-ZAP-Report.xml `
       --dest %SYSTEM_DEFAULTWORKINGDIRECTORY%OWASP-ZAP-Report.xml 
       
  - powershell: | 
     ## The powershell task for converting the test report 
     $XslPath = "$($Env:SYSTEM_DEFAULTWORKINGDIRECTORY)\_Managed-Security/OWASPToNUnit3.xslt"
     $XmlInputPath = "$($Env:SYSTEM_DEFAULTWORKINGDIRECTORY)\OWASP-ZAP-Report.xml"
     $XmlOutputPath = "$($Env:SYSTEM_DEFAULTWORKINGDIRECTORY)\Converted-OWASP-ZAP-Report.xml"
     $XslTransform = New-Object System.Xml.Xsl.XslCompiledTransform
     $XslTransform.Load($XslPath)
     $XslTransform.Transform($XmlInputPath, $XmlOutputPath)
    displayName: 'Transforming PowerShell Script'


  - task: PublishTestResults@2
    displayName: 'Publish Test Results Converted-OWASP-ZAP-Report.xml'
    inputs:
      testResultsFormat: NUnit
      testResultsFiles: 'Converted-OWASP-ZAP-Report.xml'

 

 

Docker and Webapp

In this page we will create a WebApp that serves a Docker image and see how to put it in CI/CD.

Create a Web App

If you are creating a web app in Azure portal, start with selecting Docker Container as Publish property in Basics tab. Choose Linux as Operating System. For this example a Standard SKU would just do fine.
Next, on the Docker-tab select your existing azurecr.io container registry and specify the Image:Tag you want to serve.

My tempate looks something inline with the following:

"resources": [{
        "apiVersion": "2018-11-01",
        "name": "[parameters('name')]",
        "type": "Microsoft.Web/sites",
        "location": "[parameters('location')]",
        "tags": {},
        "dependsOn": ["[concat('Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"],
        "properties": {
            "name": "[parameters('name')]",
            "siteConfig": {
                "appSettings": [{
                        "name": "DOCKER_REGISTRY_SERVER_URL",
                        "value": "[parameters('dockerRegistryUrl')]"
                    }, {
                        "name": "DOCKER_REGISTRY_SERVER_USERNAME",
                        "value": "[parameters('dockerRegistryUsername')]"
                    }, {
                        "name": "DOCKER_REGISTRY_SERVER_PASSWORD",
                        "value": "[parameters('dockerRegistryPassword')]"
                    }, {
                        "name": "WEBSITES_ENABLE_APP_SERVICE_STORAGE",
                        "value": "false"
                    }
                ],
                "linuxFxVersion": "[parameters('linuxFxVersion')]",
                "appCommandLine": "",
                "alwaysOn": "[parameters('alwaysOn')]"
            },
            "serverFarmId": "[concat('/subscriptions/', parameters('subscriptionId'),'/resourcegroups/', parameters('serverFarmResourceGroup'), '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]",
            "clientAffinityEnabled": false
        }
    }, {
        "apiVersion": "2018-11-01",
        "name": "[parameters('hostingPlanName')]",
        "type": "Microsoft.Web/serverfarms",
        "location": "West Europe",
        "kind": "linux",
        "tags": {},
        "dependsOn": [],
        "properties": {
            "perSiteScaling": false,
            "maximumElasticWorkerCount": 1,
            "isSpot": false,
            "reserved": true,
            "isXenon": false,
            "hyperV": false,
            "targetWorkerCount": 0,
            "targetWorkerSizeId": 0
        },
        "sku": {
            "Tier": "Standard",
            "Name": "S1",
            "size": "S1",
            "family": "S",
            "capacity": 1
        }
    }

 

As a security matter I have to point out the fact that the Web App is connecting to Azure Container Registry using 3 configuration items i.e. Server Url, Server Username and Server Password. These items are visible in Azure Portal Configuration:
DOCKER_REGISTRY_SERVER_URL, DOCKER_REGISTRY_SERVER_USERNAME, DOCKER_REGISTRY_SERVER_PASSWORD

Deploy Docker Image

In both cases where the Docker image gets pulled from Container Registry, you need to restart the instance in Container Instance and also in Web App Docker instance.

An other option would be to move the pull task into Azure Pipeline using. My example is defined as follows:

steps: 
– task: AzureRmWebAppDeployment@4 
  displayName: ‘Deploy Azure App Service’ 
  inputs: 
    azureSubscription: ‘$(Parameters.ConnectedServiceName)’ 
    appType: ‘Web App for Containers (Linux)’ 
    WebAppName: ‘$(Parameters.WebAppName)’ 
    DockerNamespace: ‘https://securedcontainerregistry.azurecr.io’  
    DockerRepository: ‘https://securedcontainerregistry.azurecr.io/securedazurelib’  
    DockerImageTag: ‘latest’  
    StartupCommand: ”