Getting Started with Bicep

Install Bicep to your local environment:

# Create the install folder
$installPath = "$env:USERPROFILE\.bicep"
$installDir = New-Item -ItemType Directory -Path $installPath -Force
$installDir.Attributes += 'Hidden'
# Fetch the latest Bicep CLI binary
(New-Object Net.WebClient).DownloadFile("https://github.com/Azure/bicep/releases/latest/download/bicep-win-x64.exe", "$installPath\bicep.exe")
# Add bicep to your PATH
$currentPath = (Get-Item -path "HKCU:\Environment" ).GetValue('Path', '', 'DoNotExpandEnvironmentNames')
if (-not $currentPath.Contains("%USERPROFILE%\.bicep")) { setx PATH ($currentPath + ";%USERPROFILE%\.bicep") }
if (-not $env:path.Contains($installPath)) { $env:path += ";$installPath" }
# Verify you can now access the 'bicep' command.
bicep --help
# Done!

Install Bicep extension on VS Code

Look into Bicep documentations: https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/

Deploy your main.bicep file using PowerShell:

Connect-AzAccount
$context = Get-AzSubscription -SubscriptionName 'Concierge Subscription'
Set-AzContext $context

Get-AzSubscription
Set-AzDefault -ResourceGroupName learn-8ea5f9c3-7fab-4c37-8fc7-f424f67ef1a5

New-AzResourceGroupDeployment -TemplateFile main.bicep

Using KeyVault Secrets as Parameter value

Assume you need a login and password in your main bicep resource:

resource sqlServer 'Microsoft.Sql/servers@2020-11-01-preview' = {
  name: sqlServerName
  location: location
  properties: {
    administratorLogin: sqlServerAdministratorLogin
    administratorLoginPassword: sqlServerAdministratorPassword
  }
}

So you define these 2 as parameters like this:

@secure()
@description('The administrator login username for the SQL server.')
param sqlServerAdministratorLogin string

@secure()
@description('The administrator login password for the SQL server.')
param sqlServerAdministratorPassword string

But you can’t pass these sensitive values in your pipeline. Instead you put them in the parameters.json to get the value from KeyVault:

{
    "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
      "sqlServerAdministratorLogin": {
        "reference": {
          "keyVault": {
            "id": "/subscriptions/a597e5fe-3c45-4412-b944-53e730b31c57/resourceGroups/learn-bicep-rg/providers/Microsoft.KeyVault/vaults/example-bicep-kv"
          },
          "secretName": "sqlServerAdministratorLogin"
        }
      },
      "sqlServerAdministratorPassword": {
        "reference": {
          "keyVault": {
            "id": "/subscriptions/a597e5fe-3c45-4412-b944-53e730b31c57/resourceGroups/learn-bicep-rg/providers/Microsoft.KeyVault/vaults/example-bicep-kv"
          },
          "secretName": "sqlServerAdministratorPassword"
        }
      }
    }
  }

Using Conditions

Deploying resources may also deppend on a given parameter. In this case put the condition before the object definition:

@allowed([
  'Development'
  'Production'
])
param environmentName string

// define your condition
var auditingEnabled = environmentName == 'Production'

resource auditingSettings 'Microsoft.Sql/servers/auditingSettings@2020-11-01-preview' = if (auditingEnabled) {
  parent: server
  name: 'default'
  properties: {
  }
}

You may also want to use conditions in properties section. Assuming you have defined an auditStorageAccount in this main bicep file, you can use the properties of this storage resource to assign values to auditSettings too:

resource auditStorageAccount 'Microsoft.Storage/storageAccounts@2021-02-01' = if (auditingEnabled) {
  name: auditStorageAccountName
  location: location
  sku: {
    name: storageAccountSkuName
  }
  kind: 'StorageV2'
}
resource auditingSettings 'Microsoft.Sql/servers/auditingSettings@2020-11-01-preview' = if (auditingEnabled) {
  parent: server
  name: 'default'
  properties: {
    state: 'Enabled'
    storageEndpoint: environmentName == 'Production' ? auditStorageAccount.properties.primaryEndpoints.blob : ''
    storageAccountAccessKey: environmentName == 'Production' ? listKeys(auditStorageAccount.id, auditStorageAccount.apiVersion).keys[0].value : ''
  }
}

Use Copy loops

You can use an array to create multiple resources in a loop

param storageAccountNames array = [
  'saauditus'
  'saauditeurope'
  'saauditapac'
]

resource storageAccountResources 'Microsoft.Storage/storageAccounts@2021-01-01' = [for storageAccountName in storageAccountNames: {
  name: storageAccountName
  location: resourceGroup().location
  kind: 'StorageV2'
  sku: {
    name: 'Standard_LRS'
  }
}]

Loop based on a count

Bicep provides the range() function, which creates an array of numbers. 

resource storageAccountResources 'Microsoft.Storage/storageAccounts@2021-01-01' = [for i in range(1,4): {
  name: 'sa${i}'
  location: resourceGroup().location
  kind: 'StorageV2'
  sku: {
    name: 'Standard_LRS'
  }
}]

Access the iteration index

When you need both the value in each iteration and index of that value you can specify the index variable with the array iterator:

param locations array = [
  'westeurope'
  'eastus2'
  'eastasia'
]

resource sqlServers 'Microsoft.Sql/servers@2020-11-01-preview' = [for (location, i) in locations: {
  name: 'sqlserver-${i+1}'
  location: location
  properties: {
    administratorLogin: administratorLogin
    administratorLoginPassword: administratorLoginPassword
  }
}]

Filter items with loops

Sometimes one value in a given array is enough, but we might need more associated parameters:

param sqlServerDetails array = [
  {
    name: 'sqlserver-we'
    location: 'westeurope'
    environmentName: 'Production'
  }
  {
    name: 'sqlserver-eus2'
    location: 'eastus2'
    environmentName: 'Development'
  }
  {
    name: 'sqlserver-eas'
    location: 'eastasia'
    environmentName: 'Production'
  }
]

resource sqlServers 'Microsoft.Sql/servers@2020-11-01-preview' = [for sqlServer in sqlServerDetails: if (sqlServer.environmentName == 'Production') {
  name: sqlServer.name
  location: sqlServer.location
  properties: {
    administratorLogin: administratorLogin
    administratorLoginPassword: administratorLoginPassword
  }
  tags: {
    environment: sqlServer.environmentName
  }
}]

Use loops with resource properties

param subnetNames array = [
  'api'
  'worker'
]

var subnetCount = 2

resource virtualNetworks 'Microsoft.Network/virtualNetworks@2020-11-01' = [for (location, i) in locations : {
  name: 'vnet-${location}'
  location: location
  properties: {
    addressSpace:{
      addressPrefixes:[
        '10.${i}.0.0/16'
      ]
    }
    subnets: [for j in range(1, subnetCount): {
      name: 'subnet-${j}'
      properties: {
        addressPrefix: '10.${i}.${j}.0/24'
      }
    }]
  }
}]

Variable loops

The next example creates an array that contains the values item1item2item3item4, and item5.

var items = [for i in range(1, 5): 'item${i}']

Don’t use outputs to return secrets, such as access keys or passwords. Outputs are logged, and they aren’t designed for handling secure data.

Getting started with Python

This article is about all you need to setup Python on Windows for a .Net Developer working on Windows using SQL Server.

Install Latest Python for Windows: https://www.python.org/downloads/windows/
-> My choice today: 3.10.3 Windows installer (64-bit)

Installing Python extension for Visual Studio Code which enables Jupiter Notebook
Installing Python for VSCode which brings syntax highlighting
Installing Python Extension Pack which supports intelliSense
Installing ODBC driver
pip install pyodbc

Sample code to run:

import pyodbc 
conn_str = ("Driver={SQL Server};"
            "Server='tcp:myserver.database.windows.net;"
            "Database=mydb;"
            "UID=sqlUser;"
            "PWD=myPassWord;"
            "Trusted_Connection=no;") 
conn = pyodbc.connect(conn_str)
cursor = conn.cursor()
sql_str = ("SET NOCOUNT ON; \n"
           "select * from myTable"
          )
cursor.execute(sql_str)
for row in cursor:
    print(row)

Nice to read more: https://wiki.python.org/moin/BeginnersGuide/Programmers

Next:

  • Installing Jupiter Notebook from https://jupyter.org/install
  • Installing Pandas as part of Anaconda:
    https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html

Purge Old Azure Log Analytics Ingested Data

This article is based on Microsoft document:

Assuming you have a Log Analytics workspace in a resource group, you can call the API url:

POST https://management.azure.com/subscriptions/your-subsc-ription-id/resourceGroups/yor-resource-group-rg/providers/Microsoft.OperationalInsights/workspaces/your-log-analytics-name/purge?api-version=2020-08-01

You need to pass Authorization as Bearer token in the header.

The body of the POST request will contain a filter and a table like this:

{
  "table": "Heartbeat",
  "filters": [
    {
      "column": "TimeGenerated",
      "operator": "<",
      "value": "2021-10-09T00:00:00"
    }
  ]
}

The response will have a header like

x-ms-status-location: https://management.azure.com/subscriptions/{subscriptioId}/resourceGroups/{resourceGroupName}/providers/microsoft.operationalinsights/workspaces/{workspaceName}/operations/purge-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx?api-version=2017-01-01-preview

This is a GET url to send and see the status of the operation. This Url is also given as the body of the first POST request.

The status will be something like:

{
    "status": "pending"
}

Tip: You can find the records to delete using a simple query like this:

W3CIISLog 
| where TimeGenerated > ago(32d)
| summarize count() by bin(TimeGenerated, 1d)


Heartbeat 
| where TimeGenerated > ago(32d)
| summarize count() by bin(TimeGenerated, 1d)

Get Function Token CLI

 Notice the name of the token which is created as App keys in the function.

function GetFunctionToken {
    param (
        [string[]]$funcResGroupName,
        [string[]]$funcName
    )

    # Get function token
    $url = "/subscriptions/$subscriptionId/resourceGroups/$funcResGroupName/providers/Microsoft.Web/sites/$funcName/host/default/listKeys?api-version=2018-11-01"

    $functionCode = az rest --method post --uri $url --query functionKeys.appkeyName --output tsv
    return $functionCode
}

Publish packages to nuget.org

You can follow different path in order to publish your package into nuget.org. The prerequisites and the steps are described in Microsoft documentation page.

If you get an Error 403 (The specified API Key is invalid, has expired, or does not have permission to access the specified package.) it simply indicates that the package name you chose has already been published and you are not allowed to overwrite it. Which makes sense when you upload yours and do not want other people make changes to it without you providing them an API Key.

One good practice is to create packages using a different API key than upgrading to a new version. This will make sure that your deployment pipeline will fail if the name does not exist and it would create a new package every time you deploy it.

When publishing a nuget package to Azure DevOps Artifacts you do similarly using nuget push command:

    - task: DotNetCoreCLI@2
      displayName: 'dotnet push'
      inputs:
        command: push
        packagesToPush: '$(Build.ArtifactStagingDirectory)/*.nupkg'
        nuGetFeedType: 'internal'
        publishVstsFeed: 'feed-name-or-id'
      condition: and(succeeded(), eq(variables[‘Build.SourceBranch’], ‘refs/heads/master’))

 

Test & Feedback extension

 This extension can be installed on Chrome, Edge and FireFox in any operating systems. You simply connect the extension to your organization Azure Devops. It helps you write notes, take screenshots and add a bug when you exploring the website your team is responsible for.

See more:   https://docs.microsoft.com/en-us/azure/devops/test/perform-exploratory-tests?view=azure-devops 

You can even work in Standalone mode which makes the report in your Download folder of the browser.

Azure Synapse Analytics Day

This page summarizes some highlights from the Azure Immersion Workshop on Analytics.

On GitHub:

What is Delta Lake: from docs.microsoft.com 

Delta Lake is an open-source storage layer that brings ACID (atomicity, consistency, isolation, and durability) transactions to Apache Spark and big data workloads.

It supports many features lie ACID, Time Travel and Open format which enables Apache Parquet into baseline

Comparing Synapse Analytics versus Azure Data Factory
https://docs.microsoft.com/en-us/azure/synapse-analytics/data-integration/concepts-data-factory-differences 

Tutorial: Get Started 
https://docs.microsoft.com/en-us/azure/synapse-analytics/get-started

White Paper: POLARIS – The Distributed SQL Engine in Azure Synapse
https://www.vldb.org/pvldb/vol13/p3204-saborit.pdf 

Unleash Your SAP Data
https://go.qlik.com/Qlik-Unleash-Your-SAP-Data.html

Experience with free account
https://azure.microsoft.com/en-us/free/synapse-analytics/

Application Insights – Sampling

 

In Application Insights, using sampling is an effective way to reduce the amount of telemetry data that is sent to Application Insights. If you are worried about high storage costs if all telemetry data gets sent to Application Insights, you can make use of sampling in Application Insights.

By default Application Insights sampling is already enabled when you use the ASP.NET, ASP.NET Core software development kits.

For ASP.Net applications you can configure adaptive sampling by tuning parameters in the ApplicationInsights.config file. Some of the settings are

<MaxTelemetryItemsPerSecond>5</MaxTelemetryItemsPerSecond>

This will ensure to limit the number of items that are sent to Application Insights

<EvaluationInterval>00:00:15</EvaluationInterval>

This is the interval at which the current rate of telemetry is reevaluated

<InitialSamplingPercentage>100</InitialSamplingPercentage>

This is the amount of telemetry data that is sent when the application is started

In ASP.Net Core applications, there is no ApplicationInsights.config file, so the configuration is done via code.

Adaptive sampling is enabled by default for all ASP.NET Core applications

For more information on adaptive sampling, you can use the below link

https://docs.microsoft.com/en-us/azure/azure-monitor/app/sampling

Azure Active Directory Commands

You may request to read registrations in AAD even if you have not enough permissions to do that from the Azure Portal. You can, for example request the name of the groups you are in, or even the members of a group or an application.

First thing you need to do in PowerShell is to make sure you have Azure AD module installed. Then you need to log into your Azure AD

# Connect to Azure AD
Connect-AzureAD
# In case Connect-AzureAD is not recognized as a cpommandlet, install it:
# Install-Module AzureAD -Force

Now you can query the AAD. The followi

Now you can query the AAD. The following are some samples:

# Get the name of applications that I have been part of?
Get-AzureADUser -SearchString "Pouya Panahy" | Get-AzureADUserAppRoleAssignment -All $true

# Get the list of groups that I am part of
Get-AzureADUser -SearchString "Pouya Panahy" `
  | Get-AzureADUserMembership -All $true `
  | Sort-Object -Property DisplayName 

# Where am I direct descendent from
Get-AzRoleAssignment -SignInName 'p.panahy@company.nl'

# Show all rights I've got
Get-AzRoleAssignment -SignInName 'p.panahy@company.nl' -ExpandPrincipalGroups  `
 | Sort-Object -Property DisplayName `
 | Select-Object ObjectType, RoleDefinitionName, DisplayName, Scope `
 | Format-Table

# Is my application registered?
Get-AzureADUser -SearchString "Pouya Panahy" `
 | Get-AzureADUserCreatedObject -All $true `
 | Sort-Object -Property ObjectType `
 | Select-Object ObjectType, AppId, DisplayName, HomePage, IdentifierUris `
 | Format-Table

# Looking for an application that some one else have registered
Get-AzureADServicePrincipal -All $true -Filter "startswith(DisplayName, 'AppName')"

# Who has access to my resources in a given resource group?
Get-AzRoleAssignment -Scope "/subscriptions/xxxxxxxx-xxxx-xxxx-dxxx-xxxxxxxxxxxx/resourceGroups/res-grp-name"  `
 | Sort-Object -Property RoleDefinitionName, DisplayName `
 | Select-Object ObjectType, RoleDefinitionName, DisplayName, Scope `
 | Format-Table

# List the members of a group
Get-AzureAdGroup -All $true -SearchString 'Group Name' | Get-AzureADGroupMember

 

OWASP test in Release Pipeline

In this page we are going to add some tasks in Azure Release pipeline to do the tests.

Prerequisites

There is already a docker image containing  Zap2 files and a Python file called zap-baseline.py to run the process. The image is called owasp/zap2docker-stable requires a shared folder to put the report in it. To mount a file share I use a storage account in azure containing the shared location called security. I generate the Key to access the shared location and start the process.

When process has been completed you need to have a file called OWASPToNUnit3.xslt to convert the report into an NUnit file that can be published as a test result.

OWASP Stage Tasks

There are 3 tasks in this stage:

  1. OWASP in Azure CLI
    which stes up a Container Instance that runs the tests
  2.  Transforming PowerShell Script
    which uses a PowerShell script to transform the result into NUnit
  3. Publish Test Results
    which makes the result visible in the pipeline as Test Results

Stage Tasks Yaml

steps: 
  - task: AzureCLI@2 
    displayName: 'OWASP in Azure CLI ' 
    inputs: 
      azureSubscription: 'Owasp_grp_sp' 
      scriptType: ps 
    scriptLocation: inlineScript 
    inlineScript: | 
    $key='"the-Key-to-Storage-Account-shared-location=="' 
    $ZAP_COMMAND="/zap/zap-baseline.py -t """"https://the-url-to-test.something"""" -x OWASP-ZAP-Report.xml" 

    az container create `
       --resource-group owasp_grp `
       --name owasp ` 
       --image owasp/zap2docker-stable ` 
       --ip-address public `
       --ports 8080 `
       --azure-file-volume-account-name owaspstore1000 `
       --azure-file-volume-account-key $key `
       --azure-file-volume-share-name security `
       --azure-file-volume-mount-path /zap/wrk/ `
       --command-line $ZAP_COMMAND 
    az storage file download `
       --account-name owaspstore1000 `
       --account-key $key `
       -s security `
       -p OWASP-ZAP-Report.xml `
       --dest %SYSTEM_DEFAULTWORKINGDIRECTORY%OWASP-ZAP-Report.xml 
       
  - powershell: | 
     ## The powershell task for converting the test report 
     $XslPath = "$($Env:SYSTEM_DEFAULTWORKINGDIRECTORY)\_Managed-Security/OWASPToNUnit3.xslt"
     $XmlInputPath = "$($Env:SYSTEM_DEFAULTWORKINGDIRECTORY)\OWASP-ZAP-Report.xml"
     $XmlOutputPath = "$($Env:SYSTEM_DEFAULTWORKINGDIRECTORY)\Converted-OWASP-ZAP-Report.xml"
     $XslTransform = New-Object System.Xml.Xsl.XslCompiledTransform
     $XslTransform.Load($XslPath)
     $XslTransform.Transform($XmlInputPath, $XmlOutputPath)
    displayName: 'Transforming PowerShell Script'


  - task: PublishTestResults@2
    displayName: 'Publish Test Results Converted-OWASP-ZAP-Report.xml'
    inputs:
      testResultsFormat: NUnit
      testResultsFiles: 'Converted-OWASP-ZAP-Report.xml'