Az Cli Core on PowerShell Core – Escaping Parameters

If you are deploying Azure solutions, you face the requirement to do infrastructure as code to create repeatable, idempotent deployments.
These ideally should be parameterized and automated, e.g. using Azure DevopOps pipelines. AZ Cli Core can be a great asset here!
In contrast to ARM templates or Terraform scripts Az Cli Core scripts do not require a steep learning curve, because they look much more familiar to anybody who has ever written a batch script.
I also do appreciate the compactness and the easier handling, which helps a lot to reduce the otherwise steep learning curve into IaC.
In addition, as a developer, I feel much more at home working in a “real” programming/scripting environment, instead of wrapping my mind around JSON templates, which have program workflow assets bolted on as needed.

A few things to keep in mind are:

  • Run Az Cli / PowerShell Core scripts on Linux or IOS build machines – the Windows version is not idempotent!
  • Az Cli is quite new and a lot of commands are still in their early stages (experimental – as the product group calls it).
  • Not all Azure infrastructure areas are covered by Az Cli yet. – This can be mitigated by calling ARM Templates from the Cli script, if required. This is not ideal, but real life seldom is black and white. 🙂 The white areas are also shrinking fast!

One of the largest obstacles getting started with Az Cli Core in combination with PowerShell Core, well, at least to me, was the handling of parameters.
To get over this, it is good to understand that Az Cli Core is written in Python. Due to this, if you need to escape parameters, for example, if you want to install a script extension to a VM, one needs to use Python escape mechanisms not the PowerShell ones, because the Python code is on the receiving end.

az vm extension set -n VMAccessForLinux --publisher Microsoft.OSTCExtensions --version 1.4 \
--vm-name MyVm --resource-group MyResourceGroup \
--protected-settings '{"username":"user1", "ssh_key":"ssh_rsa …"}'

“–protected-settings” expects JSON input with quotation marks, which need to be escaped. And, as stated before, do not use the PowerShell escape tick ‘ , but this template for escaping.

Json escape sequence template for Az Cli
'{\"value\":\"ParameterValue\"}' 

this would read for the sample –protected-settings:

'{\"username\":\"user1\", \"ssh_key\":\"ssh_rsa …\"}'

The backslash will help you out!

If the JSON input gets longer, doing this manually is quite tedious and I am going to show another technique using JSON parameter files in on of my upcoming posts, as well.
That one is also great to handle dynamic input.

Stay tuned. 🙂

Alexander

Cloud News 2021 / Q1

The Wechsler Consulting Cloud News – 2021 / Q1 – episode tackles newest information originating from Microsoft Ignite 2021 around Azure and Azure IOT. Some very interesting changes regarding local/centralized computing as well as creators/innovators were announced.

Focus Topic this time is Azure IOT Device Update.
A long desired device management capability for Azure IOTHub.

Links

Microsoft Ignite 2021
Decentralized Computing  / Empower creators innovators
https://myignite.microsoft.com/sessions/5f16199c-9aae-40f2-a7ba-157a477faefd

Microsoft Mesh
Introducing Microsoft Mesh | Here can be anywhere.

Focus Topic
Azure IOT – Device Update
Introduction to Device Update for Azure IoT Hub | Microsoft Docs

Wechsler Consulting – Cloud Campus
Model your world – with Digital Twins!
Azure IOTHub – time to update endpoint filter

#WechslerConsultingCloudCampus #AzureIOT  #Azure

Model your world – with Digital Twins!

And I mean this literally. What if you can access your town, streets, places, building etc. digitally via a twin? Everything needs to be secure and privacy guaranteed, of course (yes, I am German/European :-)), but, would this not open a lot of possibilities for new solutions and business models?
One would be able to conquer the space around us and easily provide location-based services. Just add a little bit of augmented reality and/or smart screens and science-fiction like scenarios become possible.
Sounds good? Well, than this episode of the Channel 9 IOT Show is a must see for you:

If you also add a spoonful of IOT Plug & Play to the stew, things are really able to take off!

Have a great week
Alexander

.NET 5.0 is out!

… or, globally available (GA), as Microsoft tends to say.
Technically, this is absolutely great news, because the newest version of the Microsoft development runtime brings a lot of new features, fixes and performance improvements.
It also cleans out a lot of the past architectural wanderings, the .NET platform has undergone in the recent years.
A really good summary of the new features and changes can be found in the .NET Core documentation.

V5 – A new engine for .NET!


Nevertheless, I always stand in wonder, how the marketing guys find the most confusing names for new products. Must be a contest.
With .NET 5.0, this is hilarious!
It is not .NET (well classic!), but based on Core, while the ASP .NET and Entity Framework parts keep “Core” in their names and, by the way, it does not replace .NET Standard.
I give 10 out of 10 obfuscation points…..

But nevertheless, developers, this is a great runtime and SDK release, so lets get over the naming accidents.
As always! 😉

Alexander

Rent a gateway! – Azure Stack Edge

Some of us know the problem:

Bandwidth terror through an abundance of chatty sensors!

For example, in a manufacturing building, network traffic would go through the roof, if “everything” would be directly connected to the Internet (although, it is the Internet of Things) and, of course, it would be a security nightmare, too.
Well, let us leave the latter topic aside for a moment and stick to the traffic requirements.
The Cloud promise was, connect everything to the Internet and in the Cloud, magically, everything gets done!
While this is not false in a lot of scenarios, it is not always true!
Depending on solution use cases in focus, there are quite a few scenarios where distributed smart architectures have significant benefits over a centralistic approach.
In these cases devices on the “Edge” come into play. They are gateway devices running pre-processing logic and providing storage capabilities to handle part of the overall system workload on-premise, on the edge to the Internet. By doing this, enable the transformation of raw events into higher quality events, such as e.g. the reporting of temperature sensors only, if set limits are exceeded. The higher quality events are passed into the Cloud solution and are handled there to trigger related business logic.
Benefits of this design include

  • Significant lower traffic on central system
  • Better manageability / monitoring and security of data flow
  • Robustness against network outages (at least in some of the scenarios)

but, it comes with challenges, as well, such as:

  • Handling of business logic on the edge
  • Device management of IOT devices as well as edge devices

This needs to be taken into consideration!

However, if you are a vendor creating Cloud solutions experiencing a lot of data ingress, sooner or later you end up installing Edge devices to sort out raw events spamming your backend. You will buy devices and or talk to the customer to install these devices in the on-premise data-center.
The drawback with this approach is that it adds a lot of upfront cost to your solution.
Edge devices might be quite capable, full-fledged and therefore expensive servers that a customer may also want to include into his system management to keep them patched and secure. This triggers often time consuming approval processes to get things into place.
All of this may kill your project or POC before it even has started!

So, is there a smarter way to approach this issue?
As you may have guessed from the title of the post, there is. Microsoft is extending its Cloud-native rent-my-system approach to edge hardware and software. In this case the system of interest is called Azure Stack Edge (fka Azure Data Box Edge – Microsoft likes the renaming game, as we know).
There is a very informative IOT Show episode on this solution, still using the old name.

Benefits of the Cloud-rental approach are that the Stack Edge devices are managed centrally via an Azure service. The service allows the installation and management of Azure IOT edge modules taken over the responsibility for the distributed logic in the system. Looking at IOT devices, it is especially interesting that these devices can be connected to a local (Edge) instance of IOTHub and also be managed from there, which gives you the best of both worlds: The devices are safe behind the firewall, but still accessible via the Edge gateway for administrative purposes!
Microsoft operates the Edge devices as an appliance, which means it takes over responsibility for any OS (Stack Edge is running on Linux) or runtime patches.
IOT Edge modules deployed can be available building blocks from Microsoft or 3rd party vendors, as well be self-developed Edge modules suiting the implemented solution.

This is really powerful, because it leaves solution developers the flexibility to draw existing commercial building blocks e.g. for AI or highspeed processing from Azure Marketplace and focus on the business needs of the solution.

If a customer needs a test at their location, the rental model is not to beat. Just ship your devices and a pre-configured Edge device and a POC can be up and running in minutes, not costing a fortune and hardware. If not suiting, it can be stopped any time with out wasting more costs and energy.

Looks like a quite innovative, efficient and modern approach to me! 🙂

Alexander


Free Blazor Training with Carl Franklin!

Get more efficient and productive creating web apps with Microsoft’s new Blazor technology!
Carl Franklin is a long-time coding veteran on the Microsoft platform and has seen some frameworks come-and-go throughout his career.
Interesting to see that Blazor caught his attention and even more, his excitement!
Carl is a renowned speaker, coach and trainer. Therefore, you should not miss his free “Blazor Train” course series! In this course, he is providing a first hand impression, on how it is to use C# and .NET Core across the complete stack of your solution.

More, than worth a look!

New OPC UA – “Getting Started” course released!

The OPC Unified Architecture communication standard is one of the main technical pillars you can build a digital transformation project on.
This has been especially true for the industrial automation market over the last years, but currently OPC UA is making huge in-roads into other verticals, such as retail and logistics, used by companies transforming to be Industry 4.0-ready.

Connect devices using OPC UA and .NET Core!

Its stability, resiliency and security features are hard to match and truly on enterprise level.
Our new Wechsler Consulting interactive online course provides an introduction to the OPC UA standard and pragmatically explains its different building blocks implementing a simple example in .NET Core using the Open Source OPC UA reference stack released by the OPC Foundation.

There is no faster way to get involved with OPC UA!

In addition – we take care of You!
This course includes two 45 minutes calls with the author to answer your questions, discuss ideas and get inspirations for your current or future project.