Azure Digital Twins reach Production

Azure Digital Twins are the virtual counterparts of systems, sensors or even complete factories in the real world.
The Digital-Twin concept has already around for a time and I have used it in several customer projects, to get a as-good-as real-time view on the state of complex systems. It comes with the additional benefit of having historical data, e.g. to follow up on errors, or predict the future with the help of machine learning algorithms. In addition, the ability to simulate and test possible future situations or different development scenarios with a close-to-reality model, cannot be overrated!

While these custom implementations are working great, it must be admitted that there is significant effort necessary to reach this goal.
Due to this, I consider Azure Digital Twins as the arrival of a game changing platform service for future IOT solutions. Azure Digital Twins save a lot of development effort, are very good integrated with other Azure IOT offerings such as IOTHub, IOT Central and build on IOT Plug & Play. This is taking the fast lane !
It is a powerful combination of services, which are going to revolutionize the way IOT solutions will be built in the coming years.
The good development story behind Twins is supported by great tools for visualizing and reporting. This is something often neglected by standard IOT approaches. Any neglection in this area is dangerous, because capable reporting and querying functionality is essential to run, maintain and evolve your solutions in field.

I predict the Azure Digital Twins will be seen quite often in upcoming solutions.
🙂

Alexander

Plug & Play coming to Azure IoT Solutions

Many of us remember Windows Plug & Play and we certainly have some painful memories with it, especially originating in its early years.
However, over time and with a lot of sweat and tears from the Microsoft product group, it evolved into a cool and robust feature of the Windows OS that has made the life of many IT-Professionals easier.


The exciting news is that Plug & Play is now coming to Azure IoT!

I am really thrilled about its capabilities! It is a new feature and therefore, yes, there will be some rough edges to expect as well as occasionally missing tool support along the journey, but as an IoT Architect, I would call this a very promising approach to tackle the device provisioning problem, we have in every solution.
There are communication technologies available that try to manage this problem on company network level (such as e.g. OPC UA), but none of these have been able to develop a sound Cloud-native strategy, yet.
The Plug & Play deep integration into Azure services such as IOTHub and Digital Twins has the potential to develop into a killer feature!

There is a great and detailed video by Olivier Bloch and Stefan Wick on the Azure IOT Show.

To me, this is just the beginning and I am looking forward to see more interesting developments around IoT Plug & Play happening in the following months.
I can see room for a lot of IoT development process enhancements, modelling tools, solutions templates, to name just a few of the possible fields of innovation!

Alexander

Windows CE App Container on Windows 10 IoT Core

Microsoft is providing a way to “modernize” older Windows CE applications by moving these onto Windows 10 IoT Core using a new feature called Windows CE App Containers.
This is certainly well-intended, but customers should really double-check their use case, if it really makes sense to follow down that path, just to avoid ending in a cul-de-sac.
As a former Windows Embedded MVP and Windows Embedded Silver Partner, I am very aware of the variety of CE applications existing and only in rare cases I would feel good with recommending to containerize an existing CE app to a customer.


If you feel the need to modernize an existing Windows CE system, there are several options you should consider first, depending on the nature of your application.

Here is a quick list of options that comes to my mind:

  • Hard real-time systems written in C or C++
    • Windows 10 IoT Core nor Enterprise are hard real-time-capable, due to Windows 10’s preemptive scheduler
    • Have a look into alternative hardware and operating systems from other vendors, or, quite interesting, Azure Sphere from Microsoft that supports hard real-time and is security hardened for IoT at the same time.
      It also includes support for the ThreadX real-time operating system (also recently acquired by Microsoft).
  • Normal UI or service applications written in C, C++, Java or .NET Compact Framework
    • Check, if these applications can be modernized by a new design leveraging Cloud technology!
      Candidates would be Azure IoT, Azure IoT Edge as well as serverless approaches such as Azure Functions and Logic Apps, looking at the Microsoft Azure ecosystem.
      Have in mind that nearly always, when modernizing applications, it does not make sense just to adapt to the newest technology level! Think about redesigning your processes, architecture and streamline end user experiences leveraging modern Cloud technologies!
    • Move your application onto cross-platform technologies such as .NET Core and ASP.NET Blazor!
      This often shakes off the chains of being bound to a certain hardware/OS combination and you ideally are able to grow a family of devices using the same software across different hardware devices and OSes.
    • Use a Cloud native, distributed architectural approach to be able to grow and advance your solution organically
    • Change the communication strategy in your solution from connected, directed calls (as it often is to be found in older applications) towards asynchronous, message based communication.
      This will add a lot of robustness and extensibility to your system!
  • Applications using certain Windows CE Apps or desktop features
    • Port your application to Windows IoT Enterprise, this will be the only path to be future proof, as App containers as well as IoT Core are going to be end of life at the end of this century.
    • There may be rare cases justifying a transition via CE App Container as a transition/bridge solution, but these must be thoroughly analyzed!
      App Container support is not just lift and shift and comes with at least “some” porting effort.
      Check if this effort really is as small, as the marketing department says, against possible porting/redesign efforts explained above.
      I always recommend 20% of the estimated porting costs as a threshold. If the to be expected containerizing effort is higher, go for redesign.
    • Keep in mind, that containerizing is only buying you time, you will need to port the app anyway!
  • Really large and complex applications, which are expensive to port
    • OK, the first mistake is to put such a large and complex application on a small embedded device running Windows CE!
      I am pretty sure, with this kind of application, you are having other troubles, such as performance and resource management problems on the device, as well.
    • The best thing is to port your application to a capable Windows 10 IoT Enterprise embedded PC system, right away.
      Do not waste money on a bridge solution, as it may cause additional problems and is not really suited to solve the existing ones.
      Redesign is a must, to make your app more manageable and fix existing issues!

Yes, there certainly are more approaches and arguments, but I think the ones laid out above cover most of the ground of this discussion.


If you need some ideas how to handle the transition in your specific use case or if there are other questions, just drop me a line and we will find a way to help you out!

Alexander

Live Video Analytics

Quite often new and innovative solutions require at least some technical effort. IOT systems e.g. need to be deployed , calibrated, provisioned with network and power, which requires quite some effort.
Depending on the use case Live Video Analytics, a feature of Azure Media Services, might be able to reduce this effort. All you need is a camera and, ideally, an IOT Edge device connected to it. This is especially helpful in dynamic environments, such as delivery entrances, machine ports or storage racks, where a lot of different things are going on simultaneously. In these dynamic environments dedicated sensors are often hard to calibrate and locate.
Video Analytics use AI models to detect motion, things and can even go down to detecting and reading text, such as numbers on license plates, addresses on parcels, etc. .
Microsoft describes some of the interesting possibilities and scenarios quite good in this recent blog post.

Detect workers and cargo in a video stream

What I like is that the AI video analytics models can be run on an edge device. This saves a lot of bandwidth and also keeps your eventually sensible video material on-premise! There are models available for use, but you can also build custom ones and thus create, adapt and fine tune the detection for a use case. Existing video streams also can be used for processing, which, in some cases, enables you to start right away focusing on implementing the IOT Edge solution,
As the analytics models are able to create events to be consumed by an EventHub, they can be used as a publisher of triggers to build business solutions on.
Use Azure Serverless capabilities and you have a sophisticated video analytics system for your use case up and running in days, or even only hours.

Alexander

Microsoft Build 2020 coming up this week!

Microsoft teams have geared up, this time, of course, virtually, to present all the newest stuff from their development repositories. Looking at Azure IOT there is quite some interesting information in the pipe.
Focus, this year, seems to be on Azure Sphere and Digital Twins, although they might come up with some new stuff, as well.

So tune in, I’ll be there. 🙂

Wechsler Consulting joins OPC Foundation

We are very happy to announce that Wechsler Consulting has joined the OPC Foundation as a corporate member.


The OPC Unified Architecture communication standard has been tremendously successful over the last years in the industrial automation space, but has a lot of beneficial functionality to offer for other vertical markets, as well.


Complete and seamless interoperability between devices is definitely the goal, when a company is transforming its business digitally and OPC UA is providing all to be a great asset in this context.
We at Wechsler Consulting are supporting our customer’s efforts with architectural consulting and training. Being a corporate member in the OPC Foundation standards body enables us to be on top of the newest developments and decisions for this very capable and efficient communication standard.
Due to this, we will be able to apply these, as fast as possible, in real world projects.

We have already started working on an “Getting Started” online course to help interested customers to jump start their OPC UA project.
The currently available course preview is free and gives a first glimpse, on what is to expect, laying the groundwork to be successful with OPC Unified Architecture.

Computer Vision – Read API

Business solutions quite often need to deal with data in different formats and sometimes you cannot even rely on having the information available digitally.

Do not get caught in an analog trap


One way to solve this problem is to “digitize” required data on demand and this is, when Azure Computer Vision comes in handy. Microsoft just released a new Read API that helps to apply optical character recognition (OCR) on any type of pictures may they be scanned or photographed.

There is a C# sample available that enables you to jump-start your document recognition and processing operations.

Digital Transformation – a few things to consider

Nearly all businesses currently strive to get transformed digitally and I was able to support a few of them on their journey. Digital Transformation for a company, without a doubt, is more than important to reach a next level of efficiency, data insight, scalability and so on. These arguments, and many more, have been discussed a lot and at the first glance, all seems understandable and eventually not so hard to execute.
Nevertheless, if one starts to go on that digital transformation journey a lot of important and underestimated aspects come into view, that were not so obvious from the beginning.

More than just pushing a button – Digital Transformation

It’s foremost a cultural change, not a technological one
Quite often companies in the beginning of the transformation believe that starting the one Cloud-based solution project is already transforming their businesses. Of course, a single project, even if it may be a great start to find out how things work technologically, does not digitally transform a company. Therefore, quite often these projects end up as Cloud islands with little or limited impact on anybody’s benefit. The challenge to overcome to get successful, is to identify a company’s solution- and data-silos and make these accessible in a secure and transparent way to everybody, who could benefit from these.
Because, only then you are able to hoist the treasures of insight and information lying in this data. Breaking up these silos is not easy. Not, because it is technically hard to do, but, because of the mindset change by employees that is required, to make this possible. Established attitudes from the past, such as “I do have knowledge and data, therefore I am needed.” need to be changed into “I provide data and am able to share them most efficiently, along with insights.” This turns a lot of internal kingdoms into rubble and due to this, it is most likely to find significant resistance here.

Digital Transformation makes no sense without a company-wide perspective
As already stated, a single Cloud solution does not transform a company. It is the whole company that needs to migrate into a new mindset and a new digital eco-system. Think of it, just as installing a new operating system onto parts of the old hardware, while replacing outdated hardware components with fresh and shiny ones from the outside (aka Cloud). This requires spending some thoughts on, what the important components of your business are and what the best way to transform them would be.
Of course, all this needs to occur, without interrupting the current (well-running) business, to ideally enable a smooth transition. And yes, you guessed it, this is far from easy to do.
Nevertheless, it can be done, if you are well planned and prepared. You need to do a thorough inventory on what is currently going on, which enables you to decide on migration paths and migration priority for any of your businesses components.

Should the new company setup be Cloud-native or hybrid (combining Cloud and on-premise)?
Do not waste time on that, it is always hybrid! Meaning – that if you have an established business, there are always components that cannot or should not be transferred to the Cloud in a foreseeable future. There is nothing bad about it. Cloud technology does not have to be the only solution for all business problems. Be pragmatic and do what makes sense! Choose to move to Cloud where the largest benefits are to be expected.
In any case, set up the required infrastructure to support a sound hybrid Cloud scenario.

Get identity management right!
Setting up that hybrid Cloud scenario, you will automatically run into the identity management question. Should I sync all my users into the Cloud? Or, only a few chosen ones? But, is this secure?
It is more than wise to sync all your user into a Cloud directory. Everything else is pretty useless, looking at future applications.
If set up correctly, a Cloud identity store is more secure than an on-premise one, because of the enhanced monitoring and auditing capabilities in Cloud, along with the additional supervision of the dedicated security teams of Cloud providers.
These guys are fighting back attacks every minute. Especially as a small to mid-size company, you will never be able to match their expertise and execution level with your own resources!
But, let us get back to identity management. There must only be one and this is pretty important! If there are currently several identity systems in your enterprise, the first task is to consolidate these. Otherwise, you are going to end up in an identity synchronization nightmare, which will be a security nightmare at the same time.
After that, it is very important to keep on-premise and Cloud identities in sync, because looking at Cloud solutions the traditional means of enterprise protection, network perimeters, have lost their effectiveness. This is due to the fact that we are communicating on public networks most of the time, which can neither be controlled nor considered safe! As Cloud offerings grow, a lot of Software as a Service (SaaS) components will be part of a solution and with them, you have no or only limited ways to establish network perimeters.
Network perimeters, as a protection mechanism, in Cloud solutions normally are replaced by identity perimeters, who are able to span a complete solution context. You still may find network perimeters, such as V-Nets, for securing certain solution parts, e.g. micro service clusters, API management, etc., but, these cannot not play a role in securing an overall solution.
Therefore, to get security right for a Cloud solution, a sound identity management enabling robust endpoint protection is key. Endpoint protection is often referred to as “defense in depth”, which means, you do not care about protection of any networks, but you are relying heavily on protecting the different communication endpoints, a solution is using.

Outsmart your old data center
When migrating to Cloud, a thorough understanding of Cloud technology benefits, its strengths and weaknesses for your business solutions need to be established up-front. Educate your teams and experts up-front and get them fully aboard. Do not underestimate this challenge! Many transformation projects died, because teams did not know what to do in Cloud and were not convinced.
In addition, just copying over solution-VMs from an on-premise data center into Cloud (lift and shift) is seldomly an efficient and rewarding approach. Instead, try to re-think your business processes and solution aspects from the new angle of perspective, Cloud computing is offering. Ask yourself, having the power of the Cloud eco-system at hand, if you could simplify, streamline or even get rid of a business process in a migrated solution.
This is the spot, where you can really make the difference! Set up highly scalable, flexible and easy to change business processes that enable you to stay ahead of the competition.
To achieve this, your development teams need to re-think their traditional implementation approach. Classical enterprise solutions require clusters on VMs, load balancers, etc.. Try to get rid of as many of these components, by using the highly efficient low code/serverless approach, which is only possible in Cloud solutions.
And again, there will be some non-technical challenges to face, such as the “not invented here” paradigm, along with developer proudness “I want to build micro-services, I can fully control!” that need to be overcome.
As soon as the dust settles and the new approach can be applied, one will be able to experience a boost in efficiency, along with the ability to do real-time changes to business processes, which will become a huge competitive advantage!

Consider Digital Transformation as a journey!
You will not be able to achieve everything overnight. Some items will be implemented fast, some are going to take more time. Have a well-planned company-wide approach ready and implement the means and mechanisms to execute sound governance on all migration activities.
Governance is very important for all involved players to stay on the same board. Re-evaluate and re-think approaches and technologies pragmatically as-you-go and as technology evolves (it does this fast in Cloud), but keep your eyes on the target, transforming your company into a digital one. As more and more components reach the new eco-system, it will be not too hard for your teams, to initiate the next round of innovation.

Alexander