Nevertheless, test the filter in your development environment, before rolling it out in production. It should normally not have any negative effects, if you are using the endpoints in a standard way, but one only ever knows after trying it out. 🙂
Azure Digital Twins are the virtual counterparts of systems, sensors or even complete factories in the real world. The Digital-Twin concept has already around for a time and I have used it in several customer projects, to get a as-good-as real-time view on the state of complex systems. It comes with the additional benefit of having historical data, e.g. to follow up on errors, or predict the future with the help of machine learning algorithms. In addition, the ability to simulate and test possible future situations or different development scenarios with a close-to-reality model, cannot be overrated!
While these custom implementations are working great, it must be admitted that there is significant effort necessary to reach this goal. Due to this, I consider Azure Digital Twins as the arrival of a game changing platform service for future IOT solutions. Azure Digital Twins save a lot of development effort, are very good integrated with other Azure IOT offerings such as IOTHub, IOT Central and build on IOT Plug & Play. This is taking the fast lane ! It is a powerful combination of services, which are going to revolutionize the way IOT solutions will be built in the coming years. The good development story behind Twins is supported by great tools for visualizing and reporting. This is something often neglected by standard IOT approaches. Any neglection in this area is dangerous, because capable reporting and querying functionality is essential to run, maintain and evolve your solutions in field.
I predict the Azure Digital Twins will be seen quite often in upcoming solutions. 🙂
IOT Central is Microsoft’s low code, low effort, ease of use approach into the world of embedded projects. This is quite a demanding challenge, because real world problems tend to be complex and what can you do to make these simple in a tool? Well, normally you start with defining an environment, to get rid at least of some of the parameters and thus reducing complexity. This is a valid approach, but for a tool/service vendor it carries the danger that the overlap of your defined environment to common real-world use cases of customers, is not large enough, or, as a worst case, even not existing. Azure IOT Central, in the beginning, felt a bit like: great base features, but not enough to cover a complete project spectrum of demands. Therefore, to me it was good for samples or a quick POC for a project. However, the IOT Central team kept improving steadily and so the product is getting more serious as we speak.
The newest update provides some very interesting features, like jobs that can be execute on devices (very important for device management), webhook improvements looking at identity management, device templates to support IOT Plug & Play as well as improvements on the dashboard.
At least for me enough new stuff to justify a closer and serious re-visiting look into IOT Central!
In nearly every IOT project I had the opportunity to work in, time series data played a very important role. The problem for this type of data is that it normally comes in larger volumes and is therefore not always great to handle. This is especially true in projects, where you have to cope with small storage on devices and no central data store, which makes it very hard, if nearly impossible to get a global view on the behavior of these solutions in time. One could work with thresholds and alerts, but this approach never gives you the chance to detect trends and get “ahead of the wave” to react better, faster and more precise to certain events. Some of the industrial communication standards, such as OPC UA and SCADA, try to tackle this issue by providing historic data functionality in their communication layers, but this is just a single aspect of a comprehensive data solution.
Cloud architectures are able to help in this case, if you have the chance to collect time series data either centrally, or on the edge. A very valuable asset in Azure and in this context is Time Series Insights. It is a cloud service allowing you to handle query, transform visualize and correlate your different data streams into comprehensive views and insights. There are also connectors into reporting tools such as Power BI available. Using the M365 infrastructure Power Automation or Azure Logic Apps and Functions, serverless integration into corporate business process and control processes is also not a problem.
Many of us remember Windows Plug & Play and we certainly have some painful memories with it, especially originating in its early years. However, over time and with a lot of sweat and tears from the Microsoft product group, it evolved into a cool and robust feature of the Windows OS that has made the life of many IT-Professionals easier.
I am really thrilled about its capabilities! It is a new feature and therefore, yes, there will be some rough edges to expect as well as occasionally missing tool support along the journey, but as an IoT Architect, I would call this a very promising approach to tackle the device provisioning problem, we have in every solution. There are communication technologies available that try to manage this problem on company network level (such as e.g. OPC UA), but none of these have been able to develop a sound Cloud-native strategy, yet. The Plug & Play deep integration into Azure services such as IOTHub and Digital Twins has the potential to develop into a killer feature!
There is a great and detailed video by Olivier Bloch and Stefan Wick on the Azure IOT Show.
To me, this is just the beginning and I am looking forward to see more interesting developments around IoT Plug & Play happening in the following months. I can see room for a lot of IoT development process enhancements, modelling tools, solutions templates, to name just a few of the possible fields of innovation!
Microsoft is providing a way to “modernize” older Windows CE applications by moving these onto Windows 10 IoT Core using a new feature called Windows CE App Containers. This is certainly well-intended, but customers should really double-check their use case, if it really makes sense to follow down that path, just to avoid ending in a cul-de-sac. As a former Windows Embedded MVP and Windows Embedded Silver Partner, I am very aware of the variety of CE applications existing and only in rare cases I would feel good with recommending to containerize an existing CE app to a customer.
If you feel the need to modernize an existing Windows CE system, there are several options you should consider first, depending on the nature of your application.
Here is a quick list of options that comes to my mind:
Hard real-time systems written in C or C++
Windows 10 IoT Core nor Enterprise are hard real-time-capable, due to Windows 10’s preemptive scheduler
Have a look into alternative hardware and operating systems from other vendors, or, quite interesting, Azure Sphere from Microsoft that supports hard real-time and is security hardened for IoT at the same time. It also includes support for the ThreadX real-time operating system (also recently acquired by Microsoft).
Normal UI or service applications written in C, C++, Java or .NET Compact Framework
Check, if these applications can be modernized by a new design leveraging Cloud technology! Candidates would be Azure IoT, Azure IoT Edge as well as serverless approaches such as Azure Functions and Logic Apps, looking at the Microsoft Azure ecosystem. Have in mind that nearly always, when modernizing applications, it does not make sense just to adapt to the newest technology level! Think about redesigning your processes, architecture and streamline end user experiences leveraging modern Cloud technologies!
Move your application onto cross-platform technologies such as .NET Core and ASP.NET Blazor! This often shakes off the chains of being bound to a certain hardware/OS combination and you ideally are able to grow a family of devices using the same software across different hardware devices and OSes.
Use a Cloud native, distributed architectural approach to be able to grow and advance your solution organically
Change the communication strategy in your solution from connected, directed calls (as it often is to be found in older applications) towards asynchronous, message based communication. This will add a lot of robustness and extensibility to your system!
Applications using certain Windows CE Apps or desktop features
Port your application to Windows IoT Enterprise, this will be the only path to be future proof, as App containers as well as IoT Core are going to be end of life at the end of this century.
There may be rare cases justifying a transition via CE App Container as a transition/bridge solution, but these must be thoroughly analyzed! App Container support is not just lift and shift and comes with at least “some” porting effort. Check if this effort really is as small, as the marketing department says, against possible porting/redesign efforts explained above. I always recommend 20% of the estimated porting costs as a threshold. If the to be expected containerizing effort is higher, go for redesign.
Keep in mind, that containerizing is only buying you time, you will need to port the app anyway!
Really large and complex applications, which are expensive to port
OK, the first mistake is to put such a large and complex application on a small embedded device running Windows CE! I am pretty sure, with this kind of application, you are having other troubles, such as performance and resource management problems on the device, as well.
The best thing is to port your application to a capable Windows 10 IoT Enterprise embedded PC system, right away. Do not waste money on a bridge solution, as it may cause additional problems and is not really suited to solve the existing ones. Redesign is a must, to make your app more manageable and fix existing issues!
Yes, there certainly are more approaches and arguments, but I think the ones laid out above cover most of the ground of this discussion.
If you need some ideas how to handle the transition in your specific use case or if there are other questions, just drop me a line and we will find a way to help you out!
Sometimes, it is the small things that make life as an IT-Professional easier!
Anybody who had to move and restore huge virtual disks knows what I am talking about. There are quite often two types of data on a disk: Unimportant data, easily restorable or always included in a base image, and important data, which normally is the business data for the user. In the past one could not separate this data easily, but was forced to go with the heavy lifting on virtual disk level.
Selective Disk Backup, exactly will help us out here, giving us a finer level of control on what to backup and restore.
What makes it interesting is that:
You can save a lot of storage space, as business data normally is much smaller
Restore times are shortened significantly due to the smaller volume
As stated in headline, the service is in preview! Just the right time to take for a test drive! 🙂
SignalR. at least to me is an absolutely underrated technology, when it comes to real-time messaging. Disclaimer for my embedded readers: yes, at least, if you do not require millisecond response times – but that is seldom the case in standard applications.
What I really appreciate with this service is is the pub/sub messaging approach, which scales great and reduces dependencies between client and publisher to a minimum. All coming as PaaS in a serverless fashion! This is definitely worth a look!
Having your infrastructure a code in the Cloud is certainly the way to go! Unfortunately it still feels like working close to the metal in Azure. Microsoft has recognized these problems and they are striving to provide more comfort and tooling, as Mark Russinovich states in his interesting blog post to the topic.
This is great news, especially looking at the tool support with VS Code. However, in enterprise scenarios Visual Studio is still the tool of choice and could use some love here, as well. As a VS enterprise licence is not cheap, this would be well deserved.