Power Platform Dataflow Tutorial Deep Dive
Power Platform Dataflow Tutorial Deep Dive
Power Platform Dataflow Tutorial Deep Dive
Azure Peering Service for Dataverse:
Requirements for Enterprise cloud services access are constantly evolving in terms of number of connections/users, data payload, multi-region/multi-continent access, ....
To overcome this different types of Enterprise architectures are often used : Geo-Partitioned/Hybrid-Cloud/Global load balancing/CDN architectures.
Create a service principal with Power Platform CLI
To perform an integration with Dynamics 365 you typically need to set up a service principal in Azure.
To achieve this goal you must complete several steps:
Dataverse Dynamics 365 Load testing for Model-driven app
In today's digital landscape, where user expectations for seamless, high-performing applications are at an all-time high, ensuring the reliability and scalability of software systems is paramount. Performance testing emerges as a critical practice in the software development lifecycle, aimed at evaluating the responsiveness, stability, and scalability of applications under various conditions.
PowerApps Model Driven UI Testing with Playwright
The User Name in D365 CRM is different from that in Office 365
Table of Contents
Today, I'd like to show a note about the User Name sync mechanism from AAD(Azure Active Directory ) to Dynamics 365 CRM.
How to use Azure Table Storage with C#
Azure Table Storage is a very useful solution present in the Azure Storage Account component. The service is a NoSQL datastore which accepts authenticated calls from inside and outside the Azure cloud. While remaining scalable and maintained by the Azure Cloud.
It is therefore a real accelerator for projects that need to store unstructured data.
You can obviously manipulate the data manually from the Azure portal, however, here I will show you that you can mainly use the API to manipulate the data from your C# code.
When creating a ci/cd pipeline for your project, at some point you have to define a connection to your environment. In case of Dataverse, the connection string will contain clientid and client secret values. It's always a good idea to store secret values in a secure place, instead of putting them in clear text into your pipeline definition file (yaml) and potentially pushing them into your code repository.
Azure DevOps provides you a number of possible solutions to address just that: