Conditionally Secure/Mask Column in Microsoft Dataverse

Introduction

Hello everyone, Glad to have you on my blog!

In this blog post, I am going to share some useful information regarding one security feature of Microsoft Dataverse, that I have been struggling to find the solution from past couple of weeks. So, the requirement was to conditionally secure/mask the Microsoft Dataverse column. There are numerous business requirements, where you have to conditionally secure/mask the Dataverse column. It seems easy to implement at first glance, however it’s not as easy as you think. Let’s think about following security requirements:

  • Mask the last 4 digit of Account number (1234XXXX), if the account is PREMIUM.
  • Users should only able to see last 4 digit of customer’s credit card number for ex: XXXX-XXXX-XXXX-4444
  • Users should not be able to see the customer’s mobile no if he/she has opted to keep it PRIVATE.
  • Solution becomes more complex. If you get the above security requirements in the later phase of the project. Sometimes, it’s easy to implement/design the solution, if you get such security requirements in the initial phase of the project (before production go live). Because, in that case. you only have to deal with new data. However, just imagine if such requirements comes in the later phase of the project, then it becomes more complex because till that time, some data has already been exposed to the end users and then you have to think about the solution that should work on new data and existing data both. And therefore sometimes, you might also need to run bulk update job as well to apply security on existing data.

What possible solutions that comes to your mind?

Field Level Security

Field Level Security is one of the most popular feature that strikes to your mind immediately whenever you think about securing the column’s value. However, it can not be used to conditionally secure the column’s value (for the requirements given above). Additionally, it’s a best solution for the security requirements where you have to completely secure/mask the column’s value (for ex. *******). not fit for the requirements, where you have to secure/mask only few characters/digits of the column’s value (for ex. 1234**** or ****ABCD).

One more limitation with Field Level Security is that, it cannot be applied on columns that you have used as Alternate Key in Microsoft Dataverse.

Plug-In

Plug-In is the best solution to conditionally and non-conditionally secure Dataverse data at server side. However, plug-in require triggers to run the business logic. Just imagine, if you already have the data on production then create, update trigger won’t work. You have to trigger the business logic on existing data. So, the only trigger point that meet the requirement is RetriveMultiple. However, i personally don’t recommend writing plug-in on Retrieve and RetrieveMultiple messages, because it might hampers your app performance, make your model driven apps unresponsive and slow down the client interaction. Check more details here.

Having said that, it’s your personal choice whether you still want to go ahead with plug-in based solution. However, I still think there should be a better solution, so that you don’t need to compromise with the app performance.

JavaScript

JavaScript is a client side language, so it can only secure/mask the column’s value either on forms or views. Data can still be exposed from various places like Advanced Find, Export to Excel, WEB APIs, Power Automate, Plugin, Workflow etc.

PCF Control

PCF Controls is a custom component that also made up with HTML/TypeScript/JavaScript and can only secure the data at client side. Data can still be exposed from various places like Advanced Find, Export to Excel, WEB APIs, Power Automate, Plugin, Workflow etc.

So is it really not possible to implement it?

Until you dare to take new challenges, you wouldn’t discover your full potential. So, I love taking challenges and working on challenging requirementsūüėä

However, this requirement literally freaked me out from past couple of weeks, since I was struggling to find it’s solution neither on any article/MS Docs on internet nor by any community experts. So thought of posting this as new Idea on Power Apps Idea Portal. Can check here.

Additionally, I have posted it on my LinkedIn wall to know the views from my fellow MVPs, Experts and Dynamics Specialist. And, got to know that this is quite common business requirement, however the solution is still unknown to everyone. I really appreciate who voted for my Idea and suggested that there should be some inbuilt feature in Microsoft Dataverse to implement such requirements.

So without any further due – Here is the solution

Option 1 – Power Fx Formula Type Column

Power Fx – Formula type column is a new type of Dataverse column introduced by Microsoft that empower citizen developers to express business logic on top of existing columns and reference column from other tables, directly in Dataverse.

The major advantage of using formula type column is – All calculations are done at the Dataverse level, therefore the results can be seen and used in all Dataverse endpoints, including the Dataverse for Teams table view, canvas and model-driven Power Apps, Power Automate, Power BI, the Dataverse APIs, to name just a few.

Additionally, In Formula type column all calculations are done on the fly as added projections of the SQL Select query.  Therefore, Results are always real time. 

More details you can find here.

Note: At the time of writing this blog, Power Fx Formula type column is in PREVIEW. Therefore, it’s not recommended to use on production unless it is generally available. Can check the release plan here.

Example: I have used Power Fx Formula type column to conditionally mask the Mobile Number available on Contact form. My requirement is to mask the last 5 digits of mobile number if customer has opted to keep it hidden, else display the complete mobile number to CRM users.

So to implement that, I have created a Formula Type Column and Write Power Fx Expression to conditionally mask the last 5 digits of mobile number.

Creating Formula Type Column
Writing Power Fx Expression in Formula Type Column

Power Fx Expression: If(‘Keep it secret’, Concatenate(Left(‘Mobile Phone’,5),””,”*****”),’Mobile Phone’)

Description: If Keep it secret (boolean type) column’s value is set to true then get the first 5 digits of mobile number and concatenate it with *****. So the outcome will look like this on form and view.

Contact Form
Contact View

Major Advantages of using formula type column are:

  • It works On Read operation. That means all calculations are done on the fly as added projections of the SQL Select query.  Therefore, Results are always real time.
  • It execute the expression server side. So security will never be compromised. Hence, user can not access it’s value neither via form, view, advanced find, export to excel etc. nor via Plugin, Power Automate, Workflow, WEB API etc.
  • You don’t need to worry about updating existing data, if you create this field in the later phase of the project. So no bulk update job needed to update legacy data.
  • You can conditionally mask/secure column’s value.

Option 2 – Calculated Column

Obviously, Power Fx Formula type column is a next generation column in Dataverse. However until you are not able to use Power Fx Formula type column and you have an urgency to immediately write business logic on production, then you can use Calculated Field as well to conditionally secure/mask the column’s value that also does the same.

Creating Calculated Column
Formula in Calculated Column
Outcome

Will Microsoft replace Calculated and Rollup Columns with Power Fx?

As per Microsoft – Yes in the future, But not at present and there will be no rush, those existing technologies are tried and true.  Only when we have more feedback and experience, and we‚Äôve worked through any functionality gaps, will we will draw up and communicate a long term migration plan.  In time formula columns should be able to do everything that can be done today, and much more, in an easier to use and Power Platform consistent manner.

So, as of now there is no such plan to completely deprecate the calculated and rollup columns from Microsoft Dataverse. Power Fx is a new feature that is being introduced by Microsoft to overcome the limitations of calculated/rollup columns along with some advance expressions to write excel like business logic.

Hence, Power Fx Formula type column is a future and it’s a long term migration plan of calculated and rollup columns. Having said that, unless Microsoft includes all the features of calculated and rollup columns in the Power Fx, it’s not going to deprecate. So currently we are allowed to use in our Dataverse environments.

Conclusion

So, finally we have got the feature in Microsoft Dataverse that can mask the field conditionally. Although, Power Fx is not that mature feature right now, however since it’s in the Microsoft’s release plan, so definitely it will be more powerful in the future compared to Calculated and Rollup fields. Some features like masking/unmasking based on security roles or based on some related table is not possible right now however, it will be there for sure in near future.

Important Tip: The only challenge that still I want to highlight is – You can’t apply it on the existing field. So for example, if you create an account number column of type single line text and deployed it to production. Then, later if client asks to mask the account number based on Account type (Standard/Premium), you can’t do anything with the original field. You’ll have to create a new field (of Formula or Calculated type) and need to write the Power Fx expression/formula to copy the value from original column, mask it and store it in the new column. Additionally, you will have to apply the field level security also to the original field, so that user could not read the value from original column. Bit tricky, but you have the solution!

Please do not forget to share your valuable feedback/input, if you find this information worth reading it.

Also, Subscribe/Follow to get notified for such more interesting blog post in the future. Stay Tuned!

Thank You!

Advertisement

Power Apps Portal – Create Left Navigation Menu

Introduction

Hey everyone,, Hope you are doing well and staying safe!

In this #PowerGuideTip, I am going to share a Tip to customize portal default toolbar menu. As you know that currently portal toolbar menu (weblinks) has two major UI limitations:

  • By default portal menu/toolbar can only be appeared on the header.
  • Only display navigation menu up to 1 level. As depicted in pic below, you can see child links of only Power Guide Tips, not the child links of Power Platform and Dynamics 365.

Business Requirement

Customizing portal UI is one of the most common business requirement, where your client wants to customize the portal UI (Login, Registration, Home and Toolbar menu) to match the portal UI with other existing applications & websites. Recently, I have came across a UI requirement, where I had to show the navigation menu in left panel instead of showing it on top.

Solution

I am going to share a code snippet which is based on JavaScript, HTML, CSS and Liquid Code. Where,

  • HTML + CSS – is being used to design the custom toolbar menu
  • JavaScript – is being used to write some client side validations & logics.
  • Liquid Code – Dynamically fetching the portal weblinks from web link set table.

Alright, let’s get into more details and implement the requirement by performing followings steps:

Note: I have created following Web Links under Primary Navigation Web Link Set.

Step 2 – Open ‘Mobile Header’ content snippet (Portal Management > Content Snippets) and paste the following code
<span style="font-size: 22px;cursor:pointer;color: black;" onclick="openNav()">‚ėį </span>
<a href="~/"><img src="/powerguidelogo3" alt="Power Guide" style="width: 214px;margin-top: -14px;"></a>
<meta name="viewport" content="width=device-width, initial-scale=1">
<style>
  body {
    font-family: "Lato", sans-serif;
  }
  .sidenav {
    height: 100%;
    width: 0;
    position: fixed;
    z-index: 1;
    top: 0;
    left: 0;
    background-color: white;
    overflow-x: hidden;
    transition: 0.5s;
    padding-top: 60px;
  }
  .sidenav a {
    padding: 0px 8px 3px 32px;
    text-decoration: none;
    font-size: 16px;
    color: #818181;
    display: block;
    transition: 0.3s;
    margin-left: 26%;
  }
  .sidenav a:hover {
    color: #f1f1f1;
    text-decoration: underline !important;
  }
  .sidenav .closebtn {
    position: absolute;
    top: 20px;
    font-size: 32px;
    margin-left: 0px;
    font-weight: normal;
  }
  #sublink_L1 {
    font-size: 15px;
    font-weight: normal;
    margin-left: 5%;
  }
  #sublink_L2 {
    font-size: 15px;
    font-weight: normal;
    margin-left: 15%;
  }
  @media screen and (max-height: 450px) {
    .sidenav {
      padding-top: 15px;
    }
    .sidenav a {
      font-size: 18px;
    }
  }
</style>
<div id="mySidenav" class="sidenav" style="width: 0px;">
  <img src="/powerguidelogo3" style="width: 54.52%;margin-left: 22%;top: 0px;position: absolute;">
  <br>
  <a href="javascript:void(0)" class="closebtn" onclick="closeNav()">√ó</a>
  {% assign primary_nav = weblinks["Primary Navigation"] %}
  {% if primary_nav %}
    {% for parentLink in primary_nav.weblinks %}
    <a href="{{ parentLink.url }}">{{ parentLink.name | escape }}</a>
    {% assign sublinks_L1 = parentLink.weblinks %}
        {% for sublink_L1 in sublinks_L1 %}
            {% if sublink_L1 %}
                <a href="{{ sublink_L1.url }}">
                <p id="sublink_L1"><span>- </span>{{ sublink_L1.name | escape }}</p>
                </a>
            {% endif %}
            {% assign sublinks_L2 = sublink_L1.weblinks %}
            {% for sublink_L2 in sublinks_L2 %}
                {% if sublink_L2 %}
                <a href="{{ sublink_L2.url }}">
                <p id="sublink_L2"><span>- </span>{{ sublink_L2.name | escape }}</p>
                </a>
                {% endif %}
            {% endfor %}
        {% endfor %}
    {% endfor %}
  {% endif %}
</div>
<script src="https://code.jquery.com/jquery-3.5.1.min.js"></script>
<script>
  function openNav() {
    $("#mySidenav").css('width', '415px');
    $("body").css('backgroundColor', '#eee');
    $('.navbar').css('backgroundColor', '#eee');
  }
  function closeNav() {
    $("#mySidenav").css('width', '0');
    $("body").css('backgroundColor', 'White');
    $('.navbar').css('backgroundColor', 'White');
  }
</script>
  • Update the Line 2 (portal logo & company name ) as per your business requirement. This line of code will already be presented in your content snippet.
  • Update the CSS (Line 4 – 57) as per your UI guidelines.
  • Line 63 – change the web link set name that you want to customize. I have used ‘Primary Navigation’ for the demonstration, however this might be different in your case. For ex: Default, Secondary Navigation, Profile Navigation etc.
  • Lines (63 – 84) – Liquid code is used to fetch the weblinks under ‘Primary Navigation’ and also fetching the sublinks (childlinks) of each weblink. In the given code, I have fetched the sublinks only till second depth. As depicted in the pic below: D1 represents Depth 1 child links and D2 represents Depth 2 childlinks (child links of a child link) .
Step 3 – Hide portal default toolbar menu from Header template

Open Header web template and add style=”display:none;” at following places.

Demo

  • Important Tip: You can create a new content snippet to write your code (i shared above) and include that in your Mobile Header content snippet.

Thank you for reading this article and hope you found it useful and valuable. Please share your feedback and hit subscribe to learn such more useful #PowerGuideTips. Stay tuned!

Power Apps Portals – Extend Portal Trial Period

Introduction

A portal is always created as a trial. A trial portal, which expires after 30 days, is useful for trying out its capabilities at no cost. After it expires, the portal is suspended and shut down. Seven days after it’s suspended, the trial portal is deleted.  As an administrator, you can convert a trial or suspended portal to a production portal. When converting a portal from trial to production, you must ensure that the environment is also a production environment. You can’t convert a trial portal to a production portal in a trial environment.

It is always recommended to convert your portals to Production to avoid this situation. However, there are numerous situations where you set up a Power Platform trial environment (30 days) to play around with the portal features, for designing Proof of concept (POC), demonstrating portal features to the client and may be for your own learning and RnD purpose. And you might need to extend the portal trial for another 30 days rather converting it to Production due to not having appropriate license or limited storage capacity.

Today, in this #PowerGuideTip, I am going to share a Tip to extend the portal trial period.

Note: When the trial portal gets expired, portal configurations still remain in the Dataverse.

Requirement

  • Extend portal trial (for MVPs)
  • Extend portal trial (for Non-MVPs)

As you can see below, I have 2 portals in two different environments where one has got suspended for 7 days after 30 days trial period and another one has got deleted after 30 days trial period + 7 days suspension period. And when I try to convert the suspended portal to production, I get the license issue. However, I want to extend my trial portal in order to connect it with my portal configuration already available in the Dataverse.

Portal is in Suspended state for 7 days after 30 days trial and will be removed after 3 days
Suspended Portal
Portal got deleted after 30 days trial and no longer in the App list
Deleted Portal
Can’t convert my Trial Portal to Production due to not having appropriate license

Solution

For MVPs

As you know, being an MVP you get the free subscription of Microsoft 365 for 1 year, where you can leverage the Office 365, Dynamics 365 and Power Platform capabilities at no cost. However, when you install the portal in that environment, it gets expired after 30 days as a normal process. Since your environment will be up and running until you have the MVP tag, you would also like to make your portal alive till then. So here is the way to do that…

Step 1 – Go to PowerApps Maker Portal and Choose your Environment

Step 2 – Create a new portal (of same type which was installed earlier) and choose ‘Use data from existing website record’

Step 3 – Browse the portal

Portal is again available for 30 days
Portal is Up and Running

For Non MVPs

For non MVPs, you get the trial environment only for 30 days, however you can extend it for 30 days more. Hence, your trial portal can also be extended only for 30 days. There might be a situation, where you have the Dynamics 365 license, however you want to install the portal in your licensed environment only for POC/Demo/RnD purpose for no cost, then still you can perform the same steps as described below:

Step 1 – Go to Portal Office Admin page and Extend your Dynamics 365 trial period (skip this step if you already have Dynamics 365 license)

Step 2 – Perform the same Steps (1-3) described For MVPs

Useful Articles and References

Thank you for reading this article and hope you found it useful and valuable. Please share your feedback and hit subscribe to learn such more useful #PowerGuideTips. Stay tuned!

Power Apps – Set/Reset values in Choice and Lookup Columns

Hello Folks, In this #PowerGuideTip, I am going to share a Tip to set/reset value in Choice (earlier name: Option Set) and Lookup field of Microsoft Dataverse in Power Apps.

Requirements

I have came across numerous business requirements and have also seen folks asking in community about setting/resetting value in choice (dropdown/optionset) and lookup field on button click. This article helps you fulfill following business requirements in Power Apps:

  • Set value in Choice (Option Set) field of Microsoft Dataverse on-demand in Edit form.
  • Set value in Lookup field on-demand in Edit form.
  • Autopopulate value in Choice field in Edit form.
  • Autopopulate value in Lookup field in Edit form.
  • Reset/Clear Choice and Lookup fields in Edit form.

Note: Power Apps use Combo Box control to render Choice and Lookup type fields in Power Apps Edit form.

Prerequisite

I already have a Power Apps Canvas app, where I have added Edit Form of Account table (Microsoft Dataverse). And the form has following controls:

Solution

For Lookup:

In order to set/reset the value in lookup field, I have used a global variable called ‘gvar_primaryContact’, that will hold the value I want to set on click of ‘Set Lookup’ button. Since it’s a lookup field, therefore you need to retrieve the record first from dataverse using Lookup function.

And finally, you need to set that global variable on ‘defaultselecteditem‘ property of the combo box.

Step 1: PowerFx expression onSelect property of ‘Set Lookup’ button.
Step 2: PowerFx expression onSelect property of ‘Reset Lookup’ button.
Step 3: PowerFx expression on DefaultSelectedItems property of ‘Combo Box’ control.

For Choices (Option Set):

In order to set/reset the value in Choice (Option Set) field, I have used a global variable called ‘gvar_businessType’, that will hold the value I want to set on click of ‘Set Dropdown’ button. Since it’s a Choice field, therefore you need to first retrieve its option value from dataverse using Lookup function.

Step 1: PowerFx expression onSelect property of ‘Set Dropdown’ button.
Step 2: PowerFx expression onSelect property of ‘Reset Dropdown’ button.
Step 3: PowerFx expression on DefaultSelectedItemsSelect property of ‘Set Dropdown’ button.

Important Tip: In this example, I have set/reset value on button click. If you want to autopopulate the value as soon as form loads, you can use the PowerFx expression (which i have written on onSelect property of the button) directly on DefaultSelectedItems property of the Combo Box. (See below)

Hope you find this #PowerGuideTip helpful. Stay tuned for further useful tips.

Power Apps Portals – View Quick View Subgrid Data

Recently, while working on portal I had a requirement to display Quick View Form which had subgrid of related records. Usually, when we display subgrid on portal, it can be configured through Basic/Advanced Form Metadata like we can add Create action, Download action and View details action. However, Quick View Subgrid is quite different with the normal subgrid because it displays the related record one level deeper. For example, I have a Account form on the portal and I use quick view form to display the related cases of primary contact of that account, then the relationship is one level deeper i.e. Account > Contact > Case. Therefore, these types of subgrid can not be configured using Basic Form Metadata.

Requirement

Add Hyperlink/View Details icon to view record in Subgrid rendered within Quick View form.

Why Quick View Subgrid is different from normal Subgrid?

Quick View Subgrid rendered within IFrame on portal form, while normal subgrid rendered directly on the portal page. Also, Quick View Subgrid can not be configured same like normal subgrid like adding create button, adding download button and adding view details button is not possible using Basic/Advanced form metadata.

Business Logic

In order to add the View Details icon next to each table row, we need to loop through the subgrid (table) rows rendered within IFrame. However, getting IFrame content is not easy as other HTML controls. I couldn’t find a way to get subgrid (table) load event within iFrame same as normal Subgrid. Therefore I have used logic to wait until table (or its rows) element exists and then add a new table row (td) to show View icon.

JQuery Code

iframeid:

data-entity (<attribute_schemaname>):

var checkQuickViewIFrameLoaded = setInterval(function() {
if($("#<IframeId>").contents().find('[data-entity="<Attribute_SchemaName>"]').length > 0) {
console.log("Iframe contents are loaded!");
$("#<IframeId>").contents().find('[data-entity="Attribute_SchemaName"]').each(function() {
var recordId = $(this).attr("data-id");
var path= "https://"+window.location.host+window.location.pathname+"<Detailed page partial name>";
$(this).append("<td><a target='_blank' href="+path+"?id="+recordId+"><span class="glyphicon glyphicon-new-window"></span></a></td>");
})
clearInterval(checkQuickViewIFrameLoaded);
}
}, 100); // check every 100ms

Note: Update the highlighted code with your HTML controls element in the above code snippet.

PowerApps Portals Build Tools – Automate Portal Deployment

Introduction

This tool empowers developers to enable CI/CD (Continuous Integration/Continuous Deployment) of portal configuration. You can use this tool to check-in the portal configuration to source control and move portal configuration to any environment using Power Apps CLI.

For more details about Power Apps CLI support for Power Apps Portal. You can check these articles:

TaskDescription
Portal Tool InstallerThis tasks install pre-requisites tools like node.js and npm to connect, authenticate, download and upload the portal configuration
Export Portal ConfigurationThis task exports the portal configuration from power platform enviornment and save it in the source control.
Import Portal ConfigurationThis task imports the Portal configuration in power platform environment

Pre-requisites

  1. Setup Azure DevOps account.
  2. Create Project in Azure DevOps.
  3. Create Repository and Folder Structure (to store portal configurations).
  4. Install PowerApps Portals Build Tools
  5. Register Azure AD App and create Application User in D365 environment
  6. Assign Permissions to manage repository.

Search for Project Collection Build Service. Choose the identity Project Collection Build Service ({your organization}) (not the group Project Collection Build Service Accounts ({your organization})). By default, this identity can read from the repo but cannot push any changes back to it. Grant permissions needed for the Git commands you want to run. Typically you’ll want to grant:

  • Create branch: Allow
  • Contribute: Allow
  • Read: Allow
  • Create tag: Allow
My GIT Repository Repo – Folder Structure

Steps to configure

Step 1 – Install the extension in your Azure DevOps Instance

  1. Create a build pipeline in Azure DevOps and find the tool with the name : “Power Apps Portals Build Tools”.
  2. Click on “Get it free” to get this tool installed in your Azure DevOps instance.
alt text

Step 2 – Portal Tool Installer

Portal Tool Installer task is used to install all prereusites softwares and tools like npm, node.js etc in Azure Devops VM. These tools and softwares are required to connect, authenticate, download and upload the portal data from/to your Power Platform environment.

alt text

Step 3 – Export Portal Configuration

Export Portal Configuration task is used to export(download) the portal configuration (form, list, webpages, content snippet etc) from your Power Platform environment and store it in the source control. In order to use this task in your build pipeline(CI), You need to pass some mandatory information like D365 instance URL, application Id, client secret, tenent id, website id and the folder path of your source control where you want to save the portal configurations.

Here are the mandatory details that you need to provide:

  • Dynamics 365 URL: https://.crm.dynamics.com
  • Application Id: Register an Azure AD app in azure and capture this value from there
  • Client Secret Key: Register an Azure AD app in azure and capture this value from there
  • Azure Tenant Id: Capture this value from your Azure AD
  • Portal Website id: Capture this value from Dynamics 365 instance, where your portals is installed
  • Save export data (All): Folder path in repository, where you want to store your complete portal configuration data. For Ex. if you have a folder in repository with name Power-Platform-Solution and inside that there is one more folder called PowerApps-Portals, where you want to export the portal configuration, then this parameter value must be: Power-Platform-Solution\PowerApps-Portals
  • Save export data (Selected): Folder path in repository, where you want to store only selected portal configuration data.
  • Exclude data: Here you can provide the list of all tables/files (in comma seperated format) whose data you don’t want to import in target instance. For example, while doing Portal deployment if you don’t want to migrate site-setting, content-snippet and website-language data to target instance, then you can specify these file/table folder name in comma seperated format. Please see below screenshot to check the folder/file name of portal components.

Important Note: I have used Output variable in order to pass the Exclude data value to the next step of the pipeline. Hence, set the reference name (portal) of the output variable as per the shown in the following screenshot.

alt text
alt text

Note: For security reasons, you should use Pipeline variables to store the confidential details like Application Id, client secret etc.

Step 4 – Commit data to source control

Once you export the portal configuration, it is currently available on your Azure DevOps virtual machine. You need to commit that data to your source control in Master branch. To do that, you should add a Powershell task and add the following script.

git config --global user.email "arpit@bizappmvp.onmicrosoft.com"
git config --global user.name "Arpit Shrivastava"
git config --global core.longpaths true
git checkout master
git add Portal-Components
git add Portal-Components\Portal-Configuration\Starter-Portal
git add Portal-Components\Portal-Deployment\Starter-Portal
git commit -a -m "changes"
git push origin master

Understand the script and folder path

Line 1-2 Commands: Enable PowerShell scripts to run Git commands. You need to provide build user’s account details, who have the appropriate permissions to perform operations in GIT repo. Check this article for more details.
Line 3 Command: It’s optional. You need to add this command in the script if you have a nested folder structure in your GIT repo. Git has a limit of 4096 characters for a filename. If you maintain 3-4 deep level of subfolder’s nesting to store the portal configuration, then add this line in the script to avoid getting error: “Unable to index file – Adding files failed in the folder due to filename too long”.
Line 4 Command: is used to Checkout your branch. In my case the branch name is ‘master. Therefore, I am checking out ‘master’. Please replace it with your branch name.
Line 5-7 Command: If your GIT repo has following folder structure. Then in line 5, you need to mention ‘Portal-Solution’ (Root folder of Portal) and in line 6 and 7 you should mention ‘Portal-Solution\Portal-Configuration’ and ‘Portal-Solution\Portal-Deployment’ respectively.

-Repo
-Portal-Solution (Root Folder 1)
–Portal-Configuration
–Portal-Deployment
-D365-Solution (Root Folder 2)

And if the folder structure is like below, then in line 5, you need to mention ‘Portal-Solution’ (Root folder of Portal) and in line 6 and 7 you should mention ‘Portal-Solution\Portal-Configuration\Starter-Portal’ and ‘Portal-Solution\Portal-Deployment\Starter-Portal’ respectively. (if you are deploying starter-portal)

-Repo
-Portal-Solution (Root Folder 1)
–Portal-Configuration
—Starter-Portal
—Customer-Portal
—Partner-Portal
–Portal-Deployment
—Starter-Portal
—Customer-Portal
—Partner-Portal
-D365-Solution (Root Folder 2)

Line 8 Command: Commit the changes to the repository
Line 9 Command: is used for pushing the local repo changes to Git repo.

Note: You only replace the highlighted part in the script. And use the folder path as per the instructions provided.

Step 5 – PowerShell Script to exclude data from being exported

This script has a logic to remove the portal component folders from the repository that you don’t want to migrate to the target instance. To manage this, i have created 2 folders (Portal-Source and Portal-Deployment) in my repository. Portal-Source folder holds the complete portal data export and Portal-Deployment folder holds only selected portal data export (that i want to import).

Currently, there is no way in PowerApps CLI to do the increamental or selective portal component deployment. The only alternative, i found is to remove the corresponding folder of a particular portal component.

git config --global user.email "arpit@bizappmvp.onmicrosoft.com" 
git config --global user.name "Arpit Shrivastava" 
$tables = $(portal.ExcludedTables); 
$lists = $tables.split(","); 
Write-Output $lists 
foreach($l in $lists) { 
git rm -r Portal-Components\Portal-Deployment\Starter-Portal\<name of your portal website>$l Write-Output $l removed... 
} 
git add . 
git commit -a -m "removed folder" 
git push origin master

Note: In the above Git Commands,
1. Replace “arpit.crmconsultant@gmail.com” and “Arpit Shrivastava” with your Azure DevOps account details in above PowerShell script.
2. Replace Portal-Components\Portal-Deployment\Starter-Portal\arpitpowerguide—arpitpowerguide with your repository path, where you want to keep the portal configuration that will be migrated to target environment.

Step 6 – Publish Artifact

This step is used to store the build artifacts to the staging directory.

Please point ‘Path to publish’ field value to the folder, where you are storing the selective portal data export in the repository.

Step 7 – Create Deployment Profile

Please read the following articles to know what is the significance of using deployment prifiles and how to use it:

You need to manually create the deployment profiles inside the folder containing the portal content (see below). In the below example, I have to deploy the portal to SIT, UAT and PROD environment. Therefore, I have created 3 different deployment profiles.

alt text

Note: Creating Deployment Profiles is an one time activity. From next time onwards, you may only required to change its content (if required).

Step 8 – Import Portal Configuration (Use it in Release Pipeline)

Import Portal Configuration task is used to import(upload) the portal configuration (form, list, webpages, content snippet etc) to your Power Platform environment. In order to use this task in your release(CD) pipeline, You need to pass some mandatory information like D365 instance URL, application Id, client secret, tenent id and the folder path of your source control where you have exported/stored the portal configurations.

Here are the mandatory details that you need to provide:

  • Dynamics 365 URL: https://.crm.dynamics.com
  • Application Id: Register an Azure AD app in azure and capture this value from there
  • Client Secret Key: Register an Azure AD app in azure and capture this value from there
  • Azure Tenant Id: Capture this value from your Azure AD
  • Portal data: Point this to your build artifact (as shown below)
  • Deployment Profile Name: Provide the nsme of deployment profile thst you hsve created for your target environment. Let say, if your UAT deployment profile is UAT.deployment.yml then the name of deployment profile would be UAT.
alt text
alt text

Step 9 – Update Website Binding (Only one time)

Earlier portals (of the same type) website ids used to be the same. However, as per recent updates now you can have multiple portals installed in your Dataverse environment. Therefore, portal website ids can not be the same. That means, the ‘Starter Portal’ installed in the Development environment will have the different website id from the Starter Portal installed on UAT environment. So now, you need to perform the following steps once you succesfully deployed the portal configuration to the target environment (SIT, UAT, PROD, etc):

  1. Delete the existing Website record from the target environment that got created after portal installation. Let say, you are deploying Starter Portal from your Development environment to UAT environment. Then after migrating the portal configuration, you will see two website records. One, that got created after portal installation and the another one is just created after portal deployment.
  2. Navigate to Power Apps Portal Admin Center
  3. Under Portal Details, Update Portal Binding by choosing newly deployed website.
  4. Wait for few seconds, your portal will be up and running.

Note: This is an only one-time activity that needs to be performed on each target environment after portal deployment. From next time onwards, the subsequent deployment will update the portal configuration in the target environment.

Step 10 – Confirm the deployed changes

To confirm the changes deployed to the target environment, Clear the server-side cache, or use Sync Configuration by using Power Apps portals Studio.

Demo

Useful Articles and References

PowerApps Portals Build Tools – An Azure DevOps Extension to Automate Portal Deployment

toolswv-2
About PowerApps Portals Build Tools

This tool empowers developers to enable CI/CD (Continuous Integration/Continuous Deployment) of portal configuration. You can use this tool to check-in the portal configuration to source control and move the portal configuration to any environment using Power Apps CLI.

For more details about Power Apps CLI support for Power Apps Portal. You can check these articles:

Task Description
Portal Tool Installer This tasks install pre-requisites tools like node.js and npm to connect, authenticate, download, and upload the portal configuration
Export Portal Configuration This task exports the portal configuration from the power platform environment and saves it in the source control.
Import Portal Configuration This task imports the Portal configuration in the power platform environment
Steps to configure
Step 1 – Install the extension in your Azure DevOps Instance
  1. Create a build pipeline in Azure DevOps and find the tool with the name: \”Power Apps Portal Build Tools\”.
  2. Click on \”Get it free\” to get this tool installed in your Azure DevOps instance.

 

Step 2 – Portal Tool Installer

Portal Tool Installer task is used to install all prerequisites software and tools like npm, node.js, etc in Azure DevOps VM. These tools and software are required to connect, authenticate, download, and upload the portal data from/to your Power Platform environment.

 

Step 3 – Export Portal Configuration

Export Portal Configuration task is used to export(download) the portal configuration (form, list, webpages, content snippet, etc) from your Power Platform environment and store it in the source control. In order to use this task in your build pipeline(CI), You need to pass some mandatory information like D365 instance URL, application Id, client secret, tenant id, website id, and the folder path of your source control where you want to save the portal configurations.

Here are the mandatory details that you need to provide:

  • Dynamics 365 URL: https://.crm.dynamics.com
  • Application Id: Register an Azure AD app in azure and capture this value from there
  • Client Secret Key: Register an Azure AD app in azure and capture this value from there
  • Azure Tenant Id: Capture this value from your Azure AD
  • Portal Website id: Capture this value from Dynamics 365 instance, where your portals are installed
  • Save export data: Repository path where you want to store your portal configuration data. For Ex. if you have a folder in the repository with the name¬†Power-Platform-Solution¬†and inside that there is one more folder called¬†PowerApps-Portals, where you want to export the portal configuration, then this parameter value must be:¬†Power-Platform-Solution\\PowerApps-Portals

Step 4 – Commit data to source control

Once you export the portal configuration, it is currently available on your Azure DevOps virtual machine. You need to commit that data to your source control in the Master branch. To do that, you should add a Powershell task and add the following script.

Note: In the git add the command (4th line in the script, as shown below), give the folder path where you want to store the portal configuration in the repository

Step 5 – Import Portal Configuration

Import Portal Configuration task is used to import(upload) the portal configuration (form, list, webpages, content snippet, etc) to your Power Platform environment. In order to use this task in your release(CD) pipeline, You need to pass some mandatory information like D365 instance URL, application Id, client secret, tenant id, and the folder path of your source control where you have exported/stored the portal configurations.

Here are the mandatory details that you need to provide:

  • Dynamics 365 URL: https://.crm.dynamics.com
  • Application Id: Register an Azure AD app in azure and capture this value from there
  • Client Secret Key: Register an Azure AD app in azure and capture this value from there
  • Azure Tenant Id: Capture this value from your Azure AD
  • Portal data: Repository path, where you had stored the portal configuration using¬†Export Portal Configuration¬†task. For Ex: if your portal configuration is stored in¬†\\Power-Platform-Solution\\PowerApps-Portals\\Starter-Portal folder, then you should choose the Starter-Portal folder in the folder path (as shown below)
Demo and Usage

PowerApps – Highlight First Record in Canvas App List Gallery


Introduction

Hey Everyone!

I hope you all had fun and great learning this weekend in PowerApps Developer Bootcamp.

I would like to express my sincere appreciation to all the Speakers and Participants who generously helped us make this event come together to become a success ūüí™

During the event, one of our participants had a question about¬†Highlighting the first record in the Canvas App List Gallery control. I found this requirement¬†quite interesting and tried to google it to check if someone else has already been implemented this before or not. However, I couldn\’t find anything on the internet.

So, I thought of implementing it using my experience and knowledge of PowerApps.

Today in this blog post, I am going to share a #PowerGuideTip32, where I\’ll show you a simple trick to highlight the first record in the List Gallery.

So let\’s get started…


Implementation

Step 1 – Add a List Screen in your Canvas App.

Step 2 – Connect with Data Source to display the records

Step 3 – Add a Rectangle Icon to the screen (above the Gallery control).

(a) Make sure this Rectangle icon must be added outside your List Gallery control. To do that, you can select the Gallery control from the left navigation panel and drag it down a bit and then place the Rectangle icon right above the Gallery (as shown below)

(b) Make sure the Height and Width of the Rectangle are similar to List Gallery\’s first record (as shown below). So that it can cover the complete information being displayed in the record.

(c) Change the Background color of the Rectangle as per your need.

Step 4 – Right click on Gallery control from the left navigation panel and choose Reorder > Bring to front

The idea to do this setting is to set the Rectangle Icon in the background of the Gallery\’s first record.

Step 5 – Now drag and adjust the whole Gallery, so that the first record of the Gallery control could appear on top of Rectangle (As shown below)

Step 6 – Run the app and see the result.

Demo of Implementation


Important Note: In this example,¬†I haven\’t used any query or formula like Lookup(…), First(…), and Filter(…) expressions in order to find the first record of the table. This is just a simple example to showcase the capabilities of out of box controls of Canvas App in order to highlight the first record of List Gallery,¬†

I hope you found this Tip useful.

Stay tuned for the next Tip – #PowerGuideTop33.

PowerApps – Get Canvas App Logged-In User Role from Dynamics 365

Introduction
Hi Everyone,
Hope you are doing great and staying safe!
Today, in this blog post I am going to share #PowerGuideTip31, where I\’ll demonstrate how to read Dynamics 365 security role in PowerApps Canvas App.
There are many business use cases, where we need to design an App based on the Dynamics 365 security role. One of the best examples is:
Role-based Canvas App, where we need to navigate the user to different-2 screens or perform different process/flow based on the role assigned in the Dynamics 365 application.
I have already been demonstrated this feature in my earlier blog post, In that post, I had used Power Automate to get the Dynamics 365 Security Role in Canvas App. However, today I am going to retrieve it directly in Canvas App without using any external components.
Let\’s get started…
I have designed a Canvas App that allows –
  • Event Organizers to manage the organization events.
  • Speakers to submit their session.
  • Participants to register for an event.

Implementation

Step 1: Design a Login Screen

Step 2: Use an expression to retrieve Canvas App logged-in user\’s security role from Microsoft Dataverse

I am using the following expression (Microsoft Power Fx) on OnStart property of the Canvas App.
That means, every time when you open the app, all security roles assigned to the logged-in user will get automatically stored in the Canvas App Collection.

ClearCollect(UserRoles, (LookUp(Users,domainname = User().Email).\’Security Roles (systemuserroles_association)\’).Name)
Here User().Email Рreturns the logged-in user email address.

LookUp(Users,domainname = User().Email) РRetrieve User from Office 365 whose email address is equal to logged-in user email address
\’Security Roles (systemuserroles_association)\’),Name –¬†will return the name of all security roles assigned to the logged-in user.
Finally, I am storing the list of all security roles in Collection, so that I can read it throughout my app.

Important Note: App¬†OnStart event will not trigger when you play/run the app from Power Studio Editor. Hence, you won\’t see any value stored in the Collection. To do that, you need to manually trigger the OnStart event by clicking on Run OnStart (see below).




Step 3: Restrict Canvas App user\’s login based on security role
Once you have the security roles of the logged-in user, the next thing you need to write the expression on Login Screen to check whether a user has sufficient privileges to access the app or not.
To do that, I have used the following expression on click of \’Event Organizer\’ control
If(IsBlank(LookUp(UserRoles,Name=\”Event Organizer\”)),Navigate(RoleErrorMessage,ScreenTransition.CoverRight,{rolemissing:\”Event Organizer\”}),Navigate(HomeScreen))

Explanation: Here, I am checking whether the canvas app collection has an Event Organizer role in the list. If it returns blank, then redirect the user to Error Screen else redirect the user to Home Screen.
And, using the following expression on click of¬†\’Participants¬†control

If(IsBlank(LookUp(UserRoles,Name=\”Participant\”)),Navigate(RoleErrorMessage,ScreenTransition.CoverRight,{rolemissing:\”Participant\”}),Navigate(HomeScreen))

Explanation: Here, I am checking whether the canvas app collection has a Participant role in the list. If it returns blank, then redirect the user to Error Screen else redirect the user to Home Screen.


Step 4: Test and Demo

Useful Tip
In the above example, I have retrieved the user\’s security role on the canvas app OnStart event and stored it in the Collection. Afterward, I am using these roles to design a role-based canvas app, However, you might also have a requirement to show the list of logged-in user roles in the Canvas App Screen. To do that, perform the followings steps:
Add a List type of Screen from the Screen List

Set the \’items\’ property of the list screen to the Collection name (created in Step 2)


I hope you found this PowerGuideTip helpful.

Stay tuned for #PowerGuideTip32

Canvas Apps – Auto populate Option Set value in the Canvas App Form

Introduction

Hi Everyone,

Hope you are well and staying safe!
In today\’s #PowerGuideTip30, I will show you how to auto-populate or set value in Option set field in Canvas App.
I have designed a Canvas App for Event Management purposes, where I have added a screen that the Event Organizer uses to create an Event. This screen has an Option Set field called \’Event Status\’ that shows the status of the event.
I don\’t want to allow the Event Organizer to set the Event Status value manually and want to auto-populate its value to \’Submitted\’ instead.



Solution

To auto-populate value in Option Set value or Set value in Option Set value you should use Choices() expression.
  • The Choices function returns a table of the possible values for a lookup column.
  • Use the Choices function to provide a list of choices for your user to select from. This function is commonly used with the Combo box/ Drop Down control in edit forms.
Syntax
Choices( column-reference )
column-reference ‚Äď Required. A lookup column of a data source or Option Set Field column. Don\’t enclose the column name in double-quotes. The reference must be direct to the column of the data source and not pass through a function or control.
Implementation
Choices(\’Event Status (Events)\’.Submitted)
where \’Event Status (Events)\’ represents the Event Status column of the Event table in Microsoft Dataverse and¬†Submitted is the Option set value that I want to auto-populate.

Note: I want to auto-populate the option set value, hence I have used the Choices(….) expression directly on the items property of Combo box. If you want to set the value on button click then you can use the same expression on onselect property of button control.
Useful Reference
I hope you found this PowerGuideTip helpful. 

Stay tuned for #PowerGuideTip31