Power Platform Security & Governance Considerations

Hopefully this might help you before & during a Power Platform project engagement:

Security & Governance Considerations

  1. What are the constructs of the solutions that were developed & are being developed?
    Those were identified during the project’s kick-off workshop [Example]:
    • Dataverse
    • SharePoint
    • Azure Blob Storage
    • Azure SQL
    • Exchange
    • Data Lake
  2. How do these constricts fit together at design & run time?
  3. It’s best we familiarize ourselves with the [Client] environments governance. Environments can be used to target different audiences and/or for different purposes such as developing, testing and production.
  4. Access to Power Platform solutions and artifacts starts with having a license, the type of license users have will determine the assets and data a user can access. Can you please give us an overview of licensing for [Client]?
  5. Are there any DLP policies in place, if so, what are they and what are their scope across all the environments?
  6. Consider Microsoft Intune, Microsoft Intune can set mobile application protection policies for both Power Apps and Power Automate apps on Android and iOS. Are there any policies set at [Client] that we need to be aware of?
  7. Consider location-based conditional access, organizations with Azure AD premium conditional access policies can be defined in Azure for Power Apps and Power Automate. This allows granting or blocking access based upon user or group, device, location. Are there any location-based policies [Client] have?
  8. How are security roles and access is managed for the Dataverse tables that are associated to the solution?
  9. An on-premises data gateway acts as a bridge, providing quick and secure data transfer between on-premises data (data that is not in the cloud) and the Power BI, Power Automate, Logic Apps, and Power Apps services. Does [Client] use an on-premises gateway? If so, can we understand whether there are any clusters installed and what on-premise data sources are connected?
  10. With custom connectors, developers can capitalize on existing organization investments in REST API services or create new APIs to expose complex server-side operations that are not available with the out-of-the-box connectors. Are there any custom connectors? If so, what are the high-level architecture of how the custom connectors work.

Performance & Optimization

  1. We can an approach to study the existing apps’ performance, in consideration of that we particular should focus on:
    • App Design – the app might be client heavy, which means the app gets large sets of data into data collections initially. The app also might have long formulas OnStart which will trigger unnecessary data calls in screens which in return will return large data records. To review the app design we will monitor the app using ‘Monitor’.
    • Bottleneck in data source – there are many possible casus of bottleneck in the source. The bottleneck can be slow if the back end machine hosting the data source is low on resources, there might be back end SQL instances that are blocking resources, or the on-premise data gateway is unhealthy.
    • Temp throttling of high-volume requests at back end, depending on how the canvas app is designed it can generate many data calls within a short time. Some of them might exceed the connectors throttling limits, the app might be subjected to temporally throttle. Although there are many options to choose a data source from, choosing the right data source and connector is important from many perspectives—architecture, performance, maintenance, and scalability. ‘Monitor’ can be used to profile the app and investigate the problem.
  2. Object naming convention – As objects are created in Power Apps application, it’s important to use consistent naming conventions for screens, controls, and data sources. This approach will make the applications easier to maintain, can help improve accessibility, and will make the code easier to read as those objects are referenced. We will make sure the following objects are align to best practice naming conventions: Screen names, Control names, Data Source names, Variables names, Collection names, etc.
  3. We will always adhere to best practices to improve performance of canvas applications, such as:
    • Limit data connections (under 30)
    • Limit the number of controls (under 500)
    • Optimize the ‘OnStart’ property
    • Cache lookup data
    • Avoid control dependency between screens
    • Use delegation
    • Use Delayed Load (currently under experimental, to be considered once in GA)
    • Optimization of working with large datasets
    • Republishing the app regularly
    • Building Responsive apps
    • And many more


  1. Testing is an important part of the software development life cycle (SDLC). Testing can help ensure the quality of the app delivered. It can identify issues or defects early in the release process and provides an opportunity to fix these issues to make the app more reliable before releasing changes. What is the currently the process of Testing at [Client]? Is [Client] utilizing Power Apps Test Studio? The tool is meant for automating tests for canvas apps.
  2. Canvas Apps tests can be automated using Azure Pipeline Classic Editor in Azure DevOps Services. Is there any existing automation in testing using Azure Pipelines?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: