Power Apps has a delegation limit of 2,000 rows . This makes loading large datasets challenging — but what if you need to load tens of thousands of rows? 😎
In this blog, I’ll show you how to bypass the row limit and dynamically load all rows from the Users table using ForAll, Sequence, and CountRows — no fixed limit, no delegation warnings!
Here i am using "Users" table from dataverse as an example.
🎯 Problem Overview
By default, Power Apps can only retrieve and store up to:
✅ 2,000 rows
If you try to load more than this, you might encounter:
❌ Delegation warnings
❌ Incomplete data retrieval
❌ Performance issues
💡 Goal:
✅ Load all rows from a Dataverse table (even tens of thousands)
✅ Use batching to bypass the row limit
✅ Keep the loading process dynamic and scalable
✅ Solution: Dynamic Loading Using ForAll, Sequence, and CountRows
To dynamically load all rows from the Users table, you need to:
-
Create a Custom Integer Column
- In the Users table, create a new column named Mirroring_ID of type Integer.
- This column will act as a unique incremental ID for filtering and batching.
-
Use Power Automate to Assign Sequential Values
- Create a Power Automate flow using the "When a row is added, modified, or deleted" trigger.
- Use
Row Number
to update theMirroring_ID
column with a unique value.
-
Use PowerFX to Load Data in Batches
- Use
ForAll
,Sequence
, andFilter
to load rows dynamically in chunks of 2,000.
- Use
💻 PowerFX Code Example
🔎 How It Works
1. Clear the Collection
👉 Clear(MyCollection)
– Removes any existing data from the collection to avoid duplication.
2. Get Total Records and Calculate Batches
👉 CountRows(Users)
– Returns the total number of rows in the Users table.
👉 PageSize = 2000
– Defines the batch size (limited to 2,000).
👉 RoundUp(TotalRecords / PageSize, 0)
– Calculates the total number of batches needed.
Example: |
---|
If there are 8,500 records → 8500 / 2000 = 4.25 → Rounded up to 5 batches ✅ |
3. Load Records in Batches Using ForAll
👉 Sequence(MaxPages)
– Generates a list of batch numbers (1, 2, 3, …).
👉 ForAll()
– Loops through each batch number to load records.
Example: |
---|
MaxPages = 5 → Loops through values [1, 2, 3, 4, 5] ✅ |
4. Filter Records in the Current Batch
👉 Filter()
– Filters the records based on the calculated range.
👉 StartRange
and EndRange
– Define the start and end of each batch.
Example: |
---|
Batch 1 → Mirroring_ID between 0 and 2000 ✅ |
Batch 2 → Mirroring_ID between 2000 and 4000 ✅ |
Batch 3 → Mirroring_ID between 4000 and 6000 ✅ |
Batch 4 → Mirroring_ID between 6000 and 8000 ✅ |
Batch 5 → Mirroring_ID between 8000 and 8500 ✅ |
5. Add Records to the Collection
👉 If(!(Mirroring_ID in MyCollection.Mirroring_ID))
– Checks for duplicates before adding the record.
👉 Collect(MyCollection, ThisRecord)
– Collects the record into the collection.
Example: |
---|
If Mirroring_ID = 1001 already exists → Skip ✅ |
If Mirroring_ID = 1002 is new → Add to collection ✅ |
💡 Why This Solution Works
✅ Handles large datasets using batch-based loading
✅ Bypasses delegation warnings
✅ Loads data dynamically without hardcoding row limits
✅ Efficient handling of records using Filter()
and Sequence()
🌟 Best Practices
✔️ Keep batch size at 2,000 rows to stay within delegation limits.
✔️ Ensure Mirroring_ID is unique and of type Integer.
✔️ Monitor app performance using the Monitor Tool in Power Apps.
✔️ Test with smaller datasets before rolling out to production.
🏆 Complete Example with Explanation
Step | Action | Example |
---|---|---|
1 | Create a custom integer column Mirroring_ID | Type = Integer ✅ |
2 | Assign sequential values using Power Automate | Row Number expression ✅ |
3 | Load first 2,000 rows using ClearCollect() | Loads initial batch ✅ |
4 | Calculate number of batches using RoundUp() | 5 batches for 8,500 rows ✅ |
5 | Loop through batches using Sequence() | [1, 2, 3, 4, 5] ✅ |
6 | Load records using Filter() and Collect() | Batch-by-batch ✅ |
7 | Store all rows into a single collection | Unlimited rows ✅ |
🚀 Conclusion
By combining CountRows(), Sequence(), and ForAll(), you can dynamically load unlimited rows from the Users table without hitting delegation limits — fast and efficient! 😎