This post covers how to configure a Power Automate Flow to get items from a SharePoint list at maximum speed. Super useful if you have an interactive process and you do not want to keep users waiting.
This post is part of a three part series of blog posts, before proceeding please read this post which describes the objectives, pros and cons of each method. It also provides links to the exported flows and sample data which I used to build the flows:
The many ways to retrieve more than 5,000 items from a SharePoint list using Power Automate
If you have not used a parallel branch before, this blog post is worth reading.
Before getting into the technicalities, have a look at the complete flow. If you have already built the more SharePoint API version detailed in this post, it will make sense.
General Premise
This flow retrieves the SharePoint items using the same method as the flow provided in this post. But the work of collecting data from SharePoint is split over four parallel branches, which execute at the same time. Instead of waiting for a single batch of 5,000 to be collected before proceeding to get the next batch.
Changes from the SharePoint API Flow
Several changes were made to the SharePoint API flow:
Variable Declarations
There are some extra values declared and defined at the start of the flow which control how the four branches choose which section of data to return:
Get the Lowest ID from the SharePoint list using the REST API
The lowest ID from the list is retrieved using the HTTP request to SharePoint action and uses the following code:
_api/Web/Lists/GetByTitle('5000 Item Test')/Items?$select=ID&$filter=ID gt 0&$top=1&$orderby=ID asc
Get the Highest ID from SharePoint list using the REST API
Same as the above flow step, but repeated to get the highest ID, uses the following code:
_api/Web/Lists/GetByTitle('5000 Item Test')/Items?$select=ID&$filter=ID gt 0&$top=1&$orderby=ID desc
Gets the Total Item count from the SharePoint List using the REST API
The total item count of the list is returned by using the ItemCount method of the SharePoint API:
_api/Web/Lists/GetByTitle('5000 Item Test')/ItemCount
Declare the control variables for each parallel branch
Because there are four parallel threads, all of the variables which operate within each branch are declared four times.
In the previous example, there was a single integer, array and boolean variable to control the do until loop that was retrieving SharePoint data. Now those variables need to be declared for each parallel branch:
The main control block for the Parallel Branches
In the control block for the parallel branches, a compose action is used to divide the total item count by four to determine the amount of data each branch should retrieve:
div(variables('itemCount'),4)
The control mechanism for the do until loops was modified to be more efficient, by changing the step which checks if there were no results from SharePoint
if( empty(body('HTTP_to_SharePoint_to_get_First_Quarter_of_Results').value),true, if ( less(length(body('HTTP_to_SharePoint_to_get_First_Quarter_of_Results').value), 5000),true,false ) )
This removes an iteration of the loop, by checking if either of the following conditions are true:
- The result set from SharePoint was empty
- The number of results returned from SharePoint was less than 5000
SharePoint API Query for each branch
The SharePoint API query is different for each parallel execution of the data collection.
Each branch collects data according to the batch size defined by dividing the total item count by four and begins at the lowest ID in branch one. ?Next, branch two begins at the end of branch one and so on, so that all items are collected.
Branch 1
_api/Web/Lists/GetByTitle('5000 Item Test')/Items?$select=ID,EmpNo,Title,Last,Furloughed,Telephone,Email,Brand&$filter=ID gt @{variables('intID')} and ID le @{add(body('getLowestID')?['value'][0]['ID'], mul(outputs('DivideTotalByFour'), 1))}&$top=5000"
Branch 2
_api/Web/Lists/GetByTitle('5000 Item Test')/Items?$select=ID,EmpNo,Title,Last,Furloughed,Telephone,Email,Brand&$filter=ID gt @{add(body('getLowestID')?['value'][0]['ID'], mul(outputs('DivideTotalByFour'), 1))} and ID gt @{variables('intID2')} and ID lt @{add(body('getLowestID')?['value'][0]['ID'], mul(outputs('DivideTotalByFour'), 2))}&$top=5000
Branch 3
_api/Web/Lists/GetByTitle('5000 Item Test')/Items?$select=ID,EmpNo,Title,Last,Furloughed,Telephone,Email,Brand&$filter=ID ge @{add(body('getLowestID')?['value'][0]['ID'], mul(outputs('DivideTotalByFour'), 2))} and ID gt @{variables('intID3')} and ID lt @{add(body('getLowestID')?['value'][0]['ID'], mul(outputs('DivideTotalByFour'), 3))}&$top=5000
Branch 4
_api/Web/Lists/GetByTitle('5000 Item Test')/Items?$select=ID,EmpNo,Title,Last,Furloughed,Telephone,Email,Brand&$filter=ID ge @{add(body('getLowestID')?['value'][0]['ID'], mul(outputs('DivideTotalByFour'), 2))} and ID gt @{variables('intID3')} and ID lt @{add(body('getLowestID')?['value'][0]['ID'], mul(outputs('DivideTotalByFour'), 3))}&$top=5000
Compile all results and send back to PowerApps
The response step simply uses a Union action to bring all the SharePoint data back in a single step:
union(variables('arrItems'),variables('arrItems2'),variables('arrItems3'),variables('arrItems4'))
Summary
Using Parallel branches to fetch SharePoint data can provide a huge speed benefit over a standard do until loop at the expense of creating a more complicated flow but it can be worth it to provide the best possible user experience.
It may be easier to just download the exported flow and modify to your requirements than build it from scratch.
You can download the flow and sample data from this post.
Liz Van der Peet says
Hi Paul,
Terrific example of speeding up processing.
Would love to see an example of how best to create / update SP list items. (I have a need to email 8000 people, then have them click a button in email that adds an item to SP list.)
Paulie says
Can you explain a little more? It sounds like it could be the basis for a good post. When the end user clicks the email button that will only be one item, so I don’t see that you need parallel processing on that part. Is the outbound email to the 8000 being generated from the SP list? (i.e. looping and creating an email list)
Noor says
Hi Paulie
It works successfully thank you sooooooo much
Can you guide me how to compare the combined array with an excel table; note that I am trying to make a flow that updates SP list from excel (add new, delete and update items)
Appreciate your support
Rishi Gupta says
We are doing a bulk create to SharePoint from PowerAutomate. We are sending a batch of 1000 records. The batch is running in parallel. We are trying to create around 2500 records. It works fine for the first run and gets completed in less than 15 mins. But for second or third run the flow is taking more than 2 hrs to complete. Sometimes it takes around 5-6 hrs to compete. Do you know what could be happening?
Mo says
Hi, This is amazing and just what I needed. The only two areas I am struggling with are:
1. Eve though I have updated the field names and in the JSON Schema, the results in PowerApps are those with your List field names e.g. ‘furlough, etc’.
2. It is easy to add string and integer field types, but I also have field types of Choices, LookUp and PeopleSelector. How can I include these in as part of the URI.
Many thanks
J-F says
Hello. Thanks a lot for sharing this, it is perfect for me and it works. It’s something new for me but yours explanations are very clear and helpful. I adapted your flow with a variable to make a first filter on the SP list (it’s only because I can do it with the data set I received).
I have some questions if you have some time. Is it possible (or a good idea) to adapt this flow for deleting/recreating my SP Online list from an Excel file (120k rows) each week for example (or maybe updating is a better way instead of delete/create) ? If I want to have my Power Apps app always available with all the data (without interruption), switch between two list could be a good idea ? Or a way to update the SP list is definitely the best solution ?
Thank you in advance for your time and have a nice day ! 🙂
Paulie says
120k rows is a lot, you could use the SharePoint batch insert functionality but I’d consider using another data source for the task. Is that possible?
J-F says
Hello. No, it’s not possible, I only received that file each week (sometimes with changes, sometimes not). I had a quick look for the batch on the net (i read that it’s possible also with Power Automate). I’ll try in that way
Thank you in advance for your time and have a nice day ! ?
Dave says
Hi, fantastic blog thank you. I have set up the flow with 4 parallel branches and I am retrieving data from libraries that contain 100,000 items. I was expecting 100,000 to be retrieved, divided by 4 and 25000 being funnelled to each branch to be processed. However, the maximum items retrieved per run is 25000 which then divided by 4 amounts to 5000 each. (i am also using a premium connector).
Do you know why this may be occurring?
Thank you
Dave
Nick M. says
Awesome solution, thank you for sharing with the community. 🙂
What if the item count and item IDs are not the same? For example, if files are deleted, the highest ID would be greater than the item count.
Perhaps the solution would work better if you divided the highest ID instead of the item count. This would cover all items in the list.
chandan says
if i want to get lookup columns ,then how can i get.please help me to get lookup column.
Hichem says
Hi there,
I want to use that because I need to send 23000 documents to be signed through the Adobe Sign connector.
So my sharepoint list have 23,000 lines.
How to link that to the adobe sign connector when you have your parallel branches? can someone help me? or guide me?
Paulie says
Wow, 23k documents. Why not just use an apply to each with additional concurrency.
You’ll need to be careful as you could easily go over your api limit if you execute too quickly