This post is part of a series of blog posts which demonstrate how to use the SharePoint Batch API in Power Automate. In this post I will demonstrate how to create SharePoint items in batches of up to 1,000. The main advantages of using the batch API are:
- Drastically improved performance.
- Potentially 1,000 times reduction in the use of Power Automate API actions.
This Flow is very similar to the post I wrote on batch deleting items from a SharePoint list.
Table of contents
Example Data
In this flow I am going to import an Excel file containing 3 columns and 15,000 rows of data to a SharePoint list with the same configuration:
The data source is unimportant. Any JSON array containing the data that you want to insert into SharePoint will be fine.
The Standard Power Automate Method
The most often used method to import rows of data into SharePoint with Power Automate is to use an apply to each loop to iterate through a JSON array using a “Create Item” action within the loop. It would look something like this:
The great thing about this method is that it is very simple to do, so if you are new to Power Automate it’s a great way to get data from a variety of sources into a SharePoint list.
For small datasets, this method is perfect. But it does have two problems:
- Performance is quite slow. A large dataset will take many hours to complete.
- A large number of rows will consume many API actions from your daily allowance.
I tested this flow using the first 1,000 items from the Excel table and it took 7 minutes and 15 seconds to complete. So to do all 15,000 rows would have taken approximately 1 hour and 50 minutes.
I’m not going to go into too much detail about the method above, because if you’re reading this, you’re probably trying to find a faster method!
Batch Create SharePoint List Items
It is possible, using the SharePoint $batch API to bundle up to 1,000 actions to SharePoint in a single HTTP request. This provides drastically increased performance and reduces the number of API actions required by a factor of 1,000.
However, it is much more complicated to implement. So if you need to increase performance, it is worth the effort.
To create exactly the same 1,000 rows of data using the batch method took only 1 minute and 46 seconds. So the total time to do all 15,000 records would be approximately 26 minutes. Around 1.5 Hours faster than the standard method.
Batch API Flow Overview
As per my method for deleting items from a SharePoint list, I have created a flow which should be simple for you to modify. Here is a screenshot of the flow:
The Flow Follows this Process:
- Data is retrieved from the data source
- A variable named LoopControl is defined which is used to determine when the Do Until loop should end.
- The Scope Create SharePoint Items begins:
- A compose action called Settings defines some settings for the flow.
- A Select action is used to reformat the incoming data to shape it correctly for the list.
- A template is defined for the change actions.
- A do until loop iterates through all of the JSON data in batches until no more remains.
- A select action is used to reformat the array items.
- A compose action is used to join the output of the select.
- The output of the compose is sent to SharePoint and contains the batch of actions to be completed.
Still not fast enough?
The flow above, although much more complicated than the standard method provides a great performance benefit. What I noticed, is that the performance bottleneck has shifted from Power Automate to SharePoint.
So I modified the flow, which adds a small amount of extra complexity, but doubles the effective speed:
Method | Time to Create 1k List Items | Difference |
Standard Apply/Each Loop | 435 seconds | |
Batch API | 106 seconds | -329 |
Batch API Threaded | 54 seconds | -383 |
This is achieved by splitting each batch of 1,000 into two simultaneous batches of 500 and running them in an apply to each loop with concurrency enabled. The apply to each loop uses an expression as its source:
if ( equals(variables('LoopControl'), 0), json('[]'), createArray ( take(body('Select'), div(length(body('Select')), 2)), skip(body('Select'), div(length(body('Select')), 2)) ) )
This expression does a couple of things:
- Checks if there are any remaining items in the dataset, and if not returns an empty array.
- If there are items remaining it determines the length of the items and splits the array into two halves.
- The two halves are then run in parallel which doubles the speed of the SharePoint batch command.
Flow Code and Implementation
Because this flow is somewhat complicated I am providing a number of options for you to download and implement it.
Sample Data and Flow Exports
Download my sample Excel Data File and Exports of the flows to import into your environment here:
- Sample Excel Data
- Create SharePoint Items Batch API Method
- Threaded Create SharePoint Items Batch API Method
To work with my sample data you will need to put the Excel file into your OneDrive and modify the List Rows Present in a Table action accordingly.
After Importing the flow you will need to update the settings compose action at the beginning of the scope with your SharePoint site address and list name:
Which version is right for you depends on how many items you have and what your performance demands are. I recommend choosing the non threaded version unless you require the added performance of the threaded version.
Paste the Scope into a Flow
If you prefer to build your own flow you can do the following:
- Create a variable called LoopControl.
- Copy the Scope code below and paste it into your flow.
- Update the settings compose action.
- Change the Select action GenerateSPData to draw from your own data source, but retain the first column.
The scope code for the non-threaded version:
{ "id": "8869cabe-b30e-400c-b0f2-3d8a-d53fb36c", "brandColor": "#8C3900", "connectionReferences": { "shared_excelonlinebusiness": { "connection": { "id": "/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness/connections/shared-excelonlinebu-917cacf6-0b56-4b3f-ae9f-e703589cabec" } }, "shared_sharepointonline_1": { "connection": { "id": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline/connections/shared-sharepointonl-46297fd3-182c-4215-bd63-ab2ad4fa0c11" } } }, "connectorDisplayName": "Control", "icon": "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMzIiIGhlaWdodD0iMzIiIHZlcnNpb249IjEuMSIgdmlld0JveD0iMCAwIDMyIDMyIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPg0KIDxwYXRoIGQ9Im0wIDBoMzJ2MzJoLTMyeiIgZmlsbD0iIzhDMzkwMCIvPg0KIDxwYXRoIGQ9Im04IDEwaDE2djEyaC0xNnptMTUgMTF2LTEwaC0xNHYxMHptLTItOHY2aC0xMHYtNnptLTEgNXYtNGgtOHY0eiIgZmlsbD0iI2ZmZiIvPg0KPC9zdmc+DQo=", "isTrigger": false, "operationName": "Create_SharePoint_Items", "operationDefinition": { "type": "Scope", "actions": { "settings": { "type": "Compose", "inputs": { "siteAddress": "https://accendo1.sharepoint.com/sites/PowerAutomateText/", "listName": "Invoices Development Test", "batchSize": 1000 }, "runAfter": {}, "description": "list to batch delete items from", "trackedProperties": { "meta": { "type": "SP.Data.@{outputs('settings')?['listName']}ListItem" }, "batchGUID": "@{guid()}", "changeSetGUID": "@{guid()}" } }, "GenerateSPData": { "type": "Select", "inputs": { "from": "@body('List_rows_present_in_a_table')?['value']", "select": { "__metadata": "@json(concat('{\"type\":\"SP.Data.', replace(outputs('settings')['listName'], ' ', '_x0020_'), 'ListItem\"}'))", "Customer": "@item()?['Customer']", "InvNum": "@int(item()?['Invoice Number'])", "Invoice_x0020_Amount": "@float(item()?['Invoice Amount'])" } }, "runAfter": { "settings": [ "Succeeded" ] } }, "batchTemplate": { "type": "Compose", "inputs": "--changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}\nContent-Type: application/http\nContent-Transfer-Encoding: binary\n\nPOST @{outputs('settings')['siteAddress']}_api/web/lists/getByTitle('@{outputs('settings')['listName']}')/items HTTP/1.1\nContent-Type: application/json;odata=verbose\n\n|RowData|\n", "runAfter": { "GenerateSPData": [ "Succeeded" ] } }, "Do_until": { "type": "Until", "expression": "@equals(variables('LoopControl'), 0)", "limit": { "count": 60, "timeout": "PT1H" }, "actions": { "Select": { "type": "Select", "inputs": { "from": "@take(skip(body('GenerateSPData'), mul(outputs('settings')['batchSize'], iterationIndexes('Do_until'))), outputs('settings')['batchSize'])", "select": "@replace(outputs('batchTemplate'), '|RowData|', string(item()))" }, "runAfter": {} }, "Set_variable": { "type": "SetVariable", "inputs": { "name": "LoopControl", "value": "@length(body('Select'))" }, "runAfter": { "Select": [ "Succeeded" ] } }, "batchData": { "type": "Compose", "inputs": "@join(body('Select'), decodeUriComponent('%0A'))", "runAfter": { "Set_variable": [ "Succeeded" ] } }, "SendBatch": { "type": "ApiConnection", "inputs": { "host": { "connection": { "name": "@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$connections']['shared_sharepointonline_1']['connectionId']" } }, "method": "post", "body": { "method": "POST", "uri": "/_api/$batch", "headers": { "X-RequestDigest": "digest", "Content-Type": "multipart/mixed;boundary=batch_@{actions('settings')?['trackedProperties']['batchGUID']}" }, "body": "--batch_@{actions('settings')?['trackedProperties']['batchGUID']}\nContent-Type: multipart/mixed; boundary=\"changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}\"\nContent-Length: @{length(outputs('batchData'))}\nContent-Transfer-Encoding: binary\n\n@{outputs('batchData')}\n--changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}--\n\n--batch_@{actions('settings')?['trackedProperties']['batchGUID']}--" }, "path": "/datasets/@{encodeURIComponent(encodeURIComponent(outputs('settings')['siteAddress']))}/httprequest", "authentication": { "type": "Raw", "value": "@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']" } }, "runAfter": { "batchData": [ "Succeeded" ] }, "limit": { "timeout": "P1D" }, "metadata": { "flowSystemMetadata": { "swaggerOperationId": "HttpRequest" } } }, "Results": { "type": "Compose", "inputs": "@base64ToString(body('sendBatch')['$content'])", "runAfter": { "SendBatch": [ "Succeeded" ] } } }, "runAfter": { "batchTemplate": [ "Succeeded" ] } } }, "runAfter": { "Initialize_variable": [ "Succeeded" ] } } }
Here is the scope code for the threaded version:
{ "id": "79273084-e0e1-450a-a878-1dda-e7a2b8a4", "brandColor": "#8C3900", "connectionReferences": { "shared_excelonlinebusiness": { "connection": { "id": "/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness/connections/shared-excelonlinebu-917cacf6-0b56-4b3f-ae9f-e703589cabec" } }, "shared_sharepointonline_1": { "connection": { "id": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline/connections/shared-sharepointonl-46297fd3-182c-4215-bd63-ab2ad4fa0c11" } } }, "connectorDisplayName": "Control", "icon": "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMzIiIGhlaWdodD0iMzIiIHZlcnNpb249IjEuMSIgdmlld0JveD0iMCAwIDMyIDMyIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPg0KIDxwYXRoIGQ9Im0wIDBoMzJ2MzJoLTMyeiIgZmlsbD0iIzhDMzkwMCIvPg0KIDxwYXRoIGQ9Im04IDEwaDE2djEyaC0xNnptMTUgMTF2LTEwaC0xNHYxMHptLTItOHY2aC0xMHYtNnptLTEgNXYtNGgtOHY0eiIgZmlsbD0iI2ZmZiIvPg0KPC9zdmc+DQo=", "isTrigger": false, "operationName": "Create_SharePoint_Items", "operationDefinition": { "type": "Scope", "actions": { "settings": { "type": "Compose", "inputs": { "siteAddress": "https://accendo1.sharepoint.com/sites/PowerAutomateText/", "listName": "Invoices", "batchSize": 1000 }, "runAfter": {}, "description": "list to batch delete items from", "trackedProperties": { "meta": { "type": "SP.Data.@{outputs('settings')?['listName']}ListItem" }, "batchGUID": "@{guid()}", "changeSetGUID": "@{guid()}" } }, "GenerateSPData": { "type": "Select", "inputs": { "from": "@body('List_rows_present_in_a_table')?['value']", "select": { "__metadata": "@json(concat('{\"type\":\"SP.Data.', replace(outputs('settings')['listName'], ' ', '_x0020_'), 'ListItem\"}'))", "Customer": "@item()?['Customer']", "InvNum": "@int(item()?['Invoice Number'])", "Invoice_x0020_Amount": "@float(item()?['Invoice Amount'])" } }, "runAfter": { "settings": [ "Succeeded" ] } }, "batchTemplate": { "type": "Compose", "inputs": "--changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}\nContent-Type: application/http\nContent-Transfer-Encoding: binary\n\nPOST @{outputs('settings')['siteAddress']}_api/web/lists/getByTitle('@{outputs('settings')['listName']}')/items HTTP/1.1\nContent-Type: application/json;odata=verbose\n\n|RowData|\n", "runAfter": { "GenerateSPData": [ "Succeeded" ] } }, "Do_until": { "type": "Until", "expression": "@equals(variables('LoopControl'), 0)", "limit": { "count": 60, "timeout": "PT1H" }, "actions": { "Select": { "type": "Select", "inputs": { "from": "@take(skip(body('GenerateSPData'), mul(outputs('settings')['batchSize'], iterationIndexes('Do_until'))), outputs('settings')['batchSize'])", "select": "@replace(outputs('batchTemplate'), '|RowData|', string(item()))" }, "runAfter": {} }, "Apply_to_each": { "type": "Foreach", "foreach": "@if\r\n(\r\n equals(variables('LoopControl'), 0),\r\n json('[]'),\r\n createArray\r\n (\r\n take(body('Select'), div(length(body('Select')), 2)),\r\n skip(body('Select'), div(length(body('Select')), 2))\r\n )\r\n)", "actions": { "batchData": { "type": "Compose", "inputs": "@join(item(), decodeUriComponent('%0A'))", "runAfter": {} }, "SendBatch": { "type": "ApiConnection", "inputs": { "host": { "connection": { "name": "@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$connections']['shared_sharepointonline_1']['connectionId']" } }, "method": "post", "body": { "method": "POST", "uri": "/_api/$batch", "headers": { "X-RequestDigest": "digest", "Content-Type": "multipart/mixed;boundary=batch_@{actions('settings')?['trackedProperties']['batchGUID']}" }, "body": "--batch_@{actions('settings')?['trackedProperties']['batchGUID']}\nContent-Type: multipart/mixed; boundary=\"changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}\"\nContent-Length: \nContent-Transfer-Encoding: binary\n\n@{outputs('batchData')}\n--changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}--\n\n--batch_@{actions('settings')?['trackedProperties']['batchGUID']}--" }, "path": "/datasets/@{encodeURIComponent(encodeURIComponent(outputs('settings')['siteAddress']))}/httprequest", "authentication": { "type": "Raw", "value": "@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']" } }, "runAfter": { "batchData": [ "Succeeded" ] }, "limit": { "timeout": "P1D" }, "metadata": { "flowSystemMetadata": { "swaggerOperationId": "HttpRequest" } } }, "Results": { "type": "Compose", "inputs": "@base64ToString(body('sendBatch')['$content'])", "runAfter": { "SendBatch": [ "Succeeded" ] } } }, "runAfter": { "Set_variable": [ "Succeeded" ] }, "runtimeConfiguration": { "concurrency": { "repetitions": 10 } } }, "Set_variable": { "type": "SetVariable", "inputs": { "name": "LoopControl", "value": "@length(body('Select'))" }, "runAfter": { "Select": [ "Succeeded" ] } } }, "runAfter": { "batchTemplate": [ "Succeeded" ] } } }, "runAfter": { "Initialize_variable": [ "Succeeded" ] } } }
To copy the Scopes above simply copy the code into your clipboard, like this:
Conclusion
This is a great way to improve the performance of creating items in a SharePoint list. I have made my best attempt to make it as simple as possible for you to duplicate and I hope you find it useful. Please do let me know in the comments if you have managed to speed up your flows using this method. I’d be interested to know how long it took before and how much time you saved.
Support This Website
Thank you for visiting. It’s because of readers like you that I continue to write on this blog.
If you found value in what I share, I’ve set up a ‘Buy Me a Coffee’ page as a way to show your support.
Mezza says
A very thorough and detailed post! Thanks!
Pratibha says
for me this flow is running fine without any error but it isn’t writing anything in the list means it isnt creating items in the list, I can see values coming up from database. what could be the possible reason for it?
Paulie says
This means that the HTTP request was a valid format, but the contents of the batch request were not. So you need to look at the results compose action and see what the error returned from SharePoint was. My guess is that you have a string in a number column or vice versa. Have a look at the results and see what the error is.
Lluis says
Hi ,thanks for all ,really goes to much faster, but just a question, i have trayed with a excel and only imports 1024 records from 65000… Flow takes 6 minutes to end and with error, but not all records have been imported, any idea?
Thanks again!
tachytelic says
Lluis, you need to check the settings on “List rows present in a Table”. The threshold value is probably set to 1,000
Steven says
Hi Paulie, I got this error,
Request to Azure Resource Manager failed with error: ‘{“error”:{“code”:”ActionRepetitionNotFound”,”message”:”The specified repetition ‘000002-000000’ of workflow action ‘Results’ is not found.”}}’.
Steven says
Hi Paulie, thank you so much, please post Batch Update multi items?
Donavan Marais says
Great post! I followed it to the T, however, I couldn’t work out the tracked properties for the scope settings. When I ran my flow, I encountered this error:
Unable to process template language expressions in action ‘batchTemplate’ inputs at line ‘1’ and column ‘23313’: ‘The template language expression ‘actions(‘settings’)?[‘trackedProperties’][‘changeSetGUID’]’ cannot be evaluated because property ‘changeSetGUID’ cannot be selected.
What needs to be stored in the tracked properties?
Thanks
Paulie says
If you copy the scope from the post and paste it into your own flow you will be able to examine what is in the tracked properties of the settings action, did you recreate the flow manually?
Donavan Marais says
I copied it over as per your instructions.
Donavan Marais says
Did you perhaps set the changeSetGUID in a variable?
Donavan Marais says
I managed to get past that issue by adding a compose for both batchGUID and changeSetGUID. Next hurdle was the sendBatch action. Got this error:
{
“status”: 400,
“message”: “The ‘Content-Type’ header value ‘application/http\\nContent-Transfer-Encoding: binary’ is invalid. When this is the start of the change set, the value must be ‘multipart/mixed’; otherwise it must be ‘application/http’.\r\nclientRequestId: 11a6b56d-e054-43a0-befe-c09bdd5a4a14\r\nserviceRequestId: bad8e29f-904e-c000-579d-aed473ac947b”,
“source”: “https://test.sharepoint.com/sites/test/_api/$batch”,
“errors”: [
“-1”,
“Microsoft.Data.OData.ODataException”
]
}
My Headers are configured as per yours:
Content-Type: multipart/mixed;boundary=batch_@{actions(‘settings’)?[‘trackedProperties’][‘batchGUID’]}
Donavan Marais says
Apologies, I found the issue. Corrected and it’s working great now. Thank you so much.
Paulie says
Well done! Glad you got it working. Lot’s of people getting on quite well with it now.
Donavan Marais says
It was definitely user error on my part! Again, thanks for an excellent article. You saved me on my project. Super happy with the results. Took 6 minutes to create 3,200 list items.
Paulie says
It does provide a massive speed boost! How long did it take the normal way to create 3,200 items?
Donavan Marais says
Unfortunately it timed out each time I tried to do it the normal way
Shatha says
Thank you so much
But does this method works if I need to update records in share point list?
Paulie says
This will not update existing records. I have not yet written up how to perform an update using this method.
Derek Crawford says
Hey Paul, thank you for your videos, they are helping me grow tremendously. Is there a way to circumvent the returned 5K record limit during the ‘List rows present in a table’? Perhaps a nested Do Until, like you implemented in your “Solutions to the SharePoint 5000 item limit using Power Automate” April 2020 post?
http://tachytelic.net/2020/04/many-ways-get-sharepoint-items-power-automate/
BK says
Hi Paulie, this was very helpful and I have incorporated your way to assign tasks to users (The task lists and number of users are dynamic based on user selection in Power Apps). I made slight modifications to your approach and incorporated it into my flow. The way you have organized and developed the flow was also very helpful. I learnt some new things. Thanks for great help.
On the other note, have you ever done uploading of document (PDF and Office documents, it works for ascii texts, but never got it work for PDF and office documents) via SharePoint HTTP action? If you have, any help would be appreciated. The reason for me to use HTTP action is because the document library (there are 50 document libraries and it is not feasible to write a switch statement in Power Automate) changes based on user selection (currently I am using non-SharePoint HTTP action which is a premium connector. This works fine. I am looking to see if I can eliminate the premium connector). Any help or suggestions would be greatly appreciated.
Paulie says
Hi BK,
Pleased to hear that you found it helpful. Yes, I have uploaded documents via SharePoint HTTP Action, and it is possible to do any file type.
You cannot create a switch statement, but you can easily create an array of document libraries and then filter that list based on the user selection and then dynamically select the correct library. You could do this within a couple of actions and avoid the need for a switch or a condition entirely.
Paul
BK says
Paulie, Thanks for reply, For me the problem lies with how the File content is being passed to SP HTTP action. Every time I do this, the file gets created but the file (PDF and Office documents specifically) is corrupted.
Tyler says
Hello Paulie,
Thanks for the demos and great tool with both the batch delete and batch create. I’m using batch delete to reduce the actions in a lot of flows that delete old data on a schedule, but I have a flow where I actually want to use both batch create and batch delete because I need to archive data before deleting it.
Unfortunately I keep getting the following error:
HTTP/1.1 400 Bad Request
CONTENT-TYPE: application/json;odata=verbose;charset=utf-8
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:{“lang”:”en-US”,”value”:”A type named ‘SP.Data.OlderRecords_Archive_and_MMListItem’ could not be resolved by the model. When a model is available, each type name must resolve to a valid type.”}}}
–batchresponse_8a3589a4-81ec-46a9-a2d8-060b2a09a147–
I suspect one of two things, either there’s something wrong with the list name or there’s something wrong with the mapping of values in the select action.
I checked the name of the site and list and they are both good. Is it possible for SharePoint to have a different name for a list if I recently changed the name?
I checked the value mapping in the select action. I made sure all the values are the correct type and this shouldn’t be difficult because I’m creating in an archive list that is a copy of the original list where the Get items action is pulling from. I also tried a couple variations on the choice column names, like making sure the Status Value value is labeled with the Status/Value label it’s given in the original Get items action.
But I’m still getting the same error. Do you have a better idea what may be wrong?
Thank You,
Paulie says
I have no idea what the error means really. If you exclude the status column completely does it then work?
Tyler says
I figured out what the specific issue is.
I was able to get this to work on a test list on a different SharePoint site in the collection with a siteAddress set to “https://metropolitanministries.sharepoint.com/sites/ProcessImprovementTeam/”. The flow just won’t work when I try to use the site that is the homepage for the SharePoint collection where the siteAddress is “https://metropolitanministries.sharepoint.com/”.
I don’t know yet if there is a way to alter the flow to get it to work with the different siteAddress structure of the homepage.
GG says
Works great, but underscores in SP list names should be replaced as well else you get the type resolution error from comments above. This worked for me:
json(concat(‘{“type”:”SP.Data.’, replace(replace(outputs(‘settings_2’)[‘listName’], ‘ ‘, ‘_x0020_’), ‘_’, ‘_x005f_’), ‘ListItem”}’))
AK says
Hi Paul,
This is awesome. One query I have is “Can we set the item permission when bulk creating items. Like give edit access to a particular email I’d.”
Or maybe is there a similar way to bulk update the list item permissions.
Thank you so much as always.
Regards,
AK
AK says
Hi Paul, This is awesome. One query I have is “Can we set the item permission when bulk creating items. Like give edit access to a particular email I’d.”
Or maybe is there a similar way to bulk update the list item permissions.
Thank you so much as always.
Regards,
AK
AK says
Hi Paul,
This is awesome. One query I have is “Can we set the item permission when bulk creating items. Like give edit access to a particular email I’d.”
Or maybe is there a similar way to bulk update the list item permissions.
Thank you so much as always.
Regards, AK
Paul Graham says
You have changed my life and the way I use Power Automate. I am very very grateful x
Paulie says
Paul – Glad to be providing a life changing service!
john good says
Thanks for these post! I was able to use both delete list and create list with sql server connections. went from 3 hours to 8 minutes.
Paulie says
Wow – 3 hours to 8 minutes is a seriously good result!
Shahar says
Hey Paulie,
Thanks for this great recourse, I got a question:
What if I want to let the user to upload a file via power app (or form) and from there to dynamically “translate” the excel file to SharePoint list, would it be possible with this method?
Amit Lohogaonkar says
great! can this be used for updating files in different libraries in same site?
X.S says
I am trying o import 9725 items from SQL to SharePoint but I am having to issues, one the final count of items is 8632 and the flow apparently runs but on the SendBatch step I am getting this error:
{
“status”: 400,
“message”: “The maximum number of bytes allowed to be read from the stream has been exceeded. After the last read operation, a total of 1055925 bytes has been read from the stream; however a maximum of 1048576 bytes is allowed.\r\nclientRequestId: ee1b1838-7a91-47fe-bbf5-3c683d9038f9\r\nserviceRequestId: 32200ba0-4073-1000-5c68-3c147bfbadd6”,
“source”: “https://xxsite.sharepoint.com/sites/xxxx/_api/$batch”,
“errors”: [
“-1”,
“Microsoft.Data.OData.ODataException”
]
}
Any suggestions?
Ha Le says
got it to work. went from ~1h 50min to around 16 minutes…. biggest time saver ever!
Jesslyn says
Thank you so much for both the Delete list and Update list flows. We are migrating from SharePoint server where our large lists worked, but not so much in SharePoint Online. These flows are lifesavers.
zo says
Running into the same issue X.S. is, anyone have a solution?
tahsin says
Hello,
I’m trying to import example List with 3 Rows.
ListName is = testBatchImport
__metadata= json(concat(‘{“type”:”SP.Data.’, replace(outputs(‘settings’)[‘listName’], ‘ ‘, ‘_x0020_’), ‘ListItem”}’))
I got following Bad Request:
–batchresponse_46e39e77-1f85-42de-8096-5c0fbfd87563
Content-Type: application/http
Content-Transfer-Encoding: binary
HTTP/1.1 400 Bad Request
CONTENT-TYPE: application/json;odata=verbose;charset=utf-8
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:{“lang”:”de-DE”,”value”:”Ein Typ mit dem Namen \”SP.Data.testBatchImportListItem\” konnte vom Modell nicht aufgel\u00f6st werden. Wenn ein Modell verf\u00fcgbar ist, muss jeder Typname in einen g\u00fcltigen Typ aufgel\u00f6st werden k\u00f6nnen.”}}}
–batchresponse_46e39e77-1f85-42de-8096-5c0fbfd87563
Content-Type: application/http
Content-Transfer-Encoding: binary
Any suggestions?
zo says
@tahsin, capitalize the ‘t’ in your ListEntityType path to read SP.Data.TestBatchImportListItem
tahsin says
@zo, I capitalze the ‘t’ in my ListEntityType path but still having bad request.
–batch_e10aee85-3362-44e0-9fb1-7ecd3d917e16
Content-Type: multipart/mixed; boundary=”changeset_29189f5f-ccfc-41ce-9161-f4d0d3ae7798″
Content-Length:
Content-Transfer-Encoding: binary
–changeset_29189f5f-ccfc-41ce-9161-f4d0d3ae7798
Content-Type: application/http
Content-Transfer-Encoding: binary
POST https://arfmann.sharepoint.com/sites/IFARotorionDigitalerSCMKollege/_api/web/lists/getByTitle('TestBatchImport‘)/items HTTP/1.1
Content-Type: application/json;odata=verbose
{“__metadata”:{“type”:”SP.Data.TestBatchImportListItem”},”A”:”1″,”B”:”1″,”C”:”1000″}
#.2 row
#.3 row
–batchresponse_22ed63a5-7aa4-4e37-8deb-87fba8e17e0b
Content-Type: application/http
Content-Transfer-Encoding: binary
HTTP/1.1 400 Bad Request
CONTENT-TYPE: application/json;odata=verbose;charset=utf-8
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:{“lang”:”de-DE”,”value”:”Die Eigenschaft \”A\” ist f\u00fcr den Typ \”SP.Data.TestBatchImportListItem\” nicht vorhanden. Stellen Sie sicher, dass nur Eigenschaftennamen verwendet werden, die im Typ definiert sind.”}}}
–batchresponse_22ed63a5-7aa4-4e37-8deb-87fba8e17e0b
Content-Type: application/http
Content-Transfer-Encoding: binary
Max1981 says
Hello together, hello Paul,
first: Paul, thanks a lot for this post. I saw it last year but I didn’t had the time to rebuild my process. So today I finally used your “template” and the time saving is create.
I’m using the PowerPlatform/PowerAppsForAdmin-Management connectors to save information in SharePoint. Unfortunately, the batch runs into in error if I’m using batches in size 1000:
Content-Type: application/http\r\nContent-Transfer-Encoding: binary\r\n\r\nHTTP/1.1 500 Internal Server Error\r\nCONTENT-TYPE: application/json;odata=verbose;charset=utf-8\r\n\r\n{\”error\”:{\”code\”:\”-2146233083, Microsoft.SharePoint.Client.ClientServiceTimeoutException\”,\”message\”:{\”lang\”:\”en-US\”,\”value\”:\”The operation has timed out.\”}}}\r\n–batchresponse_d404c9a9-b2c5-48d8-ae29-41ba2a92c4b9–\r\n”
So I reduced it to the size 200 and it works fine now. So I will play around with it and also try to get information if the response has an error in it.
Maybe this information is helpful for someone.
Jesslyn says
Thank you again for this flow. I am having success on one list, but get this error:
A type named ‘SP.Data.IPT_x0020_PE/VC_x0020_ListListItem’ could not be resolved by the model. When a model is available, each type name must resolve to a valid type.\
on another list. I can’t for the life of me figure out what is going on.
??????? ???????? says
Hello Paul, I am working in a project as a junior and I wanted to try your method, but it didnât work out. Flow ran successfully, but thereâs no change in my list, the records never show up, Iâve already checked everything and I canât setting (( please help me, I’d appreciate it. In my excel file I have 15000 record, but ‘Do until’ have 3 steps (variable LoopControl – 1000 first step, 24 – second step and 0 ) I changed Setting and GenerateSPData (besides __metadata / json(concat(‘{“type”:”SP.Data.’, replace(outputs(‘settings’)[‘listName’], ‘ ‘, ‘_x0020_’), ‘ListItem”}’))
Jesslyn says
I thought I would follow up on the errors that I got in case it would be of some help to others. It seems that the code does not like certain symbols in the column names. I had to replace both ‘/’ and ‘:’. I was having inconsistent results, either those columns were returning null values or the code threw errors. Since the SharePoint list displayed those symbols, it really didn’t matter what the Excel file shows (for us) so we’re good
Vitali Varabyeu says
I don’t have any spaces and so on in column naming
Josh Butcher says
That’s a brilliant solution! If I have a source Excel file with uo to 10k rows, does this solution handle it as-is or should I loop the List Rows action + this batch create (e.g. 1k items)? How did you get to this answer; really?
Paulie says
The example flow in the post should do exactly what you want. Bring in all 10k rows from Excel in one action and then use a do until loop to create the items in batches of 1,000. It does depend how many columns there are in your source Excel file though. If it is very wide you might need to reduce the batch size a little bit to prevent timeouts.
Alex L. says
At the SendBatch action, I got this error message”: “The maximum number of bytes allowed to be read from the stream has been exceeded. After the last read operation, a total of 1055877 bytes has been read from the stream; however a maximum of 1048576 bytes is allowed.(…)”
Any idea of how to fix this? My excel file only has about 7K rows.
Paulie says
It must be the number of columns pushing you over the size limit then. If you look in the settings action there is a batch size setting. Try reducing it to 750 and see if it works then.
Alex L. says
@Paulie, thank you so much for writing this article and for your advice about reducing the batch size. I don’t get the over-the-limit error message anymore, and my flow takes 8 mn and 20 s for ~ 7.6k rows.
I also want to thank @Jesslyn for recommending to remove symbols from the column titles.
Once again, thank you very much!
Valdeir Barbosa says
Regards Paul, thank you for deliver us this content really appreciated that!!
When I try to run my flow I got the message “The ‘from’ property value in the ‘select’ action inputs is of type ‘Object’. The value must be an array.” I’m start in to learn a batch ways a few days, please can you help me to figure out where I missed, please ?
Avik says
Hi Paul,
Thank you as always.
I wanted to ask how can we batch create items in a sharepont list where one column is a “people” column and in the excel I have a column with email ids.
Can you guide to bulk update this “people” column.
Regards,
Avik
Gianluca M. says
Very cool. Should also reduce the number of API calls, right? Is it 1 API call for each 1’000 items batch? That would be an ENORMOUS advantage, apart from being fast!
Richard Davis says
Great post. I am trying to apply this method to another upload process but am getting this error message:
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:{“lang”:”en-US”,”value”:”A type named ‘SP.Data.UserApplicationAccess_TestListItem’ could not be resolved by the model. When a model is available, each type name must resolve to a valid type.”}}}
I am a newb so not exactly sure where I went wrong, Any help or direction would be greatly appreciated. Thanks
Richard Davis says
Followup.. I found the issue, replacing the underscore.. something so simple!
David says
Hi Paul,
This is really a great time-saver, thank you so much. I attempted to use the “My Clipboard” option to paste your code, but for some reason, it doesn’t provide an action/operation choice to paste to. Anyway, I was able to get it to work by reviewing the “peek code” with your code. The Flow only added two records to my Sharepoint list. I looked at the “SendBatch” operation output and it shows the following errors for all the other records to be loaded (998): Invalid JSON. More than one value was found at the root of the JSON content. JSON content can only have one value at the root level, which is an array, an object or a primitive value.” Any ideas on what is causing this? Thank you,
David says
Hi Paul,
I realized my error…it was adding an “O” instead of “0” in the %0A batchData input. What was taking 1 to 2 hours is now down to 10 minutes…amazing!
Paulie says
Well done David – yes, itâs a great way to boost performance.
Will says
Thank You So Much! This was very helpful and what I needed for a list I need to delete and reupload daily but the calls were too high.
I’m not an IT person and just learning these flows for work. It took me hours trying to get a batch create to work so I gave up. Then I saw the batch update and the flow had some extra details that allowed me to figure out the issue of how this works, and got the flow working and I think it could help make it easier for others to use.
Issue: Sharepoint List Display Name vs Web Address Name
The list name from Settings is used in multiple places. However, the formatting has to be adjusted based on the location it is used. In the batchTemplate the SPList name needs to be regular display name. In the GenerateSPData _metadata the needs to be the web address name. OP tried to help the _metadatafield by adding some replace values for underscores, but that only works if the name never changed.
If a Sharepoint list is created with the initial name âInitialSharePointNameâ and then the display name is changed to âNew Display Nameâ, then it will never call the correct webaddress since the webaddress doesnât change from initial name just like column names.
Solution to first issue: Add a second row in the Settings and differentiate the sharepoint display name and the sharepoint webaddress name.
Original Settings Format:
{
“siteAddress”: “https://OrgName.sharepoint.com/sites/SiteName/”,
“listName”: “ExampleListName”,
“batchSize”: 1000
}
New Settings Format:
{
“siteAddress”: “https://OrgName.sharepoint.com/sites/SiteName/”,
“SPListDisplayTitleName”: “(Copy Sharepoint Name from List Settings)”,
“SPListWebAddressName”: “(Copy list name section from the Web Address section in List Settings”,
“batchSize”: 1000
}
After Changing the settings you have to go into where they are used and update the expressions.
1. Batch Template â change the output of getByTitle in the PATCH statement to: PATCH @{outputs(‘settings’)[‘siteAddress’]}_api/web/lists/getByTitle(‘@{outputs(‘settings’)[‘SPListDisplayTitleName’]}’)/items(|ID|) HTTP/1.1
2. Generate SP Data â change the _metadata value field to contain json(concat(‘{“type”:”SP.Data.’, replace(replace(replace(outputs(‘settings’)[‘SPListWebAddressName’], ‘_’, ‘_x005f_’), ‘ ‘, ‘_x0020_’),’-‘,’_x002d_’), ‘ListItem”}’))
I hope this helps some people because this flow really helped me get what I needed and saved a massive amount of calls.
Paulie says
Nice one William – thank you for the useful comments too.
Mark says
Hi Paul,
I love your blog; you have helped me (without knowing) so many times before and now Iâm back, to try my hand at your batch create flow.
I have chosen the threaded version of the flow for performance reasons.
When I replace the excel fields using the dynamic pick list at the Select Action called âGenerateSPDataâ, and despite having a unique key/value pair in the dataset.
Power Automate is âhelpfullyâ creating an Apply to Each loop. I could create a âfirstâ expression to get around this but wonder if there is something that I am missing?
Spoorthy says
Hey Paul! Thanks for the blog! Its great! I ran the Power automate flow and it worked perfectly fine. But i have done the same in the Logic apps and i get {“code”: “-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,
“message”: {
“lang”: “en-US”,
“value”: “Invalid request.”
} error.
Output of Batch:
–batchresponse_14847879-408c-4477-a6f7-279c15af55ae
Content-Type: application/http
Content-Transfer-Encoding: binary
HTTP/1.1 400 Bad Request
CONTENT-TYPE: application/json;odata=verbose;charset=utf-8
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:{“lang”:”en-US”,”value”:”Invalid request.”}}}
–batchresponse_14847879-408c-4477-a6f7-279c15af55ae
Content-Type: application/http
Content-Transfer-Encoding: binary
Any Idea why i get this issue?
Oliver says
Hello! This is fantastic, and i’ve been able to implement the ‘delete batch’ function with great results.
I have a question about the create though – will this work if you pass in JSON as the data source? I tried and ran into some errors when it came to the creating list items.
My data source pathway is: .CSV file > parse to JSON > currently I have it writing line by line to a Sharepoint list (roughly 3000 entries, takes 45-75 minutes), I would love to do this in a batch but can’t get it to work!
Paulie says
You coming to Scottish summit this weekend? If so I will help you in person?
Oliver says
I’m not coming unfortunately!
Yi says
i don’t know why it is not generate output, flow run succssfully. not it does not create items in the list as output.
GOP says
Hi, many thanks for this article.
It is working on my side, except for lookup column.
I tried many ways to construct the JSON sent to HTTP and none accept my formatting for a lookup column
{ “formValues”:
[{“FieldName”: “TheLookupFieldName”,”FieldValue”:”TheValue”}],”bNewDocumentUpdate”: true}
If you have an idea….
(Batch insert using rest with lookup columns)
Thx
Robin Roy says
Hi,
Can you tell me how we can populate a person or group column using batch update?
Thanks.
Leonardo says
Thanks for the content.
Here the flow has no errors. But when executing no lines are added in sharepoint. When I go to check the steps after running, the “results” step gives the error:
Unable to process template language expressions in action ‘Results’ inputs at line ‘0’ and column ‘0’: ‘The template language expression ‘base64ToString(body(‘sendBatch’)[‘$content’])’ cannot be evaluated because property ‘$content’ cannot be selected. Property selection is not supported on values of type ‘String’. Please see https://aka.ms/logicexpressions for usage details.’.
How can I solve this problem?
Edward says
Does it work for lookup and choice fields?
Pam says
Hi Paulie,
Thank you for this article, it has been very helpful.
I have tried to implement this script for a flow I am building for my organization, but I keep encountering a problem at the Generate SPdata item. Whenever I add try mapping the columns, I keep getting a “apply to each”.
The only difference is I am using a SQL server data source, and all my columns are string data type.
I am new to Power Automate and trying to learn as I progress, but this has been a thorn in my flesh a few days now, if anyone can help with a solution as I can’t find any single case where SQL server is used for a bulk update of SharePoint list.
Paulie says
Happy to help you out, can you use the get in touch form or connect on LinkedIn please?
Kate says
Hi Paul,
Thanks for your article. It’s really helpfup for new developer of automate like me.
I was trying to post multiple items to sharepoint list, and the flow run successfully.
But the items were not created in sharepoint list.
I run the send batch action with the body in another flow, but get the error message said :
“A supported MIME type could not be found that matches the content type of the response. None of the supported type(s) ‘multipart/mixed’ matches the content type ‘application/json; charset=utf-8’.”
Is it possible to give me some hint?
Fernando says
Hi Paul,
I tested the flow, and it works, but only create 5000 items (from 7k).
What could be the problem?
Thanks for your help.
Dalian says
Hello,
I tried both methods (threaded and non) I get all steps greens but inside the ‘Results’ step I get
HTTP/1.1 400 Bad Request
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:{“lang”:”en-US”,”value”:”A type named ‘SP.Data.Duplicate_Reference_just_testListItem’ could not be resolved by the model. When a model is available, each type name must resolve to a valid type.”}}}
Dalian says
Underscore was the issue in the list name for me.
json(concat(‘{“type”:”SP.Data.’, replace(outputs(‘settings’)[‘listName’], ‘_’, ‘_x005f_’), ‘ListItem”}’))
Aaron says
Has anyone managed to get past the issue when changing the Select action GenerateSPData to draw from your own data source, Automate is adding an ‘Apply to each’ that cannot be deleted?
Running the Flow with the Apply to each (I think) is causing an error ‘The maximum number of bytes allowed to be read from the stream has been exceeded.’
Ali says
Hi Paul,
First of all thank you for providing this amazing workaround, cut my time from 3 hors to 15 minutes.
When preparing excel data in GenerateSPData, how can we send a hyperlink? what should we put in @item…..?
Peter says
Done! After a few hours looking for a stupid detail in my target list it finally works! My monthly import is now down from 24h+ (because of to many api-calls) to a few minutes (still optimizing).
Thank you very much!
PS: I had error-code 400 (i think because of a “t” instead of a “T”) and 500 “SharePoint.SPListDataValidationException” because of a column with conditions, which i could remove
Terri says
This will be really helpful if I managed to get it to work. I downloaded and imported your solution to my workflow. However, I keep getting this message below. The workflow runs successfully but it doesn’t create the item.
“body”: {
“error”: {
“code”: “-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,
“message”: {
“lang”: “en-US”,
“value”: “An unexpected ‘PrimitiveValue’ node was found when reading from the JSON reader. A ‘StartObject’ node was expected.”
}
}
},
“statusCode”: 400
}
}
},
Can you help me out?
Ivo says
Hello! I’m having this issue when trying to send batch to the list:
“body: {
error: {
code: “-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,
message: {
lang: “es-ES”,
value: “El patrĂłn no pudo resolver un tipo con el nombre ‘SP.Data.Proyectos_x0020_AsignacionesListItem’. Cuando hay un patrĂłn disponible, cada nombre de tipo se debe resolver en un tipo vĂĄlido.”
}
}
}”
Could you please help me?
Thanks!
Seweryn says
Hi ,thanks for all ,really goes to much faster, but just a question, i have trayed with a excel and only imports 568 records from 2000⌠Flow takes 6 minutes to end and with error, but not all records have been imported, any idea?
Thanks again!
Udula Sachithra Jayawardena says
I’m utterly amazed by how speedy it is. Few seconds, and I have 1000 records in a SharePoint Lists. I encountered multiple errors and managed to resolve them all. Now I’m trying to a run a load test by uploading 600,000 records to see how it fares. But to my dismay, the flow succeeds, and only uploads 791 records. What could possibly be the reason?
Mariette says
Hi Paulie,
Thank you for this, very helpful. I am not receiving any errors, but I only retrieve 1024 rows (4 columns). The pagination for List Rows present in a table is switched on with value 1000. Do I have to implement a Loop Until or do you have any other suggestions? Thank you
Matt says
Is anyone else having an issue with it not loading all of the rows. For instance, I have run it several times on different lists of data and on the last one, it loaded only 46 of the 50 rows but not reason as to why it left off the others.
vitss says
Excellent post!!!
Like some others received an error:
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:
{“lang”:”nl-NL”,”value”:”A type named ‘SP.Data.ListItem’ could not be resolved by the model. When a model is available,
each type name must resolve to a valid type.”}}}
Solved that by making a new list without special characters like underscore,
suddenly it worked like a charm.
Paulie says
Thanks for the kind words, glad you found it useful.
Michael says
Hi Paulie,
Many thanks for this excellent post!
I have an error when I want to create items with lookup columns.
How can we populate a lookup column with this method ?
Thanks in advance !
Steven Vits says
Hello,
I combined some posts in one flow.
I read a CSV file and convert it to a JSON file.
I delete all items in a SP List to repopulate with the renewed data from the CSV
Finally I do a post batch of all the items in the CSV.
As a check I count the items in the csv and the items in the SP list (using @{body(‘Send_an_HTTP_request_to_SharePoint’)?[‘d’]?[‘ItemCount’]})
unfortunately the itemcounts are not the same: 2240 for the CSV and 2217 in SP.
do you have any suggestions to troubleshoot?
Ahemed says
HI, I want to map excel data with 3 email I’ds to a sharepoint column of multi-person column, how do we get this,
Mridul says
I get the below error:
{
“status”: 404,
“message”: “clientRequestId: 2b5d1088-ac41-4c42-ad26-8c6a4f523433\r\nserviceRequestId: 339dafa0-5096-3000-976a-62fe73368fb5”,
“source”: “https://cnxmail.sharepoint.com/sites/OperationsDelivery/MY_KL_PMM/Lists/_api/$batch”,
“errors”: []
}
I can see the inputs are gathered, but the SendBatch section doesn’t seem to be working. Any suggestions?
Maroun says
if i created a folder in the list and i want to add the items inside this folder of the list not in the list directly, how can i do that?
javier reyes says
Hi, If I want to do this using powerapps as a trigger, how I can reduce the time under 3 min. excel do not allow me to use 2 conditions on the filter query.
Andrew BRA says
Excellent post, I already used it several flows and it was work well.
But I tried with another flow which uses Lookup and Choice column and not work, someone found a solution ?
Miguel says
Hi,
I’m also not able to run the code properly.
I’ve notice the code is not the same as in the video. For example in GenerateSPData, the expression of “__metadata” is different (and missing a ‘.’)
I don’t get any error – but SP does not get populated, so I know the error could be in the Generate SPData section, trying my best to find the needle in the haystack. Thank you.
StephCORKING says
This is amazing! Thank you very much for sharing
RODRIGO PPROCĂPIO says
First of all, Thanks for this amazing post!!!
I have a question: Can we add an ID row in this flow to use all the data in power apps?
Best regards, Paul
Paulie says
You can – but you will get an ID row from SharePoint anyway?
RODRIGO PPROCĂPIO says
Not exactly, I need an ID row that is not the automatic one from SP to use in Power Apps and avoid dellegation problems. Do you have any suggestions?
Thanks again for the help! đ
Paulie says
Yep, use the range expression – or the guid expression.
RODRIGO PPROCĂPIO says
Nice, I’ll study about this expressions. Thank you so much Paul.
Andre says
Hi mr Rodrigo, to the ID’s colunms you should use colunmnameId, example: if name’s colunm is City then you create a colunm’s map CityId and set its value from Excel or another file or another List or variable. I’m not sure if I was understand your question.
—-
RODRIGO PPROCĂPIO says
MAY 29, 2023 AT 5:51 PM
Not exactly, I need an ID row that is not the automatic one from SP to use in Power Apps and avoid dellegation problems. Do you have any suggestions?
Thanks again for the help! ?
Andrew BRA says
I was solved using the same name of column like ColumnNameID and map its value from Excel file or List or variable or another file.
đ
——
Andrew BRA says
MAY 22, 2023 AT 7:41 PM
Excellent post, I already used it several flows and it was work well.
But I tried with another flow which uses Lookup and Choice column and not work, someone found a solution ?
Michael says
This is great, but is there a way to filter the data first before loading it. I have a data table of 16K+ rows. I currently filter out data I don’t need loaded to the sharepoint list during the “Apply to Each” stage using controls. There are a number of criteria to use so was hoping there would be a easy way to add a complex filter of mutltiple critera before loading. Or am I better the just Load all and delete out what I don’t need?
Paulie says
I would drop the apply to each step. Just use a Filter Array action (or use an odata query on the get items action to retrieve only the records that you actually want)
Michael says
Hi, I am using a Filtered Array to filter out the data that I don’t want to create, I then use a Parse Json so i can select only the fileds I want in my sharepoint list however this seems to be an issue when i get to the GenerateSPData step. I get this message: Unable to process template language expressions in action ‘GenerateSPData’ inputs at line ‘0’ and column ‘0’: ‘The template language expression ‘body(‘Parse_JSON_2’)?[‘body’]’ cannot be evaluated because property ‘body’ cannot be selected. Array elements can only be selected using an integer index. Please see https://aka.ms/logicexpressions for usage details.’. Any thoughts?
Mick says
Hi Paulie, I have a flow that uses your batch delete process first to clear of the existing sharepoint list items and works great, I then use the above non-threaded create list flow and encounter an issue: HTTP/1.1 400 Bad Request
CONTENT-TYPE: application/json;odata=verbose;charset=utf-8
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:{“lang”:”en-US”,”value”:”A type named ‘SP.Data.Fleet_AppListItem’ could not be resolved by the model. When a model is available, each type name must resolve to a valid type.”}}}
–batchresponse_38e8fe99-0d04-4607-b951-69ef59b802a1
Content-Type: application/http
Content-Transfer-Encoding: binary
The interesting part is that in the Settings step the site address and the list name are the exact same. the only difference is that the Settings of the creation flow, it has a line for batch size.
Any ideas where the issue could be?
Udula Jayawardena says
I had the same exact issue with an SP list while following Paulie’s method. You need to get the internal name of the SP list, not the one in settings. Following blog explains how to get the internal name.
https://vblogs.in/solved-a-type-named-sp-data-listname-could-not-be-resolved-by-the-model/
Melanie says
If anyone is having an issue using this with a “Person” column (as I was), you can use the “Id” of the person instead of name or email address.
You have to rename your SharePoint column in the “GenerateSPData” to suffix with “Id”. If your column name is “Owner” change to “OwnerId”. Then pass through the integer Id for the Person, and it will work.
I had to do a bit of extra work to obtain the Id for my Persons by using the connector “Send an HTTP request to SharePoint” and using this as the Uri:
_api/web/SiteUsers/getByEmail(”)?$select=Id
Replace text between and remove the .
Hope this helps someone!
Prithwis Roy says
This is a good solution for optimizing flow , I have implemented this logic and got a huge duration difference. But in step I’m stuck badly that not able to send data into person type column . Can anyone help me with this .How send data to person type column in this batch send method.
Scott says
How would you modify this to update exisiting records?
I have an array built that looks like below, and need to pass this to SP to update the indicated records.
[ID] matches the ID’s in my SP list. I want to update the [DateForPC] field. And [PurchaseOrderMatch] needs to be ignored, as it’s not in the SP list being updated.
[
{
“ID”: 1078,
“PurchaseOrderMatch”: true,
“DateForPC”: “2023-04-28”
},
{
“ID”: 1103,
“PurchaseOrderMatch”: true,
“DateForPC”: “2021-06-30”
}
]
Renan Lira says
The flow works perfectly, but I ran into a problem with dates. I live in Brazil, and here we use the dd-mm-yyyy format. In my Excel database there were 2,000 thousand items, and in the SharePoint list there were around 1,700. I found this strange because Flow didn’t return any errors. When I went to analyse the JSON, I realised that the items that had a date, for example, such as 31/05/2023 had not been read. This is because when I looked at the values, Flow converted my dates to mm-dd-yyyy and obviously there is no month 31. This is very strange because reading my JSON output, all the dates were in the correct dd-mm-yyyy format, but in the “Send Batch” action, when identifying the ID of these cases in the JSON, this message appeared: “”value”: “Cannot convert a primitive value to the expected type ‘Edm.DateTime’. See the inner exception for more details.” However, my flow still doesn’t show any errors.
Could you please help me?
Raul says
Hello great post, I have an issue when the batch inserts it does with quotes in the SharePoint list for example “Value”, do you know why is this happening.
Thanks.
Patrick says
Hi there. I’ve been using flows built with this logic for upwards of a year now with excellent results – until 3 weeks ago. Using a batch size of 1000, I’ve started having individual record creations/modifications timing out within a single Send an HTTP request to SharePoint block (NOT the entire action timing out, but just individual items within the batch are returning statusCode 500). I’ve tried reducing the batch size which has perhaps reduced the volume of time outs, however I have not been able to completely eradicate errors this way, even with the batch size as low as 80. So now I have a situation where to minimise errors I have reduced the speed of my flows by a factor of 10, if not more, and I am still experiencing errors.
Do you know if there is any way I can adjust the Send an HTTP request to SharePoint action – or any other action in this pattern – to handle these errors?
Jenny says
Hi, much appreciated for this post! I have encountered an error as below while trying, anyone has any idea? Thank you!
“A node of type ‘StartArray’ was read from the JSON reader when trying to read the start of an entry. A ‘StartObject’ node was expected.”
ThaoVy Ha says
Hi- I am using this to update my SharePoint site list from an Excel sheet. However, I am getting an error message at the “Set Variable” step where it is saying that the “Loop Control” variable is not initialized. Any ideas on how to fix this? This is really useful btw!
Paulie says
You need to initialise the variable first before the scope
Jenny says
Hi, I have followed the YouTube video and guidance from the comment session and somehow the error has gone. Not too sure which step helped exactly but below are the changes made.
1. Changed the “__metadata” expression from json(concat(‘{“type”:”SP.Data.’, replace(outputs(‘settings’)[‘listName’], ‘ ‘, ‘_x0020_’), ‘ListItem”}’)) to json(concat(‘{“type”:”SP.Data.’, outputs(‘settings’)[‘listName’], ‘ListItem”}’)).
2. Enabled “Pagination” for “List rows present in a table”.
3. Followed the advice above to remove any symbol in the column names.
Once again, thanks Paulie for the post!
Paulie says
Well done! Did you get a good result?
Jay says
I was already using batch for creates and deletes but the threaded version is a game changer. Outstanding!! đ
When I was looking batch creating a person column in a SP list – it required the Id of the person column (append Id to the name of your person column, then supply the Id) e.g.
Person column name: ThePerson
Create a person in batch:
“ThePersonId”: 12
I didn’t go down this route because I needed to get all of the Id’s for people in the SP tenant and then loop through them in order to create a nice batch list.
If anyone knows a way of batch creating a person column without using the person Id – that would be awesome!
Christian says
Hi!
After a few trial and errors I found out why my import only inserted 9690 out of 10497 rows. I tried to import rows where number had a dot as thousand delimiter and commas a decimal delimiter which broke my column formated as “Number” in the list. So a good starting point if not all rows show up is to convert all columns to text in your sharepoint list and see if that helps. đ
Jenny says
Yes Paulie, it’s creating 12 line items per second now!
Paulie says
Well done Jenny!
Mikael says
I took the time to recreate the flow block by block in order to fully understand its logic… and it works! I also integrated it into a childflow so that other (less expert) people could use it. I’ll try to do the same with batch update and batch delete.
Thank you very much, I’ve learnt a lot
Paulie says
Glad it was helpful to you. Itâs a tricky flow for sure.
Akshay Manke says
Hello,
I would like to get the people picker field value from excel file. I am trying to do steps mentioned by Melanie, as below but something i am missing and getting the flow failed. Can anyone please help with the exact steps to get the username added?
If anyone is having an issue using this with a âPersonâ column (as I was), you can use the âIdâ of the person instead of name or email address.
You have to rename your SharePoint column in the âGenerateSPDataâ to suffix with âIdâ. If your column name is âOwnerâ change to âOwnerIdâ. Then pass through the integer Id for the Person, and it will work.
I had to do a bit of extra work to obtain the Id for my Persons by using the connector âSend an HTTP request to SharePointâ and using this as the Uri:
_api/web/SiteUsers/getByEmail(â)?$select=Id
Replace text between and remove the .
Hope this helps someone!
Renan Lira says
That’s perfect, work fine! But I have a question, It s posible, to Send Batch to a SQL? I mean, basicly change the action that “create an item” in SharePoint to create a Line in SQL?
Dhinesh Kumar Selvaraj says
Hi Paul, Thank you for the great demo with detailed instructions.
I am trying to update multiple entries in share point using the same send http approach but facing below error in send http.
InvalidTemplate. Unable to process template language expressions in action ‘SendBatch’ inputs at line ‘0’ and column ‘0’: ‘The template language expression ‘parameters(‘$connections’)[‘shared_sharepointonline’][‘connectionId’]’ cannot be evaluated because property ‘shared_sharepointonline’ doesn’t exist.
I’m getting this error only when I use the batch process. Send http is working fine if I update records without batch process. Any inputs or suggestion for this issue would be really helpful.
Callum says
Great tutorial, thank you Paul! Everything is working well, but has something changed in PowerAutomate or SharePoint? Because it is running correctly, but I’m not saving any time in comparison to the old (Apply to each) function… any ideas? Before it was 12 minutes, now it is also 12 minutes…
Edit: I have the same “problem” with your other “Power Automate Flow to batch delete SharePoint List Items”.. Deletion is unfortunately not speeding up
Paulie says
Yes. I think something has changed. Itâs marginally quicker but nothing like it was before. Perhaps so many people implemented this method that Microsoft changed something to limit the rate that SharePoint could operate at.
Jean-Nicolas Turcotte says
I tried to implement this (before I read your last reply that it’s not faster).
I still want to use it has it probably use less token to run it.
but I am unable to make it run.
I get a :
InvalidTemplate. Unable to process template language expressions in action ‘SendBatch’ inputs at line ‘0’ and column ‘0’: ‘The template language expression ‘json(decodeBase64(triggerOutputs().headers[‘X-MS-APIM-Tokens’]))[‘$connections’][‘shared_sharepointonline_1’][‘connectionId’]’ cannot be evaluated because property ‘X-MS-APIM-Tokens’ doesn’t exist, available properties are ‘Accept-Language, User-Agent, x-ms-workflow-id, x-ms-workflow-version, x-ms-workflow-name, x-ms-workflow-system-id, x-ms-workflow-run-id, x-ms-workflow-run-tracking-id, x-ms-workflow-operation-name, x-ms-execution-location, x-ms-workflow-subscription-id, x-ms-workflow-resourcegroup-name, x-ms-tracking-id, x-ms-correlation-id, x-ms-client-request-id, x-ms-client-tracking-id, x-ms-action-tracking-id, x-ms-activity-vector, Content-Type, Content-Length’. Please see https://aka.ms/logicexpressions for usage details.’.
I believe it could be due to empty or blank value in my excel table. When using “for each list rows” I added an if(empty function.
Like this : “field_1”: “@if(empty(item()?[‘LOGCODE’]), null, item()?[‘LOGCODE’])”,
Any idea what I am doing wrong ?
Sri Lakshmi says
Hi Paulie,
I am following your suggestion but the below error is occurring at the ‘Apply to each’ in GenarateSPData.
The action ‘GenerateSPData’ was executed for ‘1024’ iterations resulting in an aggregated result size that exceeded the maximum value ‘209715200’ bytes allowed. Please reduce the number of iterations to ensure that the aggregated results size is less than ‘209715200’ bytes.
santosh says
Can we do it for a document library creating folders . I have a scenario where we need to create 8000 folders in bulk. Can you please help me to change the properties to create the same using http action to create folders in bulk
Pritish says
Hey Paul, this flow will definitely be a life saver for me. QQ. I have copied the flow and followed your instructions however when I run it, flow only adds about 1024 items and stops. I checked and it seems the list rows present in a table action only gets that many items and doesnt repeat to get the complete 15000 records from the mockup sample file. Any suggestions ?
Pritish says
Update: I increased the threshold limit to 5000 and it loaded 5120 items but it still didn’t go through the entire excel file of 15000 records.
Paulie says
Sounds like you need to enable pagination on the action where youâre getting data from
Excel
Sri Lakshmi says
Hi Paulie, I come to know why I am facing that error.
i.e. because of directly mapping the SP data to list rows present in a table instated of – item()?[‘VolumeId’].
Just sharing, it may be it is useful for the starters.
Sri Lakshmi says
Hi Paulie,
I am also facing the same issue, out of 6532 records it is copying 5120 only.
I have enabled pagination “on” and threshold to 5000 in the list rows present in a table setting.
What else do I need to, please suggest me.
Areeba says
Hi Paulie,
Thanks for the awesome post!
I tried to implement the flow with SQL table. I did COMPOSE for GET Rows action, where I am getting all rows from SQL (42,500) but this flow is creating only 27000 in SharePoint .I turned on pagination and all settings are good as I am able to get all items in compose action. There are limited columns in table.
Does anyone have any idea ? What could be possible reasons ?
Srikanth M S says
Hi Paulie,
The flow runs successfully but no entries are being made in SharePoint. I am not able to find out what’s wrong. Any help would be appreciated.
Thanks
David Haygood says
Hi M S Srikanth,
I’m not Paulie, but I’m happy to help because Paulie has provided so many great solutions I have implemented in my work. Look at the “Results” output to see if there are any errors. Sometimes the name of a column in the GenerateSPData doesn’t match the column names you have in the Sharepoint list.
J.K. says
Hi Pauli,
I ran into an issue where I constantly received a “Bad Request” and found out that for the __metadata, the internal list name is required.
{
“siteAddress”: “https://accendo1.sharepoint.com/sites/PowerAutomateText/”,
“listName”: “My List Name”,
“listInternalName”: “MyListName”,
“batchSize”: 1000
}
and change to:
json(concat(â{âtypeâ:âSP.Data.â, outputs(âsettingsâ)[âlistInternalNameâ], âListItemâ}â))
However, if the internal list name has special characters and blanks – the special characters and blanks need to be replaced!
” ” (blank) -> “_x0020_” or “_” (underline) -> “_x005f_” etc.
Example: “My List_Name” -> “My_x0020_List_x005f_Name”
Hope that helps!
Deborah says
Thank you for your solutions – I have used the Batch Delete and it works perfectly.
However, when I try this solution to upload from an Excel file, when I try to map the fields, it automatically places that element with an Apply to Each.
Any idea why? Thank you!
Santhosh says
Excellent Article , Thank you so much.
Srikanth M S says
Hi Paulie,
The flow runs successfully but no entries are being made in SharePoint. When I checked the Results step, the following error was returned:
–batchresponse_a68d0afe-8837-4bce-a8b3-f25ea9df389c
Content-Type: application/http
Content-Transfer-Encoding: binary
HTTP/1.1 400 Bad Request
CONTENT-TYPE: application/json;odata=verbose;charset=utf-8
{“error”:{“code”:”-1, Microsoft.SharePoint.Client.InvalidClientQueryException”,”message”:{“lang”:”en-US”,”value”:”A type named ‘SP.Data.EmployeeDetailsListItem’ could not be resolved by the model. When a model is available, each type name must resolve to a valid type.”}}}
–batchresponse_a68d0afe-8837-4bce-a8b3-f25ea9df389c
Content-Type: application/http
Content-Transfer-Encoding: binary
Any help in resolving this would be greatly appreciated.
Thanks
Mohammed Merchant says
Thank you Paul. The flow worked and was much faster than the single create item technique. I used 1500 items to test, it took 2:34min & all items were created in the SP list. The count option on the SP list column was stuck at 1118 but when I scrolled all the way down, all 1500 invoice items were there. I’ll re-run again to double check. This will definitely help improving getting data from ServiceNow which is in the range of 10K records. Thank you again.
james penna says
Paul, when using batch delete, it looks like sharepoint list items don’t end up in the recycle bin, is there any way to make that happen?
sai says
HI Paul, I hope you’re doing good!
Can you guide me how to achieve the below scenario using Batch processing in Power Automate Flow. I have thousands of Files where i need to do batch processing.
In azure functions I deployed a code which will Bring the files from the azure storage account & convert the Multipage pdf document to Images files and store the converted image files in same azure storage account in different folder path.
Using power automate how i can do batch processing. Please help me with the automate logic.
Craig says
Hi
This is fantastic and am looking forward to using it. The main issue I have is that I would like the Batch to either Update or Insert depending on a Index value.
I have a list of ~22,000 records which will be exported weekly.
Paulie says
I have a flow for that, but I havenât published it as it was somewhat complex, but itâs totally doable.
Ryan says
Thank you so much for posting this. It has been working really well for me over the past few years. I have a workflow that takes about 22000 rows from a SQL database and posts them to SharePoint daily. However, recently, we have noticed that the Batch API is not getting all the rows to SharePoint. It varies, but sometimes it will be short by about 100 rows give or take. We started looking through the results for each step of the loop and found some Status Code 500 request timeouts. Any idea how we can avoid this? We already had our batch size trimmed down to 500, do we need to go smaller? Should we introduce a delay inside the loop to give the API some breathing room? Thanks in advance!