It's easy to develop Power Automate Flows that process a few items at a time, but it becomes quite hard to process hundreds or even thousands at the same time. We either get a Flow that is incredibly slow or start failing completely with 429 "Too Many Requests" errors.
This is especially prevalent in Flows that work with SharePoint for example where rate limiting can be a bit aggressive at times.
Rate limiting isn't a bug; it's a deliberate constraint protecting shared services from being overwhelmed. But the limits stack in non-obvious ways, with restrictions at the connector level, SharePoint service level, and Power Platform level all potentially affecting your flow.
It's possible that these limits are changed in the backend without us being notified. These are limits that I found online so I'll do my best to keep this article updated if something changes. If you see something that is not correct, comment or send me an email and I'll fix it.
I'll write an article about how to better build Power Automate Flows so that you don't even run into these issues (or at least less frequently), but today let's check what could be the potential issues.
The idea is to keep a running list of limitations that are either not explicit or I found online, so that we can have all the information in one place.
SharePoint Connector Limits
The SharePoint connector in Power Automate has its own throttling limits separate from SharePoint service limits. These are the limits you'll hit first in most scenarios.
600 API calls per connection per 60 seconds: This is the primary constraint. If your flow makes more than 600 calls to SharePoint actions within a minute, you'll receive HTTP 429 errors. The count is per connection, not per flow, so multiple flows using the same connection share this quota. Usually Power Automate will limit so that you don't hit this limit, but the first thing to try is to limit the number of calls.
Bandwidth considerations: Large file operations may hit separate bandwidth limits at the SharePoint service level (50 GB ingress / 100 GB egress per hour per user). While less common than API call limits, this matters for flows moving many large files. I have an article related with this for Handling Large Files in Power Automate.
When throttled, SharePoint returns an HTTP 429 response with a "Retry-After" header indicating how many seconds to wait before retrying. Power Automate's built-in retry policy handles this automatically in most cases, but understanding the limit helps you design flows that avoid throttling entirely.
So if your Flow starts running slow it's simply because it's respecting SharePoint's instructions in lowering the frequency of operations and waiting for the next retry time.
SharePoint Service Limits: User-Level
Beyond connector limits, SharePoint Online itself imposes service-level throttling to protect the service. These limits apply to the user account making the requests.
3,000 requests per 5 minutes: A broader time window than the connector's 1-minute window, but still a constraint for bulk operations.
10 requests per second per user (delegated search): This limit applies specifically to search API calls when using delegated permissions. Other operations have different per-second limits.
50 GB ingress per hour / 100 GB egress per hour: File transfer limits that matter for flows moving large volumes of data.
These limits are per user account, so using multiple connections with different accounts can help distribute load. However, this approach requires careful coordination to avoid complexity.
SharePoint Service Limits: Application-Level
SharePoint also applies tenant-wide throttling based on your organization's license count. These limits use a "resource unit" system where different operations consume different amounts.
Resource unit limits vary by time window and license count:
| Licenses | Per Minute | Per 5 Minutes | Per 24 Hours |
|---|---|---|---|
| 0-1,000 | 1,250 | 18,750 | 1,200,000 |
| 1,001-5,000 | 2,500 | 37,500 | 2,400,000 |
| 5,001-15,000 | 3,750 | 56,250 | 3,600,000 |
| 15,001-50,000 | 5,000 | 75,000 | 4,800,000 |
| 50,000+ | 6,250 | 93,750 | 6,000,000 |
Different operations consume different resource units:
| Operation | Resource Units |
|---|---|
| Single item query, download file | 1 |
| Multi-item query, create, update, delete, upload | 2 |
| Permission operations (including $expand=permissions) | 5 |
This system means that permission-heavy operations cost significantly more than simple reads. A flow that checks permissions on 100 items consumes 500 resource units, while reading those same 100 items costs only 100.
Power Platform Request Limits
Power Platform itself imposes daily request limits based on your licensing tier. These are separate from SharePoint limits and apply across all connectors.
| License | Official Limit | Transition Period Limit |
|---|---|---|
| Microsoft 365 / Free | 6,000/user/day | 10,000/flow/day |
| Power Automate Premium | 40,000/user/day | 200,000/flow/day |
| Power Automate Process (per-flow) | 250,000/license/day | 500,000/license/day |
Microsoft is currently in a transition period with more generous limits while organizations adapt. The stricter "official" limits will be enforced after the transition ends—check Microsoft's documentation for current enforcement dates.
Notice the "transition period". This sentence comes directly from Microsoft's site and can change without any warning, so it's possible that this already changed at the time you're reading this.
For light automation, even the official limits are generous. For bulk processing flows that run frequently, they become a real constraint.
If you start hitting these limits, give me a call and I'll try to help. Usually with a lot less money we can build something lighter and a lot more performant adapted to your needs. Just ask me for an estimation or contact me directly.
Organizations processing large data volumes may need to purchase additional request capacity or optimize flows to use fewer actions, but capacity Flows could start being too expensive.
Concurrency and Throttling in Loops
A common cause of throttling is running SharePoint actions inside "Apply to each" loops with concurrency enabled. By default, "Apply to each" loops run sequentially—one item at a time. However, many users enable concurrency in Settings to speed up their flows, which can quickly trigger throttling.
You can read here about concurrency control and how it works, in case you're not familiar.
When you enable concurrency, Power Automate processes multiple items simultaneously. The degree of parallelism ranges from 1 to 50, with 20 as the default when first enabled.
If we calculate Parallelism × SharePoint Actions × Items per minute = API Calls per minute, we can have a sense on the scale and how many requests we'll perform. Microsoft can handle them but the objective is not to hit a throttling and making our flows slower.
If you set parallelism to 20, each iteration makes 3 SharePoint calls, and your flow processes 50 items per minute, that's 20 × 3 × 50 = 3,000 API calls per minute—far exceeding the 600 limit.
The solution is to carefully configure concurrency control on the "Apply to each" action. Set the degree of parallelism low enough that: (Parallelism × Actions × Items per minute) < 600. For many scenarios, a parallelism of 5-10 provides a good balance between speed and staying under limits.
Batch Operations Alternative
Instead of making hundreds of individual SharePoint calls, consider using SharePoint's HTTP API for batch operations. Batch operations let you create or update multiple items in a single API request, dramatically reducing both execution time and API call consumption.
A batch request packages multiple operations into one HTTP call to SharePoint's batch endpoint. SharePoint processes them together and returns a batched response. This approach dramatically reduces API call consumption and largely eliminates throttling concerns for bulk operations.
The tradeoff is complexity—batch operations require constructing specific OData batch request formats and parsing batched responses. For flows that regularly process hundreds or thousands of items, this complexity is worth the reliability gain.
Also debugging is quite hard since we send things in a very specific format and if something goes wrong we only get an error message that may, or may not, point us to the correct issue.
Final Thoughts
API throttling in Power Automate isn't one limit—it's multiple overlapping constraints from the SharePoint connector, SharePoint service, and Power Platform.
The 600 API calls per minute connector limit is what you'll hit first, especially when concurrency is enabled on loops processing many items.
Understanding how concurrency multiplies API calls is key to staying under limits—and remembering that sequential processing (concurrency off) is often the safest default. For bulk operations, batch API calls provide a path to handling large volumes reliably.
Design with these limits in mind from the start, not after throttling breaks your production flows.
Photo by Markus Spiske on Unsplash
No comments yet
Be the first to share your thoughts on this article!