Handling Large Files in Power Automate

Power Automate can handle large files, but how large? SharePoint Online supports files up to 250 GB, which sounds generous until you try moving a 200 MB file between sites using a Flow and hit an error about exceeding the buffer size. The problem isn’t SharePoint—it’s Power Automate‘s message size limit of 100 MB, which becomes an effective limit of 70-80 MB once Base64 encoding overhead is factored in.

If your automation involves files larger than a few dozen megabytes, you need to understand these limits and know which workarounds actually work.

The Real Limits

SharePoint Online’s 250 GB per file limit is what the service supports, but that’s not the limit your flows will hit. Power Automate has a 100 MB message size limit that applies to data moving between actions in a flow.

When you use the “Get file content” action, the file content is encoded as Base64, which increases the size by approximately 30%. A 70 MB file becomes roughly 90 MB after Base64 encoding, and an 80 MB file exceeds the 100 MB message limit.

List item attachments have an even stricter connector limit of 90 MB, though SharePoint itself supports up to 250 MB for attachments. This discrepancy means your flows will fail before reaching SharePoint‘s actual limits.

But why do these limits exist? I go into detail in my Understanding Binary and Base64 in Power Automate, but to oversimplify, systems need a good way to communicate between themselves in a structured way. Files introduce complexity, and therefore, to remove that complexity, we use base64 to transfer information. Systems also need to be healthy, so there need to be strict limitations while keeping the most common cases working. This is what we see. SharePoint is prepared to accept large files and has some very complex strategies to manage those files, but other tools may not. So Microsoft is opting to allow for large files but not huge files so that it covers >90% of the cases while keeping the systems running.

Common Errors for Large Files

When you exceed the limits, Power Automate returns specific error messages that clearly indicate the problem:

“Cannot write more bytes to the buffer than the configured maximum buffer size: 104857600” means you’ve exceeded the 100 MB message limit (104,857,600 bytes). This typically happens when using the “Get file content” action, for example. As mentioned before, there is a message limit, but it’s hard to reach this with only data like text, but it could happen.

“RequestEntityTooLarge – The request is larger than 94371840 bytes” indicates that the “Create file” action has a stricter limit than “Get file content” (approximately 90 MB). Even if you successfully retrieve the file, creating it elsewhere may fail.

Both errors point to the same underlying issue: you’re trying to pass too much data through the flow.

Solution 1: Use Copy File Action

The most straightforward workaround is to use the SharePoint “Copy file” action instead of the “Get file content” action + “Create file” pattern. It works because we’re not getting the file into Power Automate and then sending it back to SharePoint to be saved. We’re telling the API where to find the file and, in the backend, to make a copy of the file. Much more efficient, and we can keep things within the message limit.

This approach has no practical file size limit (up to SharePoint‘s 250 GB maximum) because the file data never enters your flow. It works for copying files between SharePoint sites, between libraries on the same site, or between SharePoint and OneDrive.

The limitation is that “Copy file” only works within the Microsoft ecosystem. You can’t use it to send files to third-party services or transform file content during the copy, since the data needs to be in the servers for this to work.

Solution 2: Microsoft Graph API

For files larger than 100 MB that need to be uploaded to SharePoint or OneDrive, the Microsoft Graph API provides better support than the standard SharePoint connector. Graph API includes resumable upload capabilities that handle large files through chunked uploads.

Implementing this requires using the “HTTP” action to make direct calls to Graph API endpoints. You’ll need to handle authentication (typically using OAuth 2.0), construct the proper API requests, and manage the chunked upload process.

The complexity is higher than using standard SharePoint actions, but Graph API is the official Microsoft-recommended approach for large file operations. It’s more reliable than trying to work around SharePoint connector limits.

I’ll have an article detailing this in the future and link it here.

Solution 3: Azure Blob Storage (or S3) as Intermediate Storage

For scenarios where files need to move between non-SharePoint systems or require transformation, Azure Blob Storage can act as intermediate storage. Blob Storage supports much larger file sizes and doesn’t have the same message size constraints as Power Automate.

The pattern involves uploading large files to Blob Storage first, processing them there if needed, and then moving them to their final destination. For large files this is probably advisable also because:

This approach adds infrastructure complexity and potential costs for Blob Storage, but it provides flexibility for complex file processing scenarios.

Final Thoughts

Power Automate doesn’t have a file problem or unfairly limit the usage. It has limits on the size of the messages. It has to in order to function correctly.
Files are, usually, larger than text, so these are usually contributing to the increase of the message size.
My point in this article is to show you that there are multiple solutions and, depending on your use case or requirements, show you multiple alternatives that you can use.

You can follow me on Mastodon (new account), Twitter (I’m getting out, but there are still a few people who are worth following) or LinkedIn. Or email works fine as well 🙂

Photo by Christa Dodoo on Unsplash

2 thoughts on “Handling Large Files in Power Automate

  1. I just ran into a related issue and found your post from 4 hours ago; weird, but good! The behavior I’m seeing is that a “When a new email arrives in a shared mailbox (V2)” trigger does not fire when the email includes an attachment over a certain size. My test file was 75MB. Then I tested with 32MB and 50MB attachments respectively and the trigger immediately fired on each of those. Repeating with 75 resulted in no trigger again. I’m surprised it didn’t trigger and error, so it’s possible this is a different issue than you describe above.

Leave a Reply

Your email address will not be published. Required fields are marked *

Mastodon