Share via

PowerShell script failing when retrieving storage details for large SharePoint site

CHANDRAMOHAN, Anish 0 Reputation points
2026-03-09T11:29:06.6666667+00:00

Hi,

I am running a PowerShell script to retrieve storage usage of files and folders from a SharePoint site that are older than January 1, 2024.

The site is around 2TB, the execution fails.

Has anyone encountered such issues when processing very large SharePoint sites? If so, are there recommended approaches or best practices for handling such large document libraries or sites with significant storage?

Any guidance would be appreciated.

Microsoft 365 and Office | SharePoint Server | For business
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Steven-N 21,805 Reputation points Microsoft External Staff Moderator
    2026-03-09T12:45:58.2733333+00:00

    Hi CHANDRAMOHAN, Anish

    Thank you for reaching out to Microsoft Q&A forum

    As far as I know, processing a 2TB SharePoint site via PowerShell requires a strategic approach. Standard item-by-item iteration (like using a standard Get-PnPListItem or Get-PnPFolderItem without strict optimization) will almost certainly fail due to a combination of SharePoint API throttling (HTTP 429 Too Many Requests),...

    However, to better assist you, could you please provide the following additional information:

    • Which PowerShell cmdlet are you using to retrieve the storage details for the large SharePoint site?
    • What is the exact error message you see when the execution fails? You may provide a screenshot if available (please make sure to hide any personal information).

    Best regards


    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.  


  2. Q&A Assist
    2026-03-09T11:29:22.3533333+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    For very large SharePoint Online sites and libraries, a direct, custom PowerShell crawl of all items often runs into timeouts, throttling, or memory limits, especially around the 2 TB scale. For storage and age analysis, use the built‑in version storage reporting and analysis tooling instead of enumerating every file manually.

    Recommended approach for large sites:

    1. Use the built‑in version storage usage report jobs
      • Install and use the SharePoint Online Management Shell.
      • Connect as a SharePoint Administrator.
      • For a whole site or OneDrive, submit a background report job:
             New-SPOSiteFileVersionExpirationReportJob -Identity $siteUrl -ReportUrl $reportUrl
        
      • Track its progress:
             Get-SPOSiteFileVersionExpirationReportJobProgress -Identity $siteUrl -ReportUrl $reportUrl
        
      • For a specific library in that large site:
             New-SPOListFileVersionExpirationReportJob -Site $siteUrl -List $libName -ReportUrl $reportUrl
             Get-SPOListFileVersionExpirationReportJobProgress -Site $siteUrl -List $libName -ReportUrl $reportUrl
        
      These jobs run on the service side and are designed to handle large sites and libraries more reliably than client‑side scripts.
    2. Analyze the generated report instead of live-enumerating content
      • The report is delivered as a CSV file to the location specified by -ReportUrl.
      • Use the provided analysis script (PowerShell 7) to process the report locally:
             . "<path to AnalyzeReportFile.ps1>" –ReportLocalFilePath "<path to the file version expiration What-If report .csv file>"
        
      • The script outputs:
        • Current Expiration Schedule (time‑series of versions and size expired)
        • Target Expiration Schedule (if testing different expiration dates)
        • Files with fewer than 10 versions after immediate deletion
      • From this CSV, filter by date (e.g., versions or files older than 2024‑01‑01) and aggregate storage usage without putting load on SharePoint during the query.
    3. Use the reports to remediate storage and metadata pressure
      • Large sites often hit metadata/storage limits due to many versions and duplicates.
      • Use the report to identify:
        • Old versions that can be expired.
        • Libraries or folders with heavy version bloat.
      • Then adjust versioning and clean up old versions/duplicates in lists and libraries to reduce storage and metadata usage.
    4. General best practices for very large document repositories
      • Prefer service‑side reporting (as above) over client‑side full crawls.
      • Consider structuring content so that individual site collections and libraries stay within practical operational bounds (even though SharePoint supports very large libraries, performance and manageability are better when content is logically partitioned).
      • For extremely large or business‑critical sites, treat them as “very large sites” and plan/monitor them explicitly, as recommended for large‑scale repositories.

    Using the version storage report jobs plus offline CSV analysis is the supported pattern for handling storage/age analysis at multi‑TB scale and avoids the failures seen with direct PowerShell enumeration.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.