Share via

SSIS Data Flow has trouble when it has to return many fields

Bob-7209 66 Reputation points
2026-02-05T20:09:28.96+00:00

I have an SSIS project that calls stored procedures to create extract files via OLE DB Source & Flat File Destination. The stored procedure returns 150 fields and sometimes a few million records. This causes SSIS & Visual Studio 2019 to take a long time to do anything.

Is there anything I can do to improve the performance?

Could I break my large Data Flow into smaller ones with 50 fields in each, and then merge them together? If so, how could I accomplish this?

SQL Server Integration Services
0 comments No comments
{count} votes

Answer recommended by moderator
  1. Lakshmi Narayana Garikapati 920 Reputation points Microsoft External Staff Moderator
    2026-02-20T09:18:59.3933333+00:00

    Hi Bob-7209 ,

    Running the package directly on the server was a good call—that gives SSIS more CPU and memory to work with.

    If you need to run it locally, the best option is to tune the Data Flow buffer properties rather than splitting the extract into multiple flows. Increasing DefaultBufferSize (e.g., 50–100 MB depending on available RAM) and adjusting DefaultBufferMaxRows so each buffer holds more rows can reduce overhead significantly.

    Splitting by columns isn’t ideal because SSIS doesn’t have a native way to merge column sets back into a single flat file. You’d end up writing multiple partial files and then stitching them together outside SSIS, which adds complexity and usually slows things down.

    If buffer tuning still isn’t enough, another option is to let SQL Server handle the export directly with bcp or SQLCMD. These tools are optimized for bulk export and can be faster than SSIS for very wide datasets.

    In short: keep the flow intact, push as much work as possible into SQL, tune buffers for local runs, and consider bcp/SQLCMD if you need maximum speed for flat file output.

    https://learn.microsoft.com/en-us/sql/integration-services/data-flow/flat-file-destination?view=sql-server-ver17

    https://learn.microsoft.com/en-us/sql/integration-services/data-flow/data-flow-performance-features?view=sql-server-ver17

    1 person found this answer helpful.
    0 comments No comments

2 additional answers

Sort by: Most helpful
  1. Yitzhak Khabinsky 27,096 Reputation points
    2026-02-10T15:08:02.07+00:00

    Hi @Bob-7209,

    It would be much better to use bcp.exe for your scenario.

    bcp is SQL SERVER command line utility for such operations. You can launch bcp from SSIS package with all needed parameters.

    Here is its download link: https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver17&tabs=windows

    0 comments No comments

  2. Q&A Assist
    2026-02-05T20:09:39.2566667+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    To improve the performance of your SSIS Data Flow that is handling a large number of fields and records, consider the following strategies:

    1. Break Down the Data Flow: Yes, breaking your large Data Flow into smaller ones can help. You can create multiple Data Flow tasks, each handling a subset of fields (e.g., 50 fields each). This can reduce the memory load and processing time for each individual Data Flow.
    2. Use a Staging Table: Instead of processing all records at once, you can load the data into a staging table first. This allows you to manipulate and process the data in smaller batches, which can be more efficient.
    3. Optimize Data Types: Ensure that the data types in your Flat File Connection Manager match the source data types. Mismatched data types can lead to performance issues.
    4. Adjust Buffer Sizes: Configure the buffer size and the number of rows in each buffer for your Data Flow tasks. You can set the DefaultBufferSize and DefaultBufferMaxRows properties to optimize performance based on your data size.
    5. Parallel Execution: If your system resources allow, consider configuring your Data Flow tasks for parallel execution. This can help in utilizing multiple threads and speeding up the overall process.
    6. Use Data Viewers: While debugging, enable Data Viewers to monitor data as it flows through the Data Flow. This can help identify bottlenecks or issues without processing the entire dataset at once.

    By implementing these strategies, you should see an improvement in performance when handling large datasets in SSIS.


    References:


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.