export large datatable data to .csv file in c# windows applications
Instead of call StreamWriter.Write(..)
all the time you may consider using a StringBuilder. Append all strings to Builder and only write once on the disk!
string filePath = @"e:\temp\test.csv";
string delimiter = ",";
#region init DataTable
DataTable dt = new DataTable();
dt.Columns.Add(new DataColumn("a", typeof(string)));
dt.Columns.Add(new DataColumn("b", typeof(string)));
dt.Columns.Add(new DataColumn("c", typeof(string)));
dt.Columns.Add(new DataColumn("d", typeof(string)));
dt.Columns.Add(new DataColumn("e", typeof(string)));
dt.Columns.Add(new DataColumn("f", typeof(string)));
dt.Columns.Add(new DataColumn("g", typeof(string)));
dt.Columns.Add(new DataColumn("h", typeof(string)));
dt.Columns.Add(new DataColumn("i", typeof(string)));
dt.Columns.Add(new DataColumn("j", typeof(string)));
dt.Columns.Add(new DataColumn("k", typeof(string)));
dt.Columns.Add(new DataColumn("l", typeof(string)));
dt.Columns.Add(new DataColumn("m", typeof(string)));
dt.Columns.Add(new DataColumn("n", typeof(string)));
dt.Columns.Add(new DataColumn("o", typeof(string)));
dt.Columns.Add(new DataColumn("p", typeof(string)));
for (int i = 0; i < 100000; i++)
{
DataRow dr = dt.NewRow();
for (int j = 0; j < dt.Columns.Count; j++)
{
dr[j] = "test" + i + " " + j;
}
dt.Rows.Add(dr);
}
#endregion
Stopwatch sw = new Stopwatch();
sw.Start();
StringBuilder sb = new StringBuilder();
foreach (DataRow dr in dt.Rows)
{
sb.AppendLine(string.Join(delimiter, dr.ItemArray));
}
File.WriteAllText(filePath, sb.ToString());
sw.Stop();
Console.WriteLine(sw.ElapsedMilliseconds);
Console.ReadLine();
EDIT
100000 rows took me 271 ms and created a file of around 18 MB
As @aiodintsov pointed out, there may/will be problems with using StringBuilder if there are several MBs of data. So I created an example according to his comment. Worked fine for me. Exported 1 000 000 rows within 2685 ms.
Stopwatch sw = new Stopwatch();
sw.Start();
using (StreamWriter swr =
new StreamWriter(File.Open(filePath, FileMode.CreateNew), Encoding.Default, 1000000))
// change buffer size and Encoding to your needs
{
foreach (DataRow dr in dt.Rows)
{
swr.WriteLine(string.Join(delimiter, dr.ItemArray));
}
}
sw.Stop();
Console.WriteLine(sw.ElapsedMilliseconds);
I've modified your code a little bit to use a StringBuilder
as buffer. That requires more RAM but should be more efficent. Play around with the initial Capacity and MaxCapacity to avoid OutOfmemoryExceptions
and for maximum efficiency:
public void CreateFastCSVFile(DataTable table, string strFilePath)
{
const int capacity = 5000000;
const int maxCapacity = 20000000;
//First we will write the headers.
StringBuilder csvBuilder = new StringBuilder(capacity);
csvBuilder.AppendLine(string.Join(",", table.Columns.Cast<DataColumn>().Select(c => c.ColumnName)));
// Create the CSV file and write all from StringBuilder
using (var sw = new StreamWriter(strFilePath, false))
{
foreach (DataRow dr in table.Rows)
{
if (csvBuilder.Capacity >= maxCapacity)
{
sw.Write(csvBuilder.ToString());
csvBuilder = new StringBuilder(capacity);
}
csvBuilder.Append(String.Join(",", dr.ItemArray));
}
sw.Write(csvBuilder.ToString());
}
}
Here's a simple measurement with sample-data(10000000/100 lac DataRows).
Sample-data:
var TblData = new DataTable();
TblData.Columns.Add("FeeID", typeof(int));
TblData.Columns.Add("Amount", typeof(int));
TblData.Columns.Add("FeeItem", typeof(string));
TblData.Columns.Add("Type", typeof(char));
for (int i = 0; i < 1000000; i++)
{
TblData.Rows.Add(9, 8500, "Admission Free", 'T');
TblData.Rows.Add(9, 950, "Annual Fee", 'T');
TblData.Rows.Add(9, 150, "Application Free", 'T');
TblData.Rows.Add(9, 850, "Boy's Uniform", DBNull.Value);
TblData.Rows.Add(9, 50, DBNull.Value, 'R');
TblData.Rows.Add(10, 7500, "Admission Free", 'T');
TblData.Rows.Add(11, 900, "Annual Fee", 'T');
TblData.Rows.Add(11, 150, "Application Free", 'T');
TblData.Rows.Add(11, 850, DBNull.Value, 'T');
TblData.Rows.Add(11, 50, "Computer Free", 'R');
}
int rowCount = TblData.Rows.Count; // 10000000
Measurement ( less than 30sec. for a 207 MB file seems to be ok ):
var watch = new System.Diagnostics.Stopwatch();
watch.Start();
CreateFastCSVFile(TblData, @"C:\Temp\TestCSV.csv");
watch.Stop();
Console.Write("Elapsed: {0}", watch.Elapsed); // 00:00:26 for 207 MB CSV-file