Export huge amount of data to CSV

Links for php scripts

Moderators: macek, egami, gesf

Post Reply
User avatar
Nullsig
php-forum Fan User
php-forum Fan User
Posts: 981
Joined: Thu Feb 17, 2011 6:52 am
Location: Racine, WI

Re: Export huge amount of data to CSV

Post by Nullsig » Mon May 14, 2012 11:24 am

Where is the data stored?

User avatar
Nullsig
php-forum Fan User
php-forum Fan User
Posts: 981
Joined: Thu Feb 17, 2011 6:52 am
Location: Racine, WI

Re: Export huge amount of data to CSV

Post by Nullsig » Tue May 15, 2012 5:37 am

Why don't you just dump the data directly from the database?

If that isn't possible because you have to process the data with a script before dumping it then you could do this to segregate the information over multiple files:

Code: Select all

$sql = "SELECT blah FROM blah WHERE blah blah blah";
if($rs = mysql_query($sql)){
	$i = 1; //this will be appended to the end of the file name.
	//This will be used to detect when to create a new file 
	//initialize at zero to trigger the file creation at the beginning
	$j = 0; 
	while($row = mysql_fetch_assoc($rs)){
		if($j % 10000 == 0){ //The 10000 here means that each file will contain 10000 rows
			if(isset($fp)){
				//first iteration the $fp variable will not be set, 
				//therefore we will not try to close it.
				fclose($fp);
			}
			
			//initialize the current file
			$fp = fopen('randomFile' . $i . ".csv", w+);
			$i++;
			
			//technically you don't need to do this but for 
			//REALLY large datasets you may overflow the integer value.
			//This also helps if you aren't iterating $j on every row
			$j = 1;
		}
		
		
		//Process your row and insert the processed data here
		
		
		$j++;//last line will iterate your current row count. This can be iterated on successful write to the file too.
	}
}
 

Post Reply

Who is online

Users browsing this forum: No registered users and 2 guests