Rohit Chopra Thu Mar 24 09:07:48 -0400 2011

Subject: Retrieving 200,000 record from the DB.

Hey guys. I absolutely love Activerecord so far. Awesome library.

I do have a question though. I am tryin to retrieve about 200,000 records from a table (about 12 cells per row) and I know that using the ::all() to fetch all records will just eat up all the memory. Is there a way to loop through all the records and fetch them one at a time in a while loop? I have been looking for a get_num_rows or something similar and I dont see it.

Help is much appreciated. Thanks


Chris R Fri Mar 25 12:47:07 -0400 2011

Try fetching the records in batches (perhaps 100 at a time). You'll need the limit and offset finder options to do this.

// Fetch all but limit to 10 total books starting at the 6th book
Book::find('all', array('limit' => 10, 'offset' => 5));

See http://www.phpactiverecord.org/projects/main/wiki/Finders#limit-offset for more details.

Erik Wiesenthal Wed Sep 14 12:19:22 -0400 2011

Using limit, offset will result in a lot of querys to db.
Will be great to have the 'query->row' method to return the next result row. or at least, any way to access the PDO functions for the connection in some way.
Recently i make a "invoice system" wich parses thousand of records and process each one, and that for each client. The entire result set was unable to be saved in memory, due to PHP and Apache limits so i did a fetch row and process each row one by one. so there must be lot of use cases for this functionality.

(1-2/2)