Sw1fty,
To rename the file to another name, and then create a blank file in its place, requires restarting apache (some systems seem to be able to do this with a reload though I think...).  On my system, I'm able to clear the file contents and have apache continue to log to it without any issues, but from what raphael said, this may not be the same for every system.
For a system like mine where you can wipe the contents with no apparent consequences, if you wanted to keep the files around you could do something like copy it to another file then truncate its contents, or cat the contents and append to another file (weekly file for example), then let logrotate run on those weekly files.  
For a system that has problems with doing this, I think your stuck with having to restart (or at least reload) apache every time you clear/rename the file.  
 
I just took a look through this code, and might see an issue.  I'm not a perl expert though... 
 
 
228         if ($rdata ne '_no_') { 
229  
230             my $rlog = $rdata; 
231  
232             ($rs, $rdata) = get_file($rlog); 
233  
234             return ($rs, '') if ($rs != 0); 
235  
236             my @rows = split(/\n/, $rdata); 
237  
238             foreach (@rows) { 
239  
240                 my $line = "$_\n"; 
241  
242                 $sum += $1 if ($line =~ /(\d+)\n$/); 
243  
244             } 
245  
246             $rs = del_file($rlog); 
247  
248             return ($rs, '') if ($rs != 0); 
249  
250         } 
251 
Does the get_file() call above actually read the whole file into memory?  If these files are large, and this is the case, can we change this to return a file pointer, and just work through requesting line by line?  I know this will probably be slower, but would be less on the memory usage, which seems to be the issue here... 
Just some ideas, please don't attack me if i'm wrong.  
 
-
Jesse