Quote:- Logging traffic usage to a mysql database degrades over time, so when you have 1 year of traffic logs in a 300 domains server the querys become painfully slow (more cpu usage, blah blah...).
I'd suggest using mysql logging - and just keep the last 4 weeks in a mysql db.
I 've only good things to tell about mod_log_sql.
It has a failback where it writes its log information into a file (in sql format) ... so it can be replayed once the mysql server is up again.
This solved the "too many open files" problem for me, too on a lot of customer servers i support.
Export into awstats can be done via pipe.
Some years ago i payed jan knescke via amazon to support generating of stats out of modlogan directly
- i think its worth the try to implement this in awstats as well (reading of log information out of the db ).
I had one dedicated mysql running for ~20 Servers. It worked very well. So performance is not the issue
. All i had to do is to make a good cleanup script
(which deleted old data and removed tables from not used domains ) .
Traffic calculation can then be calculated on top of the gathered data in the db.
Perhaps we can look into gplhost how they solved this problem as well.
Another approach would be to use mod_log_spread ... but this goes more into clustered environments.