Quote:many people around the world use fcgi for conversion and i don't see why it shouldn't work longer than few minutes
I don't think so. Both the fastcgi protocol and the fcgid implementation are designed to run web scripts, this is, fast responding scripts. You can subvert them to try to run long data-crunching tasks, but this is only a hack. Additionally, there are multiple reasons why this is undesirable:
- When running under fcgid, apache workers are left locked until the script responds. Therefore, you are locking apache workers with your long-running tasks. This can even derive into a DoS condition.
- Fcgid-spawned processes should be able to terminate within a given time range. Otherwise, when apache is restarted, these processes will have to be killed by fcgid after a timeout, causing both long delays to the apache restart and the running script to terminate improperly.
- If the client cancels the request (using the "stop" button from their browser) or looses connection, the script will be finished without letting the long operation finish his task.
A much better solution is to delegate such long tasks to background, asynchronous processes. Something like the following (this is rought code, but the idea is clear):
PHP Code:
function convert_video($id, $infile, $outfile) {
function shutdown() {
posix_kill(posix_getpid(), SIGHUP);
}
// Go to background
$pid = pcntl_fork(); // fork
if ($pid < 0) return false; // Unable to fork
else if ($pid) return true;
else {
// child, but not detached from the main process yet. Cleanup things
ob_end_clean(); // Discard the output buffer and close
fclose(STDIN); // Close all of the standard
fclose(STDOUT); // file descriptors as we
fclose(STDERR); // will be running in background.
register_shutdown_function('shutdown');
// Try to become master
if (posix_setsid() < 0) exit(1);
// And do the final fork out of apache/fastcgi control
if (pcntl_fork()) exit(0);
// Perform your long running task. You *must* reopen any db connection,
// as it will have been closed by the first child.
mysql_connect($host, $user, $pass);
mysql_select_db($db);
$cmd = 'ffpmeg -i ' . escapeshellarg($infile) . " -o " . escapeshellarg($outfile);
$output = exec($cmd, $output = array(), $ret = 0);
$ret = $ret == 0 ? 'ok' : 'fail';
$output = implode("\n", $output);
$qry = "INSERT INTO processed_videos (id, file, result, log) VALUES('" .
mysql_real_escape_string($id) . "', '"
mysql_real_escape_string($outfile) . "', '"
mysql_real_escape_string($ret) . "', '"
mysql_real_escape_string($output) . "')";
mysql_query($qry);
// Exit cleanly when the task has been completed.
exit(0);
}
}
An even better solution would be simply submitting the tasks to some backend (which might very well be a cron running every X minutes) that makes sure to process at most X conversions at the same time to avoid resource exhaustion at the server.