Im running into this issue and was wondering if anyone else had encountered it.
Basically I have bit of mel script thats batch processing a load of maya files and writing data to text files. After around 180 files I get a warning saying Internal file descriptor table is full and the next time fopen is called after that warning it fails to open the file for writing.
I have tried to reproduce this by simply making a loop that fopen"s, fprint"s and fclose"s but it didnt happen in that case even after 2000 loops. I think it must be related to the amount of data written but Ive been unable to narrow it down more than that.
Does anyone know what exactly this internal file descriptor table is? Im guessing its either the software buffer that gets written to before fprint writes to disk (makes more sense if its related to the volume of data written) or its simply a list of file IDs thats getting to its limit.
Any suggestions as to how I might avoid filling up this buffer or a way to flush it etc. would be much appreciated.
Thanks.
Replies
Last time I had this problem it was because the script was hitting some case inside the code where it'd skip or error out before the fclose was ever called, so the file was being left open.
Basically the only way to fix this is to make sure that all cases of your script erroring out (or returning) include a call to "fclose", otherwise your file will never get closed and will result in filling up the file descriptor table.
Thanks for the help, this has been driving me mad :P
Easy once I knew what to look for. Thanks again.