When I try and create a new job from a path, I get the following error:
Timeout occurred when trying to determine directory size.
I have quite a few files in my home and my scratch area as do a lot of our more experienced users. Is there any way to increase the time limit or to not traverse all of the directory tree?
At this time there is no configuration option to alter the timeout for the disk usage check.
But, it sounds like you might be attempting to create a job from a specified path incorrectly. Attempting to create a job from a path assumes that you have a job script, and some data files in a directory, and this workflow attempts to copy that entire directory into
$MYJOBS_NUMBER is just the primary key in the database.
I should note that if you do have a huge directory (number or size of files) we are considering changes that would skip the copying step.
This helps. Would be nice to have a full set of user docs. All we have found is the Ohio specific intro docs which are more site specific than what we would like to share with our people.