Job composer template "The direcory is too large to copy"

Hello,

Some of our users have run into this error when creating a new template with a source dir:

The directory is too large to copy. The directory should be less than 1073741824 bytes.

when trying to create a new template and are using a source path.

I see that this is defined in: /var/www/ood/apps/sys/myjobs/app/models/filesystem.rb but it’s fixed to 1024*1024*1024 as in :

    def max_copy_safe_dir_size
      @max_copy_safe_dir_size ||= 1024*1024*1024
    end

Can this value be changed or are there some restrictions and this is the largest it is allowed to be?

Thanks
Vasile

The upload limits is likely what you would need to change for this to be larger.

You can see how those are set here:
https://osc.github.io/ood-documentation/latest/customization.html#set-upload-limits

The nginx_file_upload_max in the /etc/ood/config/nginx_stage.yml is the part I think you are trying to change.

Hello,

That is set to 10Gib, so I am not sure that’s the setting.
The The directory is too large to copy. The directory should be less than 1073741824 bytes., it is not from trying to upload a file but using the template creator by providing a path in the section: Path (the template will be created by copying files from this source path).

1073741824 bytes is only 1Gib and that aligns with the code I posted. The error message comes from a function in the /var/www/ood/apps/sys/myjobs/app/models/filesystem.rb

    63      if size.blank?
    64        safe, error = false, "Failed to properly parse the output of the du command."
    65      elsif size.to_i > self.class.max_copy_safe_dir_size
    66        safe, error = false, "The directory is too large to copy. The directory should be less than #{self.class.max_copy_safe_dir_size} bytes."
    67      end

Please advise, as the two seem to be different
Thanks
Vasile

Sorry about that, you are correct that to copy the data it is hard coded. So if you wanted to change that for the Job Composer you would have to change the value.

That said, a workflow that is typically used is not to copy the large data files over that are to be used, but rather to use the script to reference the data, which sits somewhere on the fs setup for larger files. That way only the script is being copied and passed around by the job composer, and not the entire data set as well.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.