No such file or directory - /opt/slurm/bin/qsub)

Hello All,

I am trying to install Open Ondemand first time. I managed to configure basic part like webui and shell prompt. When trying to submit a job using jobcomposer on Slurm cluster. I am facing this below error. It seems after I click submit button, it is pointing to qsub cmd, PBS schedular.

App 883218 output: [2023-10-17 17:10:50 -0400 ] INFO “[787c705a-50c8-49a3-9e57-d482c830b637] execve = [{"PBS_DEFAULT"=>"", "LD_LIBRARY_PATH"=>":"}, "/opt/slurm/bin /qsub", "-A", "research", "-j", "oe"]”
App 883218 output: [2023-10-17 17:10:50 -0400 ] INFO “[787c705a-50c8-49a3-9e57-d482c830b637] method=PUT path=/pun/sys/myjobs/workflows/1/submit format=html controller=WorkflowsController action=submit stat us=500 error='Errno::ENOENT: No such file or directory - /opt/slurm/bin/qsub’ duration=20.73 view=0.00 db=5.07”
App 883218 output: [2023-10-17 17:10:50 -0400 ] FATAL “[787c705a-50c8-49a3-9e57-d482c830b637] \n[787c705a-50c8-49a3-9e57-d482c830b637] Errno::ENOENT (No such file or directory - /opt/slurm/bin/qsub):\n[78 7c705a-50c8-49a3-9e57-d482c830b637] \n[787c705a-50c8-49a3-9e57-d482c830b637] config/initializers/open3_extensions.rb:6:in capture3'\n[787c705a-50c8-49a3-9e57-d482c830b637] app/models/resource_mgr_adapter .rb:37:in qsub’\n[787c705a-50c8-49a3-9e57-d482c830b637] app/models/workflow.rb:291:in each'\n[787c705a-50c8-49a3-9e57-d482c830b637] app/models/workflow.rb:291:in submit_jobs’\n[787c705a-50c8-49a3-9e57-d4 82c830b637] app/models/workflow.rb:259:in submit'\n[787c705a-50c8-49a3-9e57-d482c830b637] app/controllers/workflows_controller.rb:178:in block in submit’\n[787c705a-50c8-49a3-9e57-d482c830b637] app/contro llers/workflows_controller.rb:173:in `submit’”

Any idea how to fix this?

my_cluster.yml file:

title: “Slurm Cluster”
type: “OodCluster::Cluster”
type: “OodCluster::Servers::Ssh”
host: “
type: “OodCluster::Servers::Slurm”
host: “
bin: “/opt/slurm/bin”
conf: “/opt/slurm/etc/slurm.conf”

                #squeue: "/opt/slurm/bin/squeue"
           #               sbatch: "/usr/local/bin/sbatch_wrapper"



Hello and welcome!

There’s several things off in that cluster.yml and it may be easier to ask first if you have been to the cluster configuration page:

I don’t think i’ve seen some use the v1: before, but that is likely the first problem. Really, the whole file looks wrong.

Look over the page I shared, adjust your fields off what you see there, and see if you can get this to connect or generate a new error from there. That’ll be way easier than trying to untangle what’s there currently.

Also, please use formatting! It is there for a reason and troubleshooting will be only harder for both of us if you keep posting without ensuring it’s raw text using backticks.

Hello Travert,

Thanks a lot. it worked after I changed to v2.

Initially, I followed same link that you provided, somehow it didn’t worked and I tried so many things in cluster.yml, v1 is one of them :slight_smile:



1 Like

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.