Using Open OnDemand to submit jobs to Kubernetes cluster

I’m trying to use Open OnDemand to submit jobs to a remote Kubernetes cluster. The goals is to run jobs and retrieve the output back to the OOD/VM filesystem.

Setup:

  • form.yml
  • manifest.yml (specifies cluster and batch connect template)
  • submit.yml.erb (runs a container: ubuntu:22:04)
  • script.sh.erb (inside the template/ folder)
  • after.sh (inside the template/ folder)

I’ve tried using an after.sh script, but it never executes and does not appear in the OOD session folder. I also tried volume mounting with a PVC, but mounting host paths or PVCs to OOD output fails due to permission. Manually doing kubectl cp works from the local machine to copy the pod output, but I want to automate this just by clicking submit job.

What I’m trying to do:

  1. Click Launch in OOD → job submitted to Kubernetes pod
  2. Pod runs the container job
  3. Results appear in the OOD session output folder.

Questions:

  1. Is there a recommended way to map pod output back to the standard OOD job
    output/ directory?
  2. Can OOD natively support PVC-backed output storage for Kubernetes jobs?

Any help or best practices for retrieving pod output back into OOD would be greatly appreciated.

Hi and welcome! and Sorry for the delay on this response.

I believe the only way to do this is by mounting host paths in the container. Not quite sure why you’d be betting permission failures in this case. The pod should be running as the users real and actual UID with GIDs as well and the host file system should be their HOME.

Which is to say - the pod is running as the user, writing to a location they own, so it should work like that.

I guess my first question would be to ask if the pod is in-fact running as the user?