Slow UI with bashrc configured conda init

I have been noticing some lag (and my users even more so… silently) that I have narrowed down to conda envs being loaded via .bashrc.

Looking at the server load, and top… and I see conda being spun up on actions performed in the web UI.

The fast fix for my account is to comment out the conda init from bashrc- and things get speedy again… but wondering if anyone else has noticed this, and if so… how have you gotten around it? I would love to use bash --noprofile for ondemand sessions on a global scale as editing bashrc for each user isn’t super sustainable…

Anyone else notice this? Any way to change how ondemand calls bash globally?
Many thanks for reading!

The UI shouldn’t be loading ~/.bashrc at all.

The only thing I can think of is the linux host adapter or if you use submit_hosts other than the machine that you’ve installed OOD on.

Both these facilities use ssh (and therefore bash to load an ssh session on another machine) and that can load your ~/.bashrc.

Do you use either of these mechanisms?

That was my thought as well (it shouldn’t be doing that…)… was surprised to see conda popping up and causing IO load to slow down sessions… Looking at pstree:
├─PassengerAgent─┬─PassengerAgent─┬─ruby─┬─sh───bash───bash───conda

On further debugging… adding ‘/usr/bin/cat /proc/$$/cmdline >> logfile.log’ in my own bashrc file and re-running this pointed me at a beta test of a tensorboard install that was running form.yml.erb: m_cmd = "bash -il -c ‘module -t avail py-tensorflow |& grep tensorflow’ 2>/dev/null’

Hope this helps someone that may stumble into this in the future!

1 Like

Good to hear it. BTW you can use auto_modules_tensorflow to do the same thing. We read this info from a file - once.

https://osc.github.io/ood-documentation/latest/how-tos/app-development/interactive/form.html#automatic-predefined-attributes

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.