Hey there
I’m working on providing a XFCE desktop for users. From previous threads here in the forum I believe there are working setups around. Unfortunately, I wasn’t able to find a full example somewhere and wasn’t able to piece a working one together.
My goal is to provide a XFCE desktop based on Rocky Linux 9 (which our HPC currently runs on). I’m currently using a version of the example code from the OOD documentation (Batch Connect VNC Container Options — Open OnDemand 4.0.0 documentation), adapted for Rocky 9, to create the container:
Bootstrap: docker
From: rockylinux/rockylinux:9-minimal
%environment
PATH=/opt/TurboVNC/bin:$PATH
LANGUAGE="en_US.UTF-8"
LC_ALL="en_US.UTF-8"
LANG="en_US.UTF-8"
%post
microdnf -y install dnf
dnf install -y dnf-plugins-core
dnf config-manager --set-enabled crb
dnf -y update && dnf -y upgrade
dnf install -y epel-release
dnf install -y xfdesktop xfwm4 xfce4-session xfce4-settings xfce4-terminal
dnf install -y xkbcomp
dnf install -y python3-pip xorg-x11-xauth
pip3 install ts
dnf install -y https://yum.osc.edu/ondemand/latest/compute/el9Server/x86_64/python3-websockify-0.11.0-1.el9.noarch.rpm
dnf install -y https://yum.osc.edu/ondemand/latest/compute/el9Server/x86_64/turbovnc-3.1.1-1.el9.x86_64.rpm
dnf clean all
chown root:root /opt/TurboVNC/etc/turbovncserver-security.conf
rm -rf /var/cache/dnf/*
rm -f /var/log/*.log
My submit.yml.erb uses the ‘vnc_container’ template from batch_connect:
---
batch_connect:
template: "vnc_container"
websockify_cmd: "/usr/bin/websockify"
container_path: "/path/to/xfce-desktop.sif"
container_bindpath: ""
container_module: ""
container_command: "apptainer"
script:
native:
(SLURM parameters to run the job on our HPC)
Unfortunately, I haven’t found any matching script.sh.erb file to properly run the container. One thread suggested to just copy the xfce.sh script from the batch_connect template (template/desktops/xfce.sh) and run it inside the container. But that didn’t work for me.
I previously created a native desktop (where the xfce desktop environment is running locally on the compute node of the cluster) with some customization and tried to reuse the commands by running them inside the container. The resulting script.sh.erb:
#!/usr/bin/env bash
# Clean the environment
module purge
# Set working directory to home directory
cd "${HOME}"
# xfce configuration maintenance
## Clean up previous monitors
if [[ -f "${HOME}/.config/monitors.xml" ]]; then
mv "${HOME}/.config/monitors.xml" "${HOME}/.config/monitors.xml.bak"
fi
## Copy over default panel if doesn't exist, otherwise it will prompt the user
PANEL_CONFIG="${HOME}/.config/xfce4/xfconf/xfce-perchannel-xml/xfce4-panel.xml"
if [[ ! -e "${PANEL_CONFIG}" ]]; then
mkdir -p "$(dirname "${PANEL_CONFIG}")"
cp "/etc/xdg/xfce4/panel/default.xml" "${PANEL_CONFIG}"
fi
## Set Xfce4 Terminal as login shell (sets proper TERM)
TERM_CONFIG="${HOME}/.config/xfce4/terminal/terminalrc"
if [[ ! -e "${TERM_CONFIG}" ]]; then
mkdir -p "$(dirname "${TERM_CONFIG}")"
sed 's/^ \{4\}//' > "${TERM_CONFIG}" << EOL
[Configuration]
CommandLoginShell=TRUE
EOL
else
sed -i \
'/^CommandLoginShell=/{h;s/=.*/=TRUE/};${x;/^$/{s//CommandLoginShell=TRUE/;H};x}' \
"${TERM_CONFIG}"
fi
#
# Launch Xfce Window Manager and Panel
#
echo "Setting environment variables..."
# configure DISPLAY for container
export APPTAINERENV_DISPLAY=$DISPLAY
export SEND_256_COLORS_TO_REMOTE=1
export APPTAINERENV_SEND_256_COLORS_TO_REMOTE="$SEND_256_COLORS_TO_REMOTE"
# Set the user folder
export XDG_CONFIG_HOME="<%= session.staged_root.join("config") %>"
export APPTAINERENV_XDG_CONFIG_HOME="$XDG_CONFIG_HOME"
export XDG_DATA_HOME="<%= session.staged_root.join("share") %>"
export APPTAINERENV_XDG_DATA_HOME="$XDG_DATA_HOME"
export APPTAINERENV_PATH=$PATH
export APPTAINERENV_LD_LIBRARY_PATH=$LD_LIBRARY_PATH
echo "Starting desktop instance"
module restore
set -x
# Launch dbus-launch with full path to prevent conda PATH issues (https://github.com/OSC/ondemand/issues/700)
export $(apptainer exec instance://$INSTANCE_NAME /usr/bin/dbus-launch 2>/dev/null)
export APPTAINERENV_DBUS_SESSION_BUS_ADDRESS=$DBUS_SESSION_BUS_ADDRESS
export APPTAINERENV_DBUS_SESSION_BUS_PID=$DBUS_SESSION_BUS_PID
echo "Starting xfce desktop..."
# Launche xfce window manager
apptainer exec instance://$INSTANCE_NAME xfwm4 --compositor=off --sm-client-disable
#
apptainer exec instance://$INSTANCE_NAME xsetroot -solid "#D3D3D3"
#
apptainer exec instance://$INSTANCE_NAME xfsettingsd --sm-client-disable --daemon
# disable ssh autostart
apptainer exec instance://$INSTANCE_NAME xfconf-query -c xfce4-session -p /startup/ssh-agent/enabled -n -t bool -s false
# disable gpg autostart
apptainer exec instance://$INSTANCE_NAME xfconf-query -c xfce4-session -p /startup/gpg-agent/enabled -n -t bool -s false
# launch the desktop panel (used by HPC to monitor if desktop is still running)
apptainer exec instance://$INSTANCE_NAME xfce4-panel --sm-client-disable
But this did not work as expected. The session (and desktop) is starting, but noVNC can’t connect to it.
Since I’m not sure if the root cause of the this is related to an error in my configuration or if there is some setting missing on our HPC, I’d appreciate if someone could share a working example for running a desktop container so I can test if this is working here.
Please let me know if you want to inspect additional files (or if you maybe already have spotted the error?). I’d really appreciate any support or examples.