Xfce4 Desktop app using container_vnc

Hi all, I’m using OOD 3.0.1 and I’m currently trying to use the vnc_container in the tutorial (Batch Connect VNC Container Options — Open OnDemand 3.1.0 documentation) to have a Desktop app. I took the .def file to build and all the commands seem to exist inside the container. The image also has all xfce binaries, but when I put xfce4-session in the script.sh.erb file this is run outside of the container and therefore it doesn’t find it. How can I make sure that I run the commands inside the container? I couldn’t find any options…

Thank you so much,
Bernardo

My submit.yml.erb is

---
batch_connect:
    template: "vnc_container"
    websockify_cmd: "/usr/bin/websockify"
    container_path: "/share/apps-x86/containers/singularity/rockylinux_8.sif"
    container_command: "singularity"
    container_module: ""
    container_bindpath: ""

The output file reads:

Loading ...
 Starting instance...
 INFO:    Instance stats will not be available - requires cgroups v2 with systemd as manager.
 INFO:    Converting SIF file to temporary sandbox...
 INFO:    instance started successfully
 Setting VNC password...
 Starting VNC server...
 59548
 
 Desktop 'TurboVNC: cnx035:1 (malaca)' started on display cnx035:1
 
 Log file is vnc.log
 Successfully started VNC server on cnx035:5901...
 Script starting...
 Before starting up
 Starting websocket server...
 /usr/lib/python3.6/site-packages/websockify/websocket.py:31: UserWarning: no 'numpy' module, HyBi protocol will be slower
   warnings.warn("no 'numpy' module, HyBi protocol will be slower")
 WebSocket server settings:
   - Listen on :53570
   - No SSL/TLS support (no cert file)
   - Backgrounding (daemon)
 Scanning VNC log file for user authentications...
 Generating connection YAML file...
 Cleaning up...
 Killing Xvnc process ID 54
 /var/spool/slurm/d/job06085/slurm_script: line 25: vncserver: command not found
 INFO:    Stopping bd8b3cb3-e1db-4917-8b8d-910b48ec35fc instance of /tmp/rootfs-254472145/root (PID=59392)
 ~      

I was able to solve this.
The system runs the container just to run vnc and you can’t easily make it also run xfce. So to open xfce/gnome/etc. you need to add a line in your script.sh.erb:

singularity exec /path/to/image/rockylinux_8.sif /path/to/script/xfce.sh

xfce.sh is the script file you get from the original app inside “templates”.
Essentially you run two parallel singularity containers, each with its own app.

2 Likes

Hi @BMalaca . I’m trying to do this same thing right now. I modified my script.sh.erb like you mentioned:

apptainer exec xfce.sif "desktops/xfce.sh"

My job starts successfully but when I try to connect I get the “Failed to connect to server” error.

Did you have to make any other changes? Here’s some of the relevant lines from the logs:

Log file is vnc.log
Successfully started VNC server on node:5901...
Script starting...
Launching desktop 'xfce'...
Starting websocket server...
Failed to init libxfconf: Error spawning command line ?dbus-launch --autolaunch= --binary-syntax --close-stderr?: Child process exited with code 1.
Failed to init libxfconf: Error spawning command line ?dbus-launch --autolaunch= --binary-syntax --close-stderr?: Child process exited with code 1.
D-Bus library appears to be incorrectly set up: see the manual page for dbus-uuidgen to correct this issue. (UUID file '/var/lib/dbus/machine-id' should contain a hex string of length 32, not length 0, with no other text; UUID file '/etc/machine-id' should contain a hex string of length 32, not length 0, with no other text)
/usr/lib/python3.6/site-packages/websockify/websocket.py:31: UserWarning: no 'numpy' module, HyBi protocol will be slower
  warnings.warn("no 'numpy' module, HyBi protocol will be slower")
WebSocket server settings:
  - Listen on :34688
  - No SSL/TLS support (no cert file)
  - Backgrounding (daemon)
Scanning VNC log file for user authentications...
Generating connection YAML file...

Hi and welcome @BMalaca! Sorry we didn’t get to this earlier.

Shouldn’t it use the existing container instance? We populate the INSTANCE_NAME environment variable and export it for use in script.sh.erb.

apptainer exec "instance://$INSTANCE_NAME" "desktops/xfce.sh"

That was my initial thought too Jeff. I bind mounted /var/lib/dbus/machine-id and am getting the following in my output.log:


Desktop 'TurboVNC: node137:1 (user)' started on display node137:1

Log file is vnc.log
Successfully started VNC server on node137:5901...
Script starting...
Launching desktop 'xfce'...
Starting websocket server...
_IceTransmkdir: Owner of /tmp/.ICE-unix should be set to root
/usr/lib/python3.6/site-packages/websockify/websocket.py:31: UserWarning: no 'numpy' module, HyBi protocol will be slower
  warnings.warn("no 'numpy' module, HyBi protocol will be slower")
WebSocket server settings:
  - Listen on :47677
  - No SSL/TLS support (no cert file)
  - Backgrounding (daemon)
Scanning VNC log file for user authentications...
Generating connection YAML file...

The job is running, but still getting the “Can’t connect to server” VNC error.

After alot of finagling I got this to work on my system to boot up the application visit inside the container.

I know for sure it boots inside the container because all sorts of dynamic library issues I had where I couldn’t mount /lib64 directly, had to mount each .so file individually.

#!/usr/bin/env bash

# Clean the environment
module purge

# Set working directory to home directory
cd "${HOME}"

export APPTAINERENV_DISPLAY=$DISPLAY

#
# Launch Xfce Window Manager and Panel
#
export SEND_256_COLORS_TO_REMOTE=1
export APPTAINERENV_SEND_256_COLORS_TO_REMOTE="$SEND_256_COLORS_TO_REMOTE"

export XDG_CONFIG_HOME="<%= session.staged_root.join("config") %>"
export APPTAINERENV_XDG_CONFIG_HOME="$XDG_CONFIG_HOME"

export XDG_DATA_HOME="<%= session.staged_root.join("share") %>"
export APPTAINERENV_XDG_DATA_HOME="$XDG_DATA_HOME"

export XDG_CACHE_HOME="/tmp/xdg"
export APPTAINERENV_XDG_CACHE_HOME="$XDG_CACHE_HOME"

export $(apptainer exec instance://$INSTANCE_NAME dbus-launch 2>/dev/null)
export APPTAINERENV_DBUS_SESSION_BUS_ADDRESS=$DBUS_SESSION_BUS_ADDRESS
export APPTAINERENV_DBUS_SESSION_BUS_PID=$DBUS_SESSION_BUS_PID

module restore
set -x
apptainer exec instance://$INSTANCE_NAME xfwm4 --compositor=off --sm-client-disable &
apptainer exec instance://$INSTANCE_NAME xsetroot -solid "#D3D3D3"
apptainer exec instance://$INSTANCE_NAME xfsettingsd --sm-client-disable --daemon
apptainer exec instance://$INSTANCE_NAME xfce4-panel --sm-client-disable &

set +x
#
# Start Visit
#

# Load the required environment
module load xalt/latest <%= context.auto_modules_visit %>

export APPTAINERENV_PATH=$PATH
export APPTAINERENV_LD_LIBRARY_PATH=$LD_LIBRARY_PATH

sleep 5

apptainer exec "instance://$INSTANCE_NAME" visit -small