Avoid launching a web browser

Hi Jeff and OOD team,

I am implementing an AI platform via open-webui. I can launch the standard desktop app, then start the service open-webui followed by opening a web browser at (http://localhost:8080). This is not user-friendly.

I want to bypass the web browser instead of using a proxy, similar to Jupyterlab or vscode. I have view.html.erb and need to tweak the parameters.

Please advise! A Zoom session would be highly appreciated.
Wei

I’m not 100% sure I have the availability to meet. I guess I can offer these tips:

  • you need to boot the app on a port we allocate and set the port environment variable to. (you can see in Jupyter how this is done)
  • You likely need to fix the root URL for the app to not be just / but instead be /node/$host/$port (you can see in Jupyter how this is done)
  • Your view.html.erb should handle any authentication you need by providing the password/credentials in a <form>, header or query parameters.

If you can share the app maybe I can take a look asynchronously.

Thank you, Jeff!

I will modify/debug further before sharing the code with you.

Wei

Hi Jeff,

Thank you!
I can start ollama and launch its GUI via open-webui using the standard desktop app:

This works but is not user-friendly. Therefore, I want to create a LM app, which is similar to VScode server or Jupyter Notebook.

[wfeinstein@n0000 lrc_LM]$ cat view.html.erb
<form action=“/node/<%= host %>/<%= port %>/login” method=“post” target=“_blank”>
<input type=“hidden” name=“password” value=“<%= password %>”>
<button class=“btn btn-primary” type=“submit”>
<i class=“fa fa-eye”></i> Connect to Open WebUI
</button>
</form>

[wfeinstein@n0000 lrc_LM]$ cat template/before.sh.erb
# Export the module function if it exists
[[ $(type -t module) == “function” ]] && export -f module
# Find available port to run server on
export port=$(find_port ${host})
# Export compute node the script is running on
export host=“${host}”
# Generate SHA1 encrypted password (requires OpenSSL installed)
SALT=“$(create_passwd 16)”
password=“$(create_passwd 16)”
PASSWORD_SHA1=“$(echo -n “${password}${SALT}” | openssl dgst -sha1 | awk ‘{print $NF}’)”
source /global/home/groups/scs/wfeinstein/ollama/startup.sh

[wfeinstein@n0000 lrc_LM]$ cat template/script.sh.erb
#!/usr/bin/env bash
echo “TIMING - Starting main script at: $(date)”
cd “${HOME}”
module purge
module load <%= context.modules %>
<%- end -%>
# Start Ollama in the background
ollama &
# Wait for Ollama to initialize (adjust sleep time as needed)
sleep 10
open-webui serve --port ${port}

The app didn’t launch; output.log has some info for further debugging. In addition, “connection.yaml” didn’t get created in the same directory.

[root@perceus-00 output]# cat 3940c616-a301-4ab8-8606-60ae373c6447/output.log
Setting VNC password…
Starting VNC server…
Desktop ‘TurboVNC: n0357.lr6:1 (wfeinstein)’ started on display n0357.lr6:1
Log file is vnc.log
Successfully started VNC server on n0357.lr6:5901…
Script starting…
Waiting for open-webui server to open port 31011…
TIMING - Starting wait at: Sat Feb 22 11:27:23 PST 2025
TIMING - Starting main script at: Sat Feb 22 11:27:23 PST 2025
/global/home/groups/scs/wfeinstein/ollama/usr/bin/ollama
Usage:
ollama [flags]
ollama [command]
Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
stop Stop a running model
pull Pull a model from a registry
push Push a model to a registry
list List models
ps List running models
cp Copy a model
rm Remove a model
help Help about any command
Flags:
-h, --help help for ollama
-v, --version Show version information
Use “ollama [command] --help” for more information about a command.
Loading WEBUI_SECRET_KEY from file, not provided as an environment variable.
Loading WEBUI_SECRET_KEY from /global/home/users/wfeinstein/.webui_secret_key
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
INFO [open_webui.env] ‘ENABLE_SIGNUP’ loaded from the latest database entry
INFO [open_webui.env] ‘DEFAULT_LOCALE’ loaded from the latest database entry
INFO [open_webui.env] ‘DEFAULT_PROMPT_SUGGESTIONS’ loaded from the latest database entry
WARNI [open_webui.env]
WARNING: CORS_ALLOW_ORIGIN IS SET TO ‘*’ - NOT RECOMMENDED FOR PRODUCTION DEPLOYMENTS.
INFO [open_webui.env] Embedding model set: sentence-transformers/all-MiniLM-L6-v2
/global/home/groups/scs/wfeinstein/ollama/local/site-packages/pydub/utils.py:170: RuntimeWarning: Couldn’t find ffmpeg or avconv - defaulting to ffmpeg, but may not work
warn(“Couldn’t find ffmpeg or avconv - defaulting to ffmpeg, but may not work”, RuntimeWarning)
WARNI [langchain_community.utils.user_agent] USER_AGENT environment variable not set, consider setting it to identify your requests.
/global/home/groups/scs/wfeinstein/ollama/local/site-packages/open_webui
/global/home/groups/scs/wfeinstein/ollama/local/site-packages
/global/home/groups/scs/wfeinstein/ollama/local
Running migrations
___ __ __ _ _ _ ___
/ _ \ _ __ ___ _ __ \ \ / /__| |__ | | | |_ _|
| | | | '_ \ / _ \ '_ \ \ \ /\ / / _ \ '_ \| | | || |
| |_| | |_) | __/ | | | \ V V / __/ |_) | |_| || |
\___/| .__/ \___|_| |_| \_/\_/ \___|_.__/ \___/|___|
|_|
v0.5.12 - building the best open-source AI user interface.

Fetching 30 files: 100%|██████████| 30/30 [00:00<00:00, 24750.02it/s]
INFO: Started server process [2061819]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:31011 (Press CTRL+C to quit)
Discovered open-webui server listening on port 31011!
TIMING - Wait ended at: Sat Feb 22 11:27:46 PST 2025
Starting websocket server…
[websockify]: pid: 2062089 (proxying 25880 ==> localhost:31011)
[websockify]: log file: ./websockify.log
[websockify]: waiting …
[websockify]: failed to launch!
Cleaning up…
Killing Xvnc process ID 2061649

Hope the code and log file above are helpful for debugging.

Thanks,

It looks like the web service started at proxying 25880, which theoretically could be viewed at localhost:31011. Any ideas of how I should tweak it?

Thanks,

Hi Wei,

I am also working on a similar project with Open WebUI. The problem that I have encountered is that Open WebUI does not currently allow you to set a different root URL for the server instance; see Jeff’s second point above. This is important for reverse proxy applications as he explains in his post. Using these servers within a reverse proxy is a relatively common thing that many other apps support; for instance, Gradio has this feature built-in.

Open WebUI uses a mix of absolute and relative paths to indicate where different files are located to be served out. I have had some success modifying the source code but it is still a work in progress.

Hi Anderss,

Thank you for sharing, and glad to know you are working on a similar project with webUI.
Do you have a few minutes to share some of your work in progress, including Webui’s mixed paths?

Thank you,
Wei

Hi Wei,

Sure, although I don’t have a lot of answers to share. My week is pretty full but how about we meet next Monday afternoon? I am in Eastern time.

Best,
Sean

1 Like

Hi Sean,

Sounds great next Monday afternoon. I will set up a Zoom call if you can share your email or vice versa.

Thank you,
Wei

1 Like

Hi Wei,

Sure: anderss at wfu.edu

Anytime between 3–5 PM EST works for me.

Best,
Sean

HI Sean,

Thank you! Sent you a Zoom invite.

Wei

@wfeinstein it was nice meeting you today - this is the OSC app that I was talking about. However, looking at it, the 3rd party application (stable diffusion web ui) is based off of Gradio, not openweb UI as I may have thought (Gradio is a different popular web framework).

Taking a quick look at Web UI, it doesn’t appear to support setting the base URL which is something that’s required for OOD to work correctly and proxy to it. Indeed reading this discussion, you can see someone else asking for OOD as well.

That said - there appears to be some hack in that discussion with the svelte config that you can use, maybe.

@lcrownover is on that discussion, maybe he knows more about this?

I have worked with OpenWeb-UI a Little and based on some of their PR’s that have tried to fix this solution they do not seem to want to take patches to fix this issue. So I came up with a way to do this with nginx and re-writing. Basically I have an app that runs three containers:

  1. Ollama running on the default port.
  2. OpenWeb-UI running on the default port.
  3. nginx with a templated rewrite config(Open OnDemand OpenWeb-UI nginx Rewriting code. ¡ GitHub) on a generated port.

here are the relevant lines from my script.sh.erb

export OLLAMA_HOST=127.0.0.1:11434
export OLLAMA_MODELS=/cluster/ollama_models
apptainer run --nv --writable-tmpfs library://ondemand/ollama:latest  serve &
mkdir openwebui. # this is where the database shows up, and is per-instance
apptainer run  --overlay openwebui --env-file env_file library://ondemand/openwebui:latest &
 apptainer run --writable-tmpfs \
     --env FORWARD_PORT=${port} \
     --env FORWARD_HOST=${host} \
     -B ./templates:/etc/nginx/templates \
     -B ./:/var/log/nginx/ \
     -B ./conf:/etc/nginx/conf.d/ \
     --app docker \
     library://ondemand/nginx:latest nginx

the file on gist is in the template/templates/ directory so that it can be found by nginx

Disclaimer, this is not production, but does work. There are a few know issues:

  1. reconnecting after the initial click requires cleaning out cookies in the browser.
  2. openwebui seems to rewrite the URL in the browser, so you lose the ability to copy-paste the URL to get back where you are.

Some Singularity Gists:

If you Improve on these scripts, let me know!

1 Like

Hi all,
@wfeinstein showed me this thread which is exactly what I’ve been working on the past few months. My talk at GOOD conference was related to this. Evan’s setup rewriting html is one way to approach this but isn’t exact since there is some minified javascript in open-webui that won’t fit those patterns.
I did manually rewrite open-webui to support relative URIs (some others have as well), the branch is available here:

But now to do this automatically without requiring rewrites we use a proxy to put the web service on a secondary root domain (for details, see my talk). I’m aiming to opensource it soon and will try to post here when it is ready.

Harry, thanks for chiming in. Your talk was excellent and we are looking forward to seeing your finalized proxy project. I know that this topic is also of great interest to the OOD core developer team.

It’s good to see the various ways people are going about this. I struggled with it a bit and ultimately ended up running it with the vnc template. I run the container and launch firefox full screen. Here is my script.sh.erb it’s not super elegant but it gets the job done.

#!/usr/bin/env bash

module purge

cd “${HOME}”

(
export SEND_256_COLORS_TO_REMOTE=1
export XDG_CONFIG_HOME=“<%= session.staged_root.join(“config”) %>”
export XDG_DATA_HOME=“<%= session.staged_root.join(“share”) %>”
export XDG_CACHE_HOME=“$(mktemp -d)”
module restore
set -x
xfwm4 --compositor=off --sm-client-disable
xsetroot -solid “#D3D3D3”
xfsettingsd --sm-client-disable
xfce4-panel --sm-client-disable
) &

export PORT=ruby -e 'require "socket"; puts Addrinfo.tcp("", 0).bind {|s| s.local_address.ip_port }'
module load apptainer
mkdir ~/openwebui
cd ~/openwebui
mkdir ollama open-webui static scratch
echo head -c 12 /dev/random | base64 > .webui_secret_key
export WEBUI_SECRET_KEY=$(cat ~/openwebui/.webui_secret_key)
apptainer exec -B ollama:/root/.ollama -B open-webui:/app/backend/data -B scratch:/scratch -B static:/app/backend/static --env “PORT=$PORT” --env “WEBUI_SECRET_KEY=$WEBUI_SECRET_KEY” --env “OLLAMA_MODELS=/software/models/ood/openwebui-ollama” -B /scratch/network/$USER:/scratch/network/$USER /software/containers/ood/openwebui-ollama/openwebui-ollama.sif /app/backend/start.sh &
sleep 25
firefox http://localhost:$PORT --kiosk --fullscreen

I can’t decide whether this is off topic and should be a private message. Finally deciding that others might benefit from this discussion since there seem to be a number of people who are trying to get open-webui working under OOD.

I am trying to use this branch and running into some problems. I can’t get the UI to render properly no matter what I set FRONTEND_APP_ROOT to, or even if I don’t set it at all.

Maybe you can help me.

I’ve written the following Dockerfile (will move it to Apptainer/Singularity once I have it working). I suppose I could modify the one that comes with the repo but it seems pretty complicated.

FROM python:3.11

RUN apt-get update -y && DEBIAN_FRONTEND=noninteractive apt-get install -y \
    curl git libpq-dev ffmpeg nodejs npm


WORKDIR /tmp
RUN git clone https://github.com/hsmallbone/open-webui.git

WORKDIR /tmp/open-webui
RUN git checkout relative-urls

RUN python -m pip install -e .
RUN npm run build


EXPOSE 8080
EXPOSE 5173

CMD ["sh", "-c", "npm run dev & /usr/local/bin/open-webui serve"]


Then I run it like this (ollama is not running on this machine yet but right now I’m just trying to get the UI to show up):

docker run --rm --name openwebui -p 8090:8080 -p 5173:5173 -v $HOME/.open-webui:/app/backend/data open-webui

Notice that here I am not setting a value for FRONTEND_APP_ROOT - is it ok to do that or is a value required?

Now when I go to http://hostname:5173 I get a broken image icon. Viewing source, I see links like this:

        <link rel="icon" type="image/png" href="/@098555d4-163c-464d-9936-98d084e61beb@/favicon/favicon-96x96.png" sizes="96x96" />
		<link rel="icon" type="image/svg+xml" href="/@098555d4-163c-464d-9936-98d084e61beb@/favicon/favicon.svg" />
		<link rel="shortcut icon" href="/@098555d4-163c-464d-9936-98d084e61beb@/favicon/favicon.ico" />
		<link rel="apple-touch-icon" sizes="180x180" href="/@098555d4-163c-464d-9936-98d084e61beb@/favicon/apple-touch-icon.png" />

I am not sure where the @098555d4-163c-464d-9936-98d084e61beb@ came from but looking at the network tab in the browser dev tools, I can see that all those requests failed with a 404.

Setting a value for FRONTEND_APP_ROOT (and then using that in my url) does not seem to change this situation.

Also possibly relevant - I see this in the output:

WARNI [open_webui.main] Frontend build directory not found at '/tmp/open-webui/backend/open_webui/frontend'. Serving API only.   

but it does seem like it’s at least trying to serve the UI, so I am confused about this.

Can you share how you are building/running/accessing your branch?

And thanks for doing the work to get open-webui working with OOD!

Thanks,
Dan

Hi @karcaw,

This is very cool. Really appreciate all the work you put into this. I have followed along and (with a few tweaks) gotten this to almost work.

I have ollama, nginx, and open-webui running. I can go to the URL (e.g. https://openondemand.fredhutch.org/rnode/gizmoj16/48299/) and I get a 404, but the 404 comes from open-webui and not apache or nginx.

Here’s a short video of what happens when I delete OOD’s cookies and reload the page - you can see it shows the Open-WebUI splash screen and seems to want to load it, but then it shows the 404.

Looking at the network request in the browser’s dev tools, I don’t actually see any 404s. The URL rewriting seems to work for the most part. For example, I can hit the URLs shown there and they work (for example https://openondemand.fredhutch.org/rnode/gizmoj16/48299/manifest.json).

Of course I can go directly to http://gizmoj16:8080/ and everything works fine, but that’s not through nginx + OOD.

When I view source on the Open-WebUI page as shown by OOD and the direct-to-8080 version, the contents seem almost the same except for the rewritten URLs; the difference must be something dynamic in the DOM.

In the javascript console I do see a couple of errors that I don’t see in the direct-to-8080 version, so maybe those are significant?

In a different browser get a little more info, and this might be a clue:

Seems like maybe the websocket is not being proxied (or having its url rewritten) properly?

In the nginx config there is this line in the location / section:

        sub_filter '/ws' '/rnode/$server_name/${FORWARD_PORT}/ws';

and then below that there is another separate section for the websocket:

    location /ws {
        proxy_pass http://localhost:8080/ws;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection $connection_upgrade;
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
    }

Could those sections be conflicting somehow?

Appreciate any light you could shed - happy to provide more info if that is needed.

Thanks
Dan

Edited to add - do you know what version of Open-WebUI you are running? Maybe something has changed with it - I could try running the same version as you. I seem to be running v0.6.5 (version number appears after the ascii art in the output.log file).

I was running a 5.18 version of openwebui when this was all working, and I can confirm that the new version(6.2) is failing as you describe. Back to the debug chair for me.

There is no conflict in the /ws sections you mentioned. The second one is putting in the bits to properly forward the websocket properly, whereas the sub_filter bits are rewriting the web page code as it goes through the nginx server, so that it will make links to the proper URL.

1 Like