Avoid launching a web browser

Hi Jeff and OOD team,

I am implementing an AI platform via open-webui. I can launch the standard desktop app, then start the service open-webui followed by opening a web browser at (http://localhost:8080). This is not user-friendly.

I want to bypass the web browser instead of using a proxy, similar to Jupyterlab or vscode. I have view.html.erb and need to tweak the parameters.

Please advise! A Zoom session would be highly appreciated.
Wei

I’m not 100% sure I have the availability to meet. I guess I can offer these tips:

  • you need to boot the app on a port we allocate and set the port environment variable to. (you can see in Jupyter how this is done)
  • You likely need to fix the root URL for the app to not be just / but instead be /node/$host/$port (you can see in Jupyter how this is done)
  • Your view.html.erb should handle any authentication you need by providing the password/credentials in a <form>, header or query parameters.

If you can share the app maybe I can take a look asynchronously.

Thank you, Jeff!

I will modify/debug further before sharing the code with you.

Wei

Hi Jeff,

Thank you!
I can start ollama and launch its GUI via open-webui using the standard desktop app:

This works but is not user-friendly. Therefore, I want to create a LM app, which is similar to VScode server or Jupyter Notebook.

[wfeinstein@n0000 lrc_LM]$ cat view.html.erb
<form action=“/node/<%= host %>/<%= port %>/login” method=“post” target=“_blank”>
<input type=“hidden” name=“password” value=“<%= password %>”>
<button class=“btn btn-primary” type=“submit”>
<i class=“fa fa-eye”></i> Connect to Open WebUI
</button>
</form>

[wfeinstein@n0000 lrc_LM]$ cat template/before.sh.erb
# Export the module function if it exists
[[ $(type -t module) == “function” ]] && export -f module
# Find available port to run server on
export port=$(find_port ${host})
# Export compute node the script is running on
export host=“${host}”
# Generate SHA1 encrypted password (requires OpenSSL installed)
SALT=“$(create_passwd 16)”
password=“$(create_passwd 16)”
PASSWORD_SHA1=“$(echo -n “${password}${SALT}” | openssl dgst -sha1 | awk ‘{print $NF}’)”
source /global/home/groups/scs/wfeinstein/ollama/startup.sh

[wfeinstein@n0000 lrc_LM]$ cat template/script.sh.erb
#!/usr/bin/env bash
echo “TIMING - Starting main script at: $(date)”
cd “${HOME}”
module purge
module load <%= context.modules %>
<%- end -%>
# Start Ollama in the background
ollama &
# Wait for Ollama to initialize (adjust sleep time as needed)
sleep 10
open-webui serve --port ${port}

The app didn’t launch; output.log has some info for further debugging. In addition, “connection.yaml” didn’t get created in the same directory.

[root@perceus-00 output]# cat 3940c616-a301-4ab8-8606-60ae373c6447/output.log
Setting VNC password…
Starting VNC server…
Desktop ‘TurboVNC: n0357.lr6:1 (wfeinstein)’ started on display n0357.lr6:1
Log file is vnc.log
Successfully started VNC server on n0357.lr6:5901…
Script starting…
Waiting for open-webui server to open port 31011…
TIMING - Starting wait at: Sat Feb 22 11:27:23 PST 2025
TIMING - Starting main script at: Sat Feb 22 11:27:23 PST 2025
/global/home/groups/scs/wfeinstein/ollama/usr/bin/ollama
Usage:
ollama [flags]
ollama [command]
Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
stop Stop a running model
pull Pull a model from a registry
push Push a model to a registry
list List models
ps List running models
cp Copy a model
rm Remove a model
help Help about any command
Flags:
-h, --help help for ollama
-v, --version Show version information
Use “ollama [command] --help” for more information about a command.
Loading WEBUI_SECRET_KEY from file, not provided as an environment variable.
Loading WEBUI_SECRET_KEY from /global/home/users/wfeinstein/.webui_secret_key
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
INFO [open_webui.env] ‘ENABLE_SIGNUP’ loaded from the latest database entry
INFO [open_webui.env] ‘DEFAULT_LOCALE’ loaded from the latest database entry
INFO [open_webui.env] ‘DEFAULT_PROMPT_SUGGESTIONS’ loaded from the latest database entry
WARNI [open_webui.env]
WARNING: CORS_ALLOW_ORIGIN IS SET TO ‘*’ - NOT RECOMMENDED FOR PRODUCTION DEPLOYMENTS.
INFO [open_webui.env] Embedding model set: sentence-transformers/all-MiniLM-L6-v2
/global/home/groups/scs/wfeinstein/ollama/local/site-packages/pydub/utils.py:170: RuntimeWarning: Couldn’t find ffmpeg or avconv - defaulting to ffmpeg, but may not work
warn(“Couldn’t find ffmpeg or avconv - defaulting to ffmpeg, but may not work”, RuntimeWarning)
WARNI [langchain_community.utils.user_agent] USER_AGENT environment variable not set, consider setting it to identify your requests.
/global/home/groups/scs/wfeinstein/ollama/local/site-packages/open_webui
/global/home/groups/scs/wfeinstein/ollama/local/site-packages
/global/home/groups/scs/wfeinstein/ollama/local
Running migrations
___ __ __ _ _ _ ___
/ _ \ _ __ ___ _ __ \ \ / /__| |__ | | | |_ _|
| | | | '_ \ / _ \ '_ \ \ \ /\ / / _ \ '_ \| | | || |
| |_| | |_) | __/ | | | \ V V / __/ |_) | |_| || |
\___/| .__/ \___|_| |_| \_/\_/ \___|_.__/ \___/|___|
|_|
v0.5.12 - building the best open-source AI user interface.

Fetching 30 files: 100%|██████████| 30/30 [00:00<00:00, 24750.02it/s]
INFO: Started server process [2061819]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:31011 (Press CTRL+C to quit)
Discovered open-webui server listening on port 31011!
TIMING - Wait ended at: Sat Feb 22 11:27:46 PST 2025
Starting websocket server…
[websockify]: pid: 2062089 (proxying 25880 ==> localhost:31011)
[websockify]: log file: ./websockify.log
[websockify]: waiting …
[websockify]: failed to launch!
Cleaning up…
Killing Xvnc process ID 2061649

Hope the code and log file above are helpful for debugging.

Thanks,

It looks like the web service started at proxying 25880, which theoretically could be viewed at localhost:31011. Any ideas of how I should tweak it?

Thanks,

Hi Wei,

I am also working on a similar project with Open WebUI. The problem that I have encountered is that Open WebUI does not currently allow you to set a different root URL for the server instance; see Jeff’s second point above. This is important for reverse proxy applications as he explains in his post. Using these servers within a reverse proxy is a relatively common thing that many other apps support; for instance, Gradio has this feature built-in.

Open WebUI uses a mix of absolute and relative paths to indicate where different files are located to be served out. I have had some success modifying the source code but it is still a work in progress.

Hi Anderss,

Thank you for sharing, and glad to know you are working on a similar project with webUI.
Do you have a few minutes to share some of your work in progress, including Webui’s mixed paths?

Thank you,
Wei

Hi Wei,

Sure, although I don’t have a lot of answers to share. My week is pretty full but how about we meet next Monday afternoon? I am in Eastern time.

Best,
Sean

1 Like

Hi Sean,

Sounds great next Monday afternoon. I will set up a Zoom call if you can share your email or vice versa.

Thank you,
Wei

1 Like

Hi Wei,

Sure: anderss at wfu.edu

Anytime between 3–5 PM EST works for me.

Best,
Sean

HI Sean,

Thank you! Sent you a Zoom invite.

Wei