I am implementing an AI platform via open-webui. I can launch the standard desktop app, then start the service open-webui followed by opening a web browser at (http://localhost:8080). This is not user-friendly.
I want to bypass the web browser instead of using a proxy, similar to Jupyterlab or vscode. I have view.html.erb and need to tweak the parameters.
Please advise! A Zoom session would be highly appreciated.
Wei
[wfeinstein@n0000 lrc_LM]$ cat template/before.sh.erb
# Export the module function if it exists
[[ $(type -t module) == âfunctionâ ]] && export -f module
# Find available port to run server on
export port=$(find_port ${host})
# Export compute node the script is running on
export host=â${host}â
# Generate SHA1 encrypted password (requires OpenSSL installed)
SALT=â$(create_passwd 16)â
password=â$(create_passwd 16)â
PASSWORD_SHA1=â$(echo -n â${password}${SALT}â | openssl dgst -sha1 | awk â{print $NF}â)â
source /global/home/groups/scs/wfeinstein/ollama/startup.sh
[wfeinstein@n0000 lrc_LM]$ cat template/script.sh.erb
#!/usr/bin/env bash
echo âTIMING - Starting main script at: $(date)â
cd â${HOME}â
module purge
module load <%= context.modules %>
<%- end -%>
# Start Ollama in the background
ollama &
# Wait for Ollama to initialize (adjust sleep time as needed)
sleep 10
open-webui serve --port ${port}
The app didnât launch; output.log has some info for further debugging. In addition, âconnection.yamlâ didnât get created in the same directory.
[root@perceus-00 output]# cat 3940c616-a301-4ab8-8606-60ae373c6447/output.log
Setting VNC passwordâŚ
Starting VNC serverâŚ
Desktop âTurboVNC: n0357.lr6:1 (wfeinstein)â started on display n0357.lr6:1
Log file is vnc.log
Successfully started VNC server on n0357.lr6:5901âŚ
Script startingâŚ
Waiting for open-webui server to open port 31011âŚ
TIMING - Starting wait at: Sat Feb 22 11:27:23 PST 2025
TIMING - Starting main script at: Sat Feb 22 11:27:23 PST 2025
/global/home/groups/scs/wfeinstein/ollama/usr/bin/ollama
Usage:
ollama [flags]
ollama [command]
Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
stop Stop a running model
pull Pull a model from a registry
push Push a model to a registry
list List models
ps List running models
cp Copy a model
rm Remove a model
help Help about any command
Flags:
-h, --help help for ollama
-v, --version Show version information
Use âollama [command] --helpâ for more information about a command.
Loading WEBUI_SECRET_KEY from file, not provided as an environment variable.
Loading WEBUI_SECRET_KEY from /global/home/users/wfeinstein/.webui_secret_key
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
INFO [open_webui.env] âENABLE_SIGNUPâ loaded from the latest database entry
INFO [open_webui.env] âDEFAULT_LOCALEâ loaded from the latest database entry
INFO [open_webui.env] âDEFAULT_PROMPT_SUGGESTIONSâ loaded from the latest database entry
WARNI [open_webui.env]
WARNING: CORS_ALLOW_ORIGIN IS SET TO â*â - NOT RECOMMENDED FOR PRODUCTION DEPLOYMENTS.
INFO [open_webui.env] Embedding model set: sentence-transformers/all-MiniLM-L6-v2
/global/home/groups/scs/wfeinstein/ollama/local/site-packages/pydub/utils.py:170: RuntimeWarning: Couldnât find ffmpeg or avconv - defaulting to ffmpeg, but may not work
warn(âCouldnât find ffmpeg or avconv - defaulting to ffmpeg, but may not workâ, RuntimeWarning)
WARNI [langchain_community.utils.user_agent] USER_AGENT environment variable not set, consider setting it to identify your requests.
/global/home/groups/scs/wfeinstein/ollama/local/site-packages/open_webui
/global/home/groups/scs/wfeinstein/ollama/local/site-packages
/global/home/groups/scs/wfeinstein/ollama/local
Running migrations
___ __ __ _ _ _ ___
/ _ \ _ __ ___ _ __ \ \ / /__| |__ | | | |_ _|
| | | | '_ \ / _ \ '_ \ \ \ /\ / / _ \ '_ \| | | || |
| |_| | |_) | __/ | | | \ V V / __/ |_) | |_| || |
\___/| .__/ \___|_| |_| \_/\_/ \___|_.__/ \___/|___|
|_|
v0.5.12 - building the best open-source AI user interface.
Fetching 30 files: 100%|ââââââââââ| 30/30 [00:00<00:00, 24750.02it/s]
INFO: Started server process [2061819]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:31011 (Press CTRL+C to quit)
Discovered open-webui server listening on port 31011!
TIMING - Wait ended at: Sat Feb 22 11:27:46 PST 2025
Starting websocket serverâŚ
[websockify]: pid: 2062089 (proxying 25880 ==> localhost:31011)
[websockify]: log file: ./websockify.log
[websockify]: waiting âŚ
[websockify]: failed to launch!
Cleaning upâŚ
Killing Xvnc process ID 2061649
Hope the code and log file above are helpful for debugging.
I am also working on a similar project with Open WebUI. The problem that I have encountered is that Open WebUI does not currently allow you to set a different root URL for the server instance; see Jeffâs second point above. This is important for reverse proxy applications as he explains in his post. Using these servers within a reverse proxy is a relatively common thing that many other apps support; for instance, Gradio has this feature built-in.
Open WebUI uses a mix of absolute and relative paths to indicate where different files are located to be served out. I have had some success modifying the source code but it is still a work in progress.
Thank you for sharing, and glad to know you are working on a similar project with webUI.
Do you have a few minutes to share some of your work in progress, including Webuiâs mixed paths?