PrivateGPT Reverse

Hello I am working on setting up PrivateGPT in OOD. This launches as a web app using uvicorn and gradio as a UI. I am able to launch the service. It does come up on a port and OOD connects to it. When I hit the button to connect I do see the UI. However parts of the page don’t load properly. There should be some java script and css that is loaded. I see the following in the applications log when I make the connection.

11:59:11.747 [INFO ] uvicorn.access - - “GET / HTTP/1.1” 200

If I then launch a remote desktop and open firefox I can reach the service and it loads just fine. I then see the following in the application log.

12:00:04.644 [INFO ] uvicorn.access - - “GET / HTTP/1.1” 200
12:00:04.804 [INFO ] uvicorn.access - - “GET /assets/Index-52a9d5ff.css HTTP/1.1” 200
12:00:04.849 [INFO ] uvicorn.access - - “GET /info HTTP/1.1” 200
12:00:04.879 [INFO ] uvicorn.access - - “GET /theme.css HTTP/1.1” 200
12:00:05.236 [INFO ] uvicorn.access - - “GET /assets/Index-e45a2b11.css HTTP/1.1” 200
12:00:05.239 [INFO ] uvicorn.access - - “GET /assets/Index-64f7cc27.js HTTP/1.1” 200
12:00:05.244 [INFO ] uvicorn.access - - “GET /assets/Index-a90cda25.css HTTP/1.1” 200
12:00:05.245 [INFO ] uvicorn.access - - “GET /assets/Index-b2efa79d.js HTTP/1.1” 200
12:00:05.466 [INFO ] uvicorn.access - - “POST /run/predict HTTP/1.1” 200

I did find a github post with a similar sounding issue behind an nginix proxy which was able to be resolved. I am unsure if this is applicable but may be a step in the right direction. The solution was to use the following settings.

        proxy_buffering off;
        proxy_redirect off;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_set_header Host $host;

Thank You,
Steven Mohr

Yea gradio can be kinda wonky. When I deployed stable diffusion web ui (built on gradio app) we’re able to supply a --subpath argument. Looking at the source code for this app - in the settings.yml you can only set the port, so that doesn’t seem like it’s an option for you.

It may just work for you if you use /rnode in your view.html.erb. That may proxy correctly (not sure if you started with /node or not).

That said - you may have to use a nginx proxy within the job to set the URLs right. It’s not often that you should need to, but some apps just don’t work right or can recognize that they’re behind a proxy.

Hi Jeff,

is the nginx proxy documented somewhere? Or if not can you please document it? I think we’ll have to explore this option with quite a few of the LLM frontends that are out there (like the OpenWebUI we talked about on Tuesday).


This was before my time, so I’ve never done it personally, but I am aware of this repository (that at a glance uses an nginx container).

But again, I didn’t develop that personally and indeed have never even used the app - it was decommissioned at some poitn.

@jeff.ohrstrom I wonder if this would be a potential solution for things like Cryosparc who also can’t handle uri’s being changed?

Looks like it could, but again I can’t offer a lot of guidance for the same. Looks like nginx has substitute type directives.

Thanks for the quick responses on this. I had no luck with node set in the view.html.erb. I can at least reach the UI in some capacity with rnode. In the Privatgpt settings there is an option to set a path. So far any changes to this setting haven’t made much of a difference.

enabled: true
path: /

I don’t see path in any of the settings ymls, but I think this is what you’d set it to if it is indeed a configuration option.

path: '/rnode/$host/$port'

or this below - sometimes trailing /s matter.

path: '/rnode/$host/$port/'

Maybe with no quotes. IDK how the templating system interacts with environment variables, but quoting and not quoting are other variations you may need to try.

Is it possible for the reverse proxy Open Ondemand uses to connect to a Unix domain socket? This app uses uvicorn and can be started with any options it supports.

Yes and no.

Yes in the sense that all passenger apps boot on unix sockets. So when access /pun/sys/dashboard for example apache connects to Nginx (which has booted the application) through a unix socket.

No in the sense that what this topic is about is a batch connect application. batch connect (or interactive) applications boot on a compute node (i.e., a different machine than the machine that’s running OnDemand). So connecting to that machine through a unix socket is impossible because it requires traveling over the network from the OnDemand machine to the compute node.

Makes perfect sense.

I am still stuck on this. I took the suggestion and built an nginx container. I configured it as a reverse proxy. I used the forum post about this specific app in my initial post and the shiny app that was posted with nginx config as inspiration.

server {
listen 8085;

location / {
    proxy_pass http://unix:/tmp/gp.sock;
    proxy_buffering off;
    proxy_redirect off;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection "upgrade";
    proxy_set_header Host $host;


I tried it starting uvicorn on an IP and I also tried it using a socket., host=“”, port=settings().server.port), uds=“/tmp/gp.sock”, port=settings().server.port)

If I start a remote desktop and start the app and proxy in that session I can then open a web browser and access the app on the port for the reverse proxy just fine. The page loads and everything looks fine.

When I go to reach the app through Open Ondemand I can reach it. I still have the same issues . I can reach the app, I can see it make the initial GET. It doesn’t load the CSS and JS.

I figured using nginx might give a bit more flexibility and help but there is still something in Open Ondemand that isn’t working right. Do you have any suggestions?

Feel free to close this. I got pretty close but could not get the page to render properly with proxy or reverse proxy. Ultimately I had it launch in a VNC session and start firefox in full screen kiosk mode. It’s not ideal but is getting the job done.