Open OnDemand Chat Interface

Hello all,

I’m happy to say I can finally share the OnDemand chat interface I’ve been working on for some time. This is a frontend only, OpenAI API streaming/RAG compatible, chat interface for Open OnDemand. This is meant to be as minimally invasive to the primary web app as possible. This is an updated version of the app I demoed last year at the SC OOD booth. This was ran in production on version 1.8 but has been modified for version 3.1.

The code is open sourced here - GitHub - idaholab/HPC_OOD_Chat: This code creates an interface for a web-based front-end only chatbot interface.

Additionally we wrote a paper on our results of using this in production for a few months - Supplementing HPC Support with a Science Gateway AI Assistant (Technical Report) | OSTI.GOV

I am not much of a web developer, so I apologize in advance for bugs and weird ways of doing things. Happy to improve this though if anyone wants to submit any PRs!

Here are a few examples of what this looks like in the dashboard -


image

4 Likes

Thank you for sharing this, it looks great! I’m excited to try it out and read the paper. I think this could provide a lot of convenience and utility for OnDemand users. I’m very interested in how users reacted to it and what they decided to use it for.

1 Like

Hi Brandon, this is great. Would you be able to present this at the August Tips and Tricks call?

1 Like

Hi Martin, definitely. I thought the August Tips and Tricks was going to be for a PEARC recap though?

Thanks Brandon, let me clarify this, I thought we sort of previewed the PEARC last time. I’ll get in touch directly.

Any chance there was a recording of this demo?