OpenOnDemand server generating too many OPTIONS/preflight requests

Hi Jeff,

Thank you for your response.

We’ve been looking at the Splunk logs and it looks like the OPTIONS calls are coming from web browsers that are using applications provided by our Open OnDemand server, but not directly from our Open OnDemand server. Currently, it looks like our IdP servers are receiving between 30,000-50,000 OPTIONS requests per hour. I selected an IP address that comprised the highest number of OPTIONS requests (in my sample period) and tried to cross reference it with our httpd logs and for this particular instance, it seems to be a user’s JupyterLab session. I have not been able yet to investigate whether these requests are primarily associated with JupyterLab sessions.

I have been trying to find a way to see the HTTP OPTIONS request from the browser end when I start a sample OOD session, but it appears that for security reasons, most browser developer tools no longer provide the ability to track this sort of network traffic from the browser side.

From what I understand an OPTIONS request is a request which asks which operations are allowed. I’m able to see records like the following from our Splunk instance (I’ve obscured the IP address with X.X.X.X), but they are all of this form:

jetty_access: X.X.X.X - - [27/Jan/2025:19:29:53 +0000] “OPTIONS /idp/profile/SAML2/Redirect/SSO?SAMLRequest=fZLRToMwFIZfhfR%2BFFgI0gwS3C5cMpUM9MIbU%2BjZaAIt9pSpby8bU7eb3bZ%2Fv3P%2BL10g79qeZYNt1BY%2BBkDrfHWtQna6SMhgFNMcJTLFO0Bma1ZkjxsWuB7rjba61i1xMkQwVmq11AqHDkwB5iBreNluEtJY2yOjVGvhNn3tHqTZSyW5C2KgRSOrSrdgGxdR0yM7oPlzURJnNS4jFT9i%2FyE45qXoXWnxGjQe0nGhnWzhTNmCkAZqS4vimTjrVULe%2FaqqhBdVtRfHggdzHooIAgh3IKIwqu%2FGGOIAa4WWK5uQwAvCmefPgqj0YxbELJy%2FESc%2F976XSki1vy2pmkLIHsoyn03NXsHgqdUYIOniqJqdBpsL%2Bbex%2FNc4SW%2F6xT%2B%2FC3oxZxras6cRvF7lupX1t5O1rf5cGuAWEuITmk5Prr9H%2BgM%3D&RelayState=ss%3Amem%3A615a75527ad07fd9c56bad04f878a1cdc428fd2ea360e75affc5a409a268911c&SigAlg=http%3A%2F%2Fwww.w3.org%2F2001%2F04%2Fxmldsig-more%23rsa-sha256&Signature=Pkvtv8ZIhc2gadSZjvEUW89qGVwuxAm9jLzGizFg%2F46LiQWg9NvNyYKmpUZQeuxL%2FyY%2FH5ICMqCExD3jGt9abU073NmJHSBTysFMJK0OlCf40U9DvI7SK%2FYJK61Lp0AiUZOycctzQxVJG3oJJupqsq1U0dmP27WhTm%2B1zl%2F3uBbDkeX3T5KUeYJhk016RfZsbGM13u3Fk5b52OXiOn8tHp6un8cB7jsPs0mYtVQt5X9eLerl9OICo38FpHv7Y6qauipC3K3Db70vgZjQFt7fKfQT1l9GmZXql%2B0pTlfCxkm%2Bux657KsrsNXsyPmIpgYt15FFPsNjpVhGYDvC9zjHw11lcx06gfnHfIJJLtC4ZSDWPD3dME%2BuUnfdfm%2BTFprAOptl0j5TPmdV2uQ6FAIjF4ZXP5p%2F%2F1Omc80F6bdsbr4q2AulgE0COSMfr6T9UcQmL555T6ynQEAFUAzj6pD%2FE%2BqQB595KJwUCSU%2Fexgwtgs5MjaNvJoMCMkUY36j8xAq HTTP/1.1” 403 20 “https://ood.hpc.virginia.edu/” “Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36”

When I decode the SAML request from the Splunk records, they are of the form:
<samlp:AuthnRequest xmlns:samlp=“urn:oasis:names:tc:SAML:2.0:protocol” AssertionConsumerServiceURL=“https://ood.hpc.virginia.edu/Shibboleth.sso/SAML2/POST” Destination=“NetBadge Message” ID=“_1bbbd07bc099da23a5d7e2e5fed757c8” IssueInstant=“2025-01-27T19:29:53Z” ProtocolBinding=“urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST” Version=“2.0”><saml:Issuer xmlns:saml=“urn:oasis:names:tc:SAML:2.0:assertion”>https://ood.hpc.virginia.edu/shibboleth</saml:Issuer><samlp:NameIDPolicy AllowCreate=“1”/></samlp:AuthnRequest>

I’m hoping to get some ideas for how to continue to track down what might be happening, so any ideas you have would be helpful.

Michele