Rendered at 15:38:53 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
try-working 2 days ago [-]
This has been happening with chat bots for three years now and it's never going to stop. You simply don't expose raw prompting and completions to the user like this on a customer-facing website.
steve-atx-7600 2 days ago [-]
That makes it sound trivial. It seems desirable to put an LLM in front of an api (obviously with auth/authorization as needed) so that it can be called via natural language. But, to avoid wasting LLM resources on a chiplotle chapt bot, you’d need to make the LLM classify the input text as an “acceptable” request or not to deadend it. That sounds harder and more prone to exploits than you make it seem.
try-working 1 days ago [-]
What I mean is that these chat bots shouldn't be built in the first place. They are insecure and offer nothing to users; moreover a chat bot interface tacked onto an e-commerce site does nothing that the GUI doesn't do better.