Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You have to rebuild all your prompts when switching providers.


If the superlative LLM can’t handle prompts from another provider, it just isn’t the superlative LLM.

This area by definition has no moats. English is not proprietary.

Use case is everything.


Switching to another LLM isn't always about quality. Being able to host something yourself at a lower or equal quality might be preferred due to cost or other reasons; in this case, there's no assumption that the "new" model will have comparable outputs to another LLM's specific prompt style.

In a lot of cases, you can swap models easier but all the prompt tweaking you did originally will probably need to be done again with the new model's black box.


Host something yourself also for educational reasons, just experimenting, this is how new applications and technologies to be discovered and created.


Do you? They're natural language, right?


You don't have to, but they will have been optimized for one model. It's unlikely they'll work as well on a different model.


I can't wait for TolkienAPI, where prompts will have to be written in Quenya.


I can’t wait to hire Stephen Colbert to write prompts then


No problem, just ask ChatGPT to translate it in Quenya.


I imagine AI would be able to perform the translation. "Given the following prompt, which is optimized for $chatbot1, optimize it for $chatbot2".


technically true, but the way these prompts are/can be template-ized it should be relatively trivial to do so.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: