Allow user to provide their own llm client #106
Replies: 2 comments 8 replies
-
I think this looks good. @vinid had an initial WIP PR here, which had the similar goal of reducing the implementation work to bring-your-own-engine type thing. I really like what you're suggesting here too! I'm fine with either approach, whichever you two decide is useful. |
Beta Was this translation helpful? Give feedback.
-
maybe it could be something as simple as this, don't know. But this would introduce breaking changes!? (obviously just showing for OpenAI here). Then you also wouldn't need
In fact, you have basically implemented this already with |
Beta Was this translation helpful? Give feedback.
-
I'm not a fan of the approach whereby the 'Engine' is created from scratch. It would be better if the user passed in an llm client to an adaptor function.
E.g.
Why?
Additionally, why not portobello a way for users to add their own custom engine as a Callable -> str
I'd be happy to give this a go but wanted to check for whether people thought this was a good idea first.
Beta Was this translation helpful? Give feedback.
All reactions