You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Probably a long-shot (given the large API differences), but are there any plans to support AWS Bedrock?
Given Amazon's investment into Anthropic, we've seen an uptick in interest in using Bedrock to run LLMs on their existing cloud infrastructure, and we'd like to continue using magentic.
Probably something better for Litellm?
The text was updated successfully, but these errors were encountered:
From the docs here https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters.html , the request (including messages) and response are different between the different models hosted on Bedrock. So magentic would need to know how to map from/to each of these, which would probably be easiest as a different ChatModel for each.
Other models on Bedrock might similarly be accessible through their provider's SDK, which could be a nice way to implement this because then those providers would also be supported in magentic. This is currently only the case for OpenAI on Azure.
Hi Jack 👋 ,
Probably a long-shot (given the large API differences), but are there any plans to support AWS Bedrock?
Given Amazon's investment into Anthropic, we've seen an uptick in interest in using Bedrock to run LLMs on their existing cloud infrastructure, and we'd like to continue using
magentic
.Probably something better for
Litellm
?The text was updated successfully, but these errors were encountered: