[Bug]: sagemaker_chat
provider does not correctly pass the model_id
for the SageMaker Inference Component
#9909
Labels
bug
Something isn't working
What happened?
Steps to reproduce
This is probably an issue specific to the model provider
sagemaker_chat
not correctly forwarding the inference component name to the Boto3invoke_endpoint()
API.Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
1.65.7
Twitter / LinkedIn details
https://www.linkedin.com/in/dgallitelli/
The text was updated successfully, but these errors were encountered: