Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TensorRT-LLM(TRT-LLM) LMI model format artifacts not found when deploying #2498

Open
joshight opened this issue Oct 28, 2024 · 1 comment
Open
Labels
bug Something isn't working

Comments

@joshight
Copy link

Description

(A clear and concise description of what the bug is.)

Model artifacts are in the (TRT-LLM) LMI model format:

` aws s3 ls ***
PRE 1/
2024-10-25 14:59:16 739 config.json
2024-10-25 14:59:16 11222 config.pbtxt
2024-10-25 14:59:16 194 generation_config.json
2024-10-25 14:59:16 21 requirements.txt
2024-10-25 14:59:16 444 special_tokens_map.json
2024-10-25 14:59:16 9085698 tokenizer.json
2024-10-25 14:59:16 52097 tokenizer_config.json

`

Latest DJL tensorrt container being used: 763104351884.dkr.ecr.region.amazonaws.com/djl-inference:0.29.0-tensorrtllm0.11.0-cu124

DJL is looking for hugging face artifacts to convert and fails upon not finding any:

OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory /tmp/.djl.ai/download/bf8a789f03a76ad3e0773d75f1a9a366b66e57ba.

Expected Behavior

(what's the expected behavior?)

Model was converted and quantized prior to being deployed, so the expectation is these artifacts should deploy properly with the djl tensorrt container.

Error Message

(Paste the complete error message, including stack trace.)

OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory /tmp/.djl.ai/download/bf8a789f03a76ad3e0773d75f1a9a366b66e57ba.

How to Reproduce?

(If you developed your own code, please provide a short script that reproduces the error. For existing examples, please provide link.)

  1. Convert huggingface format artifacts to quantized TRT-LLM artifacts using a DJL container in a sagemaker notebook
  2. Push tensorrt, quantized artifacts to new s3 path
  3. Deploy to sagemaker using DJL container: 763104351884.dkr.ecr.region.amazonaws.com/djl-inference:0.29.0-tensorrtllm0.11.0-cu124

Steps to reproduce

(Paste the commands you ran that produced the error.)

See above

What have you tried to solve it?

Tried different paths is serving.properties without any success.

@joshight joshight added the bug Something isn't working label Oct 28, 2024
@siddvenk
Copy link
Contributor

can you provide the serving.properties configs you are using?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants