Deploying in Yatai using bentoml API #2278
Unanswered
mlopezfernandez
asked this question in
Q&A
Replies: 1 comment
-
@mlopezfernandez Thank you for the encouraging words! Right now, there are UI interface and kubectl available for deploying for the recent released 0.2.0 version (that added CRD support) You can write k8s yaml file similar to this: apiVersion: serving.yatai.ai/v1alpha1
kind: BentoDeployment
metadata:
name: demo
spec:
bento_tag: iris_classifier:nxapohu4codzplg6
resources:
limits:
cpu: 1000m
requests:
cpu: 500m and then call |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am delighted with how bentoml and yatai work. It has been easy to mount a small example locally and later deploy to AWS pointing to an S3. I have been able to deploy different models but all through the graphical interface. Would it be possible to do it using the bentoml API?
Beta Was this translation helpful? Give feedback.
All reactions