Skip to content

Commit

Permalink
deploy: d965291
Browse files Browse the repository at this point in the history
  • Loading branch information
NeuralChatBot committed Jan 13, 2025
1 parent d965291 commit 8414a29
Show file tree
Hide file tree
Showing 372 changed files with 14,751 additions and 10,979 deletions.
31 changes: 18 additions & 13 deletions 404.html
Original file line number Diff line number Diff line change
Expand Up @@ -780,25 +780,30 @@
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="microservices/index.html#lvms-microservice">Lvms Microservice</a><ul>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/llama-vision/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llama-vision/README.html#start-microservice-with-docker">πŸš€ Start Microservice with Docker</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html#build-image-run">Build Image &amp; Run</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html#test">Test</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html#start-microservice-with-python-option-1">πŸš€1. Start Microservice with Python (Option 1)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llama-vision/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llama-vision/README.html#start-microservice-with-docker">πŸš€ Start Microservice with Docker</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html">LVM Prediction Guard Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#start-microservice-with-python">πŸš€1. Start Microservice with Python</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#consume-lvm-service">πŸš€3. Consume LVM Service</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html#start-microservice-with-python-option-1">πŸš€1. Start Microservice with Python (Option 1)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#start-microservice-with-docker">πŸš€1. Start Microservice with Docker</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#test">βœ… 2. Test</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#clean">♻️ 3. Clean</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html">LVM Prediction Guard Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#start-microservice-with-python">πŸš€1. Start Microservice with Python</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#consume-lvm-service">πŸš€3. Consume LVM Service</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#start-microservice-with-docker">πŸš€1. Start Microservice with Docker</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#test">βœ… 2. Test</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#clean">♻️ 3. Clean</a></li>
</ul>
</li>
</ul>
Expand Down
31 changes: 18 additions & 13 deletions latest/404.html
Original file line number Diff line number Diff line change
Expand Up @@ -779,25 +779,30 @@
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="microservices/index.html#lvms-microservice">Lvms Microservice</a><ul>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/llama-vision/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llama-vision/README.html#start-microservice-with-docker">πŸš€ Start Microservice with Docker</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html#build-image-run">Build Image &amp; Run</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html#test">Test</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html#start-microservice-with-python-option-1">πŸš€1. Start Microservice with Python (Option 1)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llama-vision/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llama-vision/README.html#start-microservice-with-docker">πŸš€ Start Microservice with Docker</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html">LVM Prediction Guard Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#start-microservice-with-python">πŸš€1. Start Microservice with Python</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#consume-lvm-service">πŸš€3. Consume LVM Service</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html#start-microservice-with-python-option-1">πŸš€1. Start Microservice with Python (Option 1)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#start-microservice-with-docker">πŸš€1. Start Microservice with Docker</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#test">βœ… 2. Test</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#clean">♻️ 3. Clean</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html">LVM Prediction Guard Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#start-microservice-with-python">πŸš€1. Start Microservice with Python</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#consume-lvm-service">πŸš€3. Consume LVM Service</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#start-microservice-with-docker">πŸš€1. Start Microservice with Docker</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#test">βœ… 2. Test</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#clean">♻️ 3. Clean</a></li>
</ul>
</li>
</ul>
Expand Down
31 changes: 18 additions & 13 deletions latest/CONTRIBUTING.html
Original file line number Diff line number Diff line change
Expand Up @@ -781,25 +781,30 @@
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="microservices/index.html#lvms-microservice">Lvms Microservice</a><ul>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/llama-vision/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llama-vision/README.html#start-microservice-with-docker">πŸš€ Start Microservice with Docker</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html#build-image-run">Build Image &amp; Run</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/README.html#test">Test</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html#start-microservice-with-python-option-1">πŸš€1. Start Microservice with Python (Option 1)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/llava/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llama-vision/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llama-vision/README.html#start-microservice-with-docker">πŸš€ Start Microservice with Docker</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html">LVM Prediction Guard Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#start-microservice-with-python">πŸš€1. Start Microservice with Python</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/predictionguard/README.html#consume-lvm-service">πŸš€3. Consume LVM Service</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html#start-microservice-with-python-option-1">πŸš€1. Start Microservice with Python (Option 1)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/llava/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#start-microservice-with-docker">πŸš€1. Start Microservice with Docker</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#test">βœ… 2. Test</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/video-llama/README.html#clean">♻️ 3. Clean</a></li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html">LVM Prediction Guard Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#start-microservice-with-python">πŸš€1. Start Microservice with Python</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#start-microservice-with-docker-option-2">πŸš€2. Start Microservice with Docker (Option 2)</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/predictionguard/README.html#consume-lvm-service">πŸš€3. Consume LVM Service</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html">LVM Microservice</a><ul>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#start-microservice-with-docker">πŸš€1. Start Microservice with Docker</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#test">βœ… 2. Test</a></li>
<li class="toctree-l4"><a class="reference internal" href="GenAIComps/comps/lvms/src/integrations/dependency/video-llama/README.html#clean">♻️ 3. Clean</a></li>
</ul>
</li>
</ul>
Expand Down
Loading

0 comments on commit 8414a29

Please sign in to comment.