Skip to content

models how to use functions with GPT chat API

github-actions[bot] edited this page Oct 22, 2023 · 14 revisions

how-to-use-functions-with-GPT-chat-API

Overview

Description: The "Use Functions with Chat Models" is a chat model illustrates how to employ the LLM tool's Chat API with external functions, thereby expanding the capabilities of GPT models. The Chat Completion API includes an optional 'functions' parameter, which can be used to stipulate function specifications. This allows models to generate arguments that comply with the given specifications. However, it's important to note that the API will not directly execute any function calls. The responsibility of executing function calls using the model outputs lies with the developers. ### Inference samples Inference type|Python sample (Notebook)|CLI with YAML |--|--|--| Real time|deploy-promptflow-model-python-example|deploy-promptflow-model-cli-example Batch | N/A | N/A ### Sample inputs and outputs (for real-time inference) #### Sample input json { "inputs": { "question": "How about London next week?" } } #### Sample output json { "outputs": { "answer": "Function generation requested, function = get_n_day_weather_forecast, args = { 'location': 'London', 'num_days': 7, 'format': 'celsius' }" } }

Version: 4

View in Studio: https://ml.azure.com/registries/azureml/models/how-to-use-functions-with-GPT-chat-API/version/4

Properties

is-promptflow: True

azureml.promptflow.section: gallery

azureml.promptflow.type: chat

azureml.promptflow.name: Use Functions with Chat Models

azureml.promptflow.description: Combining external functions to extend the capabilities of GPT chat models.

inference-min-sku-spec: 2|0|14|28

inference-recommended-sku: Standard_DS3_v2

Clone this wiki locally