Releases: jackmpcollins/magentic
v0.15.0
What's Changed
- Use Ellipses in functions with docstrings by @jackmpcollins in #109
- Use function instead of FunctionCall in FunctionResultMessage by @jackmpcollins in #110
- Add format method to Message. Allow registering new types with message_to_openai_message by @jackmpcollins in #114
- poetry update by @jackmpcollins in #115
- Add last_message property to Chat by @jackmpcollins in #117
Full Changelog: v0.14.1...v0.15.0
Message
subclasses now have a format
method to customize their value using the function arguments when used with the @chatprompt
decorator. This allows for more complex templating than just the string templating that is currently supported. For a real use case with GPT-4-vision see in-progress PR #116
from typing import Any
from magentic import chatprompt
from magentic.chat_model.message import Message
class UppercaseMessage(Message[str]):
def format(self, **kwargs: Any) -> "UppercaseMessage":
upper_kwargs = {k: str(v).upper() for k, v in kwargs.items()}
return UppercaseMessage(self.content.format(**upper_kwargs))
@chatprompt(
UppercaseMessage("hello {x}"),
)
def say_hello(x: str) -> str:
...
say_hello.format("world")
# [UppercaseMessage('hello WORLD')]
In addition, the openai JSON serialization for custom Message
subclasses can be added without modifying magentic
itself! This uses the functools.singledispatch decorator. Together, these changes will allow users to create Message
classes that they can template exactly as needed, which will be useful as the variety of types of inputs to LLMs increases.
from typing import Any
from magentic.chat_model.openai_chat_model import OpenaiMessageRole, message_to_openai_message
from magentic.chat_model.message import Message
from openai.types.chat import ChatCompletionMessageParam
class CustomMessage(Message[str]):
def format(self, **kwargs: Any) -> "CustomMessage":
return CustomMessage(self.content)
@message_to_openai_message.register
def _(message: CustomMessage) -> ChatCompletionMessageParam:
return {"role": OpenaiMessageRole.USER.value, "content": message.content}
message_to_openai_message(CustomMessage("hello"))
# {'role': 'user', 'content': 'hello'}
PR #114
Warning
Breaking Change: FunctionResultMessage
init param function_call: FunctionCall
replaced by function: Callable
Only the name of the function is needed when serializing the FunctionResultMessage
so we do not need the whole FunctionCall
. This simplifies creating @chatprompt
function where the chat contains a function call. e.g.
from magentic import (
chatprompt,
AssistantMessage,
UserMessage,
FunctionCall,
FunctionResultMessage,
)
def plus(a: int, b: int) -> int:
return a + b
@chatprompt(
UserMessage("Use the plus function to add 1 and 2."),
AssistantMessage(FunctionCall(plus, 1, 2)),
FunctionResultMessage(3, plus),
)
def do_math() -> str:
...
do_math()
# 'The sum of 1 and 2 is 3.'
PR #110
v0.14.1
What's Changed
- Add dependabot config by @jackmpcollins in #85
- Run poetry update by @jackmpcollins in #98
- Add ruff to pyproject.toml. Update ruff. by @jackmpcollins in #107
- Handle functions with *args and **kwargs by @jackmpcollins in #106
Dependabot
- Bump jinja2 from 3.1.2 to 3.1.3 by @dependabot in #81
- Bump jupyter-lsp from 2.2.0 to 2.2.2 by @dependabot in #82
- Bump notebook from 7.0.0 to 7.0.7 by @dependabot in #83
- Bump jupyterlab from 4.0.3 to 4.0.11 by @dependabot in #84
- Bump actions/checkout from 3 to 4 by @dependabot in #86
- Bump actions/setup-python from 4 to 5 by @dependabot in #87
- Bump pytest from 7.4.0 to 7.4.4 by @dependabot in #89
- Bump mypy from 1.4.1 to 1.8.0 by @dependabot in #92
- Bump openai from 1.1.1 to 1.9.0 by @dependabot in #88
- Bump litellm from 1.0.0 to 1.18.9 by @dependabot in #90
- Bump pytest from 7.4.4 to 8.0.0 by @dependabot in #93
- Bump pytest-asyncio from 0.21.1 to 0.23.4 by @dependabot in #94
- Bump openai from 1.10.0 to 1.12.0 by @dependabot in #105
- Bump pytest-asyncio from 0.23.3 to 0.23.5 by @dependabot in #103
- Bump litellm from 1.20.6 to 1.23.14 by @dependabot in #108
Full Changelog: v0.14.0...v0.14.1
v0.14.0
What's Changed
- Add stop param by @jackmpcollins and @mnicstruwig in #80
Full Changelog: v0.13.0...v0.14.0
v0.13.0
What's Changed
- Bump jupyter-server from 2.7.2 to 2.11.2 by @dependabot in #75
- Allow setting api_key in OpenaiChatModel by @jackmpcollins in #76
Full Changelog: v0.12.0...v0.13.0
v0.12.0
What's Changed
- Bump aiohttp from 3.8.6 to 3.9.0 by @dependabot in #70
- Add OpenAI seed param for deterministic sampling by @jackmpcollins in #71
Full Changelog: v0.11.1...v0.12.0
v0.11.1
v0.11.0
What's Changed
- Add support for Azure via OpenaiChatModel by @jackmpcollins in #65
Full Changelog: v0.10.0...v0.11.0
v0.10.0
v0.9.1
Full Changelog: v0.9.0...v0.9.1
v0.9.0
What's Changed
- Add LiteLLM backend by @jackmpcollins in #54
Full Changelog: v0.8.0...v0.9.0
Example of LiteLLM backend
from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel
@prompt(
"Talk to me! ",
model=LitellmChatModel("ollama/llama2"),
)
def say_hello() -> str:
...
say_hello()
See the Backend/LLM Configuration section of the README for how to set the LiteLLM backend as the default.