Skip to content

Releases: jackmpcollins/magentic

v0.15.0

20 Feb 02:41
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.14.1...v0.15.0


Message subclasses now have a format method to customize their value using the function arguments when used with the @chatprompt decorator. This allows for more complex templating than just the string templating that is currently supported. For a real use case with GPT-4-vision see in-progress PR #116

from typing import Any
from magentic import chatprompt
from magentic.chat_model.message import Message


class UppercaseMessage(Message[str]):
    def format(self, **kwargs: Any) -> "UppercaseMessage":
        upper_kwargs = {k: str(v).upper() for k, v in kwargs.items()}
        return UppercaseMessage(self.content.format(**upper_kwargs))


@chatprompt(
    UppercaseMessage("hello {x}"),
)
def say_hello(x: str) -> str:
    ...


say_hello.format("world")
# [UppercaseMessage('hello WORLD')]

In addition, the openai JSON serialization for custom Message subclasses can be added without modifying magentic itself! This uses the functools.singledispatch decorator. Together, these changes will allow users to create Message classes that they can template exactly as needed, which will be useful as the variety of types of inputs to LLMs increases.

from typing import Any

from magentic.chat_model.openai_chat_model import OpenaiMessageRole, message_to_openai_message
from magentic.chat_model.message import Message
from openai.types.chat import ChatCompletionMessageParam


class CustomMessage(Message[str]):
    def format(self, **kwargs: Any) -> "CustomMessage":
        return CustomMessage(self.content)


@message_to_openai_message.register
def _(message: CustomMessage) -> ChatCompletionMessageParam:
    return {"role": OpenaiMessageRole.USER.value, "content": message.content}


message_to_openai_message(CustomMessage("hello"))
# {'role': 'user', 'content': 'hello'}

PR #114


Warning

Breaking Change: FunctionResultMessage init param function_call: FunctionCall replaced by function: Callable

Only the name of the function is needed when serializing the FunctionResultMessage so we do not need the whole FunctionCall. This simplifies creating @chatprompt function where the chat contains a function call. e.g.

from magentic import (
    chatprompt,
    AssistantMessage,
    UserMessage,
    FunctionCall,
    FunctionResultMessage,
)


def plus(a: int, b: int) -> int:
    return a + b


@chatprompt(
    UserMessage("Use the plus function to add 1 and 2."),
    AssistantMessage(FunctionCall(plus, 1, 2)),
    FunctionResultMessage(3, plus),
)
def do_math() -> str:
    ...


do_math()
# 'The sum of 1 and 2 is 3.'

PR #110

v0.14.1

18 Feb 04:56
Compare
Choose a tag to compare

What's Changed

Dependabot

Full Changelog: v0.14.0...v0.14.1

v0.14.0

08 Jan 06:41
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.13.0...v0.14.0

v0.13.0

06 Dec 09:16
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.12.0...v0.13.0

v0.12.0

29 Nov 05:59
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.11.1...v0.12.0

v0.11.1

25 Nov 21:30
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.11.0...v0.11.1

v0.11.0

25 Nov 18:54
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.10.0...v0.11.0

v0.10.0

15 Nov 05:28
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.9.1...v0.10.0

v0.9.1

07 Nov 05:04
Compare
Choose a tag to compare

v0.9.0

06 Nov 05:08
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.8.0...v0.9.0


Example of LiteLLM backend

from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel


@prompt(
    "Talk to me! ",
    model=LitellmChatModel("ollama/llama2"),
)
def say_hello() -> str:
    ...


say_hello()

See the Backend/LLM Configuration section of the README for how to set the LiteLLM backend as the default.