Skip to content

Releases: jackmpcollins/magentic

v0.7.0

02 Oct 07:21
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.6.0...v0.7.0


Chat Prompting

The @chatprompt decorator works just like @prompt but allows you to pass chat messages as a template rather than a single text prompt. This can be used to provide a system message or for few-shot prompting where you provide example responses to guide the model's output. Format fields denoted by curly braces {example} will be filled in all messages - use the escape_braces function to prevent a string being used as a template.

from magentic import chatprompt, AssistantMessage, SystemMessage, UserMessage
from magentic.chatprompt import escape_braces

from pydantic import BaseModel


class Quote(BaseModel):
    quote: str
    character: str


@chatprompt(
    SystemMessage("You are a movie buff."),
    UserMessage("What is your favorite quote from Harry Potter?"),
    AssistantMessage(
        Quote(
            quote="It does not do to dwell on dreams and forget to live.",
            character="Albus Dumbledore",
        )
    ),
    UserMessage("What is your favorite quote from {movie}?"),
)
def get_movie_quote(movie: str) -> Quote:
    ...


get_movie_quote("Iron Man")
# Quote(quote='I am Iron Man.', character='Tony Stark')

v0.6.0

25 Sep 06:04
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.5.0...v0.6.0

v0.5.0

15 Sep 08:59
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.4.1...v0.5.0


from magentic import prompt_chain


async def get_current_weather(location, unit="fahrenheit"):
    """Get the current weather in a given location"""
    return {
        "location": location,
        "temperature": "72",
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }


@prompt_chain(
    template="What's the weather like in {city}?",
    functions=[get_current_weather],
)
async def describe_weather(city: str) -> str:
    ...


output = await describe_weather("Boston")

v0.4.1

13 Sep 09:08
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.4.0...v0.4.1

v0.4.0

10 Sep 07:33
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.3.0...v0.4.0


Configuration

The order of precedence of configuration is

  1. Arguments passed when initializing an instance in Python
  2. Environment variables

The following environment variables can be set.

Environment Variable Description
MAGENTIC_OPENAI_MODEL OpenAI model e.g. "gpt-4"
MAGENTIC_OPENAI_TEMPERATURE OpenAI temperature, float

v0.3.0

09 Sep 09:51
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.2.0...v0.3.0


Object Streaming

Structured outputs can also be streamed from the LLM by using the return type annotation Iterable (or AsyncIterable). This allows each item to be processed while the next one is being generated. See the example in examples/quiz for how this can be used to improve user experience by quickly displaying/using the first item returned.

from collections.abc import Iterable
from time import time


@prompt("Create a Superhero team named {name}.")
def create_superhero_team(name: str) -> Iterable[Superhero]:
    ...


start_time = time()
for hero in create_superhero_team("The Food Dudes"):
    print(f"{time() - start_time:.2f}s : {hero}")

# 2.23s : name='Pizza Man' age=30 power='Can shoot pizza slices from his hands' enemies=['The Hungry Horde', 'The Junk Food Gang']
# 4.03s : name='Captain Carrot' age=35 power='Super strength and agility from eating carrots' enemies=['The Sugar Squad', 'The Greasy Gang']
# 6.05s : name='Ice Cream Girl' age=25 power='Can create ice cream out of thin air' enemies=['The Hot Sauce Squad', 'The Healthy Eaters']

v0.2.0

15 Aug 07:17
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.1.4...v0.2.0


Streaming

The StreamedStr (and AsyncStreamedStr) class can be used to stream the output of the LLM. This allows you to process the text while it is being generated, rather than receiving the whole output at once. Multiple StreamedStr can be created at the same time to stream LLM outputs concurrently. In the below example, generating the description for multiple countries takes approximately the same amount of time as for a single country.

from magentic import prompt, StreamedStr


@prompt("Tell me about {country}")
def describe_country(country: str) -> StreamedStr:
    ...


# Print the chunks while they are being received
for chunk in describe_country("Brazil"):
    print(chunk, end="")
# 'Brazil, officially known as the Federative Republic of Brazil, is ...'


# Generate text concurrently by creating the streams before consuming them
streamed_strs = [describe_country(c) for c in ["Australia", "Brazil", "Chile"]]
[str(s) for s in streamed_strs]
# ["Australia is a country ...", "Brazil, officially known as ...", "Chile, officially known as ..."]

v0.1.4

12 Aug 06:01
Compare
Choose a tag to compare

What's Changed

  • Remove ability to use function docstring as template by @jackmpcollins in #3
  • Raise StructuredOutputError from ValidationError to clarify error by @jackmpcollins in #4

Full Changelog: v0.1.3...v0.1.4

v0.1.3

31 Jul 05:57
Compare
Choose a tag to compare

Main changes

Commits

  • b7adc1a Support async prompt functions (#2)
  • 428596e Add example for RAG with wikipedia
  • 7e96aae Add test for parsing/serializing str|None
  • 2d676c9 Use all to explicitly export from top-level
  • c74d121 poetry add --group examples wikipedia
  • 6fe55d9 Add examples/quiz
  • e389547 Set --cov-report=term-missing for pytest-cov

Full Changelog: v0.1.2...v0.1.3

v0.1.2

21 Jul 07:41
Compare
Choose a tag to compare

Main Changes

  • Handle pydantic models as dictionaries values in DictFunctionSchema.serialize_args
  • Exclude unset parameters when creating FunctionCall in FunctionCallFunctionSchema.parse_args
  • Add FunctionCall.__eq__ method
  • Increase test coverage

Commits

  • 506d689 poetry update - address aiohttp CVE
  • feac090 Update README: improve first example, add more explanation
  • dab90cf poetry add jupyter --group examples
  • 992e65e poetry add pytest-cov
  • a05f057 Test FunctionCallFunctionSchema serialize_args, and FunctionCall
  • ed8e9d9 Test AnyFunctionSchema serialize_args
  • 606cb30 Test DictFunctionSchema serialize_args
  • ae6218e Test OrderedDict works with parse_args
  • 82c1d41 Tidy function_schemas creation in Model.complete

Full Changelog: v0.1.1...v0.1.2