A library for interacting with Claude AI and ChatGPT.
At present only text requests and responses are supported.
Install via Composer:
composer require elliotjreed/ai
There are two classes, one for Claude AI, and one for ChatGPT.
$claude = new ElliotJReed\AI\ClaudeAI\Prompt('API KEY', 'claude-3-haiku-20240307');
$chatGPT = new ElliotJReed\AI\ChatGPT\Prompt('API KEY', 'gpt-4o-mini');
Each take the first argument in the constructor as your API key, and the second argument as the model you want to use.
You can optionally provide a Guzzle HTTP client:
$claude = new ElliotJReed\AI\ClaudeAI\Prompt('API KEY', 'claude-3-haiku-20240307', new \GuzzleHttp\Client());
$chatGPT = new ElliotJReed\AI\ChatGPT\Prompt('API KEY', 'gpt-4o-mini', new \GuzzleHttp\Client());
This could be useful where you are using a framework such as Symfony, you could autowire the service and reference a configured Guzzle client.
Here's an example of a Symfony integration in the services.yaml
file:
guzzle.client.ai:
class: GuzzleHttp\Client
arguments:
- {
timeout: 10,
headers: {
'User-Agent': 'My Symfony Project'
}
}
ElliotJReed\AI\ClaudeAI\Prompt:
class: ElliotJReed\AI\ClaudeAI\Prompt
arguments:
$apiKey: '%env(string:CLAUDE_API_KEY)%'
$model: 'claude-3-haiku-20240307'
$client: '@guzzle.client.ai'
ElliotJReed\AI\ChatGPT\Prompt:
class: ElliotJReed\AI\ChatGPT\Prompt
arguments:
$apiKey: '%env(string:CHATGPT_API_KEY)%'
$model: 'gpt-4o-mini'
$client: '@guzzle.client.ai'
For a really simple request and response:
<?php
require_once __DIR__ . '/vendor/autoload.php';
$prompt = new ElliotJReed\AI\ClaudeAI\Prompt('API KEY', 'claude-3-haiku-20240307');
$request = (new ElliotJReed\AI\Entity\Request())
->setInput('Which programming language will outlive humanity?');
$response = $prompt->send($request);
echo 'Used input tokens: ' . $response->getUsage()->getInputTokens() . \PHP_EOL;
echo 'Used output tokens: ' . $response->getUsage()->getOutputTokens() . \PHP_EOL;
echo 'Response from AI: ' . $response->getContent() . \PHP_EOL;
You can provide a role too, as well as additional context, data, examples, setting the temperature (between 0 and 1, basically how "creative" you want the AI to be), and the maximum tokens to use (recommended if the user input is from a indirect source, for example an online chatbot):
<?php
require_once __DIR__ . '/vendor/autoload.php';
$prompt = new ElliotJReed\AI\ClaudeAI\Prompt('API KEY', 'claude-3-haiku-20240307');
$request = (new ElliotJReed\AI\Entity\Request())
->setContext('The user input is coming from a software development advice website which provides information to aspiring software developers.')
->setRole('You are an expert in software development')
->setInstructions('Answer the user\'s query in a friendly, and clear and concise manner')
->setInput('Which programming language will outlive humanity?')
->setTemperature(0.5)
->setMaximumTokens(600)
->setExamples([(new ElliotJReed\AI\Entity\Example())
->setPrompt('Which programming language do you think will still be used in the year 3125?')
->setResponse('I think PHP will be around for at least another 7 million years.')
])
->setData('You could add some JSON, CSV, or Yaml data here.');
$response = $prompt->send($request);
echo 'Used input tokens: ' . $response->getUsage()->getInputTokens() . \PHP_EOL;
echo 'Used output tokens: ' . $response->getUsage()->getOutputTokens() . \PHP_EOL;
echo 'Response from AI: ' . $response->getContent() . \PHP_EOL;
If you want to keep a conversation going (like you would on ChatGPT or Claude's website or app), you can pass through the history from the previous response to a new request:
<?php
require_once __DIR__ . '/vendor/autoload.php';
$prompt = new ElliotJReed\AI\ClaudeAI\Prompt('API KEY', 'claude-3-haiku-20240307');
$request = (new ElliotJReed\AI\Entity\Request())
->setContext('The user will ask various ethical questions posited through an online chat interface.')
->setRole('You are a philosopher and ethicist who favours utilitarian methodology when answering ethical questions.')
->setInstructions('Answer ethical questions using British English only, referencing the works of Jeremy Bentham, John Stuart Mill, and Peter Singer.')
->setInput('Should we all be vegan?')
->setTemperature(0.8)
->setMaximumTokens(600);
$response = $prompt->send($request);
echo 'Used input tokens: ' . $response->getUsage()->getInputTokens() . \PHP_EOL;
echo 'Used output tokens: ' . $response->getUsage()->getOutputTokens() . \PHP_EOL . \PHP_EOL;
echo 'Response from AI: ' . $response->getContent() . \PHP_EOL;
$secondResponse = $prompt->send($request
->setInput('Elaborate on your response, providing 3 bullet points for arguing in favour of veganism, and 3 bullet points arguing against.')
->setHistory($response->getHistory()));
echo 'Used input tokens: ' . $secondResponse->getUsage()->getInputTokens() . \PHP_EOL;
echo 'Used output tokens: ' . $secondResponse->getUsage()->getOutputTokens() . \PHP_EOL . \PHP_EOL;
echo 'Response from AI: ' . $response->getContent() . \PHP_EOL;
For a really simple request and response:
<?php
require_once __DIR__ . '/vendor/autoload.php';
$prompt = new ElliotJReed\AI\ChatGPT\Prompt('API KEY', 'gpt-4o-mini');
$request = (new ElliotJReed\AI\Entity\Request())
->setInput('Which programming language will outlive humanity?');
$response = $prompt->send($request);
echo 'Used input tokens: ' . $response->getUsage()->getInputTokens() . \PHP_EOL;
echo 'Used output tokens: ' . $response->getUsage()->getOutputTokens() . \PHP_EOL;
echo 'Response from AI: ' . $response->getContent() . \PHP_EOL;
You can provide a role too, as well as additional context, data, examples, setting the temperature (between 0 and 1, basically how "creative" you want the AI to be), and the maximum tokens to use (recommended if the user input is from a indirect source, for example an online chatbot):
<?php
require_once __DIR__ . '/vendor/autoload.php';
$prompt = new ElliotJReed\AI\ChatGPT\Prompt('API KEY', 'gpt-4o-mini');
$request = (new ElliotJReed\AI\Entity\Request())
->setContext('The user input is coming from a software development advice website which provides information to aspiring software developers.')
->setRole('You are an expert in software development')
->setInstructions('Answer the user\'s query in a friendly, and clear and concise manner')
->setInput('Which programming language will outlive humanity?')
->setTemperature(0.5)
->setMaximumTokens(600)
->setExamples([(new ElliotJReed\AI\Entity\Example())
->setPrompt('Which programming language do you think will still be used in the year 3125?')
->setResponse('I think PHP will be around for at least another 7 million years.')
])
->setData('You could add some JSON, CSV, or Yaml data here.');
$response = $prompt->send($request);
echo 'Used input tokens: ' . $response->getUsage()->getInputTokens() . \PHP_EOL;
echo 'Used output tokens: ' . $response->getUsage()->getOutputTokens() . \PHP_EOL;
echo 'Response from AI: ' . $response->getContent() . \PHP_EOL;
If you want to keep a conversation going (like you would on ChatGPT or Claude's website or app), you can pass through the history from the previous response to a new request:
<?php
require_once __DIR__ . '/vendor/autoload.php';
$prompt = new ElliotJReed\AI\ChatGPT\Prompt('API KEY', 'gpt-4o-mini');
$request = (new ElliotJReed\AI\Entity\Request())
->setContext('The user will ask various ethical questions posited through an online chat interface.')
->setRole('You are a philosopher and ethicist who favours utilitarian methodology when answering ethical questions.')
->setInstructions('Answer ethical questions using British English only, referencing the works of Jeremy Bentham, John Stuart Mill, and Peter Singer.')
->setInput('Should we all be vegan?')
->setTemperature(0.8)
->setMaximumTokens(600);
$response = $prompt->send($request);
echo 'Used input tokens: ' . $response->getUsage()->getInputTokens() . \PHP_EOL;
echo 'Used output tokens: ' . $response->getUsage()->getOutputTokens() . \PHP_EOL . \PHP_EOL;
echo 'Response from AI: ' . $response->getContent() . \PHP_EOL;
$secondResponse = $prompt->send($request
->setInput('Elaborate on your response, providing 3 bullet points for arguing in favour of veganism, and 3 bullet points arguing against.')
->setHistory($response->getHistory()));
echo 'Used input tokens: ' . $secondResponse->getUsage()->getInputTokens() . \PHP_EOL;
echo 'Used output tokens: ' . $secondResponse->getUsage()->getOutputTokens() . \PHP_EOL . \PHP_EOL;
echo 'Response from AI: ' . $response->getContent() . \PHP_EOL;
PHP 8.2 or above and Composer is expected to be installed.
For instructions on how to install Composer visit getcomposer.org.
After cloning this repository, change into the newly created directory and run:
composer install
or if you have installed Composer locally in your current directory:
php composer.phar install
This will install all dependencies needed for the project.
Henceforth, the rest of this README will assume composer
is installed globally (ie. if you are using composer.phar
you will need to use composer.phar
instead of composer
in your terminal / command-line).
Unit testing in this project is via PHPUnit.
All unit tests can be run by executing:
composer phpunit
To have PHPUnit stop and report on the first failing test encountered, run:
composer phpunit:debug
A standard for code style can be important when working in teams, as it means that less time is spent by developers processing what they are reading (as everything will be consistent).
Code formatting is automated via PHP-CS-Fixer. PHP-CS-Fixer will not format line lengths which do form part of the PSR-2 coding standards so these will product warnings when checked by PHP Code Sniffer.
These can be run by executing:
composer phpcs
All the tests can be run by executing:
composer test
Checking for outdated Composer dependencies can be performed by executing:
composer outdated
Checking that the composer.json is valid can be performed by executing:
composer validate --no-check-publish
If GNU Make is installed, you can replace the above composer
command prefixes with make
.
All the tests can be run by executing:
make test
Specific output formats better suited to CI platforms are included as Composer scripts.
To output unit test coverage in text and Clover XML format (which can be used for services such as Coveralls):
composer phpunit:ci
To output PHP-CS-Fixer (dry run) and PHPCS results in checkstyle format (which GitHub Actions will use to output a readable format):
composer phpcs:ci
Look at the example in .github/workflows/php.yml.
This project is licensed under the MIT License - see the LICENCE.md file for details.