Skip to content

Commit

Permalink
🚧 xai beginner tutorial
Browse files Browse the repository at this point in the history
  • Loading branch information
tommydebisi committed Dec 11, 2024
1 parent 08f873c commit 22c37e0
Showing 1 changed file with 207 additions and 0 deletions.
207 changes: 207 additions & 0 deletions tutorials/en/xai-beginner-tutorial.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,207 @@
---
title: "Getting Started with xAI’s Grok API: Your First AI Integration"
description: "Discover how to use xAI’s Grok API to create powerful AI-driven applications with ease."
image: "https://imagedelivery.net/K11gkZF3xaVyYzFESMdWIQ/ae694e47-5544-44a3-aa71-4833ab837200/full"
authorUsername: "TommyA"
---

# Getting Started with xAI’s Grok API: Your First AI Integration

Hello! Tommy here, and I’m excited to guide you through xAI’s Grok API! This tutorial is designed to help you feel confident and comfortable as you start building with the Grok API, all within the user-friendly environment of Google Colab.

We’ll explore how to interact with the API in different ways, using tools like the Anthropic SDK, OpenAI Python package, LangChain-OpenAI package, and Python’s `requests` library. Whether you’re new to working with APIs or just beginning your AI journey, this guide will make it easy to understand and apply.

By the end of this tutorial, you’ll have a solid foundation for working with the Grok API and be ready to integrate it into your own AI projects. Plus, I’ll include a link to the Colab notebook I used for this tutorial at the end so you can try everything hands-on. Let’s dive in and get started! 🎉

## **Setting Up Your Environment**

To start interacting with the xAI Grok API, you need to prepare your Google Colab environment. This involves installing the necessary libraries, setting your API key securely, and ensuring compatibility.

### **Step 1: Install Required Libraries**

Run this command in your Colab notebook to install all the required libraries:

```bash
%%capture
!pip install anthropic openai langchain-openai httpx==0.27.2 --force-reinstall --quiet
```

The `--force-reinstall` flag ensures that the correct versions of the packages are installed, avoiding dependency conflicts.

### **Step 2: Restart the Colab Kernel**

After installing the libraries, restart the Colab kernel to avoid runtime issues:

```bash
import os
os.kill(os.getpid(), 9)
```

Once the kernel restarts, re-run the installation cell if needed.

### **Step 3: Set Your API Key**

In Google Colab, you can use the `userdata` module to securely retrieve your API key. Here's how you can set it up:

```bash
from google.colab import userdata

# Retrieve the API key from Colab's secure storage
api_key = userdata.get('XAI_API_KEY')
```

If you prefer, you can directly replace `userdata.get('XAI_API_KEY')` with your API key like this:

```python
api_key = "your_xai_api_key"
```

This API key will be used in all the following examples to authenticate your requests to the **Grok API**.

## **Interacting with the Grok API**

Here’s how to interact with the Grok API using four different methods: **Anthropic SDK**, **OpenAI Python package**, **LangChain-OpenAI package**, and **cURL**.

1. ### **Using the Anthropic SDK**

The Anthropic SDK is a simple way to send requests to the Grok API.

```python
from anthropic import Anthropic

# Set up the Anthropic client
client = Anthropic(
api_key=api_key,
base_url="https://api.x.ai",
)

# Send a message
message = client.messages.create(
model="grok-beta",
max_tokens=128,
system="You are a top-notch English tutor, guide me appropriately",
messages=[
{"role": "user", "content": "How do I differentiate between an adverb and an adjective?"},
],
)

# Print the response
print(message.content[0].text)
```

**Explanation**:

- The `Anthropic` client initializes with your API key and the xAI base URL.
- The `messages.create` method sends a system role (to define the assistant’s behavior) and a user query.
- The model processes the query and generates a response.

2. ### **Using the OpenAI Python Package**

The OpenAI package offers another way to interact with the Grok API. Make sure to fix any installation issues with the earlier kernel restart step.

```python
from openai import OpenAI

# Configure the OpenAI client
client = OpenAI(
api_key=api_key,
base_url="https://api.x.ai/v1",
)

# Send a chat completion request
completion = client.chat.completions.create(
model="grok-beta",
messages=[
{"role": "system", "content": "You are a top-notch English tutor, guide me appropriately"},
{"role": "user", "content": "How do I differentiate between an adverb and an adjective?"},
],
)

# Print the response
print(completion.choices[0].message.content)
```

3. ### **Using LangChain-OpenAI**

LangChain enables more advanced workflows by integrating OpenAI models with modular features.

```python
from langchain_openai import OpenAI
import os

# Set up LangChain for OpenAI
os.environ["OPENAI_API_KEY"] = "your_xai_api_key"
os.environ["OPENAI_API_BASE"] = "https://api.x.ai/v1"

# Initialize the OpenAI model
llm = OpenAI(
model="grok-beta",
max_tokens=50,
)

# Generate text
prompt = "Write a story about a brave knight."
output = llm(prompt)
print(f"AI-generated response:n{output}")
```

**Explanation**:

- The LangChain package allows for prompt chaining and advanced task management.
- The model generates output based on a user-provided prompt.

4. ### **Using Python’s `requests` Library**

The `requests` library is a Python-native way to interact with the Grok API.

```python
import requests


# Define the endpoint and headers
url = "https://api.x.ai/v1/chat/completions"
headers = {
"Authorization": f"Bearer {api_key}",
"Content-Type": "application/json"
}
data = {
"model": "grok-beta",
"messages": [
{
"role": "system",
"content": "You are a top-notch English tutor, guide me appropriately"
},
{
"role": "user",
"content": "Explain intonations to me"
}
],
"stream": False,
"temperature": 0
}


# Send the POST request
response = requests.post(url, headers=headers, json=data)


# Print the response
result = response.json()
print(result['choices'][0]['message']['content'])
```

## **Conclusion**

Congratulations on taking your first steps with xAI’s Grok API! In this tutorial, we set up your Colab environment, configured your API key, and explored four different ways to interact with the API: Anthropic SDK, OpenAI Python package, LangChain-OpenAI, and cURL.

If you successfully generated responses from the API, give yourself a big pat on the back—you’re well on your way to building with xAI! Remember, the best way to learn is by experimenting and trying new ideas. The journey has just begun, and the possibilities are endless. 🎉

Find the link to the **Google colab notebook** used for this tutorial [here](https://colab.research.google.com/drive/16bzV5EtJ2DSxJh73nLgkDNtckyjG_-4r?usp=sharing).

## **What’s Next?**

Now that you’ve learned how to interact with xAI’s Grok API, here are a few ideas for what you can try next:

- Experiment with different system roles and prompts to see how they change the API’s responses.
- Adjust parameters like `temperature` and `max_tokens` to customize the output for your specific use cases.
- Explore building simple applications like a chatbot, a content generator, or even an AI-powered quiz tool using the methods covered in this tutorial.

0 comments on commit 22c37e0

Please sign in to comment.