xAI
xAI is an AI company that develops the Grok family of large language models with advanced reasoning capabilities. The strands-xai package ( GitHub ) provides a community-maintained integration for the Strands Agents SDK, enabling seamless use of xAI’s Grok models with powerful server-side tools including real-time X platform access, web search, and code execution.
Installation
Section titled “Installation”xAI integration is available as a separate community package:
pip install strands-agents strands-xaiUsage
Section titled “Usage”After installing strands-xai, you can import and initialize the xAI provider.
from strands import Agentfrom strands_xai import xAIModel
model = xAIModel( client_args={"api_key": "xai-key"}, # or set XAI_API_KEY env var model_id="grok-4-1-fast-non-reasoning-latest",)
agent = Agent(model=model)response = agent("What's trending on X right now?")print(response.message)With Strands Tools
Section titled “With Strands Tools”You can use regular Strands tools just like with any other model provider:
from strands import Agent, toolfrom strands_xai import xAIModel
@tooldef calculate(expression: str) -> str: """Evaluate a mathematical expression.""" try: result = eval(expression) return f"Result: {result}" except Exception as e: return f"Error: {e}"
@tooldef get_weather(city: str) -> str: """Get the current weather for a city.""" return f"Weather in {city}: Sunny, 22°C"
model = xAIModel( client_args={"api_key": "xai-key"}, model_id="grok-4-1-fast-non-reasoning-latest",)
agent = Agent(model=model, tools=[calculate, get_weather])response = agent("What's 15 * 7 and what's the weather in Paris?")Configuration
Section titled “Configuration”Environment Variables
Section titled “Environment Variables”export XAI_API_KEY="your-api-key"Model Configuration
Section titled “Model Configuration”The supported configurations are:
| Parameter | Description | Example | Default |
|---|---|---|---|
model_id | Grok model identifier | grok-4-1-fast-reasoning-latest | grok-4-1-fast-non-reasoning-latest |
client_args | xAI client arguments | {"api_key": "xai-key"} | {} |
params | Model parameters dict | {"temperature": 0.7} | {} |
xai_tools | Server-side tools list | [web_search(), x_search()] | [] |
reasoning_effort | Reasoning level (grok-3-mini only) | "high" | None |
use_encrypted_content | Enable encrypted reasoning | True | False |
include | Optional features | ["inline_citations"] | [] |
Model Parameters (in params dict):
temperature- Sampling temperature (0.0-2.0), default: varies by modelmax_tokens- Maximum tokens in response, default: 2048top_p- Nucleus sampling parameter (0.0-1.0), default: varies by modelfrequency_penalty- Frequency penalty (-2.0 to 2.0), default: 0presence_penalty- Presence penalty (-2.0 to 2.0), default: 0
Available Models:
grok-4-1-fast-reasoning- Fast reasoning with encrypted thinkinggrok-4-1-fast-non-reasoning- Fast model without reasoninggrok-3-mini- Compact model with visible reasoninggrok-3-mini-non-reasoning- Compact model without reasoninggrok-4-1-reasoning- Full reasoning capabilitiesgrok-4-1-non-reasoning- Full model without reasoninggrok-code-fast-1- Code-optimized model
Advanced Features
Section titled “Advanced Features”Server-Side Tools
Section titled “Server-Side Tools”xAI models come with built-in server-side tools executed by xAI’s infrastructure, providing unique capabilities:
from strands_xai import xAIModelfrom strands import Agentfrom xai_sdk.tools import web_search, x_search, code_execution
# Server-side tools are automatically availablemodel = xAIModel( client_args={"api_key": "xai-key"}, model_id="grok-4-1-fast-reasoning-latest", xai_tools=[web_search(), x_search(), code_execution()],)
agent = Agent(model=model)# Model can autonomously use web_search, x_search, and code_execution toolsresponse = agent("Search X for recent AI developments and analyze the sentiment")Built-in Server-Side Tools:
- X Search: Real-time access to X platform posts, trends, and conversations
- Web Search: Live web search capabilities across diverse data sources
- Code Execution: Python code execution for data analysis and computation
Real-Time X Platform Access
Section titled “Real-Time X Platform Access”Grok has exclusive real-time access to X platform data:
# Access real-time X data and trendsresponse = agent("What are people saying about the latest tech announcements on X?")
# Analyze trending topicsresponse = agent("Find trending hashtags related to AI and summarize the discussions")Hybrid Tool Usage
Section titled “Hybrid Tool Usage”Combine xAI’s server-side tools with your own Strands tools for maximum flexibility:
from strands import Agent, toolfrom strands_xai import xAIModelfrom xai_sdk.tools import x_search
@tooldef calculate(expression: str) -> str: """Evaluate a mathematical expression.""" try: result = eval(expression) return f"Result: {result}" except Exception as e: return f"Error: {e}"
@tooldef get_weather(city: str) -> str: """Get the current weather for a city.""" return f"Weather in {city}: Sunny, 22°C"
model = xAIModel( client_args={"api_key": "xai-key"}, model_id="grok-4-1-fast-reasoning-latest", xai_tools=[x_search()], # Server-side X search)
# Combine server-side and client-side toolsagent = Agent(model=model, tools=[calculate, get_weather])response = agent("Search X for AI news, calculate 15*7, and tell me the weather in Tokyo")This powerful combination allows the agent to:
- Search X platform in real-time (server-side)
- Perform calculations (client-side)
- Get weather information (client-side)
- All in a single conversation!
Reasoning Models
Section titled “Reasoning Models”Access models with visible reasoning capabilities:
# Use reasoning model to see the thinking processmodel = xAIModel( client_args={"api_key": "xai-key"}, model_id="grok-3-mini", # Shows reasoning steps reasoning_effort="high", params={"temperature": 0.3})
agent = Agent(model=model)response = agent("Analyze the current AI market trends based on X discussions")