Skip to content

xAI

xAI is an AI company that develops the Grok family of large language models with advanced reasoning capabilities. The strands-xai package ( GitHub ) provides a community-maintained integration for the Strands Agents SDK, enabling seamless use of xAI’s Grok models with powerful server-side tools including real-time X platform access, web search, and code execution.

xAI integration is available as a separate community package:

Terminal window
pip install strands-agents strands-xai

After installing strands-xai, you can import and initialize the xAI provider.

from strands import Agent
from strands_xai import xAIModel
model = xAIModel(
client_args={"api_key": "xai-key"}, # or set XAI_API_KEY env var
model_id="grok-4-1-fast-non-reasoning-latest",
)
agent = Agent(model=model)
response = agent("What's trending on X right now?")
print(response.message)

You can use regular Strands tools just like with any other model provider:

from strands import Agent, tool
from strands_xai import xAIModel
@tool
def calculate(expression: str) -> str:
"""Evaluate a mathematical expression."""
try:
result = eval(expression)
return f"Result: {result}"
except Exception as e:
return f"Error: {e}"
@tool
def get_weather(city: str) -> str:
"""Get the current weather for a city."""
return f"Weather in {city}: Sunny, 22°C"
model = xAIModel(
client_args={"api_key": "xai-key"},
model_id="grok-4-1-fast-non-reasoning-latest",
)
agent = Agent(model=model, tools=[calculate, get_weather])
response = agent("What's 15 * 7 and what's the weather in Paris?")
Terminal window
export XAI_API_KEY="your-api-key"

The supported configurations are:

ParameterDescriptionExampleDefault
model_idGrok model identifiergrok-4-1-fast-reasoning-latestgrok-4-1-fast-non-reasoning-latest
client_argsxAI client arguments{"api_key": "xai-key"}{}
paramsModel parameters dict{"temperature": 0.7}{}
xai_toolsServer-side tools list[web_search(), x_search()][]
reasoning_effortReasoning level (grok-3-mini only)"high"None
use_encrypted_contentEnable encrypted reasoningTrueFalse
includeOptional features["inline_citations"][]

Model Parameters (in params dict):

  • temperature - Sampling temperature (0.0-2.0), default: varies by model
  • max_tokens - Maximum tokens in response, default: 2048
  • top_p - Nucleus sampling parameter (0.0-1.0), default: varies by model
  • frequency_penalty - Frequency penalty (-2.0 to 2.0), default: 0
  • presence_penalty - Presence penalty (-2.0 to 2.0), default: 0

Available Models:

  • grok-4-1-fast-reasoning - Fast reasoning with encrypted thinking
  • grok-4-1-fast-non-reasoning - Fast model without reasoning
  • grok-3-mini - Compact model with visible reasoning
  • grok-3-mini-non-reasoning - Compact model without reasoning
  • grok-4-1-reasoning - Full reasoning capabilities
  • grok-4-1-non-reasoning - Full model without reasoning
  • grok-code-fast-1 - Code-optimized model

xAI models come with built-in server-side tools executed by xAI’s infrastructure, providing unique capabilities:

from strands_xai import xAIModel
from strands import Agent
from xai_sdk.tools import web_search, x_search, code_execution
# Server-side tools are automatically available
model = xAIModel(
client_args={"api_key": "xai-key"},
model_id="grok-4-1-fast-reasoning-latest",
xai_tools=[web_search(), x_search(), code_execution()],
)
agent = Agent(model=model)
# Model can autonomously use web_search, x_search, and code_execution tools
response = agent("Search X for recent AI developments and analyze the sentiment")

Built-in Server-Side Tools:

  • X Search: Real-time access to X platform posts, trends, and conversations
  • Web Search: Live web search capabilities across diverse data sources
  • Code Execution: Python code execution for data analysis and computation

Grok has exclusive real-time access to X platform data:

# Access real-time X data and trends
response = agent("What are people saying about the latest tech announcements on X?")
# Analyze trending topics
response = agent("Find trending hashtags related to AI and summarize the discussions")

Combine xAI’s server-side tools with your own Strands tools for maximum flexibility:

from strands import Agent, tool
from strands_xai import xAIModel
from xai_sdk.tools import x_search
@tool
def calculate(expression: str) -> str:
"""Evaluate a mathematical expression."""
try:
result = eval(expression)
return f"Result: {result}"
except Exception as e:
return f"Error: {e}"
@tool
def get_weather(city: str) -> str:
"""Get the current weather for a city."""
return f"Weather in {city}: Sunny, 22°C"
model = xAIModel(
client_args={"api_key": "xai-key"},
model_id="grok-4-1-fast-reasoning-latest",
xai_tools=[x_search()], # Server-side X search
)
# Combine server-side and client-side tools
agent = Agent(model=model, tools=[calculate, get_weather])
response = agent("Search X for AI news, calculate 15*7, and tell me the weather in Tokyo")

This powerful combination allows the agent to:

  • Search X platform in real-time (server-side)
  • Perform calculations (client-side)
  • Get weather information (client-side)
  • All in a single conversation!

Access models with visible reasoning capabilities:

# Use reasoning model to see the thinking process
model = xAIModel(
client_args={"api_key": "xai-key"},
model_id="grok-3-mini", # Shows reasoning steps
reasoning_effort="high",
params={"temperature": 0.3}
)
agent = Agent(model=model)
response = agent("Analyze the current AI market trends based on X discussions")