Claude Anthropic API integration for LegionIO. Provides runners for creating messages, listing models, counting tokens, and managing message batches.
Wraps the Anthropic Claude REST API as named runners consumable by any LegionIO task chain. Use this extension when you need direct access to the full Anthropic API surface (including async Batches) within the LEX runner/actor lifecycle. For simple chat/embed workflows, consider legion-llm instead.
gem install lex-claudeOr add to your Gemfile:
gem 'lex-claude'create- Create a message (chat completion) with Claudecount_tokens- Count input tokens for a message request
list- List available Claude modelsretrieve- Get details for a specific model
create_batch- Create an asynchronous message batchlist_batches- List message batchesretrieve_batch- Get details for a specific batchcancel_batch- Cancel an in-progress batchbatch_results- Retrieve results for a completed batch
Set your API key in your LegionIO settings:
{
"claude": {
"api_key": "sk-ant-..."
}
}require 'legion/extensions/claude/client'
client = Legion::Extensions::Claude::Client.new(api_key: ENV['ANTHROPIC_API_KEY'])
# Create a message
result = client.create(
model: 'claude-opus-4-6',
messages: [{ role: 'user', content: 'Hello, Claude!' }],
max_tokens: 1024
)
puts result[:result]['content'].first['text']
# List models
models = client.list
puts models[:result]['data'].map { |m| m['id'] }
# Count tokens
tokens = client.count_tokens(
model: 'claude-opus-4-6',
messages: [{ role: 'user', content: 'How many tokens is this?' }]
)
puts tokens[:result]['input_tokens']
# Create an async batch
batch = client.create_batch(
requests: [
{ custom_id: 'req-1', params: { model: 'claude-opus-4-6',
messages: [{ role: 'user', content: 'Hello' }],
max_tokens: 100 } }
]
)
puts batch[:result]['id']faraday>= 2.0 - HTTP clientmulti_json- JSON parser abstraction
- Ruby >= 3.4
- LegionIO framework (optional for standalone client usage)
- Anthropic API key
lex-bedrock— Access Claude models via AWS Bedrock instead of Anthropic directlylegion-llm— High-level LLM interface including Anthropic via ruby_llmextensions-ai/CLAUDE.md— Architecture patterns shared across all AI extensions
MIT