@shinyaz

Strands Agents SDK Multi-Agent — Structure Workflows with Graph

Table of Contents

Introduction

In the previous article, we learned how agents autonomously hand off tasks in a Swarm. But since Swarm agents decide autonomously, execution order is unpredictable. Sometimes you need explicit control: "research → analysis → report."

Just define nodes and edges with GraphBuilder and a deterministic workflow based on dependencies runs automatically.

In this article, we'll try:

  1. Sequential pipeline — research → analysis → report in series
  2. Parallel processing — Branch and merge for concurrent agent execution
  3. Conditional branching — Control edge traversal with condition functions
  4. Feedback loop — Build a review-revise iterative workflow

See the official documentation at Graph.

Setup

Use the same environment from Part 1. All examples use the same model configuration and can be run as independent .py files. Write the common setup at the top, then add each example's code below it.

Python (common setup)
from strands import Agent
from strands.models import BedrockModel
from strands.multiagent import GraphBuilder
from strands.multiagent.graph import GraphState
 
bedrock_model = BedrockModel(
    model_id="us.anthropic.claude-sonnet-4-20250514-v1:0",
    region_name="us-east-1",
)

Sequential Pipeline — Series Configuration

Let's build a 3-step pipeline: "research a topic, analyze the findings, and compile a report."

First, the key differences from Swarm:

SwarmGraph
Execution orderAgents decide autonomouslyExplicitly defined by edges
Parallel executionNone (sequential handoffs)Nodes without dependencies run in parallel
Use caseExploratory tasksStructured workflows

Add nodes with GraphBuilder and define dependencies with edges.

Python (Graph construction and execution)
builder = GraphBuilder()
builder.add_node(researcher, "research")
builder.add_node(analyst, "analysis")
builder.add_node(report_writer, "report")
builder.add_edge("research", "analysis")
builder.add_edge("analysis", "report")
builder.set_entry_point("research")
 
graph = builder.build()
result = graph("What is Amazon Bedrock?")

add_node registers agents as nodes, add_edge defines dependencies. set_entry_point specifies the starting node.

01_sequential.py full code (copy-paste)
01_sequential.py
from strands import Agent
from strands.models import BedrockModel
from strands.multiagent import GraphBuilder
 
bedrock_model = BedrockModel(
    model_id="us.anthropic.claude-sonnet-4-20250514-v1:0",
    region_name="us-east-1",
)
 
researcher = Agent(
    name="researcher", model=bedrock_model,
    system_prompt="You are a research specialist. Provide key facts about the given topic in bullet points.",
    callback_handler=None,
)
analyst = Agent(
    name="analyst", model=bedrock_model,
    system_prompt="You are an analysis specialist. Analyze the research provided and identify the top 3 insights.",
    callback_handler=None,
)
report_writer = Agent(
    name="report_writer", model=bedrock_model,
    system_prompt="You are a report writing specialist. Write a concise 2-3 sentence summary based on the analysis.",
    callback_handler=None,
)
 
builder = GraphBuilder()
builder.add_node(researcher, "research")
builder.add_node(analyst, "analysis")
builder.add_node(report_writer, "report")
builder.add_edge("research", "analysis")
builder.add_edge("analysis", "report")
builder.set_entry_point("research")
 
graph = builder.build()
result = graph("What is Amazon Bedrock?")
 
print(f"Status: {result.status}")
print(f"Execution order: {[node.node_id for node in result.execution_order]}")
 
for node_id, node_result in result.results.items():
    print(f"\n--- {node_id} ---")
    print(f"  Status: {node_result.status}")
    print(f"  Time: {node_result.execution_time}ms")
Terminal
python -u 01_sequential.py

Result

Output
Status: Status.COMPLETED
Execution order: ['research', 'analysis', 'report']
 
--- research ---
  Status: Status.COMPLETED
  Time: 9915ms
 
--- analysis ---
  Status: Status.COMPLETED
  Time: 4876ms
 
--- report ---
  Status: Status.COMPLETED
  Time: 3787ms

execution_order is ['research', 'analysis', 'report'] — executed in the exact order defined by edges. Per-node execution times are also available.

When calling build(), you'll see a warning: "Graph without execution limits may run indefinitely if cycles exist." This is fine for DAGs (acyclic graphs), but for the feedback loop section below, set_max_node_executions is required.

Parallel Processing — Branch and Merge

Let's build a workflow where "an analysis team and a fact-checking team verify research results simultaneously, then merge both into a report."

Run analysis and fact_check in parallel after research, then merge results into report.

Python (Graph construction and execution)
builder = GraphBuilder()
builder.add_node(researcher, "research")
builder.add_node(analyst, "analysis")
builder.add_node(fact_checker, "fact_check")
builder.add_node(report_writer, "report")
 
builder.add_edge("research", "analysis")
builder.add_edge("research", "fact_check")
builder.add_edge("analysis", "report")
builder.add_edge("fact_check", "report")
 
builder.set_entry_point("research")
graph = builder.build()
result = graph("What is Amazon Bedrock?")

Just draw edges from research to both analysis and fact_check for parallel execution.

Python's Graph uses OR semantics — report fires when any single incoming edge's source completes. In this execution, fact_check completed first triggering report's first firing, then analysis completed triggering the second. Ultimately, report can reference results from both.

02_parallel.py full code (copy-paste)
02_parallel.py
from strands import Agent
from strands.models import BedrockModel
from strands.multiagent import GraphBuilder
 
bedrock_model = BedrockModel(
    model_id="us.anthropic.claude-sonnet-4-20250514-v1:0",
    region_name="us-east-1",
)
 
researcher = Agent(
    name="researcher", model=bedrock_model,
    system_prompt="You are a research specialist. Provide key facts about the given topic.",
    callback_handler=None,
)
analyst = Agent(
    name="analyst", model=bedrock_model,
    system_prompt="You are an analysis specialist. Analyze the research and identify key insights.",
    callback_handler=None,
)
fact_checker = Agent(
    name="fact_checker", model=bedrock_model,
    system_prompt="You are a fact-checking specialist. Verify the claims in the research and note any issues.",
    callback_handler=None,
)
report_writer = Agent(
    name="report_writer", model=bedrock_model,
    system_prompt="You are a report writing specialist. Write a concise summary combining the analysis and fact-check results.",
    callback_handler=None,
)
 
builder = GraphBuilder()
builder.add_node(researcher, "research")
builder.add_node(analyst, "analysis")
builder.add_node(fact_checker, "fact_check")
builder.add_node(report_writer, "report")
builder.add_edge("research", "analysis")
builder.add_edge("research", "fact_check")
builder.add_edge("analysis", "report")
builder.add_edge("fact_check", "report")
builder.set_entry_point("research")
 
graph = builder.build()
result = graph("What is Amazon Bedrock?")
 
print(f"Status: {result.status}")
print(f"Execution order: {[node.node_id for node in result.execution_order]}")
print(f"Total nodes: {result.total_nodes}")
print(f"Completed nodes: {result.completed_nodes}")
 
for node_id, node_result in result.results.items():
    print(f"\n--- {node_id} ---")
    print(f"  Status: {node_result.status}")
    print(f"  Time: {node_result.execution_time}ms")
Terminal
python -u 02_parallel.py

Result

Output
Status: Status.COMPLETED
Execution order: ['research', 'fact_check', 'analysis', 'report']
Total nodes: 4
Completed nodes: 4
 
--- research ---
  Status: Status.COMPLETED
  Time: 11632ms
 
--- fact_check ---
  Status: Status.COMPLETED
  Time: 8699ms
 
--- analysis ---
  Status: Status.COMPLETED
  Time: 11888ms
 
--- report ---
  Status: Status.COMPLETED
  Time: 7396ms

execution_order is ['research', 'fact_check', 'analysis', 'report'] — after research, fact_check and analysis ran, then report last. Since fact_check (8.7s) and analysis (11.9s) run in parallel, total execution time is shorter than sequential.

Conditional Branching — Controlling Edges with condition Functions

Let's build a workflow that "classifies user questions and routes technical questions to a tech specialist, business questions to a business specialist."

The graphs so far had fixed paths. Pass a function to add_edge's condition parameter. The edge is only traversed when the function returns True.

Python (Graph construction and execution)
builder = GraphBuilder()
builder.add_node(classifier, "classifier")
builder.add_node(tech_specialist, "tech")
builder.add_node(business_specialist, "business")
builder.add_edge("classifier", "tech", condition=is_technical)
builder.add_edge("classifier", "business", condition=is_business)
builder.set_entry_point("classifier")
 
graph = builder.build()
result = graph("How does container orchestration work in Kubernetes?")

The condition function receives GraphState and references the previous node's result to return True / False. GraphState is an object that holds the current execution state of the Graph (results and status of each node). If the classifier outputs "technical," it routes to tech; if "business," to business.

03_conditional.py full code (copy-paste)
03_conditional.py
from strands import Agent
from strands.models import BedrockModel
from strands.multiagent import GraphBuilder
from strands.multiagent.graph import GraphState
 
bedrock_model = BedrockModel(
    model_id="us.anthropic.claude-sonnet-4-20250514-v1:0",
    region_name="us-east-1",
)
 
classifier = Agent(
    name="classifier", model=bedrock_model,
    system_prompt="You are a classifier. Classify the input as either 'technical' or 'business'. Reply with ONLY the word 'technical' or 'business'.",
    callback_handler=None,
)
tech_specialist = Agent(
    name="tech_specialist", model=bedrock_model,
    system_prompt="You are a technical specialist. Provide a technical explanation in 2-3 sentences.",
    callback_handler=None,
)
business_specialist = Agent(
    name="business_specialist", model=bedrock_model,
    system_prompt="You are a business specialist. Explain the business impact in 2-3 sentences.",
    callback_handler=None,
)
 
def is_technical(state: GraphState) -> bool:
    result = state.results.get("classifier")
    if not result or not result.result:
        return False
    for block in result.result.message.get('content', []):
        if 'text' in block and 'technical' in block['text'].lower():
            return True
    return False
 
def is_business(state: GraphState) -> bool:
    result = state.results.get("classifier")
    if not result or not result.result:
        return False
    for block in result.result.message.get('content', []):
        if 'text' in block and 'business' in block['text'].lower():
            return True
    return False
 
builder = GraphBuilder()
builder.add_node(classifier, "classifier")
builder.add_node(tech_specialist, "tech")
builder.add_node(business_specialist, "business")
builder.add_edge("classifier", "tech", condition=is_technical)
builder.add_edge("classifier", "business", condition=is_business)
builder.set_entry_point("classifier")
 
graph = builder.build()
 
print("=== Technical question ===")
result1 = graph("How does container orchestration work in Kubernetes?")
print(f"Execution order: {[node.node_id for node in result1.execution_order]}")
 
print("\n=== Business question ===")
graph2 = builder.build()
result2 = graph2("What is the ROI of migrating to cloud computing?")
print(f"Execution order: {[node.node_id for node in result2.execution_order]}")
Terminal
python -u 03_conditional.py

Result

Output
=== Technical question ===
Execution order: ['classifier', 'tech']
 
=== Business question ===
Execution order: ['classifier', 'business']

Technical questions route to classifier → tech, business questions to classifier → business. Edges where the condition returns False are not traversed, so unnecessary nodes don't execute.

Feedback Loop — Review-Revise Iteration

Let's build a workflow that "writes a haiku, has a reviewer check quality, revises if rejected, and publishes when approved."

Using conditional edges, add a reverse edge from reviewer back to draft_writer to create a feedback loop.

Python (Graph construction and execution)
builder = GraphBuilder()
builder.add_node(draft_writer, "draft_writer")
builder.add_node(reviewer, "reviewer")
builder.add_node(publisher, "publisher")
 
builder.add_edge("draft_writer", "reviewer")
builder.add_edge("reviewer", "draft_writer", condition=needs_revision)
builder.add_edge("reviewer", "publisher", condition=is_approved)
 
builder.set_entry_point("draft_writer")
builder.set_max_node_executions(10)
builder.set_execution_timeout(300)
builder.reset_on_revisit(True)
 
graph = builder.build()
result = graph("Write a haiku about Python programming")

Three key points:

  • set_entry_point("draft_writer") — Cyclic graphs fail auto-detection, so explicit entry point is required
  • set_max_node_executions(10) — Caps total node executions to prevent infinite loops
  • reset_on_revisit(True) — Resets node state when revisited. Without this, previous conversation history persists and confuses the agent
04_feedback.py full code (copy-paste)
04_feedback.py
from strands import Agent
from strands.models import BedrockModel
from strands.multiagent import GraphBuilder
from strands.multiagent.graph import GraphState
 
bedrock_model = BedrockModel(
    model_id="us.anthropic.claude-sonnet-4-20250514-v1:0",
    region_name="us-east-1",
)
 
draft_writer = Agent(
    name="draft_writer", model=bedrock_model,
    system_prompt="You are a draft writer. Write or revise a haiku about programming based on the feedback provided. Output ONLY the haiku (3 lines).",
    callback_handler=None,
)
reviewer = Agent(
    name="reviewer", model=bedrock_model,
    system_prompt="""You are a strict haiku reviewer. Check if the text is a valid haiku (5-7-5 syllable pattern).
If it needs revision, start your response with 'REVISION NEEDED:' followed by specific feedback.
If it's approved, start your response with 'APPROVED:' followed by a brief comment.""",
    callback_handler=None,
)
publisher = Agent(
    name="publisher", model=bedrock_model,
    system_prompt="You are a publisher. Format the approved haiku nicely with a title.",
    callback_handler=None,
)
 
def needs_revision(state: GraphState) -> bool:
    result = state.results.get("reviewer")
    if not result or not result.result:
        return False
    for block in result.result.message.get('content', []):
        if 'text' in block and 'REVISION NEEDED' in block['text'].upper():
            return True
    return False
 
def is_approved(state: GraphState) -> bool:
    result = state.results.get("reviewer")
    if not result or not result.result:
        return False
    for block in result.result.message.get('content', []):
        if 'text' in block and 'APPROVED' in block['text'].upper():
            return True
    return False
 
builder = GraphBuilder()
builder.add_node(draft_writer, "draft_writer")
builder.add_node(reviewer, "reviewer")
builder.add_node(publisher, "publisher")
builder.add_edge("draft_writer", "reviewer")
builder.add_edge("reviewer", "draft_writer", condition=needs_revision)
builder.add_edge("reviewer", "publisher", condition=is_approved)
builder.set_entry_point("draft_writer")
builder.set_max_node_executions(10)
builder.set_execution_timeout(300)
builder.reset_on_revisit(True)
 
graph = builder.build()
result = graph("Write a haiku about Python programming")
 
print(f"Status: {result.status}")
print(f"Execution order: {[node.node_id for node in result.execution_order]}")
print(f"Total executions: {len(result.execution_order)}")
 
for node_id, node_result in result.results.items():
    print(f"\n--- {node_id} ---")
    print(f"  Status: {node_result.status}")
    if node_result.result and node_result.result.message:
        for block in node_result.result.message.get('content', []):
            if 'text' in block:
                print(f"  Output: {block['text'][:200]}")
                break
Terminal
python -u 04_feedback.py

Result

Output
Status: Status.COMPLETED
Execution order: ['draft_writer', 'reviewer', 'publisher']
Total executions: 3
 
--- draft_writer ---
  Status: Status.COMPLETED
  Output: Code flows like water
Indentation guides the path
Serpent's logic coils
 
--- reviewer ---
  Status: Status.COMPLETED
  Output: APPROVED: This is a beautifully crafted haiku...
 
--- publisher ---
  Status: Status.COMPLETED
  Output: # **Flowing Code**
*A Haiku on Python Programming*
...

The first draft was APPROVED, so it completed in 3 steps: draft_writer → reviewer → publisher. If revision were needed, the execution_order would be longer: draft_writer → reviewer → draft_writer → reviewer → publisher.

Summary

  • Just define nodes and edges with GraphBuilder for deterministic workflowsadd_node registers agents, add_edge defines dependencies. Execution order is explicitly controlled by edges.
  • Nodes without dependencies automatically run in parallel — Just draw edges from one node to multiple targets. Python's Graph uses OR semantics.
  • Conditional edges enable dynamic branching — Pass a function to add_edge's condition parameter. Reference previous node results from GraphState to decide branching.
  • Feedback loops are built with reverse conditional edges — For cyclic graphs, set_entry_point, set_max_node_executions, and reset_on_revisit are required.

Share this post

Shinya Tahara

Shinya Tahara

Solutions Architect @ AWS

I'm a Solutions Architect at AWS, providing technical guidance primarily to financial industry customers. I share learnings about cloud architecture and AI/ML on this site.The views and opinions expressed on this site are my own and do not represent the official positions of my employer.

Related Posts