← Blog/AWS + Blockchain + AI: Building a Multi-Chain Intelligence Copilot wit…
Blockchain

AWS + Blockchain + AI: Building a Multi-Chain Intelligence Copilot with AMB Query and Bedrock

Apr 27, 2026·7 min read

A compliance team needs an internal copilot that explains suspicious wallet activity across Bitcoin and Ethereum and prepares human-readable incident notes for investigators.

AWSBlockchain

AWS + Blockchain + AI: Building a Multi-Chain Intelligence Copilot with AMB Query and Bedrock

Scenario

A compliance team needs an internal copilot that explains suspicious wallet activity across Bitcoin and Ethereum and prepares human-readable incident notes for investigators.

Why this architecture

Amazon Managed Blockchain (AMB) Query provides standardized blockchain data APIs. Amazon Bedrock can summarize and explain that data for analysts. Combined, they reduce custom ETL and speed investigation workflows.

Key constraints from AWS docs

  • AMB Query currently runs in us-east-1.
  • API access is SigV4-authenticated.
  • It supports Bitcoin and Ethereum networks (mainnets + testnets listed in docs).

Architecture

graph TD Analyst[Compliance Analyst] --> UI[Internal Copilot UI] UI --> API[FastAPI Backend] API --> AMBQ[Amazon Managed Blockchain Query] API --> S3[(S3 Raw Query Snapshots)] API --> ATH[(Athena for Historical Analysis)] API --> BR[Amazon Bedrock] API --> DDB[(DynamoDB Case State)] API --> CW[CloudWatch + Alarms]

Trade-offs

  • Fully serverless path (Lambda/API Gateway) is cheaper at low volume.
  • ECS/Fargate gives steadier latency for heavy interactive workloads.
  • Caching is mandatory to avoid repeated expensive blockchain lookups.

Step-by-step tutorial

1) Bootstrap

export AWS_REGION=us-east-1
export PROJECT=chain-copilot
export ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)
export BUCKET=${PROJECT}-${ACCOUNT_ID}-${AWS_REGION}
$env:AWS_REGION = "us-east-1"
$env:PROJECT = "chain-copilot"
$env:ACCOUNT_ID = (aws sts get-caller-identity --query Account --output text)
$env:BUCKET = "$($env:PROJECT)-$($env:ACCOUNT_ID)-$($env:AWS_REGION)"

2) Create storage and case table

aws s3api create-bucket --bucket "$BUCKET" --region "$AWS_REGION"
aws dynamodb create-table \
  --table-name ${PROJECT}-cases \
  --attribute-definitions AttributeName=pk,AttributeType=S AttributeName=sk,AttributeType=S \
  --key-schema AttributeName=pk,KeyType=HASH AttributeName=sk,KeyType=RANGE \
  --billing-mode PAY_PER_REQUEST \
  --sse-specification Enabled=true

3) Query AMB data from Python

import boto3

client = boto3.client("managedblockchain-query", region_name="us-east-1")


def get_eth_tx(transaction_hash: str):
    return client.get_transaction(
        network="ETHEREUM_MAINNET",
        transactionHash=transaction_hash
    )


def get_btc_tx(transaction_id: str):
    return client.get_transaction(
        network="BITCOIN_MAINNET",
        transactionId=transaction_id
    )

4) FastAPI service for investigators

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
import boto3
import json

app = FastAPI(title="Blockchain Intelligence Copilot")
mbq = boto3.client("managedblockchain-query", region_name="us-east-1")
bedrock = boto3.client("bedrock-runtime", region_name="us-east-1")

class TxRequest(BaseModel):
    network: str
    tx_id: str

@app.post("/investigate")
def investigate(req: TxRequest):
    try:
        if req.network == "ETHEREUM_MAINNET":
            tx = mbq.get_transaction(network=req.network, transactionHash=req.tx_id)
        elif req.network == "BITCOIN_MAINNET":
            tx = mbq.get_transaction(network=req.network, transactionId=req.tx_id)
        else:
            raise HTTPException(status_code=400, detail="Unsupported network")
    except Exception as e:
        raise HTTPException(status_code=500, detail=str(e))

    prompt = f"Summarize this blockchain transaction for compliance review: {json.dumps(tx)[:12000]}"
    body = {"inputText": prompt}
    llm_resp = bedrock.invoke_model(
        modelId="amazon.titan-text-lite-v1",
        contentType="application/json",
        accept="application/json",
        body=json.dumps(body)
    )
    return {"transaction": tx, "analysis": llm_resp['body'].read().decode('utf-8')}

5) Persist query snapshots

import boto3, json
from datetime import datetime, timezone

s3 = boto3.client("s3")
BUCKET = "chain-copilot-123456789012-us-east-1"


def persist_snapshot(case_id: str, payload: dict):
    key = f"snapshots/{case_id}/{datetime.now(timezone.utc).isoformat()}.json"
    s3.put_object(Bucket=BUCKET, Key=key, Body=json.dumps(payload).encode("utf-8"))

6) Monitoring and alerting

aws sns create-topic --name ${PROJECT}-alerts
aws cloudwatch put-metric-alarm \
  --alarm-name ${PROJECT}-api-errors \
  --namespace AWS/Lambda \
  --metric-name Errors \
  --dimensions Name=FunctionName,Value=${PROJECT}-api \
  --statistic Sum --period 60 --evaluation-periods 5 --threshold 5 \
  --comparison-operator GreaterThanOrEqualToThreshold \
  --alarm-actions arn:aws:sns:${AWS_REGION}:${ACCOUNT_ID}:${PROJECT}-alerts

Security design

  • least-privilege IAM on AMB Query APIs
  • no client-side AWS credentials
  • encryption for S3/DynamoDB and KMS-managed keys
  • audit API usage with CloudTrail

Cost optimization

  • cache transaction lookups by hash/id
  • store raw snapshots once, analyze many times
  • batch historical analytics via Athena instead of repeated live queries

Pricing note: verify AMB Query, Bedrock, S3, and Athena pricing on official AWS pricing pages.

Production checklist

  • Retry and timeout handling for AMB Query calls
  • Data retention policy for transaction snapshots
  • Redaction policy for sensitive investigation notes
  • Analyst feedback loop for false-positive summaries
  • Budget and anomaly alarms configured

References

  • https://docs.aws.amazon.com/managed-blockchain/latest/ambq-dg/key-concepts.html
  • https://docs.aws.amazon.com/managed-blockchain/latest/ambq-dg/getting-started.html
  • https://aws.amazon.com/documentation-overview/managed-blockchain/