Python SDK
Zero-dependency Python client that mirrors the Rust SDK one-for-one. Works on Python 3.8+, uses only the standard library (urllib, json). The optional websockets extra unlocks streaming.
This page is a reference: every public method, every argument, every return shape, grouped by the product ladder (Level 0 → Level 5). For the narrative onboarding, see Quick Start. For the wire format, see the API Reference.
pip install sequence-markets # REST only — zero dependencies
pip install "sequence-markets[stream]" # + WebSocket streamingPyPI: sequence-markets. Import name is sequence_markets.
Client
Sequence(api_key, base_url="https://api.sequencemkts.com")
Construct a client. No network traffic happens on construction — the token is validated on the first request.
| Arg | Type | Default | Description |
|---|---|---|---|
api_key | str | — | seq_live_... or seq_test_.... The seq_test_* prefix forces sandbox routing for every order regardless of the per-call sandbox flag. |
base_url | str | production URL | Override for local (http://localhost:50052), staging, or a self-hosted CC. Trailing slash is stripped. |
from sequence_markets import Sequence, SequenceError
seq = Sequence("seq_live_...") # production
seq = Sequence("seq_test_...", "http://localhost:50052") # local sandboxSequenceError
Every method raises SequenceError on a non-2xx response. It exposes:
| Attribute | Description |
|---|---|
status | HTTP status code (int) |
message | Server's error.message or first 200 chars of the body |
try:
seq.buy("ETH-USD", 50)
except SequenceError as e:
if e.status == 429:
time.sleep(1) # rate limited — back off
elif e.status == 409:
... # kill switch active or duplicate graph_id
else:
raiseA small number of methods raise ValueError for client-side validation before any HTTP call (e.g. amend(...) with neither new_price nor new_qty).
Level 0 — Connect
Credentials are encrypted at rest under the per-profile CREDENTIAL_ENCRYPTION_KEY pepper and written via POST /v1/credentials/{venue}. Sequence never sees your plaintext key after the initial call.
connect_venue(venue, api_key, api_secret, passphrase=None, extra_json=None)
Store venue API credentials. Idempotent — subsequent calls overwrite.
| Arg | Type | Required | Description |
|---|---|---|---|
venue | str | ✓ | binance, coinbase, kraken, okx, bybit, kalshi, polymarket, hyperliquid, bitget, cryptocom, dex_ethereum, solana |
api_key | str | ✓ | Venue-issued API key. Empty string for Polymarket (signer-only flow). |
api_secret | str | ✓ | Venue-issued secret. Empty string for Polymarket. For Kalshi: the full RSA PEM (PKCS#1 or PKCS#8). |
passphrase | str | ✗ | OKX, Coinbase Advanced Trade, Crypto.com, Bitget |
extra_json | Any | ✗ | Venue-specific metadata — see the table below |
Returns None. Raises SequenceError on invalid credentials.
Venue quick reference for the tricky ones:
| Venue | api_key | api_secret | passphrase | extra_json |
|---|---|---|---|---|
| Kalshi | key ID (UUID) | full RSA PEM | — | — |
| Polymarket | "" | "" | — | {signer_private_key, proxy_address?, builder?} |
| Hyperliquid | agent-wallet address | agent-wallet private-key hex | vault address (optional) | — |
# CEX — flat api_key + api_secret
seq.connect_venue("binance", api_key="AK…", api_secret="SK…")
seq.connect_venue("okx", api_key="AK…", api_secret="SK…", passphrase="••")
# Kalshi — RSA keypair
seq.connect_venue(
"kalshi",
api_key="dc9d0cd5-076a-4ea8-b406-1847279dac4b",
api_secret="-----BEGIN RSA PRIVATE KEY-----\nMIIE...\n-----END RSA PRIVATE KEY-----",
)
# Polymarket — EIP-712 signer in extra_json; L2 creds are derived on first order
seq.connect_venue(
"polymarket",
api_key="",
api_secret="",
extra_json={
"signer_private_key": "0xabc…",
"proxy_address": "0xdef…", # optional
"builder": "0x…", # optional
},
)disconnect_venue(venue)
Delete stored credentials for a venue. The live edge keeps its existing WS subscriptions until the next reconnect — call this before rotating keys, not mid-session.
venues() → list[dict]
List every venue with credentials on record. Each item: {venue: str, connected: bool, last_heartbeat_ns?: int, error?: str}.
for v in seq.venues():
print(f"{v['venue']:10} {'✓' if v['connected'] else '✗'}")health() → bool
Unauthenticated liveness check against /v1/health/live. Returns True on 2xx, False otherwise — never raises.
Level 1 — See
quote(symbol, depth=10) → dict
NBBO + per-venue BBO + merged book. The one call every strategy starts with.
| Arg | Type | Default | Description |
|---|---|---|---|
symbol | str | — | Trading pair (BTC-USD), Kalshi ticker (KXBTCZ-…), Polymarket slug (fed-decision-…), or raw token ID |
depth | int | 10 | Book levels per side. Clamped to [1, 100]. Polymarket markets routinely have 50+ passive levels — raise this if you're running a market-making quote loop |
Return shape:
{
"symbol": "BTC-USD",
"nbbo": {"bid": 73984.12, "ask": 73985.03, "mid": 73984.57, "spread_bps": 1.23},
"venues": {
"binance": {"bid": 73984.12, "ask": 73985.03, "bid_size": 12.4, "ask_size": 8.1, "age_ms": 42},
"coinbase": {"bid": 73983.87, "ask": 73985.28, "bid_size": 0.8, "ask_size": 1.2, "age_ms": 55},
},
"book": {
"bids": [{"price": 73984.12, "size": 12.4, "venue": "binance"}, …],
"asks": [{"price": 73985.03, "size": 8.1, "venue": "binance"}, …],
}
}quote(symbol, depth=10, nowait=False) → dict
(full parameter list — nowait skips the up-to-2s cold-subscribe wait. Research-only; trading code must never pass nowait=True since the first call will return zeros if the symbol isn't in the edge's sticky set.)
Every returned quote includes a source field telling you which path served it:
| Value | Meaning |
|---|---|
"edge_nbbo" | Live WebSocket cache hit — sub-ms |
"edge_rpc_snapshot" | Cold-path REST via the edge (Polymarket bulk /books or Kalshi signed /orderbook) — 100–300ms, also kicks off a background WS subscribe |
"venue_public_api" | Graceful-degradation direct HTTP — edge RPC was unreachable |
"unavailable" | No path produced book data (market genuinely empty, venue 404'd, etc.) — not an error |
quotes(symbols) → list[dict]
Thin /v1/quotes?symbols=... GET pass-through. For bulk-scan workloads (50+ symbols) prefer quotes_batch() — it's dramatically faster on cold prediction-market paths thanks to server-side fan-out and the Polymarket bulk /books endpoint.
bbo = seq.quotes(["BTC-USD", "ETH-USD", "SOL-USD"])quotes_batch(symbols, depth=10, nowait=True, concurrency=64, instrument_type=None) → dict
POST /v1/quotes_batch. Server-side tokio fan-out with semaphore-bounded concurrency. Per-symbol errors return as partials so one bad symbol can't fail the batch.
# Scan all Kalshi + Polymarket markets in one call
r = seq.quotes_batch(
symbols=[*kalshi_tickers, *polymarket_token_ids],
depth=20,
nowait=True, # default — see note below
concurrency=64, # cap 256
)
for q in r["quotes"]:
if "error" in q:
continue
print(q["symbol"], q["nbbo"]["bid"], q["nbbo"]["ask"], q["source"])| Arg | Default | Description |
|---|---|---|
symbols | — | Up to 2000 symbols, any mix of CEX pairs, Kalshi tickers, Polymarket slugs / token_ids, URLs |
depth | 10 | Book levels per side (1–100) uniformly applied |
nowait | True | Defaults to True because serial 2s subscribe-waits are catastrophic at batch scale. Set False only if every symbol is already in the edge's sticky set and you need first-call data |
concurrency | 64 | Max concurrent upstream fetches (cap 256). Raise for faster scans, lower for nicer venue manners |
instrument_type | auto | Uniform override — spot, perp, or prediction. Auto-detects per symbol otherwise |
Return shape:
{
"quotes": [
{"symbol": "BTC-USD", "source": "edge_nbbo", "nbbo": {...}, ...},
{"symbol": "<token>", "source": "edge_rpc_snapshot", "nbbo": {...}, ...},
{"symbol": "bad", "error": "resolver: unknown symbol"},
],
"count": 3,
"elapsed_ms": 215,
}See Market Data → source field for how to interpret the per-entry source.
positions_unified(venues=None, kinds=None, include_closed=False) → dict
Unified positions across fiat, crypto, perps, and event contracts. One endpoint, one shape — replaces the removed balances(), positions(), perp_positions(), and portfolio() methods.
Fills come from three reconciled sources (fills → write path, venue balance APIs, venue portfolio APIs) and all land in the same positions_v2 table.
# All active positions across every venue
resp = seq.positions_unified()
print(f"NAV = ${resp['totals']['nav_usd_1e9']/1e9:,.2f}")
# Cash only (fiat + crypto stablecoins)
cash = seq.positions_unified(kinds=["fiat", "crypto"])
# Event contracts on Kalshi + Polymarket
events = seq.positions_unified(
kinds=["event_contract"],
venues=["kalshi", "polymarket"],
)
# Include closed / resolved / redeemed rows for historical reporting
historical = seq.positions_unified(include_closed=True)Returns:
{
"positions": [
{
"id": 101,
"venue": "coinbase",
"instrument": {"kind": "crypto", "symbol": "USDC"},
"qty_1e8": 237_848_346_884,
"locked_qty_1e8": 0,
"available_qty_1e8": 237_848_346_884,
"entry_price_usd_1e9": 1_000_000_000,
"current_price_usd_1e9": 1_000_000_000,
"current_value_usd_1e9": 2_378_483_468_840,
"unrealized_pnl_usd_1e9": 0,
"realized_pnl_usd_1e9": 0,
"fees_usd_1e9": 0,
"lifecycle": {"state": "active"},
"transferable": True,
"settled": False,
"settled_mark_usd_1e9": None,
},
# ... more rows for perps, event contracts, fiat cash.
# Settled event contracts look like this:
# {
# "venue": "kalshi",
# "instrument": {"kind": "event_contract", "market_slug": "KX...", "outcome": "YES"},
# "qty_1e8": 500_000_000,
# "current_price_usd_1e9": 1_000_000_000, # $1 — winning side
# "current_value_usd_1e9": 5_000_000_000,
# "lifecycle": {"state": "resolved", "outcome": "YES",
# "economic_value_usd_1e9": 5_000_000_000, "resolved_at": 1_776_...},
# "settled": True,
# "settled_mark_usd_1e9": 1_000_000_000,
# }
],
"totals": {
"nav_usd_1e9": 48_228_483_468_840,
"cash_usd_1e9": 2_478_483_468_840,
"unrealized_pnl_usd_1e9": 750_000_000_000,
"realized_pnl_usd_1e9": 0,
"fees_usd_1e9": 0,
"num_positions": 4,
},
"as_of_ns": 1_776_828_335_325_953_000,
}instrument.kind is a tagged union:
kind | Additional fields |
|---|---|
"fiat" | code (e.g., "USD") |
"crypto" | symbol (e.g., "BTC", "USDC") |
"perp" | symbol, base (e.g., "BTC-PERP" / "BTC") |
"event_contract" | market_slug, outcome, token_id (optional), expiry_ts (optional) |
balances(), positions(), perp_positions(), portfolio() still exist but emit a DeprecationWarning and hit endpoints that now return 404. Migrate to positions_unified() before the next SDK major.
symbols() → list[dict]
Every tradable instrument. Item: {symbol, base, quote, venues: [venue_id], type: "spot"|"perp"|"prediction"}.
price_history(symbol, range="1d", fidelity_secs=60, *, venue=None, limit=None, paginate=False, start_ts=None, end_ts=None, underlying=False) → dict
One primitive that covers CEX, perp, DEX, Kalshi, and Polymarket histories.
| Arg | Type | Default | Description |
|---|---|---|---|
symbol | str | — | Same universe as quote(). For Polymarket append :yes / :no on non-binary slugs |
range | str | "1d" | 1h, 6h, 1d, 1w, 1m, 3m, 6m, 1y, 2y, 5y, max. Ignored when fidelity_secs=0 or when both start_ts and end_ts are set |
fidelity_secs | int | 60 | Bar cadence. 0 switches to tick mode — returned points carry qty_1e8, side, venue_id |
venue | str | — | Required for long-range (>24h) queries on CEX/perp/DEX symbols; ignored for Polymarket / Kalshi / short-range. Values: binance, coinbase, kraken, okx, bybit, hyperliquid, dex_ethereum, solana, kalshi, polymarket |
limit | int | auto | Tick mode only. Default 100 (live tape), 50 000 cap on Polymarket lifecycle backfill |
paginate | bool | False | Polymarket-only. Walk back in 7-day windows to bypass the venue's ~5k-point per-call cap |
start_ts / end_ts | int | — | Explicit unix-seconds window. Both-set overrides range |
underlying | bool | False | Include the Chainlink/Binance underlying curve for Polymarket crypto updowns |
Return shape:
{
"symbol": "BTC-USDC",
"source": "tape", # or "polymarket_rest", "kalshi_rest", "venue_proxy",
# "venue_ticks", "binance_aggtrades_historical", ...
"window": {"start_ts_s": 1776000000, "end_ts_s": 1776086400},
"points": [
{"ts_ns": 1776000000000000000, "price_1e9": 73984123456789,
"qty_1e8": None, "side": None, "venue_id": None}, # bar mode
…
],
"underlying": None # or {symbol, source, warning, anchor_price_usd,
# final_price_usd, agrees_with_settlement, points}
}# Year of Binance hourly BTC-USDT
btc = seq.price_history("BTC-USDT", range="1y", fidelity_secs=3600, venue="binance")
# Last 200 BTC-USDC trade prints
ticks = seq.price_history("BTC-USDC", fidelity_secs=0, limit=200)
for p in ticks["points"]:
print(p["ts_ns"], p["price_1e9"] / 1e9, p["qty_1e8"] / 1e8, p["side"])
# Polymarket 5-min up/down lifecycle, with Chainlink reference
hist = seq.price_history("btc-updown-5m-1776537600", fidelity_secs=0,
limit=50_000, underlying=True)funding_rates(venue=None, symbol=None) → dict
Perpetual funding rates, normalized to bps_per_hour so Binance (8h cadence) and Hyperliquid (continuous) are directly comparable.
Return: {unit: "bps_per_hour", rates: [{venue, symbol, rate_bps_per_hour, predicted_rate_bps_per_hour, mark_price_1e9, open_interest_1e8, next_settlement_ns, snapshot_age_ms}, …]}.
funding_rate(venue, symbol) → dict | None
Single-instrument lookup. Returns None if the rate isn't currently tracked (venue not connected, or symbol unknown).
funding_history(symbol, venue, range="1m", start_ts=None, end_ts=None) → dict
Historical funding. rate_1e9 is a signed fraction × 1e9 — positive means longs pay shorts. Divide by 1e9, then ×10 000 for per-settlement bps.
# Past 30 days of Binance BTC-USDT funding
r = seq.funding_history("BTC-USDT", "binance", "1m")
for p in r["points"]:
bps = p["rate_1e9"] / 1e9 * 10_000
print(f"{p['ts_ns']}: {bps:+.3f} bps")markets(venue="polymarket", **filters) → dict
Unified prediction-market discovery across Kalshi + Polymarket. Every market comes back in the same shape so you can call seq.quote(m["yes_token_id"]) without branching on venue.
| Filter | Type | Description |
|---|---|---|
q | str | Free-text search across question + slug |
slug | str | Direct slug lookup (short-circuits the search) |
tag | str | Polymarket tag slug: bitcoin, politics, sports, … |
tag_id / related_tags / exclude_tag_id | int / bool / int | Polymarket tag-id paths |
active | bool = True | Currently-trading only. Pair with closed=True for full-history discovery — see Status filters |
closed | bool = False | Include resolved markets |
limit | int = 20 | Page size; clamped to [1, 1000] (server cap is 1000) |
offset | int = 0 | Offset pagination — avoid for deep walks, use cursor instead |
cursor | str | None | Opaque pagination cursor from the prior response's next_cursor. Use for exhaustive page-by-page walks; for full-catalog walks prefer iter_markets() which threads cursors automatically |
order | str = "" | volume, volume_week, volume_total, liquidity, end_date, start_date, created, competitive, closed_time. Empty default = no client-side rerank — the SDK previously defaulted to volume, which silently triggered Kalshi's 4× over-fetch path and capped exhaustive walks at ~25% of the catalog. Pass order="volume" only when you actually want top-by-volume and aren't paginating |
expand | str | None | "outcomes" flat-maps each multi-outcome Polymarket event into N binary rows (one per outcome token) |
Each market item:
{
"venue": "polymarket",
"slug": "fed-decision-in-october",
"question": "Will the Fed cut rates in October?",
"outcomes": ["Yes", "No"],
"outcome_token_ids": ["128…", "129…"],
"yes_token_id": "128…",
"no_token_id": "129…",
"condition_id": "0x…",
"neg_risk": False,
"volume_24h": 412_903.12,
"volume_week": 1_820_004.55,
"volume_total": 5_210_448.88,
"liquidity": 98_221.40,
"end_date": "2026-10-31T23:59:59Z",
"start_date": "2026-09-01T00:00:00Z",
"url": "https://polymarket.com/event/fed-decision-in-october"
}iter_markets(venue="polymarket", page_size=1000, **filters) → Iterator[dict]
Yield every market matching filters, page by page, threading next_cursor between calls. Stops when the server reports has_more=false. Same filter kwargs as markets() except cursor/offset/limit are managed internally.
# Walk Kalshi's currently-trading catalog (~43k rows, ~14 s).
for m in seq.iter_markets(venue="kalshi", active=True):
process(m)
# Every Polymarket binary outcome (skip placeholder slots).
for m in seq.iter_markets(venue="polymarket", expand="outcomes", active=True):
if not m.get("is_placeholder_outcome"):
process(m)For "everything Kalshi/PM ever recorded," pair active=True, closed=True — see Status filters for the full matrix.
market(slug, venue="polymarket") → dict | None
Single-market lookup — thin wrapper around markets(slug=slug, limit=1). Returns None if the venue doesn't know the slug.
search_markets(query, limit=20, **kwargs) → dict
Free-text convenience wrapper — markets(q=query, limit=limit, **kwargs).
new_markets(limit=20, **kwargs) → dict
Newest-first — wrapper around markets(order="created"). Dedupe on condition_id, not slug/token_id — slugs churn on recurring contracts (5-min BTC, daily SPY, …).
seen = set()
while True:
for m in seq.new_markets(limit=50)["markets"]:
if m["condition_id"] not in seen:
seen.add(m["condition_id"])
print("NEW:", m["slug"], m["question"])
time.sleep(10)settlement(slug, underlying=False) → dict
Prediction-market ground-truth settlement. The resolved_outcome is always what the venue actually paid — never reconstructed from our Chainlink read.
| Arg | Description |
|---|---|
slug | Polymarket event slug, :yes/:no slug, raw token ID, or full URL |
underlying | True to include the Chainlink curve resampled inside the event window (slower — bounded on-chain RPC) |
Return:
{
"slug": "btc-updown-5m-1776537600",
"question": "Bitcoin up or down 5min…",
"outcomes": ["Up", "Down"],
"outcome_token_ids": ["…", "…"],
"yes_token_id": "…", "no_token_id": "…",
"condition_id": "0x…",
"neg_risk": False,
"status": "resolved", # "open" | "resolving" | "resolved"
"resolved_outcome": "Up", # None while unresolved
"outcome_source": "polymarket_gamma",
"yes_price": 1.0, "no_price": 0.0,
"event_start_s": 1776537600, "event_end_s": 1776537900,
"resolution_source": "polymarket_umma",
"underlying": { # only when underlying=True
"symbol": "BTCUSD", "source": "rtds_chainlink",
"warning": "bounded RPC read, ±1 block",
"anchor_price_usd": 73_984.12,
"final_price_usd": 73_991.03,
"agrees_with_settlement": True, # False → drop for ML training
"points": [{"ts_ns": …, "price_usd": …}, …]
}
}edges() → Any
Connected venue edges with heartbeat state. Useful in CI — assert all(e["status"] == "connected" for e in seq.edges()["edges"]).
account() → Any
Account profile: tier, rate-limit budget, feature flags.
risk_limits() → Any
Profile-level risk limits (max_notional_usd, max_drawdown_usd, kill_switch_active, …).
fees(ticker, venues=None) → dict
Maker/taker fees per venue for a ticker. null taker/maker = "do not route here," not "free."
seq.fees("KXMLBHR-26APR292140KCATH-ATHLBUTLER4-1:YES",
venues=["kalshi", "polymarket"])
# {"ticker": "...", "fees": [{"venue": "kalshi", "maker_bps": None, "taker_bps": 350}, ...]}/v1/fees returns the fee components and curve formula (fee_model, components.rate_bps, formula.taker_fee_usd) so latency-sensitive callers can compute the exact realized fee locally — fetch once per ticker, cache, evaluate per fill. For a server-authoritative answer that also includes size-aware slippage + total landed cost, use preview() (/v1/orders/preview).
preview(symbol, side, qty, order_type="market") → dict
Pre-trade dry-run. Returns {symbol, side, qty_1e8, candidate_venues, best_venue, estimated_fee_bps, worst_case_fee_bps, estimated_slippage_bps, estimated_total_cost_bps, historical_ttff, nbbo, source, generated_at_ns}. Does not submit.
Cold-resilient by default. On a cold ticker (one that hasn't been quote-warmed since the last CC restart) the handler runs the same fallback chain /v1/quotes uses — dynamic-subscribe with up-to-2s wait → edge-routed REST snapshot → venue public-API. First call on a cold prediction-market ticker takes ~200-500ms and returns real fee + slippage + NBBO; subsequent calls are sub-ms via the warmed cache. No explicit quote() warm step needed.
The source field on the response reports which path served the NBBO (same values as quote(): edge_nbbo, edge_rpc_snapshot, venue_public_api, unavailable). Trading bots can gate on source == "edge_nbbo" for live-WS guarantees; research code can accept any non-unavailable source.
NBBO fields (bid_px_1e9, ask_px_1e9, mid_px_1e9, spread_bps) are Options. One-sided books (e.g. tail-risk Kalshi props with no bidders on the favored side) return null on the missing side rather than serializing sentinel values.
Same cold-fallback applies to /v1/intelligence/depth, /v1/intelligence/slippage, /v1/intelligence/routing, and /v1/execution/forecast — calling any of these on a cold ticker triggers a warming cycle so the response carries real data instead of empty arrays.
kill_switch_engage(reason=None) → dict
kill_switch_clear() → dict
cancel_all_on_venue(venue) → Any
Risk ops. kill_switch_engage halts new submission for the caller's identity (existing open orders are NOT cancelled — pair with cancel_all_on_venue for stop-everything). kill_switch_clear is admin-only.
Level 2 — Trade
Every buy() / sell() is sent as a 1-node execution graph — that's how urgency, policy, max_slippage_bps, horizon_ms, and tif reach the execution engine. The simple-order JSON shape in the REST docs is sugar; the SDK always uses the full graph payload.
buy(symbol, qty, **opts) → dict
sell(symbol, qty, **opts) → dict
| Opt | Type | Description |
|---|---|---|
venue | str | Pin to a single venue. Omit for SOR across every connected venue |
limit_price | float | Omit for market. Prediction markets take 0.0–1.0 |
urgency | "low" / "medium" / "high" | Tunes SOR aggression (tactic selection + horizon) |
policy | str | Override SOR: sor, ioc_sweep, passive_limit, aggressive_chase, passive_ladder, time_drip |
max_slippage_bps | int | Hard cap — reject fills worse than this |
horizon_ms | int | Soft deadline for passive/laddered tactics |
tif | "GTC" / "IOC" / "FOK" | Time-in-force (limit orders) |
instrument_type | "spot" / "perp" / "prediction" | Default "spot"; prediction-market URLs auto-normalize |
sandbox | bool | Force paper-fill routing even on a seq_live_* key |
Return:
{"graph_id": "graph_7d769…", "status": "active", "node_count": 1, "edge_count": 0}seq.buy("ETH-USD", 50, urgency="high", max_slippage_bps=10)
seq.sell("BTC-USD", 0.5, venue="coinbase", limit_price=75000.0, tif="GTC")
# Prediction markets: prices are 0.0–1.0, `venue=` pins Kalshi vs Polymarket
seq.buy("KXPGATOUR-ZUCONO26-THVI", 1, venue="kalshi", limit_price=0.02)
seq.buy("4394372887385518214471608448209527405...", 10, venue="polymarket", limit_price=0.10)
# Cancel via graph_id (or the full node_order_id — the SDK strips the suffix)
seq.cancel("graph_f679e9c25e9e4ecbaacceb71d62bd9a9")
# Perp: instrument_type kicks in automatically when the symbol is a known perp
seq.buy("ETH-USD-PERP", 5, instrument_type="perp", venue="hyperliquid")The prediction-market round-trip is fully on the same /v1/orders graph path — the SDK just marshals venue=, limit_price=, and qty into a 1-node ExecutionGraph. A real Kalshi submit takes ~75 ms end-to-end (edge → https://api.elections.kalshi.com/trade-api/v2/portfolio/orders → 201 Created). Polymarket takes ~260 ms on the first order per session (one extra roundtrip for L2-cred derivation), then ~130 ms thereafter.
order(node_order_id) → dict
Fetch a single order. Raises SequenceError(404) if the ID is unknown. The node_order_id format is graph:<graph_id>:<node_id>:<seq>.
orders(limit=50, offset=0, symbol=None) → dict
List orders with pagination. Return: {orders, total, limit, offset, has_more}. side and status come back UPPERCASE:
side:"BUY"/"SELL"status:"NEW"/"PENDING"/"ACCEPTED"/"PARTIAL"/"FILLED"/"COMPLETED"/"CANCELLED"/"REJECTED"/"FAILED"
(The graph-status endpoint uses lowercase snake_case for the same states — pending, partial_fill, filled, etc. The flat orders endpoint is grandfathered UPPERCASE to match venue-native status strings.)
cancel(graph_or_order_id) → None
Accepts either a graph ID (graph_abc…) or a node order ID (graph:graph_abc…:spot:1) — it strips the suffix. Cancels every active node in the graph.
fills(limit=50, offset=0, symbol=None) → Any
Fill history. Each fill: {fill_id, node_order_id, symbol, side, qty_1e8, price_1e9, venue, fee_1e9, timestamp_ns}.
amend(order_id, new_price=None, new_qty=None) → dict
Venue-agnostic amend — change price, qty, or both on a resting order. At least one of the two must be provided.
Return: {"order_id": str, "mode": str, "status": str}.
mode | Meaning |
|---|---|
"cancel_replace" | Today's behavior on every venue. Queue position is lost. Expect lower fill rates on heavily-quoted books |
"native_atomic" | Future — Kalshi only, once CC→edge wiring for /portfolio/orders/{id}/amend lands. Queue preserved |
r = seq.amend(order_id, new_price=0.55, new_qty=10)
if r["mode"] == "cancel_replace":
track_queue_loss(r["order_id"])Raises ValueError("amend: must pass at least one of new_price or new_qty") on empty call.
decrease(order_id, reduce_by) → dict
Shrink a resting order without touching price. Two round-trips today (reads current size, issues amend). Returns {order_id, mode: "cancel_replace_shrink", status}.
Raises SequenceError(400) if reduce_by would take the remaining qty to ≤ 0 — cancel instead.
batch(venue) → _BatchBuilder
Open a batch of submits + cancels targeted at one venue. Returns a builder with these fluent methods:
| Method | Description |
|---|---|
.buy(symbol, qty) / .sell(symbol, qty) | Market submit |
.buy_limit(symbol, qty, price) / .sell_limit(symbol, qty, price) | Limit submit. For prediction markets, price is 0.0–1.0 |
.cancel(order_id) | Enqueue a cancel |
.emulate() | Compatibility no-op. Graph-level SDK batching is serial today |
.end() | Flush. Returns {"mode": "serial", "responses": [...]} |
len(b) | Number of ops queued |
Also usable as a context manager — .end() fires on successful exit.
batch() submits one graph root per op and returns mode="serial" today. Kalshi's transport supports /portfolio/orders/batched (20-op atomic), and Polymarket's transport supports CLOB POST /orders (15 orders, per-order results), but the SDK graph path does not preserve a native venue batch window end-to-end yet.
# Kalshi — graph-level SDK batch is serial today
with seq.batch("kalshi") as b:
b.buy_limit("KXBTCZ-26DEC31-T99000", 10, 0.55)
b.buy_limit("KXBTCZ-26DEC31-T100000", 10, 0.50)
b.cancel("some_resting_order_id")
# b.result == {"mode": "serial", "responses": [...]}
# Polymarket — graph-level SDK batch is serial today
seq.batch("polymarket").buy("TOKEN_ID", 10).end()redeem(slug, venue) → dict
Claim winnings from a resolved prediction market. Uniform across venues — the mode field announces what actually happened.
mode | Meaning |
|---|---|
"auto_settled" | Kalshi settles on the venue side. Balance already reflects payout, no action taken |
"relayer_submitted" | Polymarket. L2-HMAC-signed POST fired to the CLOB relayer. USDC credit asynchronously in minutes |
"noop" | Nothing to redeem for this identity on this market |
Return: {mode, venue, slug, condition_id?, token_ids?, note}. condition_id and token_ids are populated on Polymarket.
Level 3 — Orchestrate
Execution graphs are the universal primitive: a graph with 1 node is a market buy; a 2-node graph with an edge is a conditional hedge; a 1-node graph with pacing is a TWAP. Build them with the node() / edge() / graph() helpers.
node(node_id, symbol, side, qty, *, venue=None, policy=None, urgency=None, limit_price=None, instrument_type="spot") → dict
Build a node dict (doesn't submit anything). activation is set automatically by graph() based on whether the node has incoming edges.
edge(from_node, to_node, trigger="on_fill", value=None) → dict
Build an edge dict. trigger is one of:
| Trigger | Fires when |
|---|---|
on_accepted | SOR ACKs the parent |
on_first_fill | First fill lands |
on_fill | Every fill (streaming hedge) |
fill_pct | Fill ratio ≥ value. Requires value=0.0–1.0 |
on_full_fill | Parent 100% filled |
on_timeout | {"on_timeout": {"ms": value}} — pass value=5000 for 5s |
on_cancel | Parent cancelled |
on_done | Any terminal state |
on_price | Pass a dict directly: {"on_price": {"symbol": …, "direction": "above"/"below", "reference": "graph_entry_price", "offset_pct": 5.0}} |
For on_price and complex sizing, drop to the dict form inline — the helper covers the common 80%.
graph(nodes, edges=None, sandbox=False) → dict
Submit an execution graph. Auto-sets activation: nodes with no incoming edge are root (fire immediately), the rest are wait (fire on edge trigger).
Return: {graph_id, status, node_count, edge_count}.
# Spot + perp hedge — perp sells after spot is 50% filled
graph = seq.graph(
nodes=[
seq.node("spot", "ETH-USD", "buy", 200, urgency="medium"),
seq.node("hedge", "ETH-USD-PERP", "sell", 200, venue="hyperliquid",
instrument_type="perp"),
],
edges=[seq.edge("spot", "hedge", trigger="fill_pct", value=0.5)],
)
# Bracket: entry → take-profit + 5% stop-loss
graph = seq.graph(
nodes=[
seq.node("entry", "BTC-USD", "buy", 0.1, urgency="high"),
seq.node("tp", "BTC-USD", "sell", 0.1, limit_price=80_000),
seq.node("sl", "BTC-USD", "sell", 0.1),
],
edges=[
seq.edge("entry", "tp", trigger="full_fill"),
{ # on_price needs the dict form
"edge_id": "e_sl",
"from_node": "entry", "to_node": "sl",
"trigger": {"on_price": {
"symbol": "BTC-USD", "direction": "below",
"reference": "graph_entry_price", "offset_pct": -5.0,
}},
},
],
)graph_status(graph_id) → dict
{
"graph_id": "graph_…",
"client_id": "client_…",
"status": "active", # pending | active | partial_fill | completed |
# aborted | expired | paused
"nodes": {
"spot": {
"status": "filled", # pending | armed | executing | filled |
# partial_fill | cancelled | rejected
"node_order_id": "graph:…:spot:1",
"target_qty_1e8": 20_000_000_000,
"filled_qty_1e8": 20_000_000_000,
"avg_fill_price_1e9": 2_325_530_000_000,
"fill_count": 3,
},
"hedge": {…}
},
"edges": {"e_spot_hedge": {"status": "fired", "fired": True}}
}graph_cancel(graph_id) → None
Cancel every active node in the graph. Already-filled quantities are NOT reversed — this is a stop-now, not a rollback.
graph_resume(graph_id) → None
Resume a paused graph. Pending nodes re-evaluate their edge triggers.
Level 4 — Automate
Deploy WASM strategies to the venue edges. Colocated — no network hop from strategy to OMS on the fill path.
deploy_algo(name, wasm_path, symbols) → dict
Read wasm_path, base64-encode, POST to /v1/deployments with start_immediately=True.
| Arg | Description |
|---|---|
name | Human label — shows up in logs and the dashboard |
wasm_path | Local path to the compiled .wasm |
symbols | List of symbols the algo subscribes to. The CC maps each symbol to its hosting edge and pushes the WASM blob there |
Return: {deployment_id, name, symbols, size_bytes, status, pushed_to, capable_edges, edges: [...]}. pushed_to < capable_edges means at least one edge rejected the push — check algo_status(id).edges[*].status.
algo_status(deployment_id) → dict
Full deployment state: per-edge position, P&L, fill count, callback latency.
algo_start(deployment_id) → None
algo_stop(deployment_id) → None
Pause/resume across every hosting edge. Stopping does NOT cancel the algo's resting orders — call cancel() on each, or use CancelAll in the algo's on_stop hook.
algo_undeploy(deployment_id) → None
Stop + delete. Irreversible.
Algos keyed by symbol
/v1/algos/* is the user-facing surface for algos (one running WASM per symbol per client). /v1/deployments/* is the underlying CRUD primitive.
seq.algos() # list, default flat
seq.algos(detail="edge") # per-edge breakdown
seq.algo("ETH-USD") # status for one symbol
seq.algo_start_by_symbol("ETH-USD")
seq.algo_stop_by_symbol("ETH-USD")
seq.algo_logs_by_symbol("ETH-USD")
seq.algo_stats("ETH-USD")Hosted strategies
Lifecycle for hosted strategies (Level 4-5). Strategy create flows through sequence strategy push (CLI) since it requires a base64-encoded artifact; read/start/stop/logs are SDK-native.
seq.strategies() # GET /v1/strategies
seq.strategy("strat_abc") # GET /v1/strategies/:id
seq.strategy_start("strat_abc")
seq.strategy_stop("strat_abc")
seq.strategy_logs("strat_abc")
seq.strategy_delete("strat_abc")algo_logs(deployment_id, limit=200) → Any
Strategy println!/log! output, newest-first.
algo_mesh(deployment_id) → Any
Mesh topology: per-edge label, peer latencies, message counts. Empty on single-venue deployments.
Level 5 — Monitor
Most Level 5 endpoints return free-form JSON — the structure stabilizes post-v1. Keep your parsing defensive (.get(..., default)).
tca(symbol=None) → Any
Transaction cost analysis. Cross-desk aggregate, or per-symbol with symbol=…. Returns implementation_shortfall_bps, spread_cost_bps, market_impact_bps, savings_vs_benchmark_bps, savings_usd.
intel(symbol) → Any
Market structure snapshot: per-venue market-share, spread regime, toxicity score, quote-to-trade ratio.
slippage(symbol, side="buy") → Any
Slippage curve for $1k / $10k / $100k / $1M notional. Used by SOR internally; surfaced here for pre-trade sizing.
routing(symbol, side="buy", qty=1.0) → Any
SOR's recommended leg-split for this order. Same endpoint the kernel uses.
depth(symbol) → Any
L2 depth aggregated across every venue, book-side sorted.
forecast(symbol, side, qty) → Any
Pre-trade execution forecast: expected slippage, expected horizon, expected venue mix. Cheaper than submitting a sandbox order.
execution_live(node_order_id) → Any
In-flight monitoring for a single order. Updates as fills land.
execution_review(node_order_id) → Any
Post-trade review — arrival price, VWAP, venue breakdown, realized slippage.
execution_summary() → Any
Desk-wide KPIs: 24h notional, hit rate, avg slippage, top symbols.
trace(node_order_id) → Any
Nanosecond-precision lifecycle trace: every event from SOR admission → edge dispatch → venue ACK → first fill → terminal state. The reference tool for latency regressions.
Streaming
Three WebSocket surfaces — the generic event bus (/v1/stream), and two prediction-market lifecycle streams. All three require pip install sequence-markets[stream].
async stream(channels) → async iterator of (channel, data)
Subscribe to any mix of channels on the unified /v1/stream endpoint.
| Channel | Emits |
|---|---|
prices:{symbol} | NBBO updates ({bid, ask, mid, spread_bps, ts_ns}) |
book:{symbol} | L2 depth updates |
orders | Your order lifecycle events |
fills | Your fills |
routing | SOR routing decisions — a stream of TCA-ready {legs, est_cost_bps, chosen} records |
tca | Post-completion Transaction Cost Analysis reports — one event per order with achieved_vwap_1e9, benchmark_vwap_1e9, total_fees_1e9, venues_used, execution_time_ms, full cost decomp (fee_cost_bps, spread_cost_bps, market_impact_bps, implementation_shortfall_bps, savings_vs_benchmark_bps). For prediction-venue fills four extra fields populate: is_prediction, achieved_implied_prob_bps, arrival_implied_prob_bps, probability_slippage_bps. Same payload the legacy /v1/tca/stream SSE emits, multiplexed onto the unified WS so you don't need a second connection |
traces:{deployment_id} | Algo callback traces |
funding | Every funding-rate update across every venue |
funding:{venue} | One venue only |
funding:{venue}:{symbol} | One instrument only |
import asyncio
async def main():
async for channel, data in seq.stream(["prices:BTC-USD", "fills"]):
if channel.startswith("prices:"):
print(channel, data["bid"], data["ask"])
elif channel == "fills":
print("fill:", data)
asyncio.run(main())The close path is deterministic — the iterator cleans up on break, GeneratorExit, or CancelledError with a 2-second close timeout. Safe to stop from a Jupyter cell.
async stream_new_markets() → async iterator of dict
Every prediction market the moment it's minted. On connect, the server flushes the last ~128 events from its replay ring, then streams live. Deduped across multi-region edges.
Each event:
{
"kind": "new_market",
"venue": "polymarket",
"slug": "btc-updown-5m-…",
"condition_id": "0x…",
"question": "Bitcoin up or down…",
"outcomes": ["Up", "Down"],
"outcome_token_ids": ["…", "…"],
"neg_risk": False,
"tick_1e9": 10_000_000, # 0.01 tick
"fee_schedule": {"exponent": 1, "rate_1e8": 7_200_000,
"taker_only": True, "rebate_rate_1e8": 20_000_000},
"start_ts_s": 1776657000,
"end_ts_s": 1776657300,
"observed_at_ns": 1776657000000000000
}async for m in seq.stream_new_markets():
if "btc-updown" in m["slug"]:
seq.buy(m["outcome_token_ids"][0], qty=5, venue="polymarket")async stream_resolved_markets() → async iterator of dict
Sibling stream — every resolution, with winning token ID (paid $1) and losing token ID (paid $0).
{
"kind": "market_resolved",
"venue": "polymarket",
"condition_id": "0x…",
"winning_outcome": "Up",
"winning_token_id": "…",
"losing_token_id": "…",
"observed_at_ns": …
}Pair the two streams to see the full lifecycle.
Fixed-point conventions
Every wire payload uses the same scale:
| Suffix | Scale | Example |
|---|---|---|
_1e8 | × 10⁸ | 1 BTC = 100_000_000, 0.5 ETH = 50_000_000 |
_1e9 | × 10⁹ | $50 000 = 50_000_000_000_000 |
_bps | basis points | 10 bps = 0.10% |
_ns | nanoseconds since Unix epoch | 1705406400000000000 |
| (no suffix) | human float | bid: 73984.57 |
The SDK does not wrap these in Qty/Px classes (that's the Rust SDK) — the responses are plain dicts. Convert explicitly when you need floats: qty = row["qty_1e8"] / 1e8.
Sandbox
Two ways to get paper fills against live market data:
- Per-call:
seq.buy("BTC-USD", 0.01, sandbox=True). Everything else uses your live routing. - Whole-session: log in with a
seq_test_*key — every order routes through the sandbox adapter regardless ofsandbox=.
Sandbox fills are settled against the live order book mid/touch with realistic slippage and fees. Balances and positions are isolated from your live account.