AI-Assisted Development

j17 is designed to work well with AI coding assistants. Your entire backend -- spec, docs, and patterns -- fits comfortably in a context window.

Why j17 works with AI

Property Why It Matters
JSON specs LLMs parse JSON perfectly. No compilation errors.
HTTP API Universal, no SDK lock-in. Every language, every framework.
Declarative handlers No hidden state. Predictable behavior.
Small surface area Docs + spec + examples fit in ~50k tokens.
Consistent patterns Once the AI learns one aggregate, it knows them all.

The workflow

  1. Load the docs -- Download the full docs as a single file from j17.app/j17-docs.md and add it to your AI context, or point Claude Code / Cursor at the j17 docs directory.

  2. Describe your domain -- "I'm building an invoicing system. Invoices can be created, have line items added, be sent, and be paid."

  3. Get a spec -- The AI generates your spec.json with event types, schemas, and handlers.

  4. Refine -- "Add a was_overdue event that fires when payment is 30 days late."

  5. Ship -- Upload the spec, start POSTing events.

Example prompt

I'm using j17 for a task management app. I need:
- Tasks with title, description, due date, status
- Tasks can be created, updated, completed, deleted
- Tasks belong to projects
- Projects can be created and archived

Generate a j17 spec.json with appropriate event types and Tick handlers.

No Rust required. WASM when you need it.

Other "logic on the server" platforms require writing Rust or TypeScript, compiling to WASM, and debugging opaque runtime errors -- for everything, even simple operations.

j17 specs are JSON. Your AI already speaks JSON fluently. The six declarative Tick operations (set, merge, append, remove, increment, decrement) cover 80% of use cases without any code.

When Claude writes a j17 handler, it's writing this:

{
  "handler": [{"merge": {"target": "", "value": "$.data"}}]
}

Not this:

#[spacetimedb::reducer]
fn update_task(ctx: &ReducerContext, id: u64, title: String) -> Result<(), String> {
    if let Some(mut task) = Task::filter_by_id(&id) {
        task.title = title;
        Task::update_by_id(&id, task);
        Ok(())
    } else {
        Err("Task not found".to_string())
    }
}

One of these is easy for an AI to get right. One is a minefield of type errors, borrow checker fights, and WASM compilation failures.

But if you need custom logic, j17 supports WASM handlers. Write complex validation, call external services, implement business rules that don't fit declarative patterns. The escape hatch exists -- you just don't have to use it for simple operations.

Reference applications

We maintain reference applications that demonstrate j17 patterns in real-world contexts. Load these into your AI's context for better results:

App Description Good for learning
gather Social event planning Multi-aggregate relationships, invitations, RSVPs
invoicer Simple invoicing Line items, state machines, calculated fields
taskboard Kanban-style tasks Projects, ordering, status transitions

Loading into Claude Code

# Add a reference app to your context
claude --add-dir /path/to/j17-examples/gather

# Or fetch directly
claude "Read the gather app spec at https://github.com/17jewels/examples/gather/spec.json and use it as a reference for my project"

Loading into Cursor / Copilot

Add the examples directory to your workspace, or paste the relevant spec.json into your conversation.

The AI will pick up patterns like: - How to structure multi-step workflows (sagas) - When to use append vs set - How to model relationships between aggregates - Naming conventions for events (was_created, had_item_added, etc.)

Tips for AI-assisted j17 development

Be specific about your domain -- "Users have profiles with name and email" gives better results than "I need user management."

Ask for the spec first -- Get the data model right before asking for frontend code or API calls.

Iterate on handlers -- "That had_item_added handler should append to items array, not replace it."

Request test events -- "Generate 5 example events I can POST to test this spec."

Testing AI-generated output

If you're just exploring, POST events directly to your instance and see what happens. But for real projects:

Use staging environments

Every j17 instance comes with separate staging and production environments. Different URLs, different API keys, isolated data.

# Staging
curl -X POST https://myapp-staging.j17.dev/user/123/was_created \
  -H "Authorization: Bearer $STAGING_KEY" \
  -H "Content-Type: application/json" \
  -d '{"data": {"name": "Alice"}, "metadata": {"actor": {"type": "user", "id": "123"}}}'

# Production
curl -X POST https://myapp.j17.dev/user/123/was_created \
  -H "Authorization: Bearer $PROD_KEY" \
  -H "Content-Type: application/json" \
  -d '{"data": {"name": "Alice"}, "metadata": {"actor": {"type": "user", "id": "123"}}}'

Let your AI generate specs and events against staging. Promote to production when you're confident.

Automated testing

Your spec is JSON. Your events are JSON. Test them like any other data:

# Validate spec schema
j17 spec validate spec.json

# Dry-run events against a spec
j17 events validate spec.json events.json

# Run your test suite against staging
j17 test --env staging

Seeding and test data

Ask your AI to generate seed data:

Generate 50 realistic test events for my invoicing spec:
- 10 customers created
- 20 invoices with 2-5 line items each
- Mix of paid, pending, and overdue states

Mass event ingress

For coldstart or migrating existing data, use batch ingestion:

# Import from JSON lines file
j17 events import events.jsonl --env staging

# Pipe from another source
your-export-script | j17 events import - --env staging

Full documentation for each of these is coming. For now, the CLI help (j17 --help) covers the basics.

Coming soon

  • Claude Code custom commands for j17
  • Prompt templates for common patterns
  • One-click "Generate spec from description"
Can't find what you need? support@j17.app