Implications Reference

Complete reference for the implications system. See Implications guide for practical patterns.

Overview

Implications are j17's reactive event system: when event A happens, automatically create event(s) B. All implied events are written atomically with the trigger event -- the trigger and its implications succeed or fail as a unit.

Implications are defined per event type in your spec:

{
  "aggregate_types": {
    "order": {
      "events": {
        "was_placed": {
          "schema": { ... },
          "handler": [ ... ],
          "implications": [
            { ... },
            { ... }
          ]
        }
      }
    }
  }
}

There are seven implication types, from pure-data declarative to full custom code:

Type Key Purpose
tick emit Declarative condition + emit
map map Fan-out: one event per array item
scheduled schedule Delayed emission with cancel conditions
pipeline pipeline Chained steps
wasm wasm WASM blob for complex logic
binary_container binary_container Pre-built executable image
runtime_container runtime_container Code + lockfile (you write code, we build)

Tick

The most common type. Optionally test a condition, then emit an event to a target aggregate.

Basic emit

{
  "emit": {
    "aggregate_type": "notification",
    "id": "admin",
    "event_type": "was_queued",
    "data": {"message": "New order received"}
  }
}
Field Required Description
aggregate_type yes Target aggregate type
id yes Target aggregate ID -- literal string or JSONPath expression
event_type yes Event type to emit on the target
data no Event data -- literals, JSONPath expressions, or template operators

Conditional emit

Add a condition to gate the implication on a predicate. Conditions use the same predicate syntax as tick handlers (equals, not_equals, gt, gte, lt, lte, in, not_in, exists, and, or, not).

{
  "condition": {"equals": ["$.data.priority", "urgent"]},
  "emit": {
    "aggregate_type": "alert",
    "id": "ops-team",
    "event_type": "was_triggered",
    "data": {"source": "$.key"}
  }
}

This implication only fires when $.data.priority equals "urgent".

Dynamic target ID

Use a JSONPath expression for the id field to route implied events dynamically. Any string starting with $ is treated as a path expression:

{
  "emit": {
    "aggregate_type": "user_timeline",
    "id": "$.metadata.actor.id",
    "event_type": "had_activity_added",
    "data": {
      "source_key": "$.key",
      "source_type": "$.type"
    }
  }
}

Accessing event data

All JSONPath expressions resolve against a context object containing:

Path Description
$.key Trigger event key (e.g., "order:abc123")
$.type Trigger event type (e.g., "was_placed")
$.data.* Trigger event payload
$.metadata.* Trigger event metadata (actor, timestamp, etc.)
$.state.* Source aggregate's current state (see State access semantics)
{
  "condition": {"gte": ["$.data.amount", 100]},
  "emit": {
    "aggregate_type": "loyalty",
    "id": "$.metadata.actor.id",
    "event_type": "had_points_earned",
    "data": {
      "order_key": "$.key",
      "amount": "$.data.amount",
      "customer_tier": "$.state.tier"
    }
  }
}

Map

Fan-out pattern: emit one event per item in an array. Useful for order line items, batch operations, and similar one-to-many patterns.

{
  "map": {
    "in": "$.data.items",
    "as": "$item",
    "emit": {
      "aggregate_type": "inventory",
      "id": "$item.product_id",
      "event_type": "was_reserved",
      "data": {
        "quantity": "$item.qty",
        "order_id": "$.key"
      }
    }
  }
}
Field Required Description
in yes JSONPath to the array to iterate
as yes Binding name for the current item (e.g., $item)
emit yes Emit template -- can use both event paths ($.data.*) and item binding ($item.*)
condition no Predicate to filter items before emitting

Map with condition filter

Only emit for items matching a condition:

{
  "map": {
    "in": "$.data.items",
    "as": "$item",
    "condition": {"equals": ["$item.requires_shipping", true]},
    "emit": {
      "aggregate_type": "warehouse",
      "id": "$item.warehouse_id",
      "event_type": "had_pick_requested",
      "data": {
        "product_id": "$item.product_id",
        "qty": "$item.qty",
        "order_id": "$.key"
      }
    }
  }
}

Object iteration

The map construct works with arrays only. To iterate over object keys, use $entries to convert an object to an array of {key, value} pairs:

{
  "map": {
    "in": "$.data.role_changes.$entries",
    "as": "$entry",
    "emit": {
      "aggregate_type": "user",
      "id": "$entry.key",
      "event_type": "had_role_updated",
      "data": {"role": "$entry.value"}
    }
  }
}

Scheduled

Delayed implications emit events at a future time, with optional cancel conditions. This is a saga-lite pattern for coarse-grained business logic delays.

{
  "schedule": {
    "delay": "24h",
    "emit": {
      "aggregate_type": "notification",
      "id": "$.metadata.actor.id",
      "event_type": "cart_abandonment_reminder",
      "data": {"cart_id": "$.key"}
    },
    "cancel_on": [
      {
        "aggregate_type": "cart",
        "id": "$.key",
        "event_type": "was_checked_out"
      }
    ]
  }
}
Field Required Description
delay yes Duration string before firing
emit yes Event to emit after delay
cancel_on no Array of event patterns that cancel this scheduled event if they occur before the delay expires

Delay format

The delay field accepts duration strings with a numeric value and unit suffix:

Suffix Unit Example Milliseconds
s seconds "300s" 300,000
m minutes "30m" 1,800,000
h hours "24h" 86,400,000
d days "7d" 604,800,000

Minimum delay is 5 minutes. Any delay shorter than "5m" (300,000ms) is rejected at spec validation time.

Cancel conditions

Each cancel condition matches on a specific event pattern. If any matching event occurs during the delay window, the scheduled event is cancelled.

{
  "cancel_on": [
    {
      "aggregate_type": "order",
      "id": "$.key",
      "event_type": "was_completed"
    },
    {
      "aggregate_type": "order",
      "id": "$.key",
      "event_type": "was_cancelled"
    }
  ]
}
Field Required Description
aggregate_type yes Aggregate type to watch
id yes Aggregate ID -- JSONPath expression resolved at schedule time
event_type yes Event type that triggers cancellation

Pipeline

Chain multiple steps where transform steps enrich context for downstream emit steps. Pipelines can mix any implication type as steps.

{
  "pipeline": [
    {"wasm": {"blob_name": "enrich.wasm", "mode": "transform"}},
    {
      "emit": {
        "aggregate_type": "notification",
        "id": "$.enriched.recipient",
        "event_type": "was_queued",
        "data": "$.enriched.payload"
      }
    }
  ]
}

Pipeline steps are executed in order. A step in transform mode returns enriched context that becomes available to subsequent steps. A step in emit mode produces implied events.

Valid step types: tick, map, wasm, binary_container, runtime_container, scheduled.

Wasm

For logic that exceeds what declarative tick can express, use a WASM blob. The blob receives the trigger event and context as JSON, and returns an array of events to emit.

Short form

{
  "wasm": "order-notifier.wasm"
}

Long form

{
  "wasm": {
    "blob_name": "order-processor.wasm",
    "entrypoint": "compute_implications",
    "mode": "emit"
  }
}
Field Required Default Description
blob_name yes -- Name of the WASM blob (uploaded via admin API)
entrypoint no "compute_implications" Exported function name
mode no "emit" "emit" returns events; "transform" returns enriched context (for pipelines)

The WASM function receives JSON input:

{
  "event": {
    "key": "order:abc123",
    "type": "was_placed",
    "data": { ... },
    "metadata": { ... },
    "state": { ... }
  }
}

And returns an array of events to emit (in emit mode), or an enriched context object (in transform mode).

Binary Container

Ship a pre-built executable image. Same interface as WASM (receives event JSON on stdin, returns events on stdout) but runs as a sandboxed container.

{
  "binary_container": {
    "image": "myregistry/order-processor:v2",
    "mode": "emit"
  }
}
Field Required Default Description
image yes -- Container image reference
mode no "emit" "emit" or "transform"

Runtime Container

Ship your code and lockfile; j17 builds and runs the container. Supported runtimes: node, elixir, ruby, python.

{
  "runtime_container": {
    "runtime": "node",
    "entrypoint": "implications/order.js",
    "mode": "emit"
  }
}
Field Required Default Description
runtime yes -- One of: node, elixir, ruby, python
entrypoint yes -- Path to your handler file
mode no "emit" "emit" or "transform"

Data Template Operators

The data field in emit templates supports three DSL operators for constructing complex event payloads. Operators are recognized as single-key objects where the key is the operator name.

concat -- string concatenation

Concatenates an array of values into a single string. JSONPath expressions are resolved first. Numbers and booleans are coerced to strings.

{
  "data": {
    "message": {"concat": ["Order ", "$.key", " was placed by ", "$.metadata.actor.id"]}
  }
}

Result: "Order order:abc123 was placed by user:456"

coalesce -- first non-null value

Returns the first resolved value that is not null. Useful for fallback chains.

{
  "data": {
    "display_name": {"coalesce": ["$.data.display_name", "$.data.name", "Anonymous"]}
  }
}

If $.data.display_name is null or missing, falls back to $.data.name, then to "Anonymous".

merge -- shallow object merge

Merges an array of objects. Later values override earlier ones. Non-object values are skipped.

{
  "data": {"merge": [
    "$.state.defaults",
    {
      "updated_by": "$.metadata.actor.id",
      "timestamp": "$.metadata.timestamp"
    }
  ]}
}

Nesting operators

Operators can be nested inside each other:

{
  "data": {"merge": [
    {"message": {"concat": ["Order ", "$.data.order_id", " processed"]}},
    {"name": {"coalesce": ["$.data.display_name", "$.data.email"]}},
    {"status": "pending"}
  ]}
}

State Access Semantics

Pre-batch state (S0)

When accessing $.state.* in implications, you get the aggregate's state before any events in the current request are applied. This is called S0 (state zero).

Single event write: - You submit event A - Implications for A see S0 (state before A) - After commit: state is S0 + A = S1

Batch event write: - You submit events A, B, C in one request - Implications for A see S0 - Implications for B see S0 (not S0 + A) - Implications for C see S0 (not S0 + A + B) - After commit: state is S0 + A + B + C = S3

Why S0?

S0 plus the trigger event data contains the same information as intermediate states. The trigger event data is available via $.data.*, so implications can reference both current state and the event payload without ambiguity.

All implications in a batch see the same state -- no coupling between sibling events, and no ordering dependencies within a batch.

Best practices

Reference stable fields from state and changed fields from the event:

{
  "emit": {
    "aggregate_type": "audit",
    "id": "log",
    "event_type": "was_recorded",
    "data": {
      "customer_name": "$.state.customer_name",
      "new_email": "$.data.email"
    }
  }
}

Avoid designing implications that expect to see state changes from sibling events in the same batch. If event A sets a field and event B's implication needs that field, either include the value in event B's data, use a single combined event, or make event B a separate request after A completes.

Audit Trail

All implied events automatically include implied_by metadata for traceability:

{
  "implied_by": {
    "key": "order:abc123",
    "event_type": "was_placed",
    "depth": 1
  }
}
Field Description
key Key of the event that triggered this implication
event_type Type of the trigger event
depth How many levels deep in the implication chain (1 = direct implication, 2 = implied by an implied event, etc.)

Safety Limits

Implications have built-in protection against runaway chains:

Limit Default Description
max_depth 5 Maximum implication chain depth (A implies B implies C implies D implies E)
max_total 100 Maximum total implied events from a single trigger event

Exceeding either limit returns an error and the entire transaction (trigger event plus all implied events) is rejected. Nothing is written.

Cycle detection

The engine detects static cycles at spec validation time. A spec where aggregate type A's event implies an event on aggregate type B, which in turn implies the same event back on A, will be rejected when the spec is submitted.

For example, if order:was_placed implies inventory:was_reserved, and inventory:was_reserved implies order:was_placed, the spec will fail validation with a cycle error.

API Response

When implications fire, the write API response includes the count of implied events:

{
  "stream_id": "1706789012345-0",
  "implied_count": 3
}

Limitations

  1. Atomic only. Tick, map, and pipeline implications are written atomically with the trigger event. There is no async retry for these types. Scheduled implications are the mechanism for deferred execution.

  2. No external calls. All implication types that run in the write path (tick, map, wasm, containers) are pure data transformations. No network calls, no I/O. If you need external data, fetch it before writing the trigger event and include it in the event data.

  3. Arrays only for map. The map construct iterates arrays. Use $entries to convert objects to [{key, value}, ...] arrays before mapping.

  4. Depth limits compound with fan-out. If a map emits 10 events and each of those has implications that emit 10 more, you hit 100 events at depth 2. Design fan-out chains carefully against the max_total limit.

See Also

Can't find what you need? support@j17.app