Generative AI Workflow

Upload → process → notify → display

Overview

A user uploads inputs to S3, backend services fan out work to a local ComfyUI runner, and outputs are written back to S3. Events propagate via Lambdas and Pusher so the frontend updates progressively.

User → Frontend (Next.js)
 → S3 (inputs/<job_id>/…) → Lambda: S3ToComfyUI → Pusher
 → Flask API (PREPARE_JOB) → Local /jobs/<job_id> → ComfyUI workflow(s)
 → S3 (outputs/<job_id>/<stage>/…) → Lambda: NotifyWebhookOnS3Upload → Pusher
 → Frontend (refresh UI) → User sees results

Architecture Diagram

[Frontend: React/Next.js @ jigpx.com]
   │ User selects task + parameters
   │ Identity:
   │   - Anonymous → clientId in browser cache
   │   - Signed-in (Supabase, Google login) → userId
   │ API issues jobId + presigned S3 URL
   │ request.json manifest created
   │ Job record appended to job-index/by-client/{clientId or userId}
[S3 Bucket: jig-react]
   │ Input uploaded → inputs/{jobId}/...
   ▼  (S3 event: ObjectCreated in inputs/)
[Lambda: S3ToComfyUI]
   │ Validates input + manifest
   │ Updates status.json = "accepted"
   │ Sends JobCreated webhook → ngrok (Flask)
[ngrok Tunnel]
   │ Secure public HTTPS → local Flask API
   │ Lets AWS events reach Flask
[Flask Orchestrator]
   │ Receives JobCreated webhook
   │ Prepares workflow graph + parameters from request.json
   │ Connects to local ComfyUI server via **WebSocket**
   │ Submits job graph (enqueue run)
   │ Emits Pusher "job:accepted" to frontend
[ComfyUI Worker (GPU)]
   │ Workflow executes
   │ WebSocket sends status messages back to Flask:
   │   - job started, node queued
   │   - node completed
   │   - previews / intermediate updates
   │ Flask updates status.json accordingly
   │ Flask emits Pusher "job:progress" events
[S3 Bucket: jig-react]
   │ Final outputs written → outputs/{jobId}/...
   │ output.json manifest written (lists all result files + metadata)
   ▼  (S3 event: ObjectCreated in outputs/)
[Lambda: NotifyWebhookOnS3Upload]
   │ Notifies Flask via ngrok webhook (JobOutput)
[Flask Orchestrator]
   │ Confirms outputs (using output.json)
   │ Updates status.json = "completed" or "failed"
   │ Emits Pusher "job:completed|failed" with signed output URLs
[Frontend Job Dock]
   │ Loads job list from job-index/by-client/{clientId or userId}
   │ Subscribes to Pusher channels for each job
   │ Receives live updates & results instantly

1) Upload and intent

  • Frontend requests a pre-signed PUT URL for S3.
  • Uploads media to s3://jig-react/inputs/<job_id>/…
  • Writes instructions.txt (job manifest) in the same prefix.

2) Start the job (inputs trigger)

  • S3 event on inputs/ invokes Lambda S3ToComfyUI.
  • Lambda emits a Pusher event: {job_id, keys_changed}.

3) Prepare job (API/worker)

  • Flask API receives the event and runs PREPARE_JOB.
  • Creates /jobs/<job_id>/, downloads referenced S3 objects.
  • Parses instructions.txt → builds execution plan/graph.
  • Normalizes CLI args to node expectations; queues work for ComfyUI.

4) Generation (one or multiple stages)

  • ComfyUI runs stages; partials may be produced per stage.
  • Writes outputs to s3://jig-react/outputs/<job_id>/<stage>/…
  • Optional smaller Pusher events can surface progress.

5) Notify on outputs (outputs trigger)

  • S3 event on outputs/ invokes Lambda NotifyWebhookOnS3Upload.
  • Lambda posts to a webhook; Flask rebroadcasts via Pusher: {job_id, new_files:[…]}.

6) Frontend updates

  • Job page subscribes to the Pusher channel for job_id.
  • New thumbnails/results appear; status updates live.
  • On final stage completion, emits job_done and enables downloads.

User experience

  • Immediate “Processing…” after upload.
  • Progressive results if partials are surfaced.
  • Final artifacts downloadable from outputs/<job_id>/…

Implementation notes

  • IDs and foldering: UUID job_id for inputs/ and outputs/.
  • Event payloads: send only new keys to avoid large messages.
  • Idempotency: use state.json markers to prevent duplicate runs.
  • Errors: write error.json in outputs and emit job_error.
  • Security: serve outputs via pre-signed GET or a CDN.
  • Progress truth: store instructions.txt, state.json, manifest.json in S3.
  • Chaining UX: persist a graph manifest for visualization and branching.