Building Creative Systems with AI

I design AI-enabled pipelines and production-ready workflows that help creative teams move faster without losing taste, authorship, or control.

Grounded in over a decade of motion design, 3D, and real-world production. I build systems with the same care and intention as the visuals they support.

This site focuses on the systems and workflows I build and teach using applied AI.

For selected motion and 3D work, visit notjustbsdesign.com

Microsoft

|

Nike

|

Uber

|

A16Z

|

Techstars

|

SIGGRAPH Speaker

|

Camp Mograph Instructor

|

Microsoft | Nike | Uber | A16Z | Techstars | SIGGRAPH Speaker | Camp Mograph Instructor |

Trusted by teams at

What I Do

I design creative workflows using applied AI, with an emphasis on things that actually hold up in production.

Creative Systems & Pipeline Architecture

I design and prototype AI-enabled workflows for creative teams and organizations. This often involves reviewing existing processes, identifying where automation genuinely helps, and building systems that fit how people already work rather than forcing tool-driven change.

Strategic Motion Design & 3D Production

I direct and produce motion and 3D work, often using AI as an accelerant inside traditional pipelines. The goal isn’t novelty, but clarity, speed, and maintaining creative intent under real deadlines.

Education & Advisory

I teach artists, studios, and organizations how to integrate generative tools thoughtfully through workshops, talks, and longer-term advisory engagements — focused on understanding tradeoffs, not chasing trends.

R&D: Creative Systems & Prototypes

This section covers ongoing experiments in creative systems and applied AI. Some tools are built for personal use, others for teaching or collaboration, but all of them are shaped by real production constraints rather than theoretical demos. These are tools and workflows I’ve built for personal use, teaching, or production, and continue to evolve through real-world use.

Editing Automation

Personal Production Tool

I built an internal editing automation tool to reduce the most repetitive parts of video editing while keeping creative decisions intact. The system transcribes recorded footage, detects redundant takes, and generates an edit decision list (EDL) for Premiere that suggests where to cut and which takes to keep.

Alongside the edit structure, it produces subtitle files and analyzes the content to surface follow-up ideas. That includes social posts, derivative clips, and additional content worth recording. I use this workflow regularly for tutorials and short-form social videos, where speed matters but editorial judgment still needs to stay human. The result is faster turnaround on educational content without flattening editorial judgment.

Notes on this prototype →

Synthetic Performance Workflow

Personal Production System

This workflow uses AI-driven lip sync and digital doubles to generate realistic on-camera performance without the need to reshoot or be camera-ready. I built it to automate parts of my own video production, particularly for social content and tutorial inserts, where matching existing footage or quickly adding lines would otherwise require new shoots.

The emphasis is on realism and continuity: maintaining believable performance while reducing friction in the recording process. It’s especially useful for short-form content and instructional videos, where speed and consistency matter more than traditional capture setups. This has been especially useful for maintaining continuity across serialized content without reshooting.

Workflow notes →

Hybrid Generative Workflow

SIGGRAPH Case Study

Presented at SIGGRAPH, this case study examines how structured 3D data from Cinema 4D can be used to guide and constrain generative AI output for animation and look development. Scene layout, camera data, and motion are used as inputs to preserve art direction while still allowing for generative variation.

The workflow was applied primarily to animation, with additional use in stills for look development. The goal was to integrate generative tools into an existing pipeline in a way that maintains control, continuity, and creative intent — rather than treating AI as a separate or destructive step.

SIGGRAPH case details →

Let’s Talk

If you’re reaching out, it’s usually for one of the following reasons. If you’re not sure which applies, start with a short call and we’ll figure it out.

Studios & Teams
Designing creative systems, pipelines, or production workflows.

Organizations
Workshops, talks, or internal training.

Individuals
Education, consulting, or community membership.