Human-AI Creative Animation Tool
Maya Autodesk
PlayStation Visual Arts

Project Summary: At PlayStation Visual Arts Studio, I partnered with the R&D team to turn an experimental AI + computer-vision model system into a production-ready animation tool. The breakthrough technology automatically translated raw motion-capture video into fully rigged 3D animation. While the underlying engine was technically impressive, its prototype UI prevented adoption. Animators struggled to operate the system independently and remained reliant on engineers, limiting engagement and slowing production velocity.

My mission was to design the interaction and workflow experience that enabled animators to confidently edit AI-generated animation, reduce dependency on technical support, and expedite iteration time in Maya.

My Role: End-to-End Product & UX Strategy / Human–AI Workflow Architecture / Interaction Design / Prototyping & Validation

Collaborators

Collaborators

Software Engineer X2
AI Software Engineer x1

Target Audience

Target Audience

Animators
Animation Tech Artists

Platform

Platform

Maya Autodesk

Stakeholders

Stakeholders

R&D Senior Manager
Senior Software Engineers
Animation Technical Artists
Engineering Manager

Duration

Duration

3 Months

Challenge

The initial demo UI surfaced the power of the AI system but did not align with animator workflows. Critical issues included:

  • High dependency on engineering support for basic tasks

  • AI-flagged problem poses buried in dense tables, requiring manual timeline scrubbing

  • No conceptual alignment with how animators structure their work

  • High cognitive load and inconsistent screen hierarchy

Despite the promise of the technology, lack of usable interface and workflow integration resulted in low engagement and limited adoption. The opportunity was to design an experience that transforms the AI output into a purposeful creative assistant, empowering animators rather than burdening them.

Business Goals


  • Enable animators to collaborate with AI to create high-quality animation sequences, without engineering mediation

Immersive Workflow Research

Primary research conducted with animators, animation supervisors, and technical artists to understand:

  • Existing animation workflows in Maya

  • Pain points in pose correction and performance refinement

  • How animators interpret system feedback and prioritize tasks


Impact Metrics below:
Based on comparative usability testing, stakeholder analytics review, tool reporting, and user satisfaction surveys.

60%

Increased in Adoption

95%

Increased Task Completion Rate

100%

User delight and satisfaction

Design Stack

Design Stack

Design Stack

The Project

Strategy Insight:

AI does not create value by itself, workflow integration does.

While the underlying AI and computer-vision technology was capable of generating high-quality animation from Mocap data, its impact was limited by how that intelligence surfaced to animators. The initial demo treated AI output as raw data rather than as actionable guidance, forcing animators to rely on engineers to interpret and operate the system.

The strategic shift was to reframe system not as an “AI animation,” but as a human–AI collaboration system that translates machine intelligence into clear, phased workflows matching how animators think and work inside Maya.

By aligning the interface to animators’ natural cognitive phases (pose correction → solve refinement), the tool moved from being technically impressive to operationally valuable. This shift unlocked user adoption, autonomy, real value for users and measurable production efficiency.

Decision Tradeoffs

What Was Intentionally Simplified:
We reduced the surface area of raw AI diagnostic data and consolidated it into workflow-aligned action zones, prioritizing clarity and task progression over exposing every system-generated signal.

Rather than reflecting the engineering data structure directly in the UI, we reorganized the interface around animator cognitive phases (Pose Correction → Solve & Refinement), reducing cognitive load and enabling faster, more confident decision-making.

What Was Postponed:
Advanced automation features, such as auto-approval of flagged poses and deeper predictive correction modeling, were deferred until animators demonstrated sustained trust in the AI system and sufficient production validation data was available.

This prevented premature automation from undermining creative control or introducing unintended animation artifacts into the pipeline.

What Was Not Built:
Full automation of the animation solve and one-click “finalization” workflows were intentionally excluded.

While technically feasible, removing animator oversight would have compromised creative ownership, reduced trust in the system, and risked over-reliance on machine output. The tool was designed to amplify human craft, not replace it.

Design Principles That Guided the Enterprise Delivery Dashboard


1. Design for Collaboration, Not Automation

AI should amplify human expertise, not replace it. The interface was designed to support collaboration between animator and machine, preserving creative intent while accelerating execution.

2. Align Systems to Human Cognition

Interfaces should reflect how experts think and work, not how systems store data. The two-phase workflow model (correction → refinement) became the organizing structure of the UI.

3. Context Over Control

The system provides guidance and context rather than enforcement. Animators decide when and how to act on AI suggestions, which builds trust and autonomy.

4. Reduce Cognitive Load Before Adding Features

Clarity is a feature. Removing unnecessary density and surfacing only relevant actions at each phase improved speed, accuracy, and confidence.

5. Validate Structure Before Polish

Interaction architecture and workflow alignment were validated in mid-fidelity before visual refinement, ensuring the tool worked correctly in real production environments before final styling.

6. Respect Production Reality

Design decisions accounted for Maya constraints, long work sessions, multi-monitor setups, and performance considerations, prioritizing reliability and usability over visual flourish.

Lessons Learned

This project reinforced a core design principle:

Advanced systems only create value when they are embedded into workflows that humans understand, trust, and can act upon.

Successful tools are not those that have the most features, they are those that align with how people think, work, and create.

The Human-AI Animation System design transformed a technical demo into a practical source of value for animators, bringing structure, clarity, and focus to machine-assisted animation.

“Merilly's vision and dedication to creating an inclusive environment where everyone can feel heard and valued were truly inspiring. I would highly recommend her to any organization seeking a highly motivated and capable team player who is committed to making a difference.”

Jillian Moore

Staff Software Engineer | PlayStation