Live2D Animation & Mapping
The Mapping tab is where you bring your Live2D model to life by connecting agent behavior to animations. The system has three layers: Clips define what happens, Rules define when it happens, and your agent’s state drives the whole thing automatically.
Architecture Overview
Section titled “Architecture Overview”Agent State (emotion, action, tool call) │ ▼ ┌──────────┐ all-match (every matching rule fires) │ Rules │ ──────────────────────► Which clips to play? └──────────┘ │ ▼ ┌──────────┐ clips play in parallel │ Clips │ ──► Parameter changes over time └──────────┘ │ conflicts resolved by priority + blend mode ▼ Live2D Model Parameters UpdatedAnimation Clips
Section titled “Animation Clips”A clip is a reusable animation — a set of parameter changes over time. Think of it like a short animation sequence you can trigger from multiple rules.
Creating Clips
Section titled “Creating Clips”Three ways to create clips:
1. Manual creation
- Click Add Clip in the Mapping tab
- Add parameter tracks and set keyframes
- Use the full-screen timeline editor for precise control
2. Import from Expression
- Click Create From Expression
- Select a model expression preset (
.exp3.json) - The system creates a clip with enter → hold → release phases
- Configure durations for each phase
3. Import from Motion
- Click Import Motion As Clip
- Select a model motion (
.motion3.json) - The motion’s curves are converted into editable keyframes
Clip Properties
Section titled “Clip Properties”| Property | Options | Description |
|---|---|---|
| Blend Mode | Add, Multiply, Overwrite | How this clip combines with other active clips |
| Playback | One Shot, Loop, Repeat N | How many times the clip plays |
| Completion | Complete, Bound | Whether clip finishes if the trigger ends early |
| End Behavior | Stop, Hold then Release | What happens when the clip finishes |
| Release Curve | Duration (ms) | Smooth fade-out when the clip ends |
Full-Screen Editor
Section titled “Full-Screen Editor”Click Open Full Editor for the standalone timeline editor, which offers:
- Per-parameter keyframe tracks with per-track modifiers
- Cubic-bezier curve editing with visual curve preview
- Scrub, pan, and zoom controls
- Real-time preview synchronized with the model
Keyframe Editing Shortcuts
Section titled “Keyframe Editing Shortcuts”| Action | Shortcut | Description |
|---|---|---|
| Select keyframe | Click | First click selects, second click opens inline edit panel |
| Cycle curve type | Ctrl+Click | Cycles: Linear → Bezier → Spring → Step |
| Add/Remove keyframe | Diamond button | Toggle — adds if none at playhead, removes if exists |
Keyframe icons indicate curve type at a glance:
- ◇ Diamond (sky blue) — Linear interpolation
- ● Circle (emerald) — Cubic Bezier
- ◆ Rounded diamond (violet) — Spring
- ■ Square (amber) — Step (instant jump)
New keyframes automatically inherit the curve setting from the previous keyframe on the same track.
Per-Track Modifiers
Section titled “Per-Track Modifiers”Modifiers are procedural effects that transform keyframe values — inspired by After Effects expressions. They run on top of your keyframe animation, adding organic movement without additional keyframes.
Adding Modifiers
Section titled “Adding Modifiers”Two ways to add a modifier to a parameter track:
- Timeline: Click the [+] button next to the track label, or click a modifier badge to edit it
- Inspector: Select a keyframe on the track, then use the Add dropdown in the Modifiers section
Modifier Types
Section titled “Modifier Types”| Modifier | Badge | Effect | Key Parameters |
|---|---|---|---|
| Wiggle | [W] | Perlin noise offset — organic random jitter | frequency (Hz), amplitude |
| Drift | [D] | Smooth wandering — slow organic drift around the center value | radius, speed |
| Sine | [S] | Pure sine wave — regular periodic oscillation | amplitude, speed (Hz), phase |
| Pulse | [P] | Random-interval impulse — occasional bursts | peak offset, interval range, duration |
| Spring | [K] | Damped oscillation — shock → recover bounce | initial offset, stiffness, damping |
| Loop | [L] | Repeat keyframes — cycle or pingpong | mode (cycle / pingpong) |
| Noise | [N] | Simplified Wiggle — fixed-amplitude noise | amplitude, speed |
| Clamp | [C] | Value limiter — keeps output in safe range | min, max |
How Modifiers Work
Section titled “How Modifiers Work”Modifiers chain after keyframe interpolation. At each frame:
Keyframe value (interpolated) → Modifier 1 → Modifier 2 → ... → Final value- Wiggle, Drift, Sine, Noise add an offset to the keyframed value
- Loop modifies the sample time (runs before interpolation)
- Clamp restricts the output range
- Pulse and Spring are triggered effects that decay over time
Modifier + Keyframe Interaction
Section titled “Modifier + Keyframe Interaction”Modifiers use wall-clock time, not clip time. This means:
- Loop clips: Each loop iteration gets different Wiggle/Drift offsets — the animation never looks exactly the same twice
- Release: When a clip enters release, modifiers stop and the value smoothly fades back to baseline
- Hold-then-release: During the hold phase (bound-to-trigger), modifiers continue running
Track Blend Mode
Section titled “Track Blend Mode”Each track has a Blend Mode that controls how its output combines with other active clips:
| Mode | Behavior | Use Case |
|---|---|---|
| Override (default) | Higher priority clips replace lower ones | Expression animations, action clips |
| Add | Value is added on top of other clips’ output | Breathing, sway (should persist during emotion clips) |
| Multiply | Value multiplies other clips’ output | Special effects |
Set blend mode to Add for tracks that should layer (like breathing) — they’ll continue even when higher-priority emotion clips are active.
Virtual Focus Parameters
Section titled “Virtual Focus Parameters”Two special virtual parameters let you control the model’s gaze direction:
| Parameter | Range | Description |
|---|---|---|
__focus_x__ | [-1, 1] | Horizontal gaze direction (left ↔ right) |
__focus_y__ | [-1, 1] | Vertical gaze direction (down ↔ up) |
These appear in the parameter list under the Virtual group. When written, they feed into the engine’s focusController, which automatically drives head angle, body angle, and eye ball parameters based on the model’s built-in coupling.
Virtual Focus + Drift = Natural Gaze Wandering
Section titled “Virtual Focus + Drift = Natural Gaze Wandering”Create a clip with __focus_x__ and __focus_y__ tracks, add Drift modifiers, and assign it to a Base Behavior — the model will naturally look around with smooth, organic eye movement.
Use the Template button in the Clips tab → Virtual Focus to set this up in one click.
Clip Templates
Section titled “Clip Templates”The Template dropdown in the Clips tab provides one-click presets:
| Template | What It Creates |
|---|---|
| Virtual Focus | __focus_x__ + __focus_y__ tracks with Drift modifiers (loop clip) |
| Body Sway + Breathing | ParamBodyAngleZ + ParamBreath tracks with Wiggle modifiers (loop clip) |
Templates create the clip only — assign it to a Base Behavior or Rule to activate it.
Mapping Rules
Section titled “Mapping Rules”Rules connect your agent’s state to clips. When the agent’s state matches a rule’s condition, the associated clips play.
Rule Structure
Section titled “Rule Structure”Each rule has:
- Condition — When to trigger (emotion, action, tool call, or idle)
- Clip Groups — Which clips to play when triggered
- Play Mode — Single clip, all clips, or random selection
Condition Types
Section titled “Condition Types”| Type | Trigger | Example |
|---|---|---|
| Expression | emotion + action combination | emotion = happy, action = speaking |
| Tool Call | Agent uses a specific tool | web_search, manage_memory |
| Idle | Agent is not doing anything | Background animation |
Both emotion and action support an “Any” wildcard — match regardless of that dimension.
All-Match (Parallel Playback)
Section titled “All-Match (Parallel Playback)”All rules whose conditions are satisfied fire simultaneously — not first-match-wins. If multiple rules match, their clips all play in parallel.
Rule 1: emotion=angry, action=Any → play "angry" clipRule 2: emotion=Any, action=speaking → play "talk" clipRule 3: emotion=happy, action=Any → play "smile" clipRule 4: idle → play "breathing" clipIf the agent is angry and speaking, both Rule 1 and Rule 2 match — the “angry” clip and the “talk” clip play together.
Conflict Resolution (Clip Priority)
Section titled “Conflict Resolution (Clip Priority)”When multiple clips write the same parameter, conflicts are resolved by clip priority (set per clip, not per rule):
| Priority | Level | Use Case |
|---|---|---|
| 0 | Idle | Background breathing, subtle movement |
| 1 | Normal | Default for most clips |
| 2 | Emotion | Expression-triggered clips |
| 3 | Force | Highest — overrides everything |
Higher priority clips win. Within the same priority, blend mode determines how values combine (Add layers on top, Multiply scales, Overwrite replaces).
Idle & Base Behaviors
Section titled “Idle & Base Behaviors”Idle behaviors are long-lived background animations that play when no other rule matches. Unlike regular rules, they loop continuously and blend with other animations.
Base behaviors run in both idle and active states, providing a persistent animation layer (like subtle breathing or blinking).
Expression Mapping Integration
Section titled “Expression Mapping Integration”Live2D animation mapping works alongside the broader Expression Settings system:
- Vocabulary (Expression Settings → Vocabulary tab) — Define the emotion and action words your agent can express
- Mapping Rules (Live2D Settings → Mapping tab) — Map those emotions/actions to Live2D clips
- Agent Output — During conversation, the LLM outputs emotions and actions that automatically trigger matching rules
The emotion/action vocabulary you define in Expression Settings becomes the available conditions in your Live2D mapping rules.
Use Case Walkthroughs
Section titled “Use Case Walkthroughs”Use Case 1: Basic Conversation Animation
Section titled “Use Case 1: Basic Conversation Animation”A simple setup for an agent that smiles when happy and looks thoughtful when thinking.
Step 1: Import model (Model tab)
- Upload your Live2D model ZIP
Step 2: Set idle motion (Parameters tab)
- Choose a gentle breathing/blinking motion as default idle
Step 3: Create clips (Mapping tab → Clips)
- “happy” —
ParamEyeLOpen: 0.7,ParamMouthForm: 0.8(squinting smile) - “thinking” —
ParamAngleX: -8,ParamEyeLOpen: 0.5(head tilt, half-closed eyes) - “speaking” —
ParamMouthOpenYcycling 0 → 0.6 → 0 (mouth movement loop)
Step 4: Create rules (Mapping tab → Rules)
emotion = happy→ play “happy” clipemotion = thinking→ play “thinking” clipaction = speaking→ play “speaking” clip (loop)
Step 5: Test
- Use the sequence tester to verify each rule triggers correctly
Use Case 2: Tool Call Animation
Section titled “Use Case 2: Tool Call Animation”Animate the agent differently when using different tools.
Create clips:
- “search” —
ParamEyeLOpen: 1.0,ParamBrowLY: 0.3(wide eyes, raised brows) - “write” —
ParamAngleY: -5,ParamEyeLOpen: 0.6(looking down, focused)
Create rules:
tool_call = web_search→ play “search” cliptool_call = manage_memory→ play “write” clip
Use Case 3: Import from Model Expressions
Section titled “Use Case 3: Import from Model Expressions”Quickly create animations from your model’s bundled expressions.
- In Mapping tab, click Create From Expression
- Select an expression preset (e.g., “smile”)
- Set timing: Enter 300ms, Hold 2000ms, Release 500ms
- The system auto-creates a clip with smooth parameter transitions
- Create a rule mapping
emotion = happy→ the imported clip - Repeat for other expressions
Use Case 4: Desktop Pet Mode
Section titled “Use Case 4: Desktop Pet Mode”Once your Live2D setup is complete:
- Install the AnySoul desktop app
- Enable Desktop Pet mode
- Your Live2D model renders in a transparent overlay on your desktop
- All animation rules work exactly the same — emotions, actions, and tool calls drive the pet’s animations in real-time
Default State (Idle)
Section titled “Default State (Idle)”The Default (Idle) section at the bottom of the Mapping tab lets you set baseline parameter values. These values are applied when no clips or rules are active.
Use this to set a neutral expression — slightly open eyes, relaxed mouth, centered head position — that serves as the “home” state your animations depart from and return to.
Next Steps
Section titled “Next Steps”- Parameters Reference — Deep dive into each parameter
- Getting Started — Go back to model import basics