Skip to content

Live2D Animation & Mapping

The Mapping tab is where you bring your Live2D model to life by connecting agent behavior to animations. The system has three layers: Clips define what happens, Rules define when it happens, and your agent’s state drives the whole thing automatically.

Agent State (emotion, action, tool call)
┌──────────┐ all-match (every matching rule fires)
│ Rules │ ──────────────────────► Which clips to play?
└──────────┘
┌──────────┐ clips play in parallel
│ Clips │ ──► Parameter changes over time
└──────────┘
│ conflicts resolved by priority + blend mode
Live2D Model Parameters Updated

A clip is a reusable animation — a set of parameter changes over time. Think of it like a short animation sequence you can trigger from multiple rules.

Three ways to create clips:

1. Manual creation

  • Click Add Clip in the Mapping tab
  • Add parameter tracks and set keyframes
  • Use the full-screen timeline editor for precise control

2. Import from Expression

  • Click Create From Expression
  • Select a model expression preset (.exp3.json)
  • The system creates a clip with enter → hold → release phases
  • Configure durations for each phase

3. Import from Motion

  • Click Import Motion As Clip
  • Select a model motion (.motion3.json)
  • The motion’s curves are converted into editable keyframes
PropertyOptionsDescription
Blend ModeAdd, Multiply, OverwriteHow this clip combines with other active clips
PlaybackOne Shot, Loop, Repeat NHow many times the clip plays
CompletionComplete, BoundWhether clip finishes if the trigger ends early
End BehaviorStop, Hold then ReleaseWhat happens when the clip finishes
Release CurveDuration (ms)Smooth fade-out when the clip ends

Click Open Full Editor for the standalone timeline editor, which offers:

  • Per-parameter keyframe tracks with per-track modifiers
  • Cubic-bezier curve editing with visual curve preview
  • Scrub, pan, and zoom controls
  • Real-time preview synchronized with the model
ActionShortcutDescription
Select keyframeClickFirst click selects, second click opens inline edit panel
Cycle curve typeCtrl+ClickCycles: Linear → Bezier → Spring → Step
Add/Remove keyframeDiamond buttonToggle — adds if none at playhead, removes if exists

Keyframe icons indicate curve type at a glance:

  • ◇ Diamond (sky blue) — Linear interpolation
  • ● Circle (emerald) — Cubic Bezier
  • ◆ Rounded diamond (violet) — Spring
  • ■ Square (amber) — Step (instant jump)

New keyframes automatically inherit the curve setting from the previous keyframe on the same track.

Modifiers are procedural effects that transform keyframe values — inspired by After Effects expressions. They run on top of your keyframe animation, adding organic movement without additional keyframes.

Two ways to add a modifier to a parameter track:

  1. Timeline: Click the [+] button next to the track label, or click a modifier badge to edit it
  2. Inspector: Select a keyframe on the track, then use the Add dropdown in the Modifiers section
ModifierBadgeEffectKey Parameters
Wiggle[W]Perlin noise offset — organic random jitterfrequency (Hz), amplitude
Drift[D]Smooth wandering — slow organic drift around the center valueradius, speed
Sine[S]Pure sine wave — regular periodic oscillationamplitude, speed (Hz), phase
Pulse[P]Random-interval impulse — occasional burstspeak offset, interval range, duration
Spring[K]Damped oscillation — shock → recover bounceinitial offset, stiffness, damping
Loop[L]Repeat keyframes — cycle or pingpongmode (cycle / pingpong)
Noise[N]Simplified Wiggle — fixed-amplitude noiseamplitude, speed
Clamp[C]Value limiter — keeps output in safe rangemin, max

Modifiers chain after keyframe interpolation. At each frame:

Keyframe value (interpolated) → Modifier 1 → Modifier 2 → ... → Final value
  • Wiggle, Drift, Sine, Noise add an offset to the keyframed value
  • Loop modifies the sample time (runs before interpolation)
  • Clamp restricts the output range
  • Pulse and Spring are triggered effects that decay over time

Modifiers use wall-clock time, not clip time. This means:

  • Loop clips: Each loop iteration gets different Wiggle/Drift offsets — the animation never looks exactly the same twice
  • Release: When a clip enters release, modifiers stop and the value smoothly fades back to baseline
  • Hold-then-release: During the hold phase (bound-to-trigger), modifiers continue running

Each track has a Blend Mode that controls how its output combines with other active clips:

ModeBehaviorUse Case
Override (default)Higher priority clips replace lower onesExpression animations, action clips
AddValue is added on top of other clips’ outputBreathing, sway (should persist during emotion clips)
MultiplyValue multiplies other clips’ outputSpecial effects

Set blend mode to Add for tracks that should layer (like breathing) — they’ll continue even when higher-priority emotion clips are active.

Two special virtual parameters let you control the model’s gaze direction:

ParameterRangeDescription
__focus_x__[-1, 1]Horizontal gaze direction (left ↔ right)
__focus_y__[-1, 1]Vertical gaze direction (down ↔ up)

These appear in the parameter list under the Virtual group. When written, they feed into the engine’s focusController, which automatically drives head angle, body angle, and eye ball parameters based on the model’s built-in coupling.

Virtual Focus + Drift = Natural Gaze Wandering

Section titled “Virtual Focus + Drift = Natural Gaze Wandering”

Create a clip with __focus_x__ and __focus_y__ tracks, add Drift modifiers, and assign it to a Base Behavior — the model will naturally look around with smooth, organic eye movement.

Use the Template button in the Clips tab → Virtual Focus to set this up in one click.

The Template dropdown in the Clips tab provides one-click presets:

TemplateWhat It Creates
Virtual Focus__focus_x__ + __focus_y__ tracks with Drift modifiers (loop clip)
Body Sway + BreathingParamBodyAngleZ + ParamBreath tracks with Wiggle modifiers (loop clip)

Templates create the clip only — assign it to a Base Behavior or Rule to activate it.

Rules connect your agent’s state to clips. When the agent’s state matches a rule’s condition, the associated clips play.

Each rule has:

  • Condition — When to trigger (emotion, action, tool call, or idle)
  • Clip Groups — Which clips to play when triggered
  • Play Mode — Single clip, all clips, or random selection
TypeTriggerExample
Expressionemotion + action combinationemotion = happy, action = speaking
Tool CallAgent uses a specific toolweb_search, manage_memory
IdleAgent is not doing anythingBackground animation

Both emotion and action support an “Any” wildcard — match regardless of that dimension.

All rules whose conditions are satisfied fire simultaneously — not first-match-wins. If multiple rules match, their clips all play in parallel.

Rule 1: emotion=angry, action=Any → play "angry" clip
Rule 2: emotion=Any, action=speaking → play "talk" clip
Rule 3: emotion=happy, action=Any → play "smile" clip
Rule 4: idle → play "breathing" clip

If the agent is angry and speaking, both Rule 1 and Rule 2 match — the “angry” clip and the “talk” clip play together.

When multiple clips write the same parameter, conflicts are resolved by clip priority (set per clip, not per rule):

PriorityLevelUse Case
0IdleBackground breathing, subtle movement
1NormalDefault for most clips
2EmotionExpression-triggered clips
3ForceHighest — overrides everything

Higher priority clips win. Within the same priority, blend mode determines how values combine (Add layers on top, Multiply scales, Overwrite replaces).

Idle behaviors are long-lived background animations that play when no other rule matches. Unlike regular rules, they loop continuously and blend with other animations.

Base behaviors run in both idle and active states, providing a persistent animation layer (like subtle breathing or blinking).

Live2D animation mapping works alongside the broader Expression Settings system:

  1. Vocabulary (Expression Settings → Vocabulary tab) — Define the emotion and action words your agent can express
  2. Mapping Rules (Live2D Settings → Mapping tab) — Map those emotions/actions to Live2D clips
  3. Agent Output — During conversation, the LLM outputs emotions and actions that automatically trigger matching rules

The emotion/action vocabulary you define in Expression Settings becomes the available conditions in your Live2D mapping rules.

A simple setup for an agent that smiles when happy and looks thoughtful when thinking.

Step 1: Import model (Model tab)

  • Upload your Live2D model ZIP

Step 2: Set idle motion (Parameters tab)

  • Choose a gentle breathing/blinking motion as default idle

Step 3: Create clips (Mapping tab → Clips)

  • “happy”ParamEyeLOpen: 0.7, ParamMouthForm: 0.8 (squinting smile)
  • “thinking”ParamAngleX: -8, ParamEyeLOpen: 0.5 (head tilt, half-closed eyes)
  • “speaking”ParamMouthOpenY cycling 0 → 0.6 → 0 (mouth movement loop)

Step 4: Create rules (Mapping tab → Rules)

  • emotion = happy → play “happy” clip
  • emotion = thinking → play “thinking” clip
  • action = speaking → play “speaking” clip (loop)

Step 5: Test

  • Use the sequence tester to verify each rule triggers correctly

Animate the agent differently when using different tools.

Create clips:

  • “search”ParamEyeLOpen: 1.0, ParamBrowLY: 0.3 (wide eyes, raised brows)
  • “write”ParamAngleY: -5, ParamEyeLOpen: 0.6 (looking down, focused)

Create rules:

  • tool_call = web_search → play “search” clip
  • tool_call = manage_memory → play “write” clip

Quickly create animations from your model’s bundled expressions.

  1. In Mapping tab, click Create From Expression
  2. Select an expression preset (e.g., “smile”)
  3. Set timing: Enter 300ms, Hold 2000ms, Release 500ms
  4. The system auto-creates a clip with smooth parameter transitions
  5. Create a rule mapping emotion = happy → the imported clip
  6. Repeat for other expressions

Once your Live2D setup is complete:

  1. Install the AnySoul desktop app
  2. Enable Desktop Pet mode
  3. Your Live2D model renders in a transparent overlay on your desktop
  4. All animation rules work exactly the same — emotions, actions, and tool calls drive the pet’s animations in real-time

The Default (Idle) section at the bottom of the Mapping tab lets you set baseline parameter values. These values are applied when no clips or rules are active.

Use this to set a neutral expression — slightly open eyes, relaxed mouth, centered head position — that serves as the “home” state your animations depart from and return to.