User Tools

Site Tools

This page exists within the Old ArtZone Wiki section of this site. Read the information presented on the linked page to better understand the significance of this fact.

Tutorial 1: Getting Started in Mimic for Lightwave

Author: Anonymous

Tools Needed

  • Mimic 2 Pro for Lightwave
  • Lightwave


This tutorial explains how to set up your character for use in Mimic with the BasicTemplate DMC. This is not the only way to set up your character. Morphs can be named whatever you want them to be, but by following the conventions set in this tutorial you will be able to load your character into Mimic and have it working in seconds.

Step 1 - Modeler: Setting Up the Character


Modeler: Setting up the character

The first steps for setting up a character to be used with Mimic happen in Modeler. We're going to work with a basic female human head for this tutorial.

There are two fundamental steps here: One is to set up endomorphs for the different mouth positions (phonemes), gestures, and expressions. The other is to set up the correct bone structure and weight map to animate headmovement gestures.

Step 2 - Endomorphs

There are 11 phonemes in the basic configuration of Mimic.

The phonemes are as follows:


The list below shows the morph targets created for our character in Lightwave. If you name your morphs exactly as shown here, they will automatically associate themselves with the phonemes, gestures and expressions.

The following pages illustrate the different morphs created for this character and what phoneme, gesture or expression each is applied to.


Note: Lightwave displays morphs in alphabetical order.This list has been re-arranged for the purposes of this tutorial.

Step 3 - Phonemes



1) Make sure that your open mouth phonemes, like the 'AA' sound, incorporate jaw movement as well. Keep in mind that the upper teeth should never move. Think in terms of the underlying skeletal structure and rotate your lower teeth from the pivot point of where the jaw should rotate just under the ears.

2) When you create morphs for hard consonants like the 'K' sound, you may want to add some neck constriction, pushing inward to simulate the muscles closing the throat. If you have a male character with an adam's apple, move it upwards as well.

Step 4 - Gestures and Expressions





Step 5 - Bones and Weightmaps

There are a couple of additional steps you should add to your rigging process to take advantage of Mimic's gestures track. The gestures track is where Mimic applies a somewhat random animation to automatically generate blinks, as well as eyebrow and head movement.

Without Mimic, the bone setup on the left might be an appropriate rig for the head and neck, but we're going to split the neck into two bones, one for normal animation and one specifically for Mimic.


The reason for doing this is so that the animation data that Mimic applies to this bone won't overwrite any keyframes you may have applied in Lightwave. So the key is to only animate the lower neck bone in Lightwave, and reserve the upper bone for Mimic specific animation.

The next step is to create a weightmap with the same name as the special neck bone we created called 'mimic'. This weightmap will define which polys deform with the bone That is being animated in Mimic. Keep in mind that you don't have to associate this weightmap with any bones in your rig in Lightwave. It's just there so that you can see the motion of the bone in the Mimic interface.


Note: There are many different philosophies regarding rigging. Your personal rigging style may vary, but the key issues that transcend individual rigs are the creation of a special bone just for Mimic animation, and the creation of a weightmap to associate with that bone.

Step 6 - Layout: Applying Mimic to a Character

Mimic is an object displacement plug-in, so it is applied to your character in the deform tab of the object properties palette. To launch the Mimic interface, double click 'DAZMimic Pro' when it appears in the displacement list. Once you launch Mimic, the Morphmixer displacement will automatically be added to the character as well.



Step 7 - The Session Manager

The Session Manager is where you load the audio that Mimic will analyze to create the lip-sync. You can also enter (or load from a text file) the words that are being spoken to help Mimic in the analysis. The trick to using this feature is to spell phonetically, the way the words sound.

Additionally, the Session Manager allows you to import a video file that you can place conveniently beside your 3D character for visual reference. If you choose, you can use the audio from a video file instead of importing a separate WAV file.


Configuration File (DMC)

Configuration files are pre-configured preferences for how phonemes, gestures and expressions are set up. In the Base version, you can only load configuration files, but in the Pro version, you can alter these preferences and save your own custom configurations for later use. All of the phonemes, gestures and expressions in this tutorial are part of the 'Basic Template' DMC provided with Mimic.

Clicking 'OK' will begin the sound analysis and take you to the Mimic interface.

Step 8 - The Interface

The Interface

The interface of the consists of the display window, the video window, the timeline, and the phonemes, gestures and expressions. The timeline is very similar to most non-linear editing systems. hitting the play button will play the animation in real time based on the specified frame range. You can make edits while playing the animation and see immediate feedback. This is especially useful when you specify a narrow range of frames around a particular phoneme (or group of phonemes) that you are trying to get just right. Additionally, the timeline allows you to edit the animation curves of the gestures.


Hint: If you zoom in on just the part of the head you need to see animated, and then click the closed eye button in the display window, the rest of your figure will be hidden, greatly speeding up openGL preview.

Each of the three groups of animation channels (phonemes, gestures and expressions) have their own unique attributes:

1. Phonemes - These are generated by analyzing the audio imported into the Session Manager and automatically placed in the timeline.

2. Expressions - These allow you to layer additional facial expressions onto your character. The animation is generated manually, by dragging an expression from the expressions list into the timeline.

3. Gestures - These are pre-defined to create animation for eyebrows, blinks, and head motion (which is controlled through a bone instead of morphs). Unlike Expressions, the animation for these channels is generated by Mimic, just like phonemes. The animation that Mimic generates is somewhat random, but effectively adds a spark of life to a speaking character with very little work.

(Note: there are also gestures for the eyes, intended to counteract the motion of the head, so the eyes continue looking at the same spot. However, a very popular rigging technique is to target the eyebones to a 'look at' target, which basically makes this feature of mimic unnecessary.)

Step 9 - The Timeline

Mimic does all the work of analyzing your sound file and placing the proper phonemes and gestures in the timeline, but you are still able to modify the results.

In the timeline, you can select phonemes, drag them around, move the in and out points, and adjust their strength. Moving the phoneme's position and in and out points is done simply by clicking and dragging. To adjust the length of a clip, select it and alt(option)-drag on the red bar. If you have multiple clips selected at once, all clips will adjust to the same strength.


Gestures work more like Lightwave's graph editor. Clicking on keys will allow you to move them up and down, or forward and backward in time. Clicking on the curve where there is no key will create a new one.

Expressions are dragged into the time line manually as previously discussed. Once the clips are in the timeline, they can be edited the same way as phonemes. One important note on expressions is that the morphs associated with expressions interpolate between clips, so you should place the 'neutral' expression at the beginning and end of any expressions you place in the timeline.


If a phoneme that Mimic generated is incorrect, you can easily change it by selecting a correct phoneme in the phoneme list, and right clicking on the phoneme in the timeline. Select 'replace with palette selection' from the popup menu that appears. You can also simply drag phonemes from the phoneme list to the timeline.