VTS Starter Guide: How to load Live2D Models into VTube Studio
So you’ve got yourself a stunning Live2D model and can’t wait to debut it on stream. This walk-through will teach you how to import the model into VTube Studio, calibrate face tracking for lively expressions, and map your favorite hand-crafted poses to a single key-press. I will use a free permade model as an example to explain the steps. The whole setup takes less than thirty minutes. Let's start. :3
Free Premade Model information:
illustrators: CB (@__ceeb__)
riggers: patcha (@pat__cha)
Importing Your Model
1.Install VTube Studio from Steam and launch it.
2.Open the side panel (hover on the left edge and double click if disappered) and click "Change VTS Model".
3.Press the IMPORT button that appears; a file window pointing to the software’s model folder pops up.
4.Drag the entire folder you received from the rigger (it must contain the .model3.json file) into that window. If the new model doesn’t show up immediately, close and restart VTube Studio—your avatar will now appear.
Face-tracking Tune-up
A beautiful model only comes alive when it copies your grin and eyebrow raises accurately.
1.While the model is on screen, double-click (or double-tap on mobile) to reveal the menu.
2.Choose General Settings, then pick the tab "Webcam, Tracking, Video & Visual Settings".
Webcam: Select your camera and resolution—720p at 60 fps is the sweet spot for most laptops.
Tracking Sensitivity
• Blink: Start around 25-40. lesser will make the eyes half open.
• Mouth: Start around 25–40. Over-doing it makes the lips flutter when you simply breathe.
• Smile:40-60. Smile will make model more approachable. Freely change it according to the model style.
• Eyes: 55–70 gives quick winks without accidental half-blinks.
• Brow:55-70 Flexible brow can make expressions more vivid.
Then Sit straight, look at the camera, keep a natural face for three seconds, then release. Done.
Install the VTube Studio phone app, connect via Wi-Fi, and select ARKit/USB for smoother jaw and eyebrow data—great for walking streams.
To doing this you need: First, download the VTS app in your phone and connect the same WiFi in your PC. Open the setting page and you will see the page like the image down below. Then find the IP and port number that show in the same page in your VTS on the PC. Enter the IP address and port number on your mobile device and then click "connect to PC". Good job! And you will connect your phone camera to the face tracking.
Tips: Open streaming mode will make the face tracking more stable and flexible.
One-key Fixed Expressions
Most riggers ship extra poses: sunglasses, heart eyes, sweat-drop, etc. Mapping them to a hot-key keeps your hands free for streaming.
1. Double-click to open the menu again
2. Find Settings and click Hotkeys & Expressions.
3. First time only: hit "Create New Hotkeys for All Unused Expressions". The software lists every pose it found.
You'll find JSON files in the folder provided by the motion capture rigging technician. Most of the time, each JSON file represents an expression or action.
In the example on the picture, if you want your avatar put on the glasses while you press Numpad 8 on your keyboard.
Find the "Glasses" clip, click the tiny RECORD button, then press the “Numpad 8” key. The field updates instantly.
Close settings and tap Numpad 8 at any time; the glasses will pop on until you tap once more.
Repeat for the rest of your triggers (angry, blush, dejected). Keep the shortcuts close together so you can hit them without looking away from the monitor.
That’s it—you’re ready to hit START VIRTUAL CAMERA and add that source to OBS, Zoom, or Discord. Happy streaming!