June 2025 Product Updates Webinar Recap: Avatar IV, Gesture Control, Product Placement, and More
# Avatar IV
# Gesture Control
# Product Placement
# Product Updates
The latest HeyGen product updates that give you full creative control
Joyce Kei
Nik Nolte
Adam Halper
June 26, 2025 Webinar - Product Update: Tools for Next-Level AI Video
Webinar Guests:
Nik Nolte, Head of AIS @HeyGen
Adam Halper, Product Manager @HeyGen
The era of “good enough” AI video is officially over. In our latest Product Updates Webinar, Nik Nolte (Head of AIS @HeyGen) and Adam Halper (Product Manager @HeyGen) introduced a powerful lineup of new features built to give creators more control, expression, and quality than ever before.
🎥 If you missed the live session, watch a replay of full webinar below, or read on for a quick recap of everything we announced—and why it matters for your workflow.
What’s New at HeyGen
1. Avatar IV – Now Available in Studio
HeyGen’s most advanced avatar model is now integrated directly into Studio. That means:
Full-body, audio-driven motion for more natural, expressive performances.
Compatible with public avatars and your own trained looks.
Studio features like captions, B-roll, and voice direction now work seamlessly with Avatar IV.
Pro tip: Use a high-resolution image with a slightly open mouth for the best results.
2. Gesture Control for Video Avatars
Custom gestures are now easier than ever. Record specific gestures during your avatar training (like a thumbs up or wave), then assign them to exact words in your script with real-time preview and fine-tuning.
Great for: UGC creators, educators, and anyone looking to add precise motion cues to video.
3. Product Placement Workflows
You can now feature branded products naturally in your content. Here’s two ways:
Shortcut Tool: Upload a product and a person → generate a UGC-style product video instantly.
Avatar Tab: Train a product once, then prompt your avatar to wear, hold, or interact with it in flexible ways.
Perfect for showcasing outfits, logos, or props within a branded video scene!
4. Custom Motion Prompting
Now available in Avatar IV, you can control gestures, expressions, and even eye contact through text prompts.
Tip: Keep prompts short for best results (e.g. “waving,” “sipping coffee”).
This is a Beta feature. It’s still improving, and we want your feedback!
5. Voice Engine Upgrades: Meet Fish
HeyGen now supports Fish, a new premium voice engine built for:
Greater naturalness and emotion
Better voice clone similarity
Improved accent retention
Available for all new voice clones today, with support for existing clones rolling out soon.
6. Voice Director
Direct your voice like a pro. This new feature lets you control:
Emotion (e.g. excited, calm, angry)
Tone and pacing
Expression presets or custom prompts
This works especially well with Avatar IV since motion is synced with the audio emotion.
7. Google Veo in Add Motion
Google’s Veo model is now live in Add Motion! This is perfect for:
Creating dynamic, action-focused clips (like lifting weights or dancing).
Full-body and background motion scenes.
What to choose: Google Veo is best used for expressive motion. For lip-synced dialogue, Avatar IV is still the go-to.
8. Sneak Peek: HeyGen Video Agent
Finally, we previewed our biggest leap yet… HeyGen’s upcoming Video Agent.
An AI-powered creative operating system that acts as your scriptwriter, editor, and director.
Generate entire campaigns from prompts or raw footage, in minutes.
Make the most of HeyGen’s new expressive avatar model with these quick tips:
Use a high-resolution image with a slightly open mouth (not fully closed or wide open)
Choose a photo expression that matches your intended tone — avoid “cheese” smiles unless you want high energy
Motion is driven by voice — use expressive audio for better realism
You can upload, record, or mirror audio to match your avatar’s tone
Custom motion prompting supports gestures, facial expressions, and eye contact
Keep gesture prompts short to avoid looping or overuse
Prompts like “sipping coffee” can work even without visible props in the source image
Results are non-deterministic — try again if the first output isn’t ideal
Use the “More Expressive” toggle for more dynamic movement; turn off if it feels over-animated
Enable the Prompt Refinement toggle to interpret simple phrases like “smiling while talking”
Final Takeaway
HeyGen is building the future of AI video, one update at a time. These new features were designed to help creators move faster, tell richer stories, and get pixel-perfect control over their output.