mvntSTUDIO: An AI Dance Generator by K-POP Choreographers—Right Idea, Early Days
2026-02-28 | ProductHunt | Official Website | Hacker News

Interface Breakdown: Paste a YouTube link on the left, and preview a 3D avatar dancing in real-time on the right. The interaction is highly intuitive—drop a link, wait 3 minutes, and a choreography is born.
30-Second Quick Judgment
What it does: Give it a song (YouTube link or audio file), and the AI automatically generates a 3D dance animation. It’s not just using templates; it "originates" choreography based on the music's rhythm.
Is it worth watching?: Worth bookmarking, but don't spend too much time on it just yet. The product is a v0.1 public beta, and even MP4 downloads aren't ready. The real interest lies in the team—choreographers for BTS/Seventeen + modern dance entrepreneurs, backed by Epic MegaGrants and NVIDIA Inception. This isn't just a casual side project.
Three Questions That Matter
Is it relevant to me?
Target User Personas:
- Short Video Creators: Want to add unique dance animations to TikTok/Reels.
- K-POP / Dance Enthusiasts: Want to visualize the choreography in their heads.
- Game/Animation Developers: Need 3D animation assets for character dances.
- Professional Choreographers: Use AI for creative brainstorming and rapid prototyping.
Am I the target user? If you create short video content, work in 3D animation, or do anything related to dance/K-POP, this is for you. If you're a general developer or PM, it's mostly for "eye-opening" value.
Is it useful to me?
| Dimension | Benefit | Cost |
|---|---|---|
| Time | Generate choreo in 3 minutes; traditional ways require hiring choreographers + mocap gear | Current quality is limited; may require post-editing |
| Money | Completely free for now | Future pricing is unknown |
| Effort | Just drop a YouTube link; zero barrier to entry | Very early stage; missing features (no MP4 download, no finger/facial movement) |
ROI Judgment: At this stage, play with it for the experience, but don't rely on it for formal projects. However, if you're in dance/animation, it's worth joining their community—this team's background suggests the product will iterate significantly.
Is it fun to use?
The "Wow" Factors:
- Zero-Barrier Choreography: Paste link → 3 mins later → Result. The experience itself is quite magical.
- Image to Dance: Upload any character image and make it dance (powered by Tripo AI v3). People are making One Punch Man and SpongeBob dance; the community gallery is full of these quirky creations.

Community Gallery: Users making characters like Patrick, Baby Shark, and anime icons dance. This "meme-like" playfulness is the product's biggest draw right now.
Founder's Quote:
"We simply thought it'd be just purely fun to open them like a public testbed, share the process, and play them around together as a community." — @jooooooooonjung (Source)
For Independent Developers
Tech Stack
- Core Model: mvnt-m4, a proprietary diffusion-based music-to-motion model.
- Infrastructure: EDGE++, improved from Stanford's EDGE (CVPR 2023).
- Music Encoding: Frozen Jukebox model extracts features → transformer-based diffusion model generates 5-second dance segments.
- 3D Character Rendering: Tripo AI v3 (for Image to Dance).
- Training Data: Proprietary mocap dataset, significantly larger than AIST++ (5.2 hours, 10 dance styles), sourced from professional choreographers.
- Deployment: Web-based, WIP migration to g7e instances for faster inference.
- Product Form: Browser-based playground.
Core Technical Logic
Simply put: Music → Jukebox extracts rhythm/melody features → Diffusion model "denoises" to generate motion sequences → Rendered into 3D animation.
The innovation of EDGE was replacing GAN/Transformer schemes with diffusion models, which naturally supports editing (e.g., specifying a joint's movement or interpolating between keyframes). mvntSTUDIO trained EDGE++ on a much larger dataset for higher quality and multi-style support.
Open Source Status
- mvntSTUDIO itself: Not open source. The GitHub org (github.com/mvnt) has no public code.
- Base Paper: EDGE is fully open source → GitHub - Stanford-TML/EDGE.
- Similar Projects: LODGE (CVPR 2024, long-sequence generation), DanceRevolution, Bailando.
- Build Difficulty: Medium-High. You can run the EDGE code, but reaching product-grade quality requires massive amounts of high-quality mocap data (the core moat). A demo could take 3-6 months, but data collection is the real hurdle.
Business Model
- Current: Free playground (user validation phase).
- Parent Company MVNT's Path:
- Emote Publisher: Selling dance IP (K-POP, traditional) to game companies.
- Zepeto Partnership: Selling virtual dance goods on Naver Z's metaverse platform (20M MAU).
- Choreography Rights Management: Helping choreographers collect royalties (similar to ASCAP for music).
Big Tech Risk
Risky, but not fatal. AI commentators like Linus Ekenstam have noted Google Labs' AISOMA project is also working on AI choreography. However, this niche has two natural barriers:
- Data Barrier: High-quality mocap requires professional dancers. MVNT's co-founder is a top K-POP choreographer—a resource big tech can't just buy overnight.
- Copyright Barrier: MVNT holds a choreography brokerage license from the Korea Copyright Commission, giving them a first-mover advantage in K-POP IP.
For Product Managers
Pain Point Analysis
- Core Issue: AI content covers text, images, music, and video, but "dance/motion" is almost a vacuum.
- Severity: Medium-frequency need. For creators, adding dance animations currently requires expensive choreographers/mocap or unoriginal templates. It's not a "life or death" pain point yet.
- Deeper Pain: Choreographers create iconic K-POP dances but don't get royalties like songwriters. MVNT wants to change this fundamental industry structure.
User Personas
| User Type | Frequency | Willingness to Pay |
|---|---|---|
| TikTok/Short Video Creators | High | Low-Mid (mostly free) |
| Game Developers (Character Motion) | Mid | High (B2B) |
| K-POP Entertainment Companies | Low freq, high value | High (IP Trading) |
| Professional Choreographers | Low | Low (Tool assistance) |
Feature Breakdown
| Feature | Type | Description |
|---|---|---|
| Music to Dance | Core | YouTube link → 3D dance in < 3 mins |
| Image to Dance | Core | Upload character image → Make it dance |
| Community Gallery | Growth | Showcasing user work, social proof |
| Screen Recording | Basic | Export video (MP4 download WIP) |
| Choreography Assistant | Planned | Tools for professional choreographers |
| MVNT Studio Compose | Planned | B2B motion generation for games/virtual environments |
Competitor Comparison
| Dimension | mvntSTUDIO | DeepMotion ($17/mo) | Krikey AI (Free/$15/mo) | Freebeat |
|---|---|---|---|---|
| Core Capability | Music→Original Choreo | Video→3D Mocap | Template-driven 3D | Music→2D Dance Video |
| Innovation | AI choreographs from scratch | Replicates human motion | One-stop animation | Beat-matched visuals |
| Output Format | 3D Animation (WebGL) | FBX | Video/FBX/GIF | Video |
| Data Source | Licensed Mocap | User-uploaded video | Preset templates | AI Generated |
| Best Scenario | Creative Choreo/K-POP | Game Animation | Education/Marketing | Short Videos |
Key Difference: Most competitors do "motion transfer"—applying existing dances to characters. mvntSTUDIO does "motion generation"—creating new choreography directly from music. This is a fundamental distinction.
Key Takeaways
- "Playground" Launch Strategy: Don't wait for perfection. Open for testing and let quirky community creations (making any character dance) drive virality.
- UGC Workflow Integration: They explicitly mention exporting dances as motion references for Kling AI or Seedance, positioning themselves as a link in the creative toolchain.

Workflow Strategy: Suno generates music → Tripo generates character → mvntSTUDIO generates dance → Kling AI generates final video. This pipeline positioning is very clever.
For Tech Bloggers
Founder Story
This team has a great narrative:
-
Choi Youngjun: A legendary K-POP choreographer, head of TEAM SAME, active since 2010 with 700+ choreographies. He’s worked with BTS, Seventeen, TWICE, and Wanna One. Recipient of the 2020 Prime Minister's Award. He experienced firsthand the "fame without fortune" struggle—iconic dances go viral globally, but choreographers only get a one-time fee, no royalties.
-
Jung Eui-jun: CEO, modern dance major, and former founding member of the co-living space 'nonce'. He bridges the gap between dance and entrepreneurship.
Motivation: Seeing that choreography isn't protected as IP, they decided to use AI + Blockchain to build a choreography IP system. mvntSTUDIO is the consumer-facing product of this vision.
Controversy / Discussion Angles
- Will AI replace choreographers? MVNT says "no"—they emphasize that all training data is licensed and the goal is to help choreographers earn royalties. However, the topic remains ripe for debate. CalMatters reported on AI dance tests, concluding "AI still isn't great at dancing"—which is both a challenge and an opportunity.
- K-POP Choreography Copyright: A fascinating angle. K-POP is global, but the creators of the moves aren't profiting. Naver Z (Zepeto) has already signed with MVNT to protect these rights.
- Web3 Elements: MVNT's GitHub describes them as "Web 3.0 Entertainments," but the current product has no obvious blockchain/token elements. Worth asking about.
Hype Data
- ProductHunt: 2 votes, almost no heat yet.
- Hacker News: Show HN post is live, early stage.
- Twitter/X: Founder's post has low engagement, but prominent AI influencer Linus Ekenstam mentioned Google's competitor AISOMA (86 likes).
- Overall Judgment: Extremely early; public awareness is near zero. But the narrative of "BTS choreographer building an AI tool" is highly shareable.
Content Suggestions
- Best Angle: "The BTS Choreographer's AI Startup Story"—blending tech with K-POP culture for a deep, character-driven piece.
- Trend Jacking: If AI dancing takes off on TikTok (like AI fashion did), this is the first dedicated tool.
- Comparison Review: Put mvntSTUDIO, DeepMotion, Krikey, and Freebeat to the test using the same song.
For Early Adopters
Pricing Analysis
| Tier | Price | Features | Is it enough? |
|---|---|---|---|
| Current (v0.1) | Completely Free | Music to Dance, Image to Dance, Screen Recording, Gallery | Fun to play with, but feature-incomplete |
| Future | Unannounced | Likely paid tiers | TBD |
Quick Start Guide
- Setup Time: 1 minute.
- Learning Curve: Extremely low—just paste a link.
- Steps:
- Go to mvnt.studio
- Paste a YouTube link or upload a .wav file.
- Wait ~3 minutes.
- Watch the 3D character dance; use Screen Recording to export.
- (Optional) Upload a character image to make a custom avatar dance.
Pitfalls and Gripes
- No MP4 Download: You have to screen record, which is annoying. Founders say it's WIP.
- No Finger/Facial Movement: Characters look a bit like marionettes, lacking detail.
- Dance Quality: The v0.1 m4 model is still improving; m4.1 is in development.
- Very Early Stage: It's explicitly a "public testbed," so manage your expectations.
- Social Login: Not yet supported.
Security and Privacy
- Data Storage: Cloud-processed (uploaded music/images are handled on servers).
- Privacy Policy: No clear privacy policy page found yet.
- Security Audit: None.
Alternatives
| Alternative | Advantage | Disadvantage |
|---|---|---|
| DeepMotion | High mocap precision, FBX export | Starts at $17/mo, no original choreo |
| Krikey AI | One-stop 3D animation, free tier | Template-driven, limited creativity |
| Freebeat | Music-driven video, great beat matching | 2D video, not 3D animation |
| EDGE Open Source | Free and self-deployable | Requires tech skills, average quality |
For Investors
Market Analysis
- 3D Mocap Market: $319.25M (2026) → $595.37M (2031), CAGR 13.27%.
- Generative AI Animation Market: $652.1M (2024) → $13.4B (2033), CAGR 39.8%—the fastest-growing segment.
- Animation & VFX Total Market: $220.69B (2026), CAGR 11.86%.
- Drivers: Explosion of short-form video, falling AI inference costs, marker-less mocap maturity, and Metaverse/Virtual Human demand.
Competitive Landscape
| Tier | Players | Positioning |
|---|---|---|
| Big Tech Labs | Google Labs (AISOMA), Meta | Research stage, not yet products |
| Leading Tools | DeepMotion, Move.ai | Pro-grade mocap for studios |
| Mid-tier | Krikey, Viggle, Freebeat | Consumer-grade animation/video tools |
| New Entrants | mvntSTUDIO | Only one focused on "Music→Original Choreo" |
| Open Source | Stanford EDGE, LODGE | Academic papers + implementations |
Timing Analysis
-
Why now?:
- AI has covered text/image/audio/video; "motion/dance" is the final frontier.
- Short video platforms (TikTok) have caused dance content demand to skyrocket.
- Diffusion models have just reached the "usable" quality threshold for motion.
- K-POP choreography copyright issues are gaining traction, with new legislation in Korea.
-
Risk: Technology isn't fully mature (AI dance quality is still far from human level); it may take 1-2 years to be production-ready.
Team Background
- CEO Jung Eui-jun: Modern dance + startup background.
- Co-founder Choi Youngjun: Top K-POP choreographer (BTS/Seventeen/TWICE), 700+ works.
- Team Size: Undisclosed (estimated small team of 5-10).
- Unique Advantage: Rare combination of top-tier industry resources and AI technical capability.
Funding Status
- Raised: Seed round (amount undisclosed).
- Investors: Mashup Ventures, NVIDIA Inception Program, Google for Startups.
- Mashup Ventures Quote: "MVNT has the potential to become the Universal Music Group of choreography copyright."
- Others: Epic Games MegaGrant, Korea Copyright Commission license.
- Naver Z Partnership: Choreo IP distribution deal with Zepeto (20M MAU).
Conclusion
The Bottom Line: mvntSTUDIO has the right direction (AI choreography is a vacuum) and the right team (K-POP pros + AI tech), but the product is extremely early. It’s time to "watch," not necessarily to "use" for production.
| User Type | Recommendation |
|---|---|
| Developers | Wait and see. Moat is data, not just algorithms. Start with EDGE open source if building similar tools. |
| Product Managers | Bookmark. The "playground + gallery" strategy and UGC workflow integration are smart lessons. |
| Bloggers | Follow. The "BTS choreographer" story is great, but wait for better quality before a full review. |
| Early Adopters | Go play. It's free; experience the "music-to-dance" magic, but don't use it for pro work yet. |
| Investors | Worth tracking. Rare team resources in a 39.8% CAGR market, but needs 1-2 years for maturity. |
Resource Links
| Resource | Link |
|---|---|
| Official Website | mvnt.studio |
| ProductHunt | mvntSTUDIO |
| Hacker News | Show HN |
| Parent Company | mvnt.world |
| Tech Docs | MVNT Docs |
| GitHub | github.com/mvnt |
| EDGE Paper | edge-dance.github.io |
| Crunchbase | MVNT |
| Founder Twitter | @jooooooooonjung |
| Funding News | WOWTALE |
2026-02-28 | Trend-Tracker v7.3