WELL, THIS IS THE 21st CENTURY, YOU KNOW: Hatsune Miku, the Animated Vocaloid Who Headlined Dallas, Should Run for President:

When the Fort Worth resident she says “it,” she doesn’t mean the avatar, she means Miku’s devoted fan base. The real show here is not the 3D anime vixen twirling on stage, but the solidarity of the fan base, which knows every beat, outfit, virtual sidekick and glow stick color change. A 3D animation brought these disparate people together to listen to music with lyrics in a language most can’t understand.

Enthusiasm, mass appeal, razor-sharp marketing and crowd manipulation — this Hatsune Miku is too good for pop entertainment.  She should run for president.

Like any epic talent, Miku came from humble beginnings. Her origin story is rooted in music industry software, not talent recruitment.

Yamaha developed the vocaloid concept in the early 2000s. The idea was simple: Instead of paying vocal artists to sing, what if researchers could make a synthesizer that could approximate a human voice? This “singer in a box” could open up new creative venues, especially in the world of synth-pop.

Here’s how it works, more or less. A voice actor provides samples of sounds for a digital library. Users type in the lyrics and melody, and the voice follows along. The reason this doesn’t sound like the computer from WarGames is because the software includes a Synthesis Engine that coverts pitch, manipulates timbre and adjusts timing. The software also adds stress to pronunciations and vibrato, but it can’t approximate a shout. (Grunge is safe from vocaloids. For now.)

Called it — right down to the use of Yamaha’s Vocaloid — back in 2004 at Tech Central Station, in an article titled: “The Making of a Pop Star, 2010.”