Nippon TV’s ‘Tokyo Miko Ninja’ Shows What Human-AI Co-Creation Looks Like in Practice (EXCLUSIVE)

3 hours ago 4

When Nippon TV premiered “Tokyo Miko Ninja” in January, the Japanese broadcaster wasn’t testing the waters on AI-integrated drama production. It was, by its own account, making a declaration.

Set in a stylized version of present-day Tokyo where the sensibility of the Edo period lingers, the series follows Sumu Shiraishi, a trainee shrine maiden played by Riko, whose kindness becomes both her greatest asset and her greatest vulnerability as she navigates a conflict with the heir to a rival ninja clan. The show deploys generative AI for creature design, in-camera VFX and virtual production to render a world described as “cyber-Edo” — a fully synthetic, high-quality environment that doesn’t exist in any physical form.

Suzuki Tsutomu, who served as both scriptwriter and producer, spoke to Variety about why Nippon TV moved early on AI, what happened when the technology surprised the production itself, and how the broadcaster is thinking about the model’s commercial future. Hara Hiroo, Nippon TV’s associate managing director for programming and platform strategy, addressed the longer-term strategic stakes.

For Suzuki, the decision to position AI not as an add-on but as the structural core of production design was rooted in how Nippon TV reads the arc of creative technology. “The creative evolution of visual media has always advanced through cycles in which technological progress creates new forms of visual experience,” he says. “We believe that AI is no exception.”

What distinguished the broadcaster’s approach was a refusal to treat AI as a cost-efficiency play. “This project is not a ‘cost-cutting method,'” Suzuki says flatly. “It is a way to create a world that cannot be produced through traditional techniques – an entirely original cityscape that does not physically exist.” The goal, as he frames it, was to present AI not as an experiment but as a viable production model: “New technology creates new experiences, and those experiences become the next standard. We intend to be the ones who proactively open that possibility.”

The most consistent anxiety in AI-assisted production is whether the technology diminishes the human performance it surrounds. Suzuki addresses this as a structural design problem, not an abstract value. The team’s foundational principle: “Actors remain the true leads of this project.”

In practice, that meant controlling color, lighting and depth of field to draw the viewer’s eye toward performance rather than environment; vetting AI-generated visuals against the emotional register of each scene; and sharing video storyboards and visual references with cast before filming so they could imaginatively inhabit the world. “Technology served to enhance their imagination, not constrain it,” Suzuki says.

Shooting in an LED studio environment also yielded an unexpected benefit. Several cast members told the production that the absence of typical on-set ambient sound allowed for deeper concentration than conventional shoots, and the fully indoor environment reduced physical fatigue. “The actors shared that the environment was significantly better for them both physically and mentally,” Suzuki notes.

In post-production, the team applied a single governing question to every editorial decision: “Who is the subject of this cut?” If AI-generated material drew too much attention, it was pulled back. “The final criterion is always whether the actor’s performance shines the most,” Suzuki says.

Suzuki is candid that the risk of technical spectacle overwhelming narrative discipline was “extremely high – and it actually happened.” The case in point: a fight sequence in which the protagonist summons a phoenix-like guardian spirit.

In the original script, the phoenix was conceived as a relatively modest visual flourish. What generative AI produced was something else entirely. “The visuals generated through generative AI possessed a scale and form far beyond what we had initially imagined,” Suzuki says. “Even a single beat of its wings seemed capable of distorting space, freezing time, and dominating the entire frame.”

Rather than reining the image back to fit the original conception, the director and producers chose to let the AI’s output reshape the scene. The team continued regenerating the phoenix’s movement through the offline stage, pushing intensity “within the limits of what would not break the storyline.” The experience shifted how Suzuki understands the production dynamic: “We realized that content was actually co-created by AI and humans.”

The series uses a workflow in which generative AI handles creature and environmental design, in-camera VFX enables real-time compositing within LED environments, and virtual production studios replace location shoots with controllable, data-built worlds. Ayakashi – supernatural creatures drawn from Japanese folklore – were designed using a division of labor Suzuki describes as deliberate: narrative meaning and symbolic function determined by the human creative team; visual form, texture and the subtle elements of distortion or otherness explored through AI. “The significance of the ayakashi – their symbolic relationship to human emotions, their narrative role – was defined through extensive discussions among the producer, screenwriter, director, and AI creators,” Suzuki says. “AI was used to expand the possibilities of form and design.” Final selection always rested with the director.

The workflow shifts cost concentration to pre-production, reduces unpredictable on-location expenses, and makes assets reusable. “Once an AI-generated asset is created, it can be reused any time,” Suzuki notes. The medium- and long-term calculus, in his view, significantly reduces production uncertainty, even if short-term costs don’t necessarily decrease.

Nippon TV produces drama daily. The question of whether “Tokyo Miko Ninja’s” model can travel to that volume was central to the interview. Suzuki’s answer is structured around two distinct tracks. For mainstream serialized production, AI integration is most naturally introduced at the margins – previsualization, virtual background layers for location shoots, creature inserts, fully AI-generated sequences within otherwise conventional episodes. “Introducing technology in these ways has the potential to reliably elevate both the quality and the speed of traditional drama production,” he says.

For high-concept properties where the narrative world is built around AI from the ground up, the creative possibilities are of a different order. “When the very world of the narrative is inseparable from technology, the creative possibilities become especially compelling,” Suzuki says. “Projects like ‘Tokyo Miko Ninja,’ which are premised on forms of expression that could not exist without AI, represent an area we would like to continue exploring proactively.”

On AI’s labor and ethical implications, Suzuki points to internal policy development and proprietary tooling as Nippon TV’s framework. “We have built an environment that allows AI to be operated with confidence, safety, and quality control,” he says, adding that the company intends to “promote the internal use of AI within our company so that we can embrace this wave of innovation without falling behind.”

Hara, speaking to the broader corporate stakes, positions “Tokyo Miko Ninja” as more than a content experiment. “The know-how we gained through the success of this drama is something we aim to apply not only to drama production, but also across multiple areas of our business – including the creation of visual works in various genres, the expansion of IP-based content, and the potential commercialization of AI-related solutions,” he says. The implication is that the series functions as internal proof-of-concept for a company-wide AI posture, not just a programming decision. “The results of ‘Tokyo Miko Ninja’ have made it clear that the new visual experiences enabled by AI cannot be ignored,” Hara adds. “This may well become a catalyst for shifting our lineup.”

“Tokyo Miko Ninja” streams on TVer and Hulu Japan following its Nippon TV broadcast. The series was produced in cooperation with AOI Pro., with production and technical support from CyberAgent. Key credits include Key credits include director Yosuke Goto, AI creative director Akihiro Miyagi and VFX supervisor Yuki Yamada of Tree Digital Studio.

Read Entire Article