All News

Startup Fable Wants to AI-Remake a Lost Welles Film

Fable, a startup building AI tools for automated storytelling, announced plans to recreate the lost 43 minutes of Orson Welles’ The Magnificent Ambersons using a new long-form narrative model and face‑swap techniques. The project lacks studio rights and estate sign-off, raising questions about authorship, IP, and whether AI can truly restore what was lost.

Published September 6, 2025 at 06:10 PM EDT in Artificial Intelligence (AI)

Fable Announces Ambitious AI Reconstruction of a Lost Welles Film

On Friday, startup Fable—which dubs itself the “Netflix of AI” and counts Amazon’s Alexa Fund among its backers—revealed plans to recreate the missing 43 minutes of Orson Welles’ 1942 film The Magnificent Ambersons. The company has launched a new AI model designed to generate long, complex narratives and already powers user-created cartoons and unauthorized fan episodes of shows like South Park.

Fable says the remake will be a hybrid: some scenes reshot with contemporary actors, then digitally altered to resemble the original cast; other sequences stitched together from AI-generated content. Filmmaker Brian Rose, who has spent years reconstructing Welles’ lost footage, plans to use Fable’s model over the next two years to assemble the missing material.

Crucially, Fable has not secured rights to the film and did not notify Welles’ estate. The estate characterized the plan as a publicity play and warned it would be a “purely mechanical exercise” lacking Welles’ creative spark. That reaction underscores the legal and moral friction when AI tackles culturally important works without provenance or permission.

Why this project? The Magnificent Ambersons carries a specific reputation: a studio‑mutilated film now remembered for what was lost, not only for what remains. For preservationists and cinephiles that sense of loss is powerful—so powerful that a tech-driven attempt at reconstruction invites both curiosity and skepticism. Can code restore an auteur’s intent?

There are clear technical and commercial precedents: AI voice models, deepfake face swaps, and narrative generation each have practical uses and known risks. Fable’s platform shows those capabilities can be stitched together into longer-form outputs—but technical ability does not erase legal obligations or cultural stewardship.

  • Intellectual property risk: remakes without rights invite takedowns, lawsuits, and blocked distribution.
  • Authorship questions: AI-generated or AI-assembled sequences complicate who can claim creative credit.
  • Ethics and reputation: estates, audiences, and critics may reject synthetic reconstructions as inauthentic or exploitative.
  • Preservation vs. recreation: historians distinguish between restoring originals and creating new works inspired by lost ones.
  • Operational safeguards: provenance metadata, consent records, and model audits help platforms limit misuse.

Think about the real-world tradeoffs. A studio or archive might prefer a transparent, labeled reimagining co-created with rights holders. A consumer app might push viral clips without context. Regulators and platforms will increasingly need clear rules on labeling, licensing, and the reuse of likenesses and scripts.

What should organizations do next? Practical steps include: mapping IP and consent, requiring auditable model logs, embedding visible provenance markers in outputs, and negotiating fair compensation with estates and rights holders. These are not neutral compliance tasks—they shape trust, market access, and the long-term viability of AI-driven creativity.

Fable’s announcement is a useful stress test: it shows how far generative tools have come and exposes the gaps that remain—in law, ethics, and craft. Even if the project stays a demo, it forces a conversation about what counts as restoration, who gets to decide, and how technology firms should behave when treading on cultural touchstones.

At QuarkyByte we translate these debates into actionable roadmaps for studios, archives, and platforms: combining legal due diligence, provenance engineering, and human-in-the-loop workflows so organizations can explore AI creativity while protecting authorship and audience trust.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte can help studios, archives, and policymakers map legal, ethical, and technical risk when AI touches legacy creative works. We translate provenance, consent, and model-auditing requirements into practical guardrails and deployment strategies that protect reputation and open new preservation pathways.