Battlefield 6 Faces Backlash for Heavy Generative AI Use Without Steam Disclosure

Signs are mounting that generative AI had a meaningful hand in bringing Battlefield 6 to life, especially in areas players can actually see and hear. A recent business-focused report spotlighting innovative game makers points to AI-assisted character work in EA’s upcoming shooter, including tools used for lip-sync and facial creation. That raises a question many PC players now ask with any major release: does this kind of AI involvement require a disclosure on Steam?

Battlefield 6 has already drawn attention for questionable downloadable content decisions. One cosmetics bundle reportedly featured a weapon configuration that looked oddly put together, while another item included a mask design that seemed uncomfortably close to older Call of Duty artwork. Some of these pieces were later pulled, but there was no clear public explanation about where the designs came from or how they were made—fueling speculation in a community already wary of AI-generated assets slipping into big-budget games.

The report specifically highlights a tool called Voice2Face, which takes recorded speech and produces matching facial animation. The most striking detail is the claim that around 30% of the final animated speech ended up being generated through this process. For players, that’s not a behind-the-scenes productivity trick like faster bug testing or quicker concept brainstorming—it directly affects the on-screen performance in cutscenes and dialogue moments, which can make it feel more “player-facing” than other forms of AI use.

Artists were also reportedly supported by FaceRig, a system that makes it easier to adjust and iterate on character faces without the same level of manual workload. While much of the process was still handled by humans—and the end result can look far more polished than rushed, cut-and-paste content—the workflow suggests machine learning played a visible role in the final game build.

So where does Steam fit into this? Valve’s rules generally don’t require a developer to announce every internal use of AI, especially when it’s limited to ideation, automation, or efficiency improvements. The key phrase in Valve’s policy has centered on “player-facing AI output.” In other words, if AI is producing content that ships to customers in a form they can experience—art, text, voices, animations, or other outputs—that’s where disclosure expectations become more relevant.

This gray area is exactly why AI in gaming keeps turning into controversy. Some publishers appear to flirt with the line, skipping disclosure when they believe the usage is “minor,” only for players to discover questionable assets later. A recent example involved placeholder AI images that weren’t fully removed before launch, frustrating fans who felt misled about the product they bought.

Ahead of Battlefield 6’s release, EA previously maintained that no machine-generated assets would be included in the retail version. Given the scale of the company’s broader investment in AI tools, that statement was always likely to be dissected by players. If systems like Voice2Face and FaceRig contributed directly to what ends up on screen, fans will continue debating what counts as “machine-generated,” what counts as “assisted,” and whether the spirit of disclosure rules is being met—even if the letter of the policy is harder to pin down.

For now, Battlefield 6 is shaping up to be another flashpoint in the wider conversation about generative AI in AAA games: how it’s used, what studios owe players in transparency, and where platforms draw the line when AI helps build the very content people pay to experience.