
In his article for Forbes, AI contributor John Winsor describes his experience with ‘vibe-coding’, a new methodology of creating digital content, which is done chiefly through AI tools rather than coding directly. The article describes the author’s emotional high of creating a 3D platform game entirely via Claude’s Sonnett tool. From developing the original concept to defining the game’s visual style and even dictating specific mechanics and kinds of controller inputs. Within a half-hour, John has created a game that wasn’t:
…anything close to cinematic quality but the model nailed the mechanics of the game. The game included a skier hurtling down a snowy mountain, maneuvering around obstacles, competing against rival skiers, and featuring multiple difficulty levels, comprehensive score tracking, and even camera controls to zoom in on the action.
The piece is framed as an interesting description of the kind of creative endeavour such tools can unlock and the efficiency gains they can provide for game developers.
I am, however, concerned by the logical conclusion that Windsor implicitly drives us towards in his piece: the idea that nobody needs to ‘know’ any underlying hard skills; AI can ‘wish’ the desired output into existence. We’re hurtling towards a kind of ‘Idiocracy’ future rather than a creative utopia, where there is no need to develop underlying expertise as the LLMs will happily serve it up to us on a plate.
Without developing the underlying skillset, I think it’s unlikely that someone could serve as an effective ‘art director’ (as Winsor euphemistically calls it) for an AI-coded game because they won’t understand the principles of good game design. I like the idea of freeing people from limitations to allow them to create more effectively, but I think what will likely result is the proliferation of a lot of really bad games. This is disappointing because there are already so many really bad games.
Hard work makes an artist good at their craft. The doing of the thing, combined with substantial repetition, makes your output considered and, therefore, good. I often find myself writing about a topic I’m considering, and it’s in the writing of the topic that I uncover what I think about it – because my first impressions rarely stand up to the interrogation that engaging deeply with any subject entails. A senior game developer could undoubtedly use a tool like Claude effectively to code games, but it eliminates the need for junior developers. If there are no pathways for junior developers to one day become senior developers, then the industry’s talent pipeline never gets created in the first place.
I don’t think Windsor is correct; the piece doesn’t demonstrate how the “lines between creativity and technology are blurring.” What I think it proves, perhaps without realising it, is that AI, combined with the imperative of maximising shareholder value, is on the precipice of potentially destroying an industry whose output I value.
There is no scenario under our current shareholder capitalism model in which the popularisation of this technology does not result in (further) mass layoffs of creatives from major game studios, which is now at risk of being hollowed out entirely.
Consider The Verge’s recent story about Ashly Burch’s character, Aloy, from the Horizon series. A recent video leaked from Guerilla Studios, which produces the series, showed Alloy being voiced and performed solely by AI and was not trained on any of Burch’s performances. However, the studio insists this is “not necessarily something that’s in production for actual games.” That ‘necessarily’ is doing so much heavy lifting in that sentence that I hope it’s bending its knees.

Burch, who performs and voices the Aloy, said on TikTok:
I love this industry and this art form so much and I want there to be a new generation of actors. I want there to be so many more incredible game performances.
I want to be able to continue, to do this job, and if we don’t win, then that future is really compromised.
Sure, this AI-enabled future is interesting, but I’d be lying if I said I wasn’t a little concerned.
I’m certainly not against the profusion of tools that democratise access. In principle, I think that’s generally a good thing. I have been noodling (unsuccessfully) with a podcast and a YouTube channel for years. Something that could only be possible when consumer-grade audio-visual tools filtered down to the public in the early 2000s. Before that, in my Dad’s day, for example, a video editing suite would run into tens of thousands of dollars and would (necessarily) only have commercial uses. Hell, I used Grammarly to help me refine this very article, and that’s essentially an AI tool at this point.
I would argue, however, that the vast (vast) majority of the creators on YouTube, and any other open-source platform, for that matter, are absolute trash. This is not an argument for prohibiting access to the tools; it is simply an acknowledgement that democratising these tools will enable the profusion of mountains of utter dross. There will be some diamonds in the rough. Still, given the absolute firehose of quantity over quality that’s certain to materialise, I think the ability to parse them will be even more difficult than it already is. The unfortunate consequence of this is that the creative output will be less valuable to the market than ever.
We’ll have to use AI to filter out the AI dross. What a time to be alive.
Links:
https://www.theverge.com/news/630176/ashly-burch-sony-ai-horizon-aloy-tech-demo-sag-aftra-strike