Artificial intelligence, or AI, has progressed a long way in the past couple of years. Developments such as ChatGPT and CoPilot have taken off, showcasing the incredible potential of this technology in various fields. AI has a massive impact on video games, transforming everything from simple character behavior to complex game and world design methods. We have moved from the predictable movements of early NPCs in games such as Pac-Man to complex path-tracking and memory systems in titles such as The Last of Us. AI constantly pushes the boundaries of what is possible in interactive entertainment, and developers are continually digging for more.
As AI continues to evolve, it also brings potential downsides. Current issues such as unemployment, bias, plagiarism, and privacy risks have emerged as prominent challenges in all industries. Despite these concerns, AI can enhance gameplay and create more immersive experiences and remains a driving force in the game industry.
Early AI in Video Games
Games such as Pong (1972) and Space Invaders (1978) implemented AI into their prediction systems. Limited technology meant that AI had restricted use and could only simulate basic movement patterns. For example, Pong would have the computer move the opposition paddle, and Space Invaders would have aliens move around in a fixed pattern and have their speed increase. The goal of this early AI was to provide a challenge to the player and create a critical foundation for interactive gameplay. However, it was simple for players to learn and memorize patterns and ultimately master games.
Introduction to Non-Playable Characters (NPCs)
As AI continued to make its way into game development, the concept of NPCs came into view. NPCs added depth to gameplay by adding further challenges, allies, and enhanced storylines for the player. Developers could build environments that the players could interact with and further immerse themselves in when playing.
A popular example would be the 1980s Pac-Man and the use of the ghosts that chase the player around. Each ghost has a color that represents a personality. Blinky (red) is aggressive, Pinky (pink) strategically ambushes, Inky (cyan) works with Blinky to trick Pac-Man, and Clyde (orange) flees when Pac-Man is too close (Villain Wiki). These behaviors force the player to pay attention to both their movement and the ghosts while they are in the maze. This helped to create a more dynamic gameplay experience.
The Legend of Zelda (1986) introduced NPCs in a different way to Pac-Man, in that you could have interactions with them across the map. NPCs were used as a form of storytelling and moved the plot forward the more the player interacted with them. Enemies had various attacks and movements, which required the player to create strategies to defeat them. Friendly NPCs had specific roles and dialogue when the player chose to interact with them, and they helped the world in the game feel more alive. Instead of a pre-written script they had to follow, NPCs had personalities the player could interact with.
While NPCs at this point lacked varying movement and grew repetitive, they showed the capabilities AI had for video games. They introduced a layer of interactivity not yet seen, providing narrative depth and companionship for players. These changes, while basic, set the stage for future AI integration and development.
AI Continues to Advance in the 1990s and 2000s
The 90s and 00s saw significant leaps in AI development. Players started experiencing more developed worlds and characters and had believable interactions with enemies and friendly NPCs. A popular example is Half-Life (1998), which uses finite-state machines. This means that enemies have dedicated tasks and paths, such as patrolling set areas of the map. They have environmental awareness and adjust their movements once the player is in range and sight. They would also adapt to the player’s actions as the game continued, increasing unpredictability and dynamic combat. Other NPCs like scientists and security guards would react to the player’s presence, adding further depth to the game world.
We can’t discuss the 2000s without talking about The Sims (2000). This game is completely built on AI. Characters had requirements and behaviors that were entirely influenced by their unique personalities. AI managed characters’ various needs (hunger, hygiene, energy levels) and prompted autonomous actions such as eating and showering. This allowed for emergent gameplay, with unexpected scenarios occurring in gameplay (such as starting a fire while cooking – any Sims player knows this annoyance). Players could influence their characters’ actions by making their Sim complete tasks, creating a balance between player autonomy and the control of AI.
Advancements in AI during this era significantly enhanced player immersion and the realism of games. Players would experience more believable and engaging worlds, where characters and enemies would act in ways that felt natural and intelligent. This laid the groundwork for more sophisticated AI, and these systems are used as a base for modern games.
Modern AI: Adaptive and Fast-Learning Systems
Modern video games exhibit more responsive technology due to the introduction of Adaptive AI. Its core components are pattern recognition, behavioral analysis, and machine learning. This means that the AI can memorize player patterns and behaviors and adjust itself accordingly, creating harder opponents and experiences. NPCs and enemies build themselves based on player input, not a pre-written script.
The Last of Us Part II is a perfect example of adaptive AI. Enemies band together to hunt down and flank Ellie, calling out to each other when they notice someone is missing or they find a body. Every NPC has a name and a relation to whoever they are with, which certainly pulls at the heartstrings of players at points. In the boss battle with Ellie as Abby, Ellie’s AI fights with the behaviors that the player used throughout the game while playing as her, forcing the player to change their strategy with very little practice.
Red Dead Redemption 2 also has an incredibly detailed NPC system. NPCs within towns all seem to know each other and react to every single player’s action, regardless of whether it’s pointed to them or not. They also react differently to Arthur depending on the honor path the player has chosen for him. If the player chooses to kill an NPC, they will remain dead for the remainder of the game. RDR2 shoves this choice back in the player’s face by parading the widow of the dead NPC in front of them.
Summing It Up
Adaptive AI gives the players an unpredictable world every time they start a new playing session, handing them unique NPC interactions wherever they go. NPCs exhibit incredibly realistic behaviors and personalities, which can create moral dilemmas for players when they are placed in an awkward situation.
Beyond enhancement in video games, AI’s influence is increasing rapidly. Companies like Nvidia are leading the charge with their GPUs powered by this technology, and they have become one of “the fastest-growing companies ever” (The Guardian). However, this increase in popularity comes with its consequences. AI requires immense computational power, consuming high amounts of energy and, therefore, contributing to carbon emissions. As there is more of a push for AI, we must also consider the environmental costs that come in turn with its use.
Additionally, AI is not just being used to enhance games but also create them. Developers are now experimenting with AI creating voice lines, assets, even blocks of code, leading to faster production times and cost cuts. As mentioned, this raises the concern of job displacement and ethical concerns of replacing human voice actors with AI. If this becomes the norm, where do we draw the line between utilizing human creativity and using AI?
AI has the potential to shape the video game industry for the better, but it must be used to enhance player experiences, not take away opportunities from those working within. We’ve seen the potential it has to offer and the beautiful creations it has to offer, but how do we use it in moderation to not disturb those working to produce these masterpieces? Those of us within the industry need to keep track of this technology and learn how we can embrace this development, not compete against it.