Delta_Ghost

our mission: every world, more alive

Our vision is to revolutionize the gaming industry by transforming the way NPCs interact within game worlds. By empowering NPCs with intelligent, adaptive, and creative behaviors that go beyond pre-scripted actions, we aim to create dynamic and immersive environments where characters can think, learn, and evolve in real-time. This will usher in a new era of gaming where emergent gameplay, infinite replayability, and deeply engaging experiences become the norm, fundamentally changing how developers create games and how players interact with virtual worlds.

our product: text-2-behavior

Our product, essentially, is a text-2-behavior system. The developer can text prompt the agent in development and the agent will execute that behavior during the game. The behaviors can be simple (follow the player, take photos of interesting things in the environment), or they can be complex (build an interesting architectural tower, reproduce and survive). Firstly, this makes developing NPC behavior easier. Secondly, this makes NPC behavior more interesting because the agent continuously and creatively recombines actions on the fly, based on the state of the game (or world).

our technique: ml behavior trees

From a technical perspective, we are currently creating a plugin for integrating the OpenAI agents (assistants) API and UE5 into a single system: a multi-modal ml-based behavior tree. This technology allows a NPC to receive both visual and state information from the game-agent’s point of view in UE5, and then the ml-agent (with memory, reasoning, and long-term goal orientation) decides on an appropriate course of action that is then executed by the NPC inside Unreal. This repeats in a loop, while the NPC tries to fulfill its core objective (its prompt).

The system isn’t so granular that the ML agent is responsible for the movment of the NPC’s fingers (technically impractical), nor so high level that it’s basically only a game director (not that novel); rather, it has access to fundamental actions (action bank) of the NPC that allow it to produce truly interesting and creative chains of behavior. This is what leads to emergent gameplay, increased replayability, and dynamic, ever-changing scenarios.

DEMO: gpt4o building a structure in unreal engine

In the video below, Unreal saves images from an NPC-Robot’s forward facing and bird’s eye view cameras and sends them to the OAI assistant. The OAI assistant analyzes the images and decides what action to issue based on the visual data (and also its short and long term memory and world-information—including the actions available). Then, Unreal parses the responses and fires the appropritate action. The agent below has been instructed to “build an interesting structure.” It is recombining fundamental actions, in order to do so (move up, down, left, right, forward, backward, place blocks, etc.) and printing to the screen (a.) what is sees out of both cameras and (b.) why it’s deciding to take one action or another, helping the developer debug the system.

fast video: agent building process

Changing Behaviors with a Prompt: find interesting stuff

In this example, we change the behavior of the agent by slightly modifying the prompt. The agent is now looking for interesting things in the environment; it finds the trees interesting, so it heads towards them.

Navigating a More Complex World

In the example above, the agent navigates a more complex world and performs a building procedure inside of it.

Continuous Learning? Social Learning? Self Programming?

R&D: improvements, continuous learning & social behavior

Our immediate R&D efforts focus on improving the current system: sensor expansion, action bank expansion, better sensor fusion and control, improving speed and cost, etc. From there, we intend to develop systems for NPCs to continue to learn and adapt over time (e.g. rewriting their own prompts in game). We are also exploring social learning, enabling agents to learn from each other within the game environment, and create new social structures to achieve goals (group behavior). We believe both of these capabilities are within reach. Furthermore, as a stretch goal, we are investigating ways for the AIs to “self-program” during gameplay (write new modules in the action bank based on conditions in game), like they do in the voyager paper.

team: 70 years of experience in emerging technology

Our team is composed of experienced professionals: CEO Ari Kalinowski & CTO Arden Schager, both AI and machine learning experts in gaming, Ops Lead (Contract) Hannah Scott, an expert in cross functional transdisciplinary teams, Lead UE Developer (Contractor) Logan Dye, an experienced UE5 developer, Lead AI Developer (Contractor) Haider Ali with an AI research background, and Lead 3D Modeler (Contractor) Fariba Shaffie specializing in high-fidelity asset creation. Our organizational structure is lean, supported by specialized contractors, with a growth plan to hire full-time staff as funding allows.

Business Model

Our revenue streams include plugin licensing with subscription-based access to our UE5 plugin, premium support offering optional technical assistance and customization, and custom development providing bespoke AI solutions for clients. Our pricing strategy employs tiered licensing: affordable pricing for indie developers, standard pricing with added features for studios, and premium pricing with full access and support for enterprise clients.

Go-to-market Strategy

Sales and distribution channels involve direct engagement with developers, distribution through the UE5 Marketplace as an official plugin, and partnerships through collaborations with developmer communities. Our marketing plan includes digital marketing with targeted advertising to developers, content creation such as tutorials and webinars showcasing our technology, and participation in industry events like gaming conferences. To acquire customers, we offer incentives for early adopters, including discounts and exclusive features, and focus on community building by establishing a user community for support.

Financial Projections

There are currently 750,000 Unreal Engine developers (information provided by Epic Games). If the plugin was sold at a price point of $50 per developer, the potential Year 3 revenue will full market capture becomes $37.5 million. This is only a baseline number/estimation, and doesn’t take into account the variety of different liscenses and bespoke offerings we could do for larger clients. We believe we have a leg up on our early competitors because we are exlusively focused on unreal, games (not games and robotics), 3D native systems, behavior (not dialogue and QA) and supporting developers.

product timeline

For now, our ideal use of funds includes R&D salaries for two years of fundamental research, contractor fees for specialized development work, marketing and sales for early customer acquisition and market research, and operational expenses covering administrative costs. The funding should provide a 24-month runway for research, development, early client development and iteration. Revenue projections focus on R&D with minimal to moderate revenue in years 1–2, official product launch and sales in year 3, and established market presence in year 4.

Conclusion

Our solution addresses a critical need in the gaming industry for dynamic and intelligent NPC behaviors. With a strong team, strong prototype, clear R&D focus, and innovative technology, we are poised to make a significant impact. The requested seed funding will enable us to conduct fundamental research, achieve key development milestones, and position us for future growth. We invite you to join us on this exciting journey and are available to discuss any questions or provide additional information.