Please Wait
Loading Providence Studios
Please Wait
Loading Providence Studios
Devblog
Today closed the loop on everything built over the past two days. The dialogue runtime went in, a real NPC was placed in the world, and the conversation UI went live
Today closed the loop on everything built over the past two days. The dialogue runtime went in, a real NPC was placed in the world, and the conversation UI went live — and by the end of the session, the full chain was running: walk up to a character, press E, have a conversation, make a choice, and watch the game respond.
The runtime went in first. Given a dialogue data asset — a graph of nodes, responses, conditions, and effects — the subsystem navigates it from start to finish. It tracks which node is active, filters which responses the player can see based on the current narrative state, applies the effects of a chosen response before moving forward, and ends cleanly when the last node is reached. Two kinds of nodes flow through the same runtime: monologue lines where the player presses continue, and choice nodes where the player picks a response.
Then the first NPC. A C++ base class that every character in the world will share — it implements the interactable interface, holds its assigned conversation, and when the player presses interact, hands off to the dialogue runtime. Walk up to it and "Talk" appears. Press E and the runtime fires.
The conversation UI completes the picture. The dialogue box shows the speaker's name, the current line of text, and the player's available responses as clickable buttons. Pick one, the conversation advances. Reach the end, the box disappears, and control returns. Mouse and gamepad both work.