articy:draft 3 is a standalone dialogue and content management program with exporting capabilities. Using the official articy importer plugin for Unreal Engine lets the user import defined data types and dialogue. While it features an impressive list of features, it falls short on some aspects of in-engine usability. While the official importer that I worked on gained new UX features and UI, configuration of runtime dialogue elements such as events, animations, camera settings and so on were lacking, which is why I started work on my own articy extension plugin.


Goal of the plugin was to use the official importer as its core, and wrap around to add new workflow options.

The plugin boils down to following features:

a dialogue graph in articy
the automatically recreated graph within Unreal Engine

All graphs created inside articy:draft 3 are automatically generated within Unreal Engine. This allows us to supercharge dialogue nodes with engine-functionality and with additional data that gets associated with the original articy element via its unique articy ID.

  • Rather than following a typical “one dialogue per asset” approach that most dialogue plugins for Unreal Engine use, here we have a centralized database of all dialogue. The search function makes it easy to look for dialogue containing specific words or characters.
  • Hierarchy analysis between imports ensures that moved data does not get lost. The extension import pipeline can be automatically triggered when the official importer finishes importing, or can be manually triggered.
  • Extension data works on a per-asset basis to support source control with ownership concept, such as Perforce. This way multiple people can work on different parts of the dialogue database at once.
  • Tab support to have multiple dialogue options open at once. Tab navigation using forth & back is also supported.

Every node receives an extension data asset once modified. This includes options often used in different games across the board, such as skip timers, automatic skip timers and so on, as well as an extendable set of dialogue events. This allows the user to define data relating to its actual dialogue mechanics right inside Unreal Engine.

Disclaimer: Some of these videos show some artifacts. This is due to the recording tool.

The extendable event system allows the user to define their own event blueprints with start, tick, and end functions that can be overridden. This way, wholly custom mechanics can be implemented and reused. Whether it’s spawning particles, setting cameras, or any other logic that is required for a game, it can be easily implemented here to suit anyone’s needs.

In the above example, we check the events’ details and see that the first camera event is setting the camera on the firespirit character (represented by the fireplace actor in-game), whereas the second camera event switches back to the warrior’s camera. At the same time as we switch back, a firestorm particle effect spawns on top of the firespirit.

This workflow is already a lot quicker to work with than what common dialogue systems use, and also enables at-a-glance information when debugging dialogue events. Nodes don’t have to be clicked on to know what events are being used.

To improve from there, drag & drop as well as copy & pasting events is also supported. If some particle event isn’t quite right on a particular dialogue line, we can simple move it without using the keyboard at all. By moving the firestorm particle event from the second to the first node, the firestorm now spawns upon starting the dialogue.

As you might have noticed, when choosing an actor camera, we aren’t actually choosing a camera, nor an actor that has a camera. Instead, we select a character defined inside articy:draft 3. But Unreal Engine has no understanding of what the “character” represents in the game. This is where the character-actor mapping comes in.

Any actor that uses a specific component which has a character property will register under that character identity with a subsystem. This subsystem can be asked for any actor registered under any character ID. This indirection allows us to define our game content while being able to ignore the technical aspects thereof. Of course, some failure safety needs to be considered, as the mapped actor might not exist at the time, but behavior will be configurable in the future to ensure spawning of actors, for example.

A visual display to take a look at the mapping is in the works at time of writing. Since articy character identities also support copies (think “Red Goblin 1, 2, 3”), the mapping widget is supposed to be a grid listing the different character default identities to its instances, with additional utility functionality like focusing the actor in PIE or accessing its details panel right from within.

Example dialogue scene created with Articy Extension

The dialogue path marked above contains all the event data for the scene shown in the video. At first, we update the camera to choose the side camera and we attempt to sheath our weapon, if needed. After we chose the dialogue option “Why not?”, our character plays a talking animation, mocking the fire spirit. The fire spirit now activates a firestorm particle event, and our character falls to the ground, while the camera changes yet again to use a bird’s point of view. The camera changes back again to our player camera, and our character gets up again. The dialogue is over.

While this event driven approach doesn’t mesh well with AAA quality cut scenes, it works fairly well for ingame dialogue scenes that are typically not as well produced. That’s what the sequencer is for. In an attempt to offer the best of both worlds, I made an attempt to integrate the cut scene tool sequencer into my tool, but it was clear that sequencer wasn’t written to support it the way I would have liked it. The idea was to have a preview level inside the tool and animate in there, since one of the features is level-independent scene creation. Sequencer is written to only properly work in PIE however. An integration might be possible, but my implementation will need to adhere to sequencer’s implicit assumptions; something that isn’t documented officially or unofficially.

Upgrades to the workflow over default articy:draft 3

Epic Games released an article about Anshar Studios using articy:draft 3 for their game. GameDec. Read up on it here:

The article showcases quite nicely the problems that the studio was facing. Addressing specific articy objects (characters, dialogues, dialogue lines etc.) per ID to add required game logic is a workflow that isn’t sustainable. Switching to the internal articy scripting system for external events that are implemented engine-side works, but those function definitions are interpreted from pure text without proper type information. This means only basic types such as strings, ints, bools and generic UObjects are supported. While you can technically make use of these to drive an event system, it’s still fairly inflexible and prone to error, and iteration times are bad due to requiring constant exporting & importing.
Who wants to define camera settings in plain text, anyways?

This is where the event system of the Articy Extension plugin shines. You can simply create your own event blueprints and then add these events to any dialogue node. If needed, you can alter timings via delay and duration parameters of each event right inside the engine.

While articy’s dialogue traversal and data management system is solid, my articy extension plugin combines the best of both worlds by exposing the entire engine functionality to the data created in articy:draft 3.
It’s still a WIP and began as my bachelor’s thesis, so naturally UI, features, UX and polish aren’t done.