top of page

Particle Editor

2025

Specifications

Engine: R.O.S.E.

Language: C++

 

 

​Some VFX shown on the left is created by Alexander Thambert.

 

 

 

 

 

 

Download

Introduction

Particle Editor is a tool developed in our custom engine R.O.S.E developed at The Game Assembly. It is a separate application that allows the user to quickly iterate on particle systems in runtime. When the user feels finished with their work they can save the particle framework to a file which can then be loaded and played in the game.

Table Of Contents

Description

The particle editor was developed during the time of the later game projects we had at The Game Assembly, which started as part of the final examination task in the tool development course. I made it my goal to make it practical and easy to use before the 7th game project was started. The aim is for our technical artists to use it to quickly iterate and add more cool particles to our games.

The editor is designed to mimic Unity's old particle editor as much as possible to make it familiar to any new users and ease the learning curve. The editor was also expanded further to allow editing multiple particle effects at the same time and combine them together, which is called a framework. As you can see in the GIFs above, there is a framework window top left which contains each particle effect (that is called ParticleSettings). On the right are the settings of the particle effect, where you have the basic settings for the particle, then the emitter, affectors, emitter shape, and finally rendering. 

The top settings determine some common behaviour and look for the particle. The emitter decides particles spawned each second, and bursts furthermore allow for a number of particles to be emitted at a specific point in time. Affectors change how particle behave after they have been emitted, e.g., changing its opacity to fade out. Emitter shape define the volume or surface from which particles can be emitted, e.g., box, sphere, and cone. Rendering is where you can change how the particle should be rendered by using a different material or mesh.

Settings.gif

The code below show the two commonly used structs ParticleFramework and ParticleSettings that get serialized for save and load.

Implementation

Front-end

The interface the editor uses is ImGui which contains many handy and intuitive tools to allow the user to interact with the application by, for example, pressing buttons, modifying sliders, and menubars.

 

To mimic Unity's editor, the application has its own viewport on the left, while most of the important UI is on the right. When resizing the UI on the right, the viewport changes its size accordingly. This was implemented by adding a feature to our graphics engine to resize the viewport that is set when rendering, for example, GBuffer, ambient occlusion, bloom, etc. 

 

A significant feature that was added to the front-end was the curve editor that enables for many new and interesting VFX to be created instead of being stuck to constant values. Smooth interpolation was achieved via Catmull-Rom splines. It was also made possible to use a minimum and maximum range that produces random values between. 

Another feature that was later added is color gradients, which enables the user to place alpha or color markers on a spectrum. This feature was critical for me to implement since our 8th game project is going to feature lots of flashy VFX where it would significantly help speed up the iteration and make it easier to add more variety.

​​

Finally, when the user saves or loads, the framework and settings are serialized using nlohmann built-in features, which are incredibly easy to make work and change to suit one's needs. For reliability, a backup save is created for the user every 5 min.

ParticleEditor_Property.gif
Gradient.gif

Back-end

Behind the scenes, there are a lot of involved objects to efficiently simulate particle systems while furthermore being modular and well, readable. The code is divided into several classes that each hold a distinct responsibility, e.g., Particle, ParticleEmitter, ParticleSpawner, ParticleBurst, ParticleAffector, ParticleShape, ParticleSystem, and ParticleManager.

 

Particles belonging to a unique setting within a framework are given its own ParticleSystem that handles its update, rendering, removal, and more. This is because different settings defines different rules for how update is run and how the particle should be rendered, which makes it difficult for all particles to run within the same system. The ParticleManager simply holds all the systems and runs the update and render functions for each. 

The code excerpt below show briefly in the system how particles (and emitters) are updated and removed. A noteworthy thing is that threading is additionally utilized for updating particles by using an execution policy std::par_unseq that is part of the std library which yield significant performance gains. This enables for up to a million particles (depending a lot on the parameters) by utilizing only the CPU.

Particle Emitter

An emitter lives within a particle system and handles particle emission and its rules, e.g., creating and setting particles' start position, color, lifetime, size, and color. An emitter is first added by injecting an entity that owns it, which then enables particles to be created at a certain position in the world. Whenever the entity is destroyed, the emitter will also be destroyed with it. Furthermore, a noteworthy feature to the emitter is particle bursts, which creates a number of particles at a specified time with some additional parameters such as probability to burst, number of cycles and intervals for each cycle.​​

Particle Affector

An affector handles changing a particle's properties after it has been emitted, for example, size over lifetime, force over lifetime, rotation over lifetime, damping, and noise. It is common for such affectors to make use of curve editing, which changes a particle property during its remaining lifetime. 

 

A special affector that was added is noise, which uses curling noise to move the particles' position in the world to create more liveliness. Curling noise is computed from Perlin noise by calculating the derivative from the gradients at each of the nearby 8 lattice points. Such computation is rather expensive to make every frame, which is why it was added for the user to select number of dimensions when sampling. When the user selects, for example, a lesser number of dimensions, the samples are re-used across the axes and combined in a way to hide the re-use.

 

To improve the modularity of the code to ease, for example, implementing new affectors; my goal has been to reduce as much as possible the number of places that have to be modified for it to work. What the developer has to do is inherit from a base class called ParticleAffector and add a new type to the ParticleAffectorType enum. In the particle manager, the new affector must also be registered to allow for it to be serialized when saved or loaded, and that is essentially it for it to at least work. However, in the editor there is some more work involved since you have to additionally define the name of the affector, what properties the user is allowed to modify, and what affector is created from which enum by a factory. I've tried to separate editor-related code from the particle classes, since I feel otherwise that they may hold too much responsibility and become cluttered. Instead, dependency injection is done in the editor for the user to define the necessary functionality to edit the affector.

Particle Shape

The ParticleShape is another abstract class similar to ParticleAffector that other shapes inherit from, e.g., box, sphere, cone, and circle. In the code related to the emitter, you can see that a shape is initially only used for setting the particles' default position and rotation, as well as the direction it is headed. The shape additionally has a transform that allows the user to set an offset or rotate the shape itself.

A tricky shape to implement was the cone, since the logic to generate a random position in the volume and how rotation should be set wasn't very clear. Generating a position was solved by getting a random value on the Y-axis for its height and using that variable to compute the radius for generating a random point on a circle, which together yields the position.

Rendering

Finally, particles can have their rendering settings modified by changing the mesh, material, billboarded, and sort mode. The mesh can be anything the user desires by simply dragging and dropping a .fbx file into the window. Material is created using another editor by Axel Franzén in our engine that can be dragged and dropped similarly to mesh. Billboard makes the particles always face the camera, and sorting changes in what order particles are rendered.

​​

Instanced rendering is used within a particle system to improve performance with the help of structured buffers in the GraphicsEngine. While each system is drawn correctly, it does however result in some transparency issues between different systems. For example, if a gunshot VFX is played from a rifle and a bullet impact plays behind it, there are no rules defining which should be drawn first. I currently do not know a practical solution to the problem without becoming too expensive or complicated. An idea I have is to sort particles also by their emitters' distance to the camera. However, while it sounds simple, it could cause some issues with performance when adding another layer of complexity to the sort when it is already a bottleneck. This could all be improved using compute shaders (and with order-independent transparency), and is something I would love to explore if I had more time. Luckily, our games likely won't have millions of particles on-screen anyway (well, hopefully).

bottom of page