A ShaderEffect is a QML item that takes a GLSL shader program allowing applications to render using the GPU directly. Using only property values as input as with the Canvas in our previous article, we will show how a ShaderEffect can be used to generate a different kind visual content, with even better performances. We will also see how we can use the fluidity it provides in user interface designs, again taking Google’s Material Design as a concrete example.
The fragment (pixel) shader
This can be a difficult topic, but all you need to know for now is that correctly typed QML properties end up in your shader’s
uniform variables of the same name and that the default vertex shader will output (0, 0) into the
varying variable for the top-left corner and (1, 1) at the bottom-right. Since different values of the vertex shader outputs will be interpolated into the fragment shader program inputs, each fragment will receive a different
qt_TexCoord0 value, ranging from (0, 0) to (1, 1). The fragment shader will rasterize our rectangular geometry by running once for every viewport pixel it intersects and the output value of
gl_FragColor will then be blended onto the window according to its alpha value.
This article won’t be talking about the vertex shader, the default one will do fine in our situation. I also encourage you to eventually read available tutorials out there about shaders and the OpenGL pipeline if you want to write your own.
A basic example
This animates an
animatedColor property through a regular QML animation. Any change to that property, through an animation or not, will automatically trigger an update of the ShaderEffect. Our fragment shader code then directly sets that color in its
gl_FragColor output, for all fragments. To show something slightly more evolved than a plain rectangle, we clear the green component of some fragments based on their
x position within the rectangle, leaving only the blue component to be animated in that area.
Parallel processing and reduced shared states
One of the reasons that graphics hardware can offer so much rendering power is that it offers no way to share or accumulate states between individual fragment draws. Uniform values are shared between all triangles included in a GL draw call. Every per-fragment state first has to go through the vertex shader.
In the case of the ShaderEffect, this means that we are limited to
qt_TexCoord0 to differentiate pixels. The drawing logic can only be based on that input using mathematic formulas or texture sampling of an Image or a ShaderEffectSource.
Using it for something useful
Even though this sounds like trying to render something on a graphing calculator, some can achieve incredibly good looking effects with those limited inputs. Have a look at Shadertoy to see what others are doing with equivalent APIs within WebGL.
Design and implementation
Knowing what we can do with it allows us to figure out ways of using this in GUIs to give smooth and responsive feedback to user interactions. Using Android’s Material Design as a great example, let’s try to implement a variant of their touch feedback visual effect.
This is how the implementation looks like. The rendering is more complicated but the concepts is essentially similar to the simple example above. The fragment shader will first set the fragment to the hard-coded
backgroundColor, calculate if the current fragment is within our moving circle according to the
normTouchPos and animated
spread uniforms and finally apply the ShaderEffect’s opacity through the built-in
Explicit animation control through start() and stop()
One particularity is that we are controlling Animations manually on input events instead of using states. This gives us more flexibility in order to stop animations immediately when changing states.
The mighty Animators
Some might have noticed the use of UniformAnimator and OpacityAnimator instead of a general NumberAnimation. The major difference between Animator and PropertyAnimation derived types is that animators won’t report intermediate property values to QML, only once the animation is over.
Property bindings or long IO operations on the main thread won’t be able to get in the way of the render thread to compute the next frame of the animation.
When using ShaderEffects, a UniformAnimator will provide the quickest rendering loop you can get. Once your declaratively prepared animation is initialized by the main thread and sent over to the QtQuick render thread to be processed, the render thread will take care of computing the next animation value in C++ and trigger an update of the scene, telling the GPU to use that new value through OpenGL.
Apart from the possibility of a few delayed animation frames caused by the thread synchronization, Animators will take the same input and behave just like other Animations.
Resource costs and performance
ShaderEffects are often depicted with their resource hungry brother, the ShaderEffectSource, but when a ShaderEffect is used alone to generate visual content like we’re doing here, it has very little overhead. Unlike the Canvas, ShaderEffect instances also don’t each own an expensive framebuffer object. It can be instantiated in higher quantities without having to worry about their cost. All instance of a QML Component having the same shader source string will use the same shader program. All instances sharing the same uniform values will usually be batched in the same draw call. Otherwise the cost of a ShaderEffect instance is the little memory used by its vertices and the processing that they require on the GPU. The complexity of the shader itself is the bottleneck that you might hit.
Selectively enable blending
Blending requires extra work from the GPU, prevents batching of overlapping items. It also means that the GPU needs to render the fragments of the Items hidden behind, which it could otherwise just ignore using depth testing.
It is enabled by default to make it work out of the box and it’s up to you to disable it if you know that your shader will always output fully opaque colors. Note that a
qt_Opacity < 1.0will trigger blending automatically, regardless of this property. The simple example above disables it but our translucent touch feedback effect needs to leave it enabled.
Should I use it?
The ShaderEffect is simple and efficient, but in practice you might find that it’s not always possible to do what you want with the default mesh and limited API available through QML.
Also note that a using ShaderEffects requires OpenGL. Mesa llvmpipe supports them, an OpenGL ES2 shader will ensure compatibility with ANGLE on Windows, but you will need fallback QML code if you want to deploy your application with the QtQuick 2D Renderer.
If you need that kind of performance you might already want to go a step further, subclass QQuickItem and use your shader program directly through the public scene graph API. It will involve writing more C++ boilerplate code, but in return you get direct access to parts of the OpenGL API. However, even with that goal in mind, the ShaderEffect will initially allow you to write a shader prototype in no time, giving you the possibility to reuse the shader if you need a more sophisticated wrapper later on.
Try it out
Those animated GIFs aren’t anywhere near 60 FPS, so feel free to copy this code into a qml file (or clone this repository) and load it in qmlscene if you would like to experience it properly. Let us know what you think.