How to Practically Ship GLSL Shaders with your C++ Software
OpenGL supports pre-compiled binaries, but not portably. Unlike HLSL, which is compiled into a standard bytcode format by Microsoft's compiler and later translated into a GPU's native instruction set by the driver, OpenGL has no such format. You cannot use pre-compiled binaries for anything more than caching compiled GLSL shaders on a single machine to speed-up load time, and even then there is no guarantee that the compiled binary will work if the driver version changes... much less the actual GPU on the machine changes.
You can always obfuscate your shaders if you are really paranoid. The thing is, unless you are doing something truly one-of-a-kind nobody is going to care about your shaders and I mean that genuinely. This industry thrives on openness, all the big players in the industry regularly discuss the newest and most interesting techniques at conferences such as GDC, SIGGRAPH, etc. In fact, shaders are so implementation-specific that often there is not much you can do from reverse engineering them that you could not do just by listening to one of said conferences.
If your concern is people modifying your software, then I would suggest you implement a simple hash or checksum test. Many games already do this to prevent cheating, how far you want to take it is up to you. But the bottom line is that binary shaders in OpenGL are meant to reduce shader compile time, not for portable re-distribution.
With c++11, you can also use the new feature of raw string literals. Put this source code in a separate file named shader.vs
:
R"(
#version 420 core
void main(void)
{
gl_Position = vec4(0.0, 0.0, 0.0, 1.0);
}
)"
and then import it as a string like this:
const std::string vs_source =
#include "shader.vs"
;
The advantage is that its easy to maintain and debug, and you get correct line numbers in case of errors from the OpenGL shader compiler. And you still don't need to ship separate shaders.
The only disadvantage I can see is the added lines on the top and bottom of the file (R")
and )"
) and the syntax that is a little bit strange for getting the string into C++ code.
There is just "store them directly in the executable" or "store them in (a) separate file(s)", with nothing in-between. If you want a self-contained executable, putting them into the binary is a good idea. Note that you can add them as resources or tweak your build system to embed the shader strings from separate development files into source files to make development easier (with the possible addition of being able to directly load the separate files in development builds).
Why do you think shipping the shader sources would be a problem? There is simply no other way in the GL. The precompiled binaries are only useful for caching the compilation results on the target machine. With the fast advances of GPU technology, and changing GPU architectures, and different vendors with totally incompatible ISAs, precompiled shader binaries do not make sense at all.
Note that putting your shader sources in the executeable does not "protect" them, even if you encrypt them. A user can still hook into the GL library and intercept the sources you specify to the GL. And the GL debuggers out there do exactly that.
UPDATE 2016
At SIGGRAPH 2016, the OpenGL Architecture Review Board released the GL_ARB_gl_spirv
extension. This will allow a GL inmplementation to use the SPIRV binary intermediate language. This has some potential benefits:
- Shaders can be pre-"compiled" offline (the final compilation for the target GPU still happens by the driver later). You don't have to ship the shader source code, but only the binary intermediate representation.
- There is one standard compiler frontend (glslang) which does the parsing, so differences between the parsers of different implementations can be eliminated.
- More shader languages can be added, without the need to change the GL implementations.
- It somewhat increases portability to vulkan.
With that scheme, GL is becoming more similar to D3D and Vulkan in that regard. However, it doesn't change the greater picture. The SPIRV bytecode can still be intercepted, disassembled and decompiled. It does make reverse-engineering a little bit harder, but not by much actually. In a shader, you usually can't afford extensive obfuscuation measures, since that dramatically reduces performance - which is contrary to what the shaders are for.
Also keep in mind that this extension is not widely available right now (autumn 2016). And Apple has stopped supporting GL after 4.1, so this extension will probably never come to OSX.
MINOR UPDATE 2017
GL_ARB_gl_spirv
is now official core feature of OpenGL 4.6, so that we can expect growing adoption rate for this feature, but it doesn't change the bigger picture by much.