Hi,
is there a way to manage emmissive material in spine with monogame?
best
[MonoGame] Emissive material
- Edited
Please take a look at the SpineEffectNormalmap.fx
file and workflow, which uses an additional input texture (the normal map). You could just create a similar effect file with an emissive color texture which is then used accordingly, adding a read emissive color to the result color.
Hi Harald,
I just use SpineEffectNormalmap.fx, I try to undestand the shader but I'm really noob in this
I have quickly added the necessary lines to the SpineEffectNormalmap.fx
file (and named it SpineEffectNormalmapEmissive.fx
). I have not tested it, but basically that should be all that's needed, along with providing the emission map as an additional input in the c# side of code, just as the normalmap was added.
WOOOOWWW thankyou very much Harald!!!!
Now I'm with a customer, as soon as I get back on the computer I try immediately
I tried it, the emissive texture cover the primary texture, but there is not emissive glow effect
I attach a screen shot
What does your emissive texture look like? The emissive texture will not act as a light source (which is not trivial, even more so when not using linear color space), it's just an additive texture layer.
I take a screen shot, it is a transparent png
then,is the result right or should it have the transparency faded as the texture?
Harald, noooooooo I was wrong, I shouldn't have used transparency but a black background! :clap:
is it not possibile to use transaparence? I'm only asking for a practical reason.
then if I use black background it works but:
-If I use transparence the texture show filled,
-If I put a completely black background work but, the transparent areas of the primary texture become black
I attach a second image that show what appen if I use full black background
I just noticed that the effect file is not using premultiplied alpha but straight alpha textures, so just add the line
emissiveColor.rgb *= emissiveColor.a;
before adding the emissive color to the output.
Please in the future look at the respective shader code and try to diagnose the situation yourself. It's only three lines that were added to add for the emissive effect, one line that determines how the output color is affected. As a programmer you should be able to figure out what output = saturate(output + emissiveColor);
does if you are working on your own game framework.
Harald NUMBER ONE all works!!!
You are right about shader programming, for me shader language it is a topic that I still don't know little about.
Now that I've seen your code, I understand. And it helped me understand future situations. Thanks so much
Glad to hear it helped, thanks for your kind words! :nerd:
Learning the basics of shader programming can be very worthwhile, you can achieve a lot with little knowledge already. Basically shader programming boils down to the following:
-
You always have one vertex shader, and one fragment (pixel) shader.
-
The vertex shader is applied for every input vertex from your mesh. One vertex in, one vertex out - it receives a single vertex's set of data (local position, texcoords, vertex color) and outputs a single vertex with a (different) set of data. Any local position data is transformed (by multiplying localpos with the model, view and projection matrix) to what's called clip-space (which is just some simple transformations away from screen space). This is a required position output since the pipeline will automatically rasterize the triangles and then pass each fragment (pixel) to the fragment shader for further processing. The fragment shader input data values are just the (interpolated) output values of the vertex shader.
-
The fragment (pixel) shader then needs to output (return) an rgba color value, if you
return float4(1,0,0,1);
you receive a red opaque pixel on screen. The fragment shader can now sample textures at the provided input texture coordinates (which were output by the vertex shader). Since we don't always want fully-bright textures, it can use whatever lighting calculations to determine that the pixel is lit with e.g. intensity0.5
at the given angle to the light. Light intensity is then combined with the diffuse texture in our case, by simple multiplication. Basically lighting and shadows are always applied by multiplying a color value with something less than 1.0. -
One thing to note is that color values (sampled texture color, vertex color, etc) are always normalized float values in the range 0.0-1.0, not the 0-255 integer rgb values of your texture file. To prevent any overflow of the output pixel color value over 1.0, there was the
saturate(..)
call used, which returns the input value clamped to the range[0.0, 1.0]
.
The following intrinsic functions are available in hlsl:
https://learn.microsoft.com/en-us/windows/win32/direct3dhlsl/dx-graphics-hlsl-intrinsic-functions
Harald for president! Thank you for this very usefull info.
You're welcome, hope it helps.