• Runtimes
  • threejs, I have questions

Related Discussions
...

We are using threejs and need more control over the shader (material). Just supplying the parameters is not enough, because this prevents us from changing the material runtime. What we would like is to be able to use are own material.

Looking at the code on github has resulted with more questions than answers. Perhaps someone can help explain.

Q: The SkeletonMeshMaterial has two shaders (very simple shaders that basically just paints). Why isn't the spine using the gpu to do the calculations and gain performance increase?

Q: The MeshBatcher is where the material is instanced (in the constructor, using the materialCustomizer parameters). Since it is actually creating the exact same material every time, why isn't the material shared between meshes? Since, you are not doing spine calculations in the gpu I see no reason why you just don't create one material and share it between meshes using the same material.

Q: The alphaTest is initialized to 0.5 in the SkeletonMeshMaterial, but the shaders used do not use the value? what gives?

This all leads to my primary question, how can I add our own shader?

orgCrisium escribió

Why isn't the spine using the gpu to do the calculations and gain performance increase?

The Spine does a lot that would be difficult to do on the GPU. We have a lot more flexibility with what we can do by not being limited by GPU. With 2D there are much fewer transformations than with 3D.

orgCrisium escribió

why isn't the material shared between meshes?

It looks like you are right that it could be. We'll look into improving it. To be honest, I think not a lot of people use Spine with three.js and so it doesn't see as much attention as some other of the Spine Runtimes.

orgCrisium escribió

The alphaTest is initialized to 0.5 in the SkeletonMeshMaterial, but the shaders used do not use the value?

Note sure, maybe left over from development.

orgCrisium escribió

how can I add our own shader?

Mario is the primary developer for the three.js runtime, but I can try to help until he comes along. It looks like when you create a SkeletonMesh you can specify a SkeletonMeshMaterialParametersCustomizer. When the material is created, your customizer is given a chance to change the default ShaderMaterialParameters, so you can set your own shader code, etc.

Difficult does not mean impossible, think of it as a challenge 🙂

Perhaps a lot of people are not using spine in conjunction with threejs because of stuff like this.

The alphaTest is something that is needed when using zBuffer, but if I can get control on the material then I can handle this myself.

Your last comment is actually the problem, by supplying only the parameters means we can only apply our changes at startup. We want to be able to change the material at runtime which we only can do by having access to the material (or allowing us to create our own material).


I would also like to add the following quirk about threejs and spine:

spine forces the threejs mesh object to be double sided,... it should be front. I have checked this and it seems that you are rendering the triangles for the quads in the wrong directions (probably explains why you forced it double... which is wrong).

Nate already answered the "why no GPU side skinning" question, so let me elaborate on the others.

The alphaTest value isn't a parameter consumed by the shaders, but by the blend stage. It's responsible for actually letting us do transparency.

Using double sided is also on purpose, as flipping the skeleton would otherwise result in incorrect rendering.

Both these things, as well as the vertex and fragment shaders, and any other THREE.ShaderMaterialParameter, can be modified via the customizer.

Now, being able to set just any THREE.Material isn't trivial. We can anticipate the vertex attributes of your material, which is why we went with our approach. I'm happy to consider a better solution to what we have, but due to reason mentioned above, I can't think of anything better.

Finally, you are right that materials aren't shared. But again, the reason for this is due to how threeJS does materials. Instead of separating shaders and textures, threeJS unites them via THREE.Material (and transitively THREE.ShaderMaterial). Spine skeletons may share the same vertex and fragment shaders, but they might not share the same texture. As such, each skeleton has it's own material(s) instance(s).

Now, we aren't the experts on threeJS and I'm sure much more experienced folks can think of a better solution. I'd be grateful if you can make concrete suggestions on how to restructure things so that plugging in custom materials as well as sharing materials across SkeletonMesh instances is more in line with your needs.

Mario escribió

Nate already answered the "why no GPU side skinning" question, so let me elaborate on the others.

I don't think Nate's answer is valid for many reasons.

Of course a 3rd dimension (or even more) would always make the computation greater, but that is just a small part of the reason. Computation is also effect by the amount of computations, more spine object = more computations which also validates why it should be on the gpu.

Javascript is single threaded, this means your code is actually blocking javascript execution until you are done doing your calculations. If the computations were done on the GPU then you would be doing the calculations async and not stalling the cpu.

The alphaTest value isn't a parameter consumed by the shaders, but by the blend stage. It's responsible for actually letting us do transparency.

This is wrong. Blend stage has nothing to do with the AlphaTest value. The AlphaTest value is to help the fragment shader not to fill the zbuffer where alpha values are rejected by the function of the AlphaTest value. In ThreeJS I don't think they have a function, so the logic is usually hardcoded in the fragment shader with something like:

if (a < AlphaTest){
reject;
}

This prevents it from zbuffer values for the transparency part of the image.

Using double sided is also on purpose, as flipping the skeleton would otherwise result in incorrect rendering.

This is not a valid reason. You are forcing double sided for everybody. This is a decision the user must make depending on what they want to achieve. And as I stated your geometry is drawn in the wrong direction!

Both these things, as well as the vertex and fragment shaders, and any other THREE.ShaderMaterialParameter, can be modified via the customizer.

This is only half the story. You can only modify at creation not after! I have no way of retrieving the material created. When I say no way of retrieving I am also lying a little bit. We use typescript, just as you do. This means typescript prevents us from accessing the material, but... it all becomes javascript and javascript you can basically access anything. So, I circumvented your material code and swap out your texture creation with our own and at cleanup time I swap them back. This works, but is definitely what I call a hack.

Now, being able to set just any THREE.Material isn't trivial. We can anticipate the vertex attributes of your material, which is why we went with our approach. I'm happy to consider a better solution to what we have, but due to reason mentioned above, I can't think of anything better.

Ok, I don't know what you are getting at here since you are not using the gpu to do calculations. This means all gpu attributes are standard attributes need to render an image (nothing amazing about that). This is also the reason I can use my own shader by circumventing yours.

Finally, you are right that materials aren't shared. But again, the reason for this is due to how threeJS does materials. Instead of separating shaders and textures, threeJS unites them via THREE.Material (and transitively THREE.ShaderMaterial). Spine skeletons may share the same vertex and fragment shaders, but they might not share the same texture. As such, each skeleton has it's own material(s) instance(s).

I was under the assumption that textures where combined in a texture atlas which would make your statement invalid, but if what you are saying is true then I understand this part. Just so we are clear Material doesn't unit textures, the material has a uniform that is bind'ed to a texture, but if you have mutliple textures then you are correct a material for each would need to be created, but there seems to be cases where it isn't... using our project as an example since I could create one material and apply it to all batch meshes without any problems.

Now, we aren't the experts on threeJS and I'm sure much more experienced folks can think of a better solution. I'd be grateful if you can make concrete suggestions on how to restructure things so that plugging in custom materials as well as sharing materials across SkeletonMesh instances is more in line with your needs.

I wouldn't say you are not an expert, already with what you have written here clearly shows that you are at an expert level.

orgCrisium escribió

Javascript is single threaded, this means your code is actually blocking javascript execution until you are done doing your calculations. If the computations were done on the GPU then you would be doing the calculations async and not stalling the cpu.

Yes, we are aware of all that. However, GPU side skinning puts limitations on the number of bones that can influence a mesh due to the available number of uniforms, through which we can push the bone matrices. Spine also allows switching between meshes for a single slot as part of an animation, which again will change the bone matrices that will have to be uploaded. When all taken together, we end up with a lot more batches to be submitted to the GPU than if we calculate all vertex data on the CPU and upload once (caveat: different blend modes/textures inside the same skeleton). We've benchmarked this and found it to not be an issue.

This is wrong. Blend stage has nothing to do with the AlphaTest value. The AlphaTest value is to help the fragment shader not to fill the zbuffer where alpha values are rejected by the function of the AlphaTest value. In ThreeJS I don't think they have a function, so the logic is usually hardcoded in the fragment shader with something like:

if (a < AlphaTest){
reject;
}

This prevents it from zbuffer values for the transparency part of the image.

This is correct according to the threeJS docs, and according to the behaviour of rendering when unsetting this value:
https://threejs.org/docs/#api/en/materials/Material.alphaTest

Using double sided is also on purpose, as flipping the skeleton would otherwise result in incorrect rendering.

This is not a valid reason. You are forcing double sided for everybody. This is a decision the user must make depending on what they want to achieve. And as I stated your geometry is drawn in the wrong direction!

No, it also enables lighting on both sides of the skeleton, a use case users have. We opted for setting this as the default, and it can be customizer via modifying the doubleSided and side attributes of the material parameters. The winding of the vertices is the same in all our runtimes.

This means all gpu attributes are standard attributes need to render an image (nothing amazing about that). This is also the reason I can use my own shader by circumventing yours.

Yes. we use standard attribute names, which will resolve correctly for many materials that come with threeJS. But that is not necessarily true for custom materials. I.e. we lack normal attributes.

... textures and materials ...

So the problem with all of this is as follows. Here's how the whole thing is setup:

  • A Spine skeleton contains 0 or more attachments (==meshes).
  • Each attachment references one texture page from a texture atlas.
  • Different attachments in the same skeleton can reference different texture pages. Their materials will not be compatible.
  • Different attachments in the same skeleton can have different blend modes. Their materials will not be compatible.
  • Each of the attachments is ultimately converted to a MeshBatcher, which is really a THREE.Mesh attached to the SkeletonMesh which is a THREE.Object3D, in SkeletonMesh.updateGeomtry().
  • Subsequent attachments of a skeleton all go into the same MeshBatcher if the texture and blend mode they need is the same, and there's enough space in that MeshBatcher in terms of vertices/indices.
  • Every frame, you call SkeletonMesh.update(), which will advance the animation and reconstruct the MeshBatchers based on the latest animation data.
  • Then, you tell threeJS to render all meshes of all THREE.Object3Ds in the scene. A single SkeletonMesh will contribute 1 or more MeshBatchers for rendering, each representing one or more attachments.

Now, if we use the same Material for all MeshBatcher instances, we'll run into issues as soon as the texture or blend mode between two attachments of a skeleton is different. Imagine a skeleton for a character. It's limbs, torso, and head attachments are reference on texture page. However, it also has an attachment of a halo effect on top of its head, which ended up in a different texture page in the texture atlas. The limbs, torso, and head attachments will thus go into one MeshBatcher, and the halo into another.

Now, if those two MeshBatcher instances shared the same Material, then the texture assigned in the uniform from the first batcher would get overwritten by the texture assignment for the second batcher.

When threeJS then starts rendering the batches, it will render the first batch with the wrong texture.

And this is why we can't share materials, neither within batches of the same skeleton, nor across skeletons. If you know that in your specific application, all skeletons and their attachments share the exact same texture, then yes, you can use a single material for everything. But we can not anticipate this in our code, nor check for it in a smart way. Instead, we need to instantiate our own Material inside SkeletonMesh as we run through the attachments of the skeleton, checking if the previous and current attachments are compatible. If they are, great, then all the attachments in your SkeletonMesh may end up in a single batch, with a single material. If they differ, you get two or more batches, with as many material instances.

This also explains why we use the customizer instead of letting you specify your own material. We have to instantiate materials internally on the fly. If you provide us with a material, we can't simply clone it (at least there is no such deep copying method on THREE.Material).

Now, having said all that, what I can offer is the following. The constructor of SkeletonMesh could take SkeletonMeshMaterialParametersCustomizer AND THREE.Material. If you provide nothing or a customizer, then the logic described above happens, with us instantiating SkeletonMeshMaterials as needed. If you proide your own THREE.Material, we'll simply set it on all batches and trust that you know what you are doing.

How does that sound?

Hi again,

thanks for the detailed explanation.

This is wrong. Blend stage has nothing to do with the AlphaTest value. The AlphaTest value is to help the fragment shader not to fill the zbuffer where alpha values are rejected by the function of the AlphaTest value. In ThreeJS I don't think they have a function, so the logic is usually hardcoded in the fragment shader with something like:

if (a < AlphaTest){
reject;
}

This prevents it from zbuffer values for the transparency part of the image.

This is correct according to the threeJS docs, and according to the behaviour of rendering when unsetting this value:
https://threejs.org/docs/#api/en/materials/Material.alphaTest

The wording in the docs is not good, but like it states it has a built in alphatest function (which is <AlphaTest) and it doesn't apply the material per fragment. It gets rejected by the shader. please see threejs own code here:
https://github.com/mrdoob/three.js/blob/dev/src/renderers/shaders/ShaderChunk/alphatest_fragment.glsl.js

About your solution, it would help, but at the same time can give problems if the artists start using different blends and textures as you wrote. Since, it actually can be bad to have common texture and there is no way to catch it in the code then I would say it would be a bad option to add that solution.

I going on vacation from tomorrow and will look further into this when I get back after 14 days.

thank you very much for you comments.

Peter