I'm currently working on a Vertex Animation Texture pipeline for a Three.js project. A custom shader would read a horizontal pixel for each vertex in my geo (e.g. pixel (0, 0) for vertex id 0, (1, 0) for id 1, etc.) and would translate the vertex by the RGB value for that pixel (RGB->XYZ).
uniform sampler2D animationAtlas;
uniform vec2 atlasSize;
void main() {
float vertex_id = float(gl_VertexID);
vec3 anim_data = texture2D(
animationAtlas,
vec2(
(vertex_id + 0.5) / atlasSize.x,
0.5
)
).rgb;
vec4 world_position = modelMatrix * vec4(position + anim_data, 1.0);
gl_Position = projectionMatrix * modelViewMatrix * world_position;
}
Right now, in the code I sent you, vertices are being translated according to the texture. The issue is that some of them are not at their desired position and the translations seem to have created holes in between faces (move the camera around in the fiddle and notice the black lines).
Things I verified :
The vertex ids in the .gltf are matching the horizontal pixels of my VAT--If I move one vertex with my texture, it moves the right one.
The normals on my vertices are all pointing towards z-forward.
There are no holes in my model.
If I understand correctly, the GPU/Shader is not reading each vertices individually, but as triangles? Is the issue created because the translation is being applied everytime the vertex appears in a triangle--it kinda makes sense, because the most problematic vertices seems to be the central ones (the ones that would share the most triangles).
Run the Snippet (Orbit controls are enabled)