Manually set UV texture coordinates on model

Is it possible to manually set/pass UV texture coordinates to a model that was loaded from a glb?

For example,

var entity = viewer.entities.add({
  name: 'mesh',
  position: position,
  orientation: orientation,
  model: {
    uri: 'model.glb',
    scale: 1.0,
  }
});

then, let’s say I have some arrays of my UV texture coordinates that I want to update in time. Is there something I can set on entity.model to pass these coordinates for mapping an image as a texture?

I presume I will have to create a material of the image I have for a texture. Is that correct?

Welcome to the Cesium community @banesullivan!

I don’t think this is exposed in the public API. Is your goal to generate UV’s dynamically? Can you explain more about your use case?

Yep, I need to be able to update the UVs on the fly as I have custom code to compute them dynamically. So in Cesium, I just need a way to adjust the texture coordiantes (UV’s) on the fly.

My use case:

I have the following data:

  • a static mesh (glb) with no UVs (if there is a better format I should use, please let me know. I am converting PLY to glb)
  • a series of images (jpeg/png) (frames from a video)
  • a series of camera models corresponding to each image that allows me to compute the texture coordinates (UVs) of the mesh on the fly. I have custom code to do this computation easily and generate UVs, I am just not sure how to do two things (see below)

The data are from an aerial video where we map the video to the mesh to show it in “real time”. I have all of the texture mapping implemented, I am just unsure of how to dynamically adjust the texture coordinates (UVs) of the model in Cesium.

The two things I need from Cesium:

  1. Be able to access the XYZ coordinates of the model
  2. Dynamically set the UV coordinates

The data are from an aerial video where we map the video to the mesh to show it in “real time”

Ah yes, I remember discussing this a bit over email now!

I’m not sure if going through glTF models is going to be the easiest way. In a glTF model the texture coordinates are embedded in a buffer, one of the accessors defines the byte offset and how to extract them from that buffer. This part happens here:

You could add this check here for debugging to skip uploading the texture coordinates:

        if (attributeName == 'TEXCOORD_0') {
          return;
        }

And confirm it works. You could extend the model API to allow you to pass your own texture coordinates here, but this would only work when first creating the model. You would need to re-create these vertex arrays and re-upload them to the GPU to update it - I’m not sure if there’s an easy mechanism for that.

What do the meshes you generate look like? If they’re not very complicated I wonder if an easier path would be just to use custom geometry. You could base it off of the PlaneGeometry (https://github.com/CesiumGS/cesium/blob/master/Source/Core/PlaneGeometry.js) class for example (you can see where it generates the texCoords there). You’d still need to handle re-uploading to the GPU, but that could be accomplished just by recreating the primitive (this is already how dynamic polygons work in CesiumJS in the Entity API, it is recreated every frame it changes). That should be fast enough to be able to do this every frame.

This approach may be simpler than the glTF route since it would be a lot simpler to go through its internals in the engine.

@banesullivan For the use case, I think a custom primitive may be suitable. The idea is that in Cesium, a Scene object owns a list of Primitive. Each primitive has the update(FrameState) method that will get called for each frame. The role of the update(FrameState) method is to provide a list of DrawCommands to the FrameState object. Each DrawCommand will have the ShaderProgram, VertexArray, UniformMap, Textures and other GPU resources that are necessary to render the primitive to the Framebuffer.

With this, you can parse all the attributes of the static mesh, upload them to the WebGL VertexArray, and create Shader accordingly at the beginning. Then, in the update() method, you can upload the UV array to the UV buffer for each frame. And finally, assign all those GPU resources to the DrawCommand, and add the command to the FrameState object. This Cesium blog post has more info about the process of creating a custom primitive for the engine along with the code sample. Hopefully that helps. Please let me know if you have any questions.

2 Likes