Hi. I’m having an issue where a DeveloperError arises on Firefox, and occasionally on Safari, but generally not on Chrome. It appears to revolve around a sensitivity to render speed of tiles. The error is:
DeveloperError: Expected width to be greater than 0, actual value was 0
The stack trace isn’t very helpful, except that it contains a large number of FrameRequestCallback
, suggesting that this is happening as a result of some chain of frames not being rendered on time. That tracks with the fact that I have my own ImageryProvider class, which is doing a bit of postprocessing on imagery data returned from the server. This postprocessing instantiates a separate WebGL2 instance. If I turn the postprocessing off, then the error goes away, but I’m left without the processing I need.
Are there any steps I can take to prevent this from happening? Or at least, can I get some hints as to where to look deeper?
Thanks!
Hi @smarimc,
Offhand, it sounds like a texture of width 0 is somehow being created. It’s difficult to say exactly why this is happening without more context.
Is there anyway you could create a Sandcastle example which reproduces the issue? That would be helpful for troubleshooting.
The DeveloperError
is very generic, and it’s hard to say for sure what is causing the error in this case. When you say
This postprocessing instantiates a separate WebGL2 instance.
then some rather “anecdotal” comment could be: I’ve seen cases where this error (with this exact message) appeared when the system ~“ran out of resources” - i.e. when it ran out of GPU memory or some form of ‘handles’ - and could no longer create new contexts at some point.
Could it be that you are creating a new instance in each rendered frame, and not properly disposing the resources that are allocated in this instance?
If you are creating a new instance each frame, then this is could be the reason for the error. For details about how to properly dispose such an instance, I’d have to point to things like javascript - How do I clean up and unload a WebGL canvas context from GPU after use? - Stack Overflow or hope that someone else can chime in here.