I came up with a partial solution that measures the distance from the camera to the label and uses that to generate a scale for the label and to determine how far apart labels should be. Labels that overlap are hidden. Zooming in results in more labels being displayed as long as they don’t intersect according to the new camera distance. Then, after a certain threshold, all labels are shown. It seems to work for now, but I would like to revisit this in the future, possibly with some kind of grouping based on location or using some kind of force repulsion between labels. I wasn’t sure of a good way to do either of these things.
If there’s any interest, I can share what I have done for the time being.
When I was working on it there were two approaches we considered in case of intersection.
One was the spiral/circle idea, where you put one label and try to put the next. If they are overlapping then you try to find a place around the original placement point on a circle, if there is no space then extend the circle for a while.
Is there any update on the declutter roadmap or could you please let me know how can i find out whether the labels are overlapping at a particular zoom level?
There’s no update on the roadmap, but if you search the forum, I believe some developers have created partial solution specific to their app that could provide some ideas.
I’ve posted this elsewhere - but just wanted to point out that server-side clustering and heat-mapping of point data can be handled by GeoServer quite easily (if you have control of the data) using rendering transformations and consumed in Cesium as as simple WMS layer: http://docs.geoserver.org/stable/en/user/styling/sld-extensions/rendering-transform.html.