Gravity Dots: When Data Has Weight
When you land on our homepage, the first thing you see is a rolling terrain of glowing dots — a procedural landscape that feels alive. Some dots pulse brighter than others, rising briefly above the surface before settling back. It looks like something imagined. It isn’t. Each beacon represents a real client relationship, and the data behind them comes from the last place you’d expect: our time tracker.
This is the story of how spreadsheet data found its gravity.
The Idea
Every agency website needs a hero section. Most reach for a stock video or a particle effect. We wanted ours to hold a secret — that what you’re looking at is real, even if it doesn’t feel like it should be.
We didn’t want a wall of logos or a counter ticking up to “500+ projects.” We wanted something you could feel. A terrain where years of work stretch into the distance, with points of light marking where we’ve spent real time and energy. The kind of thing where you stop and think: “wait — is this actually data?”
It is. And building it was one of those projects where every technical constraint opened a door to something we didn’t expect.
The metaphor is simple: in a sea of possibilities, our influence is visible.
The Data Pipeline
The visualization starts not with shaders or Three.js, but with a Node script and the Toggl API.
Step 1: Harvesting Time Data
Toggl’s reporting API limits queries to one-year windows. So we walk backwards from today in yearly steps, all the way to 2017, our founding year, collecting project summaries from each window and merging them by project ID.
// Sliding 1-year windows from now back to 2015
let cursor = now;
while (cursor > earliest) {
const windowStart = new Date(cursor);
windowStart.setFullYear(windowStart.getFullYear() - 1);
windows.push({ start: toDateStr(windowStart), end: toDateStr(cursor) });
cursor = new Date(windowStart);
}
Three API calls run in parallel: project summaries (the hours), the project list (which maps projects to clients), and the client list (which gives us names).
Step 2: Aggregating by Client
Individual projects are interesting, but we care about relationships. A single client might have five projects spanning three years. We aggregate all tracked time under each client and find the earliest project creation date — that becomes the client’s “start date” in our timeline.
// project → client → aggregate
for (const entry of summary) {
const clientId = projectClientMap.get(entry.project_id);
const clientName = clientMap.get(clientId);
client.tracked += entry.tracked_seconds;
// Keep the earliest project start date
if (date < client.startDate) client.startDate = date;
}
Internal projects are filtered out. What remains is a clean dataset: 16 client entries, each with a start date and total tracked minutes.
Step 3: The Output
The pipeline produces a minimal JSON file — no names, no project details, just the shape of each relationship:
[
{ "startDate": "2017-05-09", "totalTime": 4099 },
{ "startDate": "2022-06-27", "totalTime": 86096 },
...
]
Privacy by design. The data reveals the texture of our work history without exposing who or what. You can see that we’ve had a long, deep engagement starting in mid-2022, and a cluster of new relationships in 2024 — but you’ll never know the names. We liked that tension: real data, zero exposure.
From Data to Terrain
The JSON feeds directly into a Three.js component. Each client becomes a beacon positioned on a procedural landscape:
- X-axis: the month of the year the relationship started — spreading clients across the terrain horizontally
- Z-axis: the year — older clients sit further from the camera, newer ones closer
- Height and size: total tracked time on a logarithmic scale — so a 61-hour engagement is a subtle point, while an 86,000-hour relationship towers above the surface
The logarithmic scale matters. Without it, one or two large clients would dominate and the rest would be invisible. With it, every relationship has visible presence while the larger ones still clearly stand out.
Try it yourself — change the start date to move the beacon across the landscape, and adjust the total time to see how logarithmic scaling affects its height and size. A 100-hour project barely rises above the surface; a 1,400-hour engagement towers over it.
The Animation
Beacons don’t all pulse at once. They take turns — one rises over 3 seconds following a sine curve, pauses for 1.5 seconds, then the next begins. It creates a rhythm, like orbits falling in and out of sync. The terrain itself shifts slowly through Perlin noise, two overlapping grids drifting at different speeds to create depth.
The Terrain
Two layers of point clouds rendered with custom GLSL shaders. The height at each point is computed entirely on the GPU using Fractional Brownian Motion — multiple octaves of Perlin noise layered on top of each other. Here’s the core of the vertex shader:
float hash2(float x, float y) {
float n = sin(x * 127.1 + y * 311.7) * 43758.5453;
return fract(n);
}
float vnoise2(float x, float y) {
float xi = floor(x), yi = floor(y);
float xf = x - xi, yf = y - yi;
// Quintic Hermite curve — smoother than cubic, no visible grid artifacts
float fx = xf*xf*xf*(xf*(xf*6.0-15.0)+10.0);
float fy = yf*yf*yf*(yf*(yf*6.0-15.0)+10.0);
return mix(
mix(hash2(xi,yi), hash2(xi+1.0,yi), fx),
mix(hash2(xi,yi+1.0), hash2(xi+1.0,yi+1.0), fx),
fy
);
}
float fbm2(float x, float y) {
float v = 0.0, a = 0.55, freq = 1.0;
for (int i = 0; i < 5; i++) {
v += a * vnoise2(x*freq, y*freq);
a *= 0.48; freq *= 2.07;
}
return v;
}
Each terrain point computes its own height in the vertex shader — no CPU involvement at all for 32,400 points per frame. The uTime uniform drifts the noise input slowly, giving the terrain its gravitational pull:
void main() {
float h = (fbm2(position.x*0.11 + uOffset + uTime*0.06,
position.z*0.11) - 0.5) * 5.5;
vec3 pos = vec3(position.x, h, position.z);
// ...
}
Here’s where it gets fun: the same noise function is implemented identically in both JavaScript and GLSL. Why? Because the terrain heights are computed on the GPU, but beacon positions are calculated on the CPU (from client data). If the noise functions diverge even slightly, beacons float above or sink below the surface. So we ported the hash function, the smoothstep interpolation, the FBM loop — everything — to match bit for bit across languages.
Could we eliminate the JS noise entirely and do it all in GLSL? In theory, yes — you could use transform feedback or pixel readback to compute beacon heights on the GPU. But that would add async complexity and GPU pipeline stalls for a problem that a few lines of mirrored JavaScript solve cleanly. Sometimes the fun answer is also the pragmatic one.
The Highlight
Move your cursor over the terrain and you’ll see it react — points brighten and swell under the mouse. This is also pure GPU. The CPU sends a single world-space position each frame (via raycasting to the ground plane), and the vertex shader does the rest:
if (uHighlightActive > 0.5) {
vec3 diff = pos - uHighlightPos;
float d2 = dot(diff, diff);
if (d2 < r2) {
float t = 1.0 - sqrt(d2) / 4.5;
float ease = t * t;
col = mix(col, vec3(1.0), ease * 0.5); // brighten toward white
sz = size * (1.0 + 1.5 * ease); // enlarge at center
}
}
No extra draw calls, no post-processing — just a distance check per vertex. The quadratic easing gives it a soft, organic falloff rather than a hard circle.
Performance (or: Don’t Ship a Space Heater)
A full-viewport WebGL scene on a landing page is a bold choice. It has to earn its place by not destroying the user’s battery or making their fan spin up. We spent real time on this — and honestly, the constraint-driven problem solving was some of the most satisfying work in the project.
- Mobile: Grid resolution drops from 180 to 120, noise octaves from 5 to 4, frame rate caps at 30fps, DPR limited to 1.5x. The visual difference is minimal; the thermal difference is night and day.
- Visibility: An IntersectionObserver pauses rendering entirely when the component scrolls out of view. Zero GPU work for a section nobody’s looking at.
- Mouse interaction: The cursor creates a highlight on the terrain — points within a 4.5-unit radius brighten and enlarge. All computed in the vertex shader, so the CPU cost of interactivity is just one raycasted mouse position per frame.
The Wonder of It
There’s a moment in projects like this — somewhere between the third refactor and the first time it actually works — where you forget you’re engineering and start just watching. A time-tracking API connected to a noise function connected to a vertex shader, and suddenly real data is glowing on screen, pulling, rising, alive. That moment is why we do this.
Every layer of the pipeline presented a small puzzle — how to handle Toggl’s one-year query limit, how to map time logarithmically so small clients aren’t invisible, how to keep JS and GLSL noise in sync — and each solution felt like a small discovery rather than a chore.
The best part? The visualization updates itself. Run the pipeline script, and new client relationships appear as new beacons. The terrain grows with the company. It’s not a static illustration — it’s a living record that keeps surprising us.