How it works
// 01 DPI and pixel density
Every frame of `requestAnimationFrame` does three things. First, the canvas is cleared with a semi-transparent fill (`rgba(0,0,0,0.08)`) instead of a full `clearRect`, which leaves a short trail behind each particle and adds perceived motion smoothness. Second, each particle's velocity is updated by an acceleration vector pointed at the cursor (dx, dy normalized and scaled by a small constant), then multiplied by a friction factor (0.96) so the system converges instead of running away. Third, the particle is drawn as a single filled circle at its updated position.
// 02 Bleed and cut tolerance
Cursor tracking uses `pointermove` for desktop and `touchmove` for touch devices, stored in the same `{x, y}` state object. When the pointer leaves the canvas, the target falls back to the canvas center, which keeps the field alive instead of freezing. That small detail matters. An idle cloud that slowly recenters reads as alive. A frozen cloud reads as broken.
// 03 Safe zone and trim accuracy
The particle array is allocated once and mutated in place. No object allocations per frame, no garbage collection pauses, no frame drops on long runs. At 400 particles, one frame takes about 1.2 milliseconds on an iPhone 12 and about 0.4 milliseconds on a desktop. Plenty of headroom inside the 16.6 millisecond budget for 60 fps rendering.
// why this exists
real-time interaction
A particle field is a classic piece of real-time graphics work. Each particle has a position, a velocity, maybe a color, maybe a lifetime. The draw loop updates every particle on every frame and paints the result. This widget runs a few hundred pink particles in canvas2D, gives them a light physics model, and pulls them toward your cursor or finger. The goal is not to be flashy. The goal is to show that real-time cursor interaction, with correct behavior on touch devices, is not a library problem. It is a loop problem.
The core trick is that every particle carries four floats: x, y, vx, vy. Each frame, the position updates by the velocity, the velocity updates by a small acceleration vector pointed at the cursor, and a friction coefficient slightly shrinks the velocity so the system does not run away. That is the whole physics model. Drop it into a 60 fps loop and you have a living cloud that feels like it is paying attention to you.
Count matters. Below 50 particles the field looks sparse. Above 2,000 particles the canvas2D rendering cost (roughly 0.1 milliseconds per particle on modest hardware) starts to eat the frame budget. This widget ships with 400, which lands in the sweet spot where the effect is dense without taxing the GPU. The count is tunable from the control panel so you can feel the performance cliff yourself.
Touch is the thing most particle demos get wrong. A mouse event has exactly one active point. A touch event can have zero (finger lifted) or many (multi-touch). The widget binds `touchmove` and `touchstart` separately, reads `event.touches[0]` for the primary finger, and falls back to the mouse path if no touch is active. The result: it works on a phone without pretending the phone is a desktop.
I use this as a live demo of interaction work. It is here because it disproves the assumption that cursor-reactive animation needs a library. Three hundred lines of plain JavaScript, no dependencies, no build step, no performance concerns.
Frequently asked questions
How many particles can you render in canvas2D at 60 fps?
Roughly 2,000 to 4,000 on modern desktops, 1,000 to 2,000 on mid-tier phones. Beyond that, switch to WebGL or use instanced rendering.
Why does my particle system stutter on scroll?
Usually the draw loop is running even when the canvas is off-screen. Gate the loop with an IntersectionObserver so it pauses when the canvas is not visible.
What is the difference between requestAnimationFrame and setInterval for animation?
requestAnimationFrame syncs to the browser's paint schedule (usually 60 Hz) and pauses automatically when the tab is backgrounded. setInterval does neither, which burns CPU and produces visible tearing.
Why use canvas2D instead of WebGL for particles?
For under 2,000 particles, canvas2D is simpler, has zero setup overhead, and performs fine. WebGL pays off above roughly 5,000 particles or when you need per-particle shaders.
How do you make the particles feel like they follow the cursor naturally?
Acceleration toward the cursor (not instant snap) plus friction. The particles overshoot, decelerate, and recenter. Without friction, they orbit forever. Without acceleration, they teleport.
Does this work with multi-touch?
This version tracks only the first touch. Multi-touch particle attraction is a reasonable extension: average the touch points, or split the field so each finger pulls its own subset.
Can I change the particle color?
Yes. Each particle reads a color from the palette array on init. Swap the palette or give every particle a color sampled from a range.
Why the faint trails behind the particles?
Each frame clears with a semi-transparent fill instead of a hard clear, which preserves a ghost of the previous frame for a few ticks. It is a cheap motion blur.
How do I reuse this code?
Copy the component file. It has no dependencies beyond React. Replace the palette with your brand color and swap the draw call for any shape you want.