
Helmdar: 3D Scanning Brooklyn on Rollerblades by todsacerdoti
2025 May 4
One of my favorite joys is exploring a city late at night on rollerblades. I’ve been doing it fairly regularly for about
10 years now, first in Boston and then in NYC. Every time day turns to night and back again a city takes a breath. At
night people flow out or huddle up in buildings, leaving the streets clear for the people and machines that reset
the urban environment for the next day. Garbage trucks lumber about, workers unload fresh stock at stores, repairs
happen in subway tunnels and on roads. Without all the people it’s easier to see the naked form of the streets,
buildings, and infrastructure.
When you’re moving slow you spend more time taking in the details. When you’re moving fast the world becomes a blur. The
world around you is a brush that paints into your perception, and the speed of the brush strokes helps set the style of
the painting. I like rollerblading in particular because it gives those perceptual brush strokes a lot of range. You can
quickly stop and soak in a detail, change direction, occupy a tight space, or fly downhill in a long straight line.
Stickdar
Some years ago I picked up a 2D LiDAR scanner (RPLidar A1) without a particular purpose in mind. As it spins it tells
you its current angle and the distance to the environment in the direction it’s pointing. They’re often used in robot
vacuums to map walls and other obstacles. One night in 2021 I put it on the end of a stick and carried it around with my
laptop recording timestamped measurements from it. By putting each scan sweep on its own layer I could make
visualizations like this:
Taken outside my apartment at the time during heavy snowfall. You can see the path cut through the snow on the
sidewalk, the apartment building wall, windows, cars along the sidewalk, and points on a tree.
Walking along the perimeter wall outside New Lab. Just behind the character is scaffolding. There are rectangles from
windows. And if I remember right there was snow on the ground at that time too. This one is much more distorted because
I wasn’t so careful to keep the sensor stable while moving.
I think about devices like this as “weird cameras” – imperfect leaky imaging systems that don’t faithfully reproduce
a precise representation of what they are pointed at. Instead they mix in side channels that the user or environment can
affect, and that frees them up for more interesting creative expression.
Helmdar
While I was experimenting with the stickdar I had the idea that it would be fun to make maps on a larger scale by
carrying it around while I was rollerblading. But when I tried that out the maps were very chaotic because it was harder
to keep it steady. I moved onto other projects, but then the other week this project thread popped back into mind.
Over the years I’ve done various projects which needed 6 DoF tracking through space (i.e. machine needs to know where it
is and where it is pointing). A very convenient way to achieve this nowadays for something quick and dirty is to use
mobile phones, because they come with AR frameworks that know how to use the sensors and cameras onboard to do very good
“inside-out” (no external sensors) 6 DoF tracking. On Android that’s even exposed to web browsers via the WebXR API so
you can bash together a web app to access that data, no app development required (hot tip: works on the Quest VR
headsets too, including hand tracking, which I’ve
had fun using for projects in the past).
So when the stickdar popped back into mind I started thinking about attaching the LiDAR to a phone to track its position
and orientation in space. Then I could work out the position of all the points returned by the sensor in 3D world space,
building up a point cloud map of the world as I moved through it. In general doing this is a problem known as
simultaneous localization and mapping (SLAM), and
nowadays there are plenty of very good ways to solve it. But in the spirit of weird cameras, because I was curious about
how well this would work, and for fun I decided to try it this way.
Here’s what I came up with:
I built a frame out of aluminum extrusion and attached the phone (a Pixel 6) and the LiDAR to the front with some laser
cut and 3D printed brackets. I affixed this to the helmet using flexible brackets printed in TPU and VHB double-stick
tape, with some white duct tape for insurance. The LiDAR plugs into the phone’s USB-C jack via a US
7 Comments
fshafique
You should post this on /r/Photogrammetry on Reddit:
https://www.reddit.com/r/photogrammetry/
condensedcrab
Very impressive! LiDAR and point clouds seem very promising, but the challenge of denoising point clouds and artifacts keep the skill bar very high/time intensive.
timzaman
Just install polycam and walk around :)
amelius
Wouldn't this be cheaper with a stereo pair of cameras + software reconstruction instead?
pj_mukh
So cool! I wonder how the Lidar and ARCore poses were cross-calibrated?
Just to avoid this, I would just use a LiDAR equipped iPhone Pro, with industrial grade cross-calibration and still have all the visualization fun.
dllu
I once put an Ouster OS1 on a hat and walked around with it. Pic of me here: [1]
[1] https://x.com/ddetone/status/1141785696224477184?s=46
maeln
A slight tangent but rollerblades is a case of proprietary eponym : Rollerblade is a brand of inline skates (often call skates – plural – for short) that became so famous people started to use it to describe all inline skates, no matter the brand. Just like vaccum cleaner and hoover :)