Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: How to track flying objects?
96 points by devlop on Feb 21, 2022 | hide | past | favorite | 63 comments
What technology can be used to track multiple flying objects in a space like football field and up to 100 ft above it? Attaching something lightweight to each object is fine. I would like to visualise an FPV drone race with computer graphics in realtime.

Any ideas are welcome thanks!



There are systems specifically for this purpose, typically called a local positioning system. Decawave has a product that might work for you, you’ll have to check on the range limitations however.

In general you’re going to want to go with a time difference of arrival (TDOA) system, as it can work one way to support many simultaneous locations. These generally require that you set up an array of anchors or bases that send synchronized ultra wideband radio pulses out. The “tags” are the individual receivers that calculate position based on when these are received. The RF behavior is different but ultimately it’s similar type of system to GPS.

Two-way ranging is a different technique that can work with fewer anchors and be more precise, but won’t scale nearly as well. Most commercial products will support both of these modes of operation. In addition, some products have a channel to support reasonably high-rate data transport as well.

If you search for ‘local positioning system TDOA UWB’ you’ll start getting in the right area. I would start small and test heavily with this in realistic venue situations as the protocols used are incredibly simple and could be subject to noise, reflections, etc. Most of the ones I've seen have relatively low power transmitters, you may want to see if a licensed band is an option. You may also need to integrate onboard imu/gps streams with a kalman filter or similar mechanism to patch over data loss and noise. GPS and/or visual failsafes will also be essential for safety. I'm sure there are plenty of regulations here as well if you want to go commercial.

Either way good luck!


Yes, every serious lab that researches drone control uses similar systems. They are the best way to go. Buy a kit, put up your sensors, calibrate, and enjoy precise tracking.


Tangentially, Kalman filters were first applied in... tracking flying objects:

https://en.wikipedia.org/wiki/Kalman_filter#History


Differential GPS is what drones use for light shows or mapping or anything requiring high fidelity. This gives them centimeter-level precision, at the cost of a large sensor array on both the ground and the drone. If you are adding this to a drone, you'd also need to include orientation and other parameters in your data stream, as you'd only be getting XYZ from GPS. Unfortunately, no simple hardware exists for your request.


Pixhawk exists. https://pixhawk.org/


Differential GPS should be eminently hackable and cheap, including the antennas in question, but it's an open question if we even want that right now because spoofing would be no more difficult.


GPS-RTK is not that hard or expensive: https://docs.centipede.fr/ (in French).

https://docs.centipede.fr/docs/make_rover/rover_v5_1.html lists a €180 GPS module, which is not dirt-cheap, but still affordable.


RTK GPS isn't hard in theory, only in practice.

To maintain a high-precision fix, you have to maintain particularly good tracking (a 'phase lock') of 5 or 6 satellites. That can be demanding - you practically need line of sight to the satellites.

That might be practical if you're flying drones outside in free space - but if you're flying in a stadium with a lot of seating? Or dodging in and around trees with lots of foliage? By all means test it out, but don't imagine it'll be easy :)

I've also heard second-hand reports that the PLL signal tracking built into many GPS modules isn't really tuned for >4g acceleration because 99.9% of GPS modules end up in cars and phones. With the consequence that trying to track acrobatic flight leads to more lost signals than you'd expect, based on the excellent view of the sky. I haven't seen that firsthand, though - it's just what I've been told.


I've been using UBlox F9P/F9R units recently and it's really quite amazing how good it is. The ground receiver doesn't require a large array of anything, the base station fits in the palm of my hand and mounts onto a regular photography tripod. The radios we use to send the RTCM3 stream to the drones are quite tiny too, the antenna's about as long as my finger.


How’s the accuracy on cloudy days? I’ve been playing with uBlox ZED-F9P and accuracy degrades unless I’ve clear sky.


Like the other replies said, I’d suspect you might have an issue with your setup. We’ve used 3 different antennas and they’ve all (to my knowledge) performed beautifully.

I see your email address is in your profile. Today’s a stat holiday in many parts of Canada, but I’ll look around tomorrow and see if I might have some data to support/refute my assertion. Our pilots, nominally, write down the weather for every flight, so I might actually have some clear sky/cloudy data that I could share from the base station.


Something is up with your setup. Check that the active antenna you are using meets the F9Ps relatively high gain requirements for the LNA amplifier.

Actual impact on accuracy for outdoor short baseline rtk should be close to non existent.


I agree, we have these deployed in the field and have no issues with weather due to using more than the L1 band. Something isn't configured correctly or something failing, such as the antenna.


I’ve seen a bunch of drone-detection computer vision projects. Usually they’re detecting drones from other drones though (Eg for autonomous racing[1] or drone-defense).

A challenge with doing it from the ground is that the drones will be quite small relative to the size of the image. But with sufficient compute and several cameras, a tiling-based approach[2] should work.

If you want to do unique-identification you’ll also need object tracking[3].

This is exactly the type of project Roboflow (our startup) is built to empower! Happy to chat/help further (Eg we might be able to help source a good dataset to start from). And if it’s for non-commercial use it should be completely free.

[1] https://blog.roboflow.com/drone-computer-vision-autopilot/

[2] https://blog.roboflow.com/detect-small-objects/

[3] https://blog.roboflow.com/zero-shot-object-tracking/


> A challenge with doing it from the ground is that the drones will be quite small relative to the size of the image.

Bright IR LEDs on the drones could possibly help there. Even a visible light camera (with its IR filter) will pick up a strong NIR source. An IR camera will pick up the LEDs even better. They'll likely stand out pretty well against the sky and be straight forward to filter with some image processing.


We did something very similar here, using computer vision, to track video feeds of moving trains containing graffiti. Much more complicated in one sense, but also much simpler.


> to track video feeds of moving trains containing graffiti.

What was the purpose of that ?


Delayed response here.

I was trying to count the number of unique graffiti pieces on trains across the US, and measure the distance individual paintings traveled, and the duration they lived before being painted over.


Hey, interesting question! I think using ground-based equipment to accurately measure the position of fast-moving drones in a large outdoor volume might be quite difficult. A pretty common way to do this indoors is to attach infrared retroreflectors to the object. You then have infrared cameras in different locations. Each camera viewpoint restricts the possible location of the marker to a cylinder along its view axis. Using two or more viewpoints, you can estimate the actual 3D position. Two problems for your application: 1) the infrared light from the sun will add a lot of noise to your measurements, 2) it is generally difficult to identify which marker is which. [1] is an example of an off-the-shelf system that works quite well though.

I think some onboard positioning system might work best for your application. kognate suggests using "inside out" tracking based on the features observed by a camera on the drone. A nice thing here is that most FPV drones are already transmitting realtime video. It would require significant computational power on the ground to localize drones from their camera feeds though. See [2] for some inspiration.

Another idea that may be possible is to use inertial sensor fusion algorithms using the data from the IMUs onboard the drones to find out their trajectory in real time. However this is quite a tricky business. The sensors would have to be characterized extremely well and be able to deal with the highly dynamic forces that would be felt by a racing drone. Probably would make sense as a standalone module that accepts 5V from the drone's power system and has its own IMU(s) and telemetry radios.

[1] https://optitrack.com/

[2] https://matthewearl.github.io/2021/03/06/mars2020-reproject/


The sun gives off a heck of a lot of infrared, but the water in the atmosphere absorbs some frequency ranges. There's a pretty deep valley around 940 nm, and 940 nm illumination sources are cheap and readily available. Combined with a sufficiently narrow bandpass filter, sunlight may be less of a problem than you expect.


Won't be easy, drones are small and fast.

RGB based detection will probably be too slow and error prone. Rather put active IR LEDs or similar markings which can be easily detected, use cameras which only let through IR and high framerate! Then use computer vision to spot blobs. Finally compute 3D position by triangulation.

Active IR tracking is still pretty much State of the Art for motion capturing and the like.

Short googling leads to OptiTrack, where they even advertise exactly this use case of drone tracking:

https://optitrack.com/applications/robotics/


I'll never understand why computer vision is the first-choice tracker. It's easily the worst sensor for most tracking tasks, which is why it receives so much funding / publication / publicity when it works.

If you really want to track drones, you'd use radar, radio, and GPS combined with a good imu + transmitter on the drone. Off the shelf systems exist for this application when tracking your own drones / those you can "attach lightweight systems to"


Optical tracking suddenly is a very nice option if you cannot tinker with the object being tracked.

Source: my old employer does tracking and well, try getting a tag into a football, basketball, or ice hockey puck. it's hard.


OP specifically said that they can tinker with the drones in this case. But I hear you, when you can't and the environment is structured it can work.

Still the first question should always be: Can we instrument the object.


I think it's just humans having human biases.

"I know where things are mainly by looking at them continuously, so it seems logical that if a computer can capture images, it should be doing the same."


most people can't actually look at something continuously. There are blinks and saccades our brain filters out.


You're not wrong, but I wasn't speaking using this level of specificity.

They also don't look at things continuously for psychological reasons like being bored, sensory reasons like being distracted by something else, physical reasons like having their line of sight blocked, social reasons such as the event ending, or biological reasons like being asleep or deceased.

And neither would computers be looking at things continuously, if anything the discontinuity would be even more obvious as computers use "frames" of processing, as well as camera shutter speeds, instead of just being a function of the overall system's response time.


Cameras are cheap and readily available, can detect most objects that humans are interested in and don't depend on having access to modify the objects you want to track.


There's a company called "Sports Media Technology" SMT that has some nice integrations with the NHL (Hockey). https://www.smt.com/hockey#techinfo

They put emitters/sensors in the hockey puck as well as on the players. The data gets processed and displayed on video for audiences as an "augmented reality" experience.

My understanding is that the puck has an infrared emitter that is tracked by sensors in various locations around the rink and this can locate the realtime position of the puck. The players also have sensors/transmitters and this makes it possible to have really responsive position tracking (the video in the link shows how it looks quite nicely).

I suppose the speed and erratic motion of a hockey puck is not unlike that of an FPV drone.


There's a big difference if it's being tracked outdoors or indoors - depending on that you can use different technologies. Another thing is if you can use GPS (no local receivers/sender posts needed) or something like UWB - where you need to deliver coverage for your sensors, does not sound feasible for a NASCAR track.

The big players I remember right now are the SMT you mentioned, https://kinexon.com/sports-clients/ (where I worked) and Zebra.


SMT (currently) provides the telemetry for NASCAR, including 100Hz GPS location data, and those are regularly going >190mph at super speedways. It's not typically erratic but it works very well in any case.


If you are able to race at dusk or in the evenings with medium external lighting, tracking different colored indicator leds doesn't require extensive or expensive hardware. This will only really give you a 2d overview, but some information on altitude can be gleaned from the amplitude of the light from each drone.

A quick example on how well this works, this is a few roombas bouncing around [1]

This is what that path integration can look like rendered in a video [2]

[1] https://transistor-man.com/PhotoSet/roomba_dance/animated/da...

[2] https://vimeo.com/645355520#t=30s


A few ideas: have computer vision tool analyze each frame from one or more cameras above the field, look for the drones, and overlay your graphics. You might have a QR Code or similar. Even a simple color badge if it's just a handful of drones. Then, your cv code just has to look for the big chunk of pink, blue, or green pixels. Look up color tracking, object tracking, etc. Look into https://en.wikipedia.org/wiki/FoxTrax as an old example. OpenCV, Tensorflow possibly as tools now.

As you noted in another comment, GPS by itself probably isn't accurate enough, but there is GPS augmentation tech. You put a base station in the area that measures the drift and sends that over the air to use as correction. I'm thinking you'd take the raw GPS from the drones, apply the correction, and hopefully get sub-meter positioning. Look up DGPS, WAAS for options there.

The other idea that comes to mind is to triangulate based on their radios. You'd have base stations around the perimeter, each measuring the signal strength and direction of the target frequencies. Positioning would be a matter of fairly simple trig + error correction. I don't know if there's anything doing this off-the-shelf, but indoor positioning systems may be a rabbit hole to go down (even if used outdoors).

A final idea is to use the video feed from the drones. You'd place QR Codes throughout the course, process the video feeds, and use the codes seen in each feed to tell which ones are ahead. Or instead of QR Codes, build a point cloud of each point on the course to use as position.

Sounds fun!


I would suggest using the video feed from each drone to localize it (since it's in a known space). You'll need to do a few passes beforehand and record the background. Doing a puzzle-piece style match isn't terribly (computationally) difficult and then you can fuse the position data in your visualization.


If the drones have cameras and you can scatter AR markers around the space with known positions, you can probably just use a Kalman filter based on detecting the markers.


Agree. Kalman was designed for this use case (radar) and still performs admirably well for how simple it is.


RTK base station and zed-f9p if you own the drones. Haven’t heard the best about the TDOA systems but ymmv.


Would help to know which drones and/or the software/OS controlling them. Many drones already have APIs that can give you X/Y/Z positioning data. If this is indoors you may not be able to rely on GPS data, but you may have options for other coordinates based on triangulating wifi control signals.

Doing it with machine vision would likely be challenging (and I say this having a fair bit of experience with AI/MV systems). The area you are covering a very large field of view, and drones are generally very small relatively speaking.

If you can't do it with native APIs, I would probably look into an RF style system with a small transmitter on each drone and then antennas places around the stadium to detect and triangulate the signals into 3D space.


If there are multiple drones in the frame and you rely on object detection only, then you will have hard time telling drones apart. Even if you use something like lidars and get a bunch of coordinates, you still need to know who is who.

You may want to predict future coordinates of drones to increase tracking accuracy.

Drones have inertia and when split into small enough chunks, trajectory in each chunk can be expressed as a Bezier curve. Given a few past coordinates you can predict the future one, so this helps with object detection and keeping track of each individual drone.

When doing object detection, instead of scanning the whole frame searching for a drone, you will be scanning only the areas it is likely to be, meaning you can run at higher FPS and with higher frame resolution.

https://hsto.org/r/w1560/webt/vq/ga/at/vqgaat7sqymkhlro_8vef...


Ultra-Wideband is starting to make its appearance in some customer-level products, such as airtags. I am not sure if interference would be an issue if tracking multiple object, but the technology seems like a good fit! From a simple search, this product might work: https://www.inpixon.com/technology/rtls/anchors/nanoanq-uwb

You could also go the custom hardware route and trilaterate signals from small embedded transmitters. That would require a lot of effort, but it should work using FPGAs and/or analog electronics.

Another approach could be to use radar and/or RFID: http://rfidradar.com/howworks.html


If the drones localize, you can get a constantly updating position over radio. Even with other tools, this will help disambiguate.

There are a few different parts here from robotics that can help.

  - Tracking allows you find a how a patch of pixels move. Look up "klt" and "SIFT features". Older
  - Recognition allows you to find a given object. Look up "yolo". Newer
  - Motion modeling allows you to predict where something should be, and can include the transponder ego data. Look up "kalman filter" 
  - All three of the above should be available in existing libraries
  - If possible, engineer the environment, like putting easy to spot LED patterns on the drones. This is always the easiest


RTK GPS? Assuming you have a base station, you correct the precision of the drones to ~1cm


I've been working with motion capture system for several years as a phd student in robotics. The first thing came into my mind is the motion capture system which is already mentioned by someone ahead. I worked in one of the largest indoor motion capture in academy, the ASU drone studio which possibly can be a reference if you searched it. But the cost of such a field is pretty high since you basically have to cover every corner with camera each of which costs 4k-6k dollars. It can track 10s of objects easily with sub-mm accuracy and 360 frame per second.


Can you expand on the gates (using many more of them) that work off the fpv video signal? You can add cameras detecting motion and not worry about which drone is which (getting that from fpv signal as it passes the gate).

Then interpolate with accelerometer data from the drones telemetry?

Dunno seems hard anywhich way but having more than one tech involved to cross check seems like it might help.


Depends on how you want to do it.

The big boys use sensor blending and kinematic gps to get cm level accuracy.

However that requires good GPS coverage (ie no roof)

Second to that its fiducial tracking. Either attacking markers to each drone and using multiple cameras to work out 6 degrees of freedom, or giving the drones enough horsepower to do it onboard.


I would try sound. Make each drone emit a distinct (ultrasonic) frequency and triangulate them that way. Maybe you could get velocity estimates too from the Doppler shift? I don't really know what I'm talking about, but it feels like this would be easier than a visual method.


Did you consider using Lidar? If you need high positional accuracy it might be better than cameras.


You might take a look at OpenCV or BoofCV:

http://boofcv.org/index.php?title=Example_Tracker_Object

BoofCV also has a great Android App to check out its features.


SteamVR's Lighthouse stuff is limited to 10x10m but if they let you hack it the drones would only need to carry a few LEDs. It would work in indoor arenas too if this is a traveling show.


If using Lighthouse, the drones would need to carry sensors, you're thinking of oculus tech.

However, by using a lot of base stations, lighthouse should be able to cover a big area... provided there isn't too much infrared light interfering with the sensors.


There is a project out there to utilize HTC Vive lighthouses with a diy sensor. Not sure if something like this is out there for the Steam VR stuff

https://github.com/ashtuchkin/vive-diy-position-sensor

Note: the project isn't mine, I just found it cool.


I have a similar question. How would you track an animal like a rat in a one acre plot of land to something like 10cm accuracy?

Could decawave work for this? What if the rat goes behind a tree?


Multiview cameras.

Once you have the tracking data how do you plan to view it ?


OK let's say I have cameras, how do I get the data x/y/z for each flying object?

For viewing I'm thinking some kind of 3D engine. The view can look like a game, it does not need very beautiful graphics just be clear about what is happening in realtime.


If you have 2 cameras observing the same object and you know their coordinates and orientations in 3D space there's a fairly simple algorithm to recover 3D coordinates from pairs of matching pixels between the two images. It should be covered in most computer vision textbooks, such as the Szeliski book: https://szeliski.org/Book/. Of course you also need an algorithm for matching pixels between the two images. There are a number of ways of doing that, also covered in the book. OpenCV probably has code that would help with some or all of this.

> For viewing I'm thinking some kind of 3D engine.

OK, that makes sense, I just wanted to be sure we were talking about the same thing.

EDIT: BTW I assume that you don't have access to the data from the drones themselves. If you did it would probably be much easier just to capture GPS coordinates from them.


Maybe I can get GPS data but I don't think it's accurate enough. A few feet wrong would show the wrong position of the drones. It is important to be able to see who is currently the number 1, 2, 3 in the race.


The biggest source of inaccuracy in GPS are atmospheric effects, meaning two receivers in the same vicinity see similar errors. If you get the drone's GPS position and compare it to the reported GPS position from a stationary receiver planted nearby you might be able to get enough accuracy to be useful (basically a poor man's differential GPS).


> It is important to be able to see who is currently the number 1, 2, 3 in the race.

So in addition to finding the current position of the racers you'll also need some history to determine where they are on the track. You'll also need to define the path of the track in order to determine the rank.


What is your budget?


A very blurry camera and being far away, and at night, then you can call it an OVNI.


I've only wanted to add that I'm curious too


Multiple camera probably is the fastest (latency & update) tracking sensor you can build.


ORB, SIFT, ICP, Open CV




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: