JUST one thing in common thats the speed of light and the distances makes a big variable as well . by the way a lot of the gps systems are talking to each other for correction all the time
Been playing around with a Ka band Doppler traffic speed measuring radar lately, and noticed that it doesn't perfectly agree with my GPS speed indications. It got me wondering... why? What basic errors could creep into each system to cause them to disagree?
Doppler radar operates on a very simple principle, an extremely high frequency microwave signal is sent out to a moving target, and the wavefront is either compressed or stretched by the target's approaching or receding motion. The apparent shift in frequency, aka Doppler shift, is zero-beat against the outgoing frequency, and the difference is directly proportional to the target velocity, based on a fairly simple formula. For example, at 35.500 GHz, the shift will be 65.786236 Hz per km/h of target velocity. (BTW, a very common online calculator for Doppler shift is wrong! It forgets to do the 2V part of the calculation, and it defaults to some very strange value for c.) This Doppler return is measured, some math applied, and a target speed is arrived at. When measuring the Doppler return, a typically very much higher frequency crystal is used as a frequency reference; lets say 4.0 MHz is used. Crystal oscillators would rarely exhibit more than 100 Hz error at this sort of frequency, so when scaled down to comparing & measuring a few kHz Doppler shift, the error is going to be very small - hundredths of a Hz. The simplicity of this system overall means that Doppler radar should be very accurate, given a nice stable reflection from the target, and an accurate & stable frequency reference to measure the Doppler shift.
Then there's GPS. Outwardly simple, just get some timing information from four or more satellites, do some calculations, and it'll tell you where you are on the surface of the earth. If you're moving, the measuring device should also be able to work out in what direction and how fast. Sounds simple, huh? The reality behind GPS is actually quite complex. How do you get an atomic rubidium cell in orbit around the earth to be perfectly stable, despite insane temperature differences over the space of hours? How do you replicate that sort of accurate clocking capability in small receiving handheld devices to be able to accurately measure the differences between arriving signals? How do you compensate for the difference in transmission medium as the signal transitions from the near vacuum of low earth orbit to the more dense atmosphere surrounding earth? All those questions and many more besides were thought about & considered when designing the GPS system. Even the Theory of Relativity caused by the motion of the hyper-accurate clocks on the GPS satellites is compensated for. It's a multi-billion dollar system, with some incredible attention to detail and well thought out engineering, and so should be able to produce accurate results. Indeed, aircraft can navigate from one side of the world to the other, land completely blind, and be off the centreline of the runway by mere centimetres using GPS.
And now we come to Time-of-Flight laser / LIDAR distance & speed measuring. This one has me scratching my head a bit. To achieve suitable accuracy, the system needs to be able to measure ultra short time periods, in order to measure the time period between when a pulse of light is sent out, to it's reflected return. Over short distances, at the speed of light, we're talking pico-second level timing. Yet, the electronics driving all this has response times in the nano-second region. So how are they achieving this magic?
Be interested to hear musings about the pros & cons of each system when it comes to the measuring of speed and / or distance, and errors that might impact each. As for my Ka band Doppler radar being about 3% 'off' what GPS tells me - I suspect it's calculating speed with a margin of error on the low side, though I'm not giving the GPS a halo of perfection, either.
Look Here -> |
JUST one thing in common thats the speed of light and the distances makes a big variable as well . by the way a lot of the gps systems are talking to each other for correction all the time
Unless you're using a survey-grade GPS unit ($$$), GPS will provide the least accurate speed. There are a number of reasons for this - first, there are two GPS codes, "C/A" (civilian access) and "P" (precise). Consumer level GPS units utilize the C/A code for cost reasons. The biggest source of error in GPS is ionosphere interference - which can be corrected, but not in consumer units. Combine a C/A solution with the interference, and you GPS location only valid within a couple meters at best.
Consumer level GPS units also only refresh once per second - for example, say you travel 10 meters on the ground in one second and each GPS position is +/- 1 m, the velocity shown on your GPS could range anywhere from 8 m/s to 12 m/s. Because of this, consumer GPS units will become more "accurate" the faster you're going because the error is spread over a longer distance.
If you want more accurate velocity values from a GPS, you would have to upgrade to a survey-grade device that receives the P code, and is capable of RTK corrections. The price goes way up, but the refresh interval of P code receivers can be up to 100Hz. These devices are capable of calculating positions to 1 cm in real-time, depending on geographic location and local correction services available.
Last edited by Q101ATFD; 09-03-20 at 03:06 AM.
lsemmens (09-03-20),mitaux8030 (09-03-20)
Just a note, Q101ATFD, some consumer GPS are operating at 10Hz (sampling at 10 times per second) & are more accurate than the average 1Hz models.
This is used in models for Marine applications & lap timing for racing applications.
Also the number of satellites the unit can sense & use is important.
Just the addition of the ability to pick up on Navigation Satellites other than GPS like GLONASS, Galileo¹ and QZSS¹ can considerably enhance startup accuracy.
Cheers, Tiny
"You can lead a person to knowledge, but you can't make them think? If you're not part of the solution, you're part of the problem.
The information is out there; you just have to let it in."
Screenshot from my Huawei p30, I'm sitting in a brick office with tin roof.
It works very well in planes as well.
It seems pretty accurate, and uses 6 different systems.
Sent from my VOG-L09 using Tapatalk
Everyone is talking gps but no one is talking about Doppler systems ??? gunn diodes and the such so who got a comment on the op question. you know transmitt a microwave frequency in a specific direction then count the time of the return signal in wave lengths to see how far it is ????
The OP asked about all 3 technologies, and I replied about the one I’m most familiar with.
does any one even know what doppler is now days just cos its not on the phone it doesnt exist ? attitude ? tof in my opinion is very similar to doppler effect and works on the same basic principal just using a different signal basis the advantage of doppler systems is that there are no angles to deflect the signals . Im of the opinion that people dont have a clue about doppler effect but if you are trying to get accuracy multi-doppler is the best by far as there is no ambiguities that are just accepted with gps derivatives , there is always talk of it being the most accurate which is just not true there is always a margin of error with gps as stated in prior posts, no one states that gps is the most accurate over a very long distances but absolutely useless at short distances !!!!!! Time of flight only has a disadvantage of angles involved because the receiver is not directly in line with the transmitter and lenses are used and thus involve optical errors so in conclusion you have to make up your own mind , but I hope I have helped don
Oh don't you have some questions for us this week.
OK... a bit late on a friday night so my brain might not quite be in gear. So lets see what I can cough up.
Did you forget the phase noise of the local oscillator and the delay time vs the probability of the associated frequency shift.
Normally I would expect this noise to be averaged out. You're radar gun probably uses a sigma delta DSP chip system so three is also a little bit of quantisation noise in there. It's a bit of a tradeoff. Accuracy for a good lock on a target while there are multiple return signals and some extra noise.
I can explain this on. Lets start with problems that affect clocks.How do you get an atomic rubidium cell in orbit around the earth to be perfectly stable, despite insane temperature differences over the space of hours?
Vibration - None - you're floating in space. Nothing to bump into or shake the platform.
Temperature Variations - Tiny - While the external surfaces of the spacecraft might heat up and cool down, the internal parts of the spacecraft don't. They're well insulated from the chassis and each other. A vacuum is a wonderful insulator.
On top of this a cryo pump is a very simple thing to add to the spacecraft. This cools down the components you want to keep cool. Then added to this is a heater to warm the instrument up and keep the clock very thermally stable.
And finally the clock can self calibrate. Multiple clocks can compare and external clocks can also be used to reference and correct.
[quote]How do you replicate that sort of accurate clocking capability in small receiving handheld devices to be able to accurately measure the differences between arriving signals? [quote] The carrier contains the information for that clocking reference.
The density of air makes very little difference but it's easily correct for if you know what it is. Just like three other factors.How do you compensate for the difference in transmission medium as the signal transitions from the near vacuum of low earth orbit to the more dense atmosphere surrounding earth?
The speed of the satellite introduces a doppler shift. If you know the vector of the spacecraft with reference to fixed ground stations then you can correct for the shift based on other positions.
The velocity also causes a relativistic doppler shift. The velocity of the the spacecraft causes the clock to tick slower.
There is also a gravitational doppler shift. Earth's gravitational field causes a blueshift going down and a redshift going up.
All of which can be accounted for like you mentioned. Errors can also be introduced into the system and removed by external corrections like DGPS.
LIDARNo, nanosecond is fine. Light travels about 30 seconds in a nanosecond. If you consider that a target is about 300 metres away, that gives you about 1000nS (1uS) flight time. The next sample 100mS later, there is a difference in the return time of about 7 nanoseconds difference. Difference in clocking time is about 150MHz. Not at all unmanageable for a simple integration using a simple GAL..... Yet, the electronics driving all this has response times in the nano-second region. So how are they achieving this magic?
The real question is which system do you trust the most?
The real answer is none of them. You can use all three to calibrate off each other and determine where there errors are and by how much.
Lets call it statistical phase noise. You have four noise sources. Radar, Lidar, GPS and the Reference. I would also build myself a laser amphometer for shits and giggles as another comparison.
Measure each multiple times (lots) and generate that data into a graph with error bars. Correlate that data and compare it with a gaussian bell curve.
You won't get an exact answer, but if you compare each system to that curve you will be able to make a good guess as to how far out you're likely to be.
In terms of police speed checks. The cops aren't going to bother whacking you for trivial differences.
20kph over the sped limit. Ha... nope... you're well outside of the error bars and the benefit of the doubt.
How accurate does each need to be considering the application?
Yes I am an agent of Satan, but my duties are largely ceremonial.
hinekadon (14-03-20),mitaux8030 (19-03-20)
Bookmarks