Why GPS isn't enough.
The first time I watched an aide spoof her location, I was sitting in a coffee shop in Tampa with a ride-along day's worth of data on my laptop. The aide had clocked into a patient's home in Lakeland — except the visit happened on a Tuesday, and on Tuesday her phone never left a parking lot in St. Petersburg.
The GPS coordinates on the EVV record were pristine. The check-in was on time, on address, on patient. Anyone reviewing the claim — including the state — would have seen a clean visit.
This is the part of home-health verification nobody wants to talk about: GPS is the easiest signal to fake. Free apps in the Play Store will spoof location with two taps. Developer mode toggles do it natively. Older Android handsets will accept a fake location from a Bluetooth GPS dongle without complaining. iOS makes it harder, but a jailbroken phone is a five-minute job.
What "GPS-only" EVV actually verifies
If your visit verification stack is just GPS — and most are — what you've actually verified is this: a phone, somewhere, sometime, said it was at an address. You haven't verified that the phone belongs to the aide of record. You haven't verified that the human holding the phone is the aide of record. You haven't verified that the location signal is real and not synthesized.
You've verified an assertion made by an app, by a phone, that can be tampered with by anyone holding that phone.
The four ways the geofence breaks
- Mock location apps. Free, abundant, hard to detect from server-side signals alone.
- Bluetooth GPS spoofing. An external dongle feeds a fake NMEA stream. The phone's OS happily reports the fake location upstream.
- Dev-mode toggles. A motivated aide enables Developer Options and "Allow mock locations" with one tap. No app required.
- Emulator-based fraud. Less common at the individual aide level, more common in coordinated agency-level schemes. The "phone" is a virtual machine running anywhere on earth.
What a defensible verification stack looks like
The fix isn't more GPS. The fix is more sources of truth. A visit verification token that's worth anything to a state auditor needs to bind together at least three independent assertions:
- Identity. Is this the aide of record? Biometric, on-device, bound to the device key.
- Location. Is this the patient's address? GPS plus anti-spoof plus accuracy plus timing metadata.
- Time. Is the visit happening now? A signed timestamp from a trusted clock, not the phone's wall time.
Sign all three together with a key the aide cannot extract from the device. That's what a real visit token is. Anything less is a signature on a piece of paper that says "I promise I was here."
The honest version
I'll close with the honest version: even three-factor verification can be defeated by a sufficiently motivated and technical attacker. The point isn't to be uncrackable. The point is to raise the cost of fraud high enough that the volume drops, the patterns become detectable, and the auditor has something signed and cryptographically bound to walk into court with.
GPS-only EVV doesn't clear that bar. It never did. The 21st Century Cures Act asked states to require EVV. It didn't define what EVV had to be — and the gap between "captured the data" and "verified the visit" is where the fraud lives.
That gap is what we built VisitLock to close.
Ben Adesina is co-founder and CTO of VisitLock. Before VisitLock, he led the biometric platform team at a major identity-verification vendor.
← All Field Notes