> Are we really approaching the reality where we need 2FA to trust the other person is really who they are?
Yes we are. We are approaching a world where people will not want to invest their time, energy, and emotion engaging with other supposed humans remotely unless they have verified their personhood / identity.
I'm actually surprised we have not yet seen an entire industry built around human authentication. Seems like Apple is the only company taking this seriously.
The standard of human interaction will either be meeting IRL or signing communications biometrically.
>I'm actually surprised we have not yet seen an entire industry built around human authentication. Seems like Apple is the only company taking this seriously.
Completely agree - but to that - with how quickly AI voice cloning is improving, and how quickly image creation tools are improving, how long will it be until someone can sample my voice well enough to command Siri to do something and Siri won't be able to tell the difference? How long will it be until an AI generated photo of me is so realistic that it can unlock my phone via FaceID?
I have a feeling that's going to happen in a shorter timeframe than we all think it will. I mean, my voice is somewhat similar to my father's, and if I'm around his phone, I can say "Hey Siri" and his phone will think it's his voice, not mine.
We're going to see an arms race between AI and biometrics / authentication technology. My hope is that the latter outpaces the former, but it doesn't seem to be panning out that way.
In the future, we might be only able to unlock our phone or command Siri when hooked up to a continuous biometrics sensor.
You're probably not too far off from what will eventually become reality.
Perhaps that's Apple's angle with the AR/VR headset it supposedly has in the works - if you've got your headset on, you can unlock your devices, and if you don't, you have to manually unlock devices via a password or PIN or whatever.
The bulk of this remains a solvable security issue almost entirely centred around telecom. Spoofing caller IDs is way too easy. SIM swapping is also another issue that can be addressed with better security practices by phone companies.
Most young people these days aren't even using phone numbers/SMS these days except when forced to, so the threat will remain with the older generation and either die off with them or until the phone companies solve this problem... Or some digital voice system replaces it.
Otherwise it's just email phishing. Anything involving money transfers is already locked down and mostly a matter of fooling people... Which again is mostly older people with a poor grasp of technology and common threats (modern street smarts).
Read the story. Notice that the call came from an unknown number, and the scammer was playing the role of a kidnapper calling from the kidnapper's phone, and the "authentication" was the daughter's faked voice. There's nothing the telecom system could have done about that. This particular attack circumvented any possibility of "telecom security" being useful. And if "telecom security" gets better, there's more incentive for scammers to keep finding ways to make it irrelevant.
If you want a technical solution, you have to demand that a person always contact you from their own device as "2FA", or at least that they some kind of 2FA device on them... except there are a billion ways that somebody might lose access to such a device when they were in genuine trouble, and scammers are totally capable of making it look like one of those scenarios has happened.
You're really down to the point where people have to do "mind to mind" authentication with shared secret knowledge... while under extreme stress... in a case that's uncommon enough that most people will not have practiced it.
This is just what GP called "modern street smarts". People won't keep getting fooled like this for long. Just like people had to learn to stop trusting everything they heard on TV, and learn to stop trusting every pop-up on every website. We will develop new habits, such as what you call "mind to mind authentication" or verifying through a separate trusted channel.
So, here's the thing: I have a 15 year old daughter. If she were actually snatched by a kidnapper and threatened with rape/murder/whatever, I am not absolutely sure that she would remember and execute a "code word" protocol. Especially not a protocol that had the extra measures to help keep it from being subverted in various ways, but maybe not even a very simple protocol.
Not sure enough to feel really comfortable betting her life on it, anyway. Not if we hadn't drilled it on a daily basis for weeks and a weekly basis for months.
It's easy to blank on things when you're adrenalized, say if you've been kidnapped. And it's also easy to blank on things when you're adrenalized because you are hearing the person's voice saying they've been kidnapped.
... and if I asked a scammer pretending to have kidnapped her to let me call her on her phone, I would expect to get the obvious reply: "I threw her phone away. I'm not dumb enough to let you track me/her/us through it". Which is totally credible because that's what a kidnapper should do.
When you get the call, the strong prior probability is that the whole thing is a scam, but that's not so easy to hold onto in a situation like that. And even if you do hold onto it, you will be scared.
Oh, and on edit: Yes, I expect I would keep it together enough to call her on her phone to check, since if she hasn't been kidnapped there's nothing stopping her from answering it. I don't know if I'd expect that of others. But it's also true that if I call her, and nothing is actually wrong, I still expect about a 50-50 response rate because she doesn't hear the thing, has it on mute, or is in school and forced to keep it in her locker, or has let the battery run down, or whatever.
If millions of people start getting fake kidnapping calls every day, then I'm sure we will stop falling for it quite soon. There's a limit to how many days in a row you will keep sending money to every random AI-empowered guy calling to convince you he has taken your daughter, only for her to return home from school like normal just a few minutes later.
I don't think it will be long before video and audio is no more convincing than text is now. We will stop falling for the AI scams, just like we (most of us) stopped falling for the Nigerian princes, scam ads/virus popups on the web, and those fake emails from family members claiming they need us to wire money so they can pay for a flight home or whatever. Basically, my thesis is that people will start to get wise to any scam that is sufficiently common and harmful.
Real kidnappers might have to learn to work to convince the families that the kidnapping is real and not just another scam.
And what happens when someone surfaces a video of you spouting off racial slurs or running down a street naked, generated literally with a simple text prompt? Do you think your employer, colleague, or even spouse will take the repetitional risk of sticking around even if the video is completely AI-generated?
Every piece of communication will have to be signed to verify authenticity.
Camera makers are already looking into embedding authentication technology at the hardware level.
All a digital signature can prove is possession of a secret, it can't prove that some process was followed in generating an image.
It's impractical to have authenticity enforced by millions of consumer devices. If any random manufacturer in a third-world country can generate millions of valid signing keys to be embedded in millions of cheap phones or cameras, then there are many employees there that can also leak a bunch of valid keys if the price is right, and these keys can sign anything in a way that's indistinguishable from all these cheap cameras. And if you revoke the signatures of any "untrustworthy" manufacturers (really, any manufacturer can be and will be compromised, especially if some governments want to manufacture propaganda), then people won't stop using the phones/cameras just because their signatures aren't considered kosher, you'll still have millions of people uploading genuine, valid images from these cameras, so either people will have to trust invalid signatures or refuse lots of benign content, and I'll bet most people will choose the former one.
And of course a camera doesn't know if it's taking a picture of reality or another picture - it would take a relatively simple optical setup to allow any camera to record (and sign) an image off of another screen, so if some authenticity-verification system was actually working and popular, any reasonable criminals would do it and have signed recordings of their deepfakes from a real major brand camera.
Where exactly does this video "surface"? A random person on the internet?
If it's someone you know, with a reputation that makes them believable, then that's basically a criminal act and depends heavily on that person's reputation. There should be enough deference even without that, beyond simple harassment (which is what you're saying, where you lose your job or relationships).
Otherwise I'm not sure why your spouse or boss would rather believe some video coming from barelycartwheel48292@gmail.com is more believable than the person themselves.
Anyway, I was talking about overt scams not the wider cultural/social implications. This is a large new burden that will be imposed on society. We will learn to adapt and society will manage.
This is starting to happen when you engage with the state (small "s" in the US). Things like pictures of your drivers license and 180degree videos of your face further coupled with a quick facetime video with a real person reviewing the data.
It feels horrible, invasive and like the state has outsourced it to a third party you dont know and have no control/direct engagement with.
Every single time historically that we've had this level of authentication, there is a 'strong man' type that uses this information against the population. Whether its simple strong societal shame in smaller communities or its a massive program to track and/or persecute a group of people.
> The standard of human interaction will either be meeting IRL or signing communications biometrically.
From a convenience standpoint, it seems like we've gotten so technologically advanced that we are starting to move backwards. Is there a name for this type of phenomenon?
The physical world is what we are fully integrated into as embodied beings along dimensions we do not yet and may never understand. It is the only space that has an inherently scarce quality, which seems to be something we need as humans. IMO technology should enable us to deepen our connection in the "real world," not isolate us in piss poor representations built on the belief that we are merely mind-bearing mechanical vessels to be manipulated.
AI is going to drive us back to fully-embodied living, and that's good.
>> The standard of human interaction will either be meeting IRL or signing communications biometrically.
Isn't Sam Altman involved in some Crypto company trying to collect biometrical data, using some orb thing, in exchange for worthless crypto? No idea if he still is, but I am more and more convinced that some folks took every single cyber punk story not as a satire or warning, but as a playbook and something to strife for.
> We are approaching a world where people will not want to invest their time, energy, and emotion engaging with other supposed humans remotely unless they have verified their personhood / identity.
We were past it years ago in some domains.
I used to do a lot of language exchanges, and there is/was a need to screen for people who are solely using machine translation. It's pointless correcting someone using machine-generated output.
Yes we are. We are approaching a world where people will not want to invest their time, energy, and emotion engaging with other supposed humans remotely unless they have verified their personhood / identity.
I'm actually surprised we have not yet seen an entire industry built around human authentication. Seems like Apple is the only company taking this seriously.
The standard of human interaction will either be meeting IRL or signing communications biometrically.