Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Last time I applied for a credit card online, they asked me to take a video of myself and turn my head from side to side.


This sounds like a great way to get sufficient images/video of you to create a deepfake that could pass this test. Hmmm...


New mandatory security rule: Employees must never turn their heads side to side in a meeting.


Microsoft Teams developed a feature when if you’re using a background and turn sideways, your nose and the back of your head are automatically cut off.

Bug closed, no longer an issue, overcome by events.


Interesting that you bring that up. The most egregiously invasive student and employee monitoring software requires that the subject always face the camera. That seems most ripe for bypassing with the current state of deepfakes. https://www.wired.com/story/student-monitoring-software-priv...


I work as a Digital Gardener[1] and we’re trained to NEVER use our real name.

- [1] https://youtu.be/XQLdhVpLBVE


My bank does a much better system where they ask for a photo of you holding your ID and a bit of paper with a number the support person gave you for authorizing larger transactions. It's still not bullet proof but since you already have to be logged in to the app to do this, I'd say it is sufficient.


Interesting anecdata! Do you think the photo is authenticated/validated automatically (by software), or by a human, or combination (assistance)?

And, if you are willing to share, what country/bank?


This case I was on the bank text support requesting to make a transaction of $100,000 in one go which the app would not let me do. So it was a real person on the other side. Bank was in Australia called Up.


This sounds like a good thing. An extra step in a $100,000 transaction to prevent accidents or crimes definitely feels justified if the accounts not marked as normally moving heaps of money like a billionaire or something.


Yeah this is quite common with fintech (stock brokers and crypto IME) KYC nowadays I've noticed.


May I ask what card/Institution? This would be an immediate no for me.


I'd trust the data with a (real, not online) bank more than most other companies like Google.

I'd be more worried about people hacking into networked security camera DVRs at stores and cafes and extracting image data from there. Multiple angles. Movement. Some are very high resolution these days. Sometimes they're mounted right on the POS, in your face. Sometimes they're actually in the top bezel of the beverage coolers.

Banks are the hardest way to get this data, not the easiest one.


> Banks are the hardest way to get this data, not the easiest one.

Is this statement based on data or a hunch? A quick google turns up a lot of bank data breaches.


A quick google turns up a lot of bank data breaches.

Because banks have to report data breaches. Do you think every neighborhood Gas-N-Blow is publicizing, or even knows, that it's been hacked?


Good point. I’m still wary of just assuming (if that’s what we’re doing here?) that old established organizations you’d expect to be secure are in fact secure. For example I would have expected credit rating agencies to be secure…

Mandatory reporting certainly helps IMO. Reporting should be mandatory for anyone handling PII.


No bank is going to run such a system in house. It will be a contracted service whose data is one breach away from giving fraudsters a firehose of data to exploit their victims.


You would? You would trust a random number to call you and talk to you about your bank account?

(That's what Chase's fraud department tells you to do.. no joke)


"I trust you more than Google" is a pretty low bar in terms of personal data.


I want to know so that I can forward this to lawyers that specialize in biometric privacy law (in IL).

Fuck these biometeric data farmers.


Yes I believe this sideways turning thing is mandatory when doing online identifications


What is an "online identification"? In what context would such a thing occur?


And now that scan could eventually end up out there someplace.


Agreed. Now they have the data to deep fake you turning your head.

I hope they delete the data immediately after use.


Frankly, of all the personally identifying data I share with my bank, a low resolution phone video of the side of my head is the least worrying. It's like worrying the government knows my mum's maiden name!

In the eventuality that robust deepfake technology to provide fluid real-time animation of my head from limited data sources exists and someone actually wants to use it against me, they can probably find video content involving the side of my head from some freely available social network anyway.


I've been looking to rent housing and get a new job the last few months. The amount of info I've sent strangers always worries me.

At least with housing they don't ask me to input the information I've already sent them into their crappy website.


And, if deepfake technology becomes so easy to use, video of your face will no longer serve to identify you.


The implementation I’ve seen only stores a hash based on the image analysis




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: