Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Surely someone's going to figure out a way to "talk" to Alexa in a pitch that it can hear but humans cannot?

But even if humans can hear the fraudulent commands, what's the defense beyond a confirmation?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: