Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm having trouble getting worked up about this one. Yeah a privacy breach happened, but it was only one person's data exposed, and only to one other person.

The only reason it made the news is because people are already paranoid about voice assistants.



Was the data supposed to be stored in the first place according to the privacy policy/user consent? If not, it'd mean that Amazon stored highly sensitive data (audio recordings from people's bedrooms) illegally and in breach of user's trust.


if you enable Alexa to tune to your voice, it retains recordings. You can listen to the recordings in you Alexa mobile app- so it shouldn’t be that much of a surprise.

It shouldn’t be a shocker if you understand ML and infer what “tuning to your voice” implies. Most people aren’t sophisticated enough to infer that, however.

To me- this is a big demo of unexpected consequences of GDPR. Sounds like a great law, but forcing companies to share everything they know about you increases the risk that a nice package of everything is shared with the wrong party.


Was there something in the article to suggest that it wasn't collected and stored in accordance with policy?


The interesting fact is how much data they store with no apparent reason. They can perform speech to text on the fly and throw away the recording.

Machine learning? Do it with people paid to talk to Alexa.


1.) It's not actually scalable to pay enough people to talk to Alexa to train these models.

2.) That data would not be representative of what real users are doing so it would bias the models.


I'm not necessarily disagreeing with the thrust of your argument (do you really need to store all that?), but constraining your sample to people paid to talk to Alexa can create huge swathes of bias. You'd need to make sure the people you pay also reflect all the accents and languages of the people who use Alexa. On top of that, without some amount of voice data, how are you to even know what that accent breakdown looks like? That's a near-impossible task.


That's one way of looking at it. There's little we could expect Amazon to do to prevent this given they already store the data.

On the other hand: There's little we could expected Amazon to do to prevent this given they already store the data.


The only other kind of data I could imagine this kind of reaction to would be NEST cameras or similar. I'm pretty accepting of voice assistants now, but even I would be pretty put out by the thought of video of me in my jammies going to a stranger without my permission.


I was unxer the impression this was already somewhat the case. Don't many doorbell cameras offer onfettered access to law enforcement?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: