Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This goes along with the news that Windows 10 backs up your drive encryption key by default, and that Microsoft can use it to decrypt your data. In "good faith", of course.


For most users, this protects them to a useful level. Most users don't think losing a password is a big deal and would be very upset to learn their data is lost because they forgot. That's an anti-feature.

The number of people that'll be protected from leaving their laptop in a taxi, or home burglary, or selling/trading-in a device, or just snoopy relatives or acquaintances, etc. is large and MS absolutely made the right call here. Otherwise, you'd have "experts" giving advice to disable this feature or suffer data loss.

Also, if they use OneDrive to back stuff up (like they should!), the security damage is already done as most juicy files will be unencrypted in MS's hosting and still subject to warrants.


> For most users, this protects them to a useful level. Most users don't think losing a password is a big deal and would be very upset to learn their data is lost because they forgot. That's an anti-feature.

"Would you like to store a backup for your drive encryption password on Microsoft OneDrive? If you choose not to do so, and you forget your password, all of your data will be lost. [Yes/No]"

And none of that warrants a ToS that says they can use that backup for anything other than helping you recover your data.

> Also, if they use OneDrive to back stuff up (like they should!), the security damage is already done as most juicy files will be unencrypted in MS's hosting and still subject to warrants.

Hence why client-side-encrypted backups are a good idea.


>"Would you like to store a backup for your drive encryption password on Microsoft OneDrive? If you choose not to do so, and you forget your password, all of your data will be lost. [Yes/No]"

You know when most people care about whether or not they can recover their data? It's not when someone asks them a Yes/No question, it's when they can't recover their data. And responding with "Well, remember 2 years ago when you clicked 'No'?" Doesn't really help.


Would you like to store a backup

You're quite right in your response to the GP.

It's dangerous to ask Joe Sixpack a question like this and accept a simple y/n answer. When people are installing software they are rushing thru w/o thinking.

What the software could do is put something like this on the screen:

   DO NOT STORE A BACKUP OF MY PASSWORD
and force the user to opt-out by literally typing all those characters exactly like that:

   DO NOT STORE A BACKUP OF MY PASSWORD
At least that way it isn't an unthinking, rote response. Maybe also force them to type

   I UNDERSTAND ALL MY DATA WILL BE LOST
Anyway, that would probably only help about half of them. The other half won't care until, as you point out, they can't access their data.


Then, don't make it a [Yes/No] question, make it an [Okay] with a barely noticeable "Change Advanced Settings" link; much like the dialog that the TFA complains about.


Why show the non-prompt? The users who don't want it can turn it off later, as is the case now.


"Later", as in after the key is sent to Microsoft? Not very useful.


I want to encrypt my drive, but ensure that the encryption keys never leave systems that I physically control.

I sure as hell don't want to encrypt my drive a second time because the default setting (that I could only change later, when I'm actually using the computer) for the drive encryption software was to upload the drive crypto key to The Cloud.

Or am I misunderstanding your question?


Which is a great reason why we should be seeking better authentication systems than memorizing random passwords. Or providing people with another means of backup; for instance, provide options to both "back up via print" and "back up to USB device", in both cases providing something you could then store in a secure location.


Which the Bitlocker stuff already does, when you run it manually. I suspect not with devices that come pre-encrypted - Surfaces, etc.

I was going to say that the most likely thing folk have done is try to back up their key on the drive they encrypt - but last time I tried that due to not being on a network, I think the Bitlocker wizard refused to let me.


I can see that one as a default for non-nefarious reasons. If you make it easy for non-specialist users to encrypt their machine, and there is no way for technical support to recover their data, you're going to inevitably end up with angry users. What's not as clear to me is whether you can change the default.


That's a new one on me.

Bitlocker keys can be backed up to onedrive if you want, but you can also store them in a TPM or a smartcard (physical or virtual).


See http://thenextweb.com/microsoft/2015/07/29/wind-nos/ ; "Windows 10 automatically encrypts the drive its installed on and generates a BitLocker recovery key. That’s backed up to your OneDrive account." Together with the ToS: "We will access, disclose and preserve personal data, including your content (such as the content of your emails, other private communications or files in private folders), when we have a good faith belief that doing so is necessary to protect our customers or enforce the terms governing the use of the services."


> We will access, disclose and preserve personal data, including [...] files in private folders

I don't see any language that restricts that to their cloud offerings. It's in the privacy statement that covers windows too.

So unless i'm missing something they're granting themselves the right to disclose your harddrive to government agencies or their own legal department on a good-faith basis.


Every company that has access to your encryption keys can be prompted to give them up with a warrant.

You can keep them from having the key. That's one way around it. Using hardware of some kind (and there are multiple.)

You are also free to use another solution that might meet your strict requirements to personally review the encryption, filesystem, device driver, and memory management code of your operating system to verify it's operating to your specifications. There have literally never been so many options for the privacy minded person with the time to pour through a metric ton of C code.


I'm not talking about encryption keys.

I'm talking about the data itself. Sitting on my harddrive, as it is.

As I understand it microsoft is saying that they could siphon data from my computer if they deemed it necessary.

Maybe that's an adversarial reading of their privacy statement[1]. But it clearly speaks of accessing files in private folders.

[1]: https://www.microsoft.com/en-us/privacystatement/default.asp...


I believe that you are mistaken. Could they turn over your BitLocker recovery key to the authorities that would then use it to decrypt your HDD that they have already taken from you? Yes.

Are they going to reach out over the internet and take your data? No. They are not going to do that. I follow this stuff really closely. I promise I haven't seen or heard of a capability where they can remotely take data from your machine and turn it over to the government.


The point is that they're granting themselves the permission to do so if anyone ever deemed it necessary and that the user has to agree to their terms to use windows.

So you're basically signing away your rights to privacy. Not based on due process but on "good faith belief".

Someone at microsoft thought there is a need to do that to cover their legal asses. They would only think that in case they anticipated needing it in the future.


You are completely free to not use it. I'm not trying to be a smart-a here. There have never been more options for end users.

You aren't signing away your rights to privacy without due process...that's your part to evaluate. "Is this useful enough to me that it's worth agreeing to this?"

Also, this is version dependent. The TOS for an individual consumer is different than a developer with an MSDN license, and a business with a volume agreement. Do you have different privacy requirements? Are you willing to pay for them? If they can't make money with the product that they built in the manner that they came up with then it isn't illegal, or really even remotely morally odious, for them to ask for a different payment arrangement.

Now. Do I like everything about life in a capitalist national security state? No way. But do I whine when some vendor doesn't do exactly what I want when I'm really not event scratching the surface of enough money to get their attention? Seriously, man.


you're going off tangents here.

Obviously the premise of this discussion is that you install their software.

IF you install windows 10 THEN you agree to their terms of service which includes granting them access to your private files.


I'm going off on a tangent?

YOU installed their software. You didn't have to. No one forced you to. Don't like the TOS? Call them and schedule a meeting to talk about coming up with a different arrangement...they will want money for that, but you can certainly have it.

The truth is that there is jack all that I or anyone else can say to you that would change your mind about any of this.

Also, I'm not willing to grant that you are reading the TOS correctly...so there's that point. No offense, but its pretty dense and things that are probably pretty reasonable come across as a privacy invasion to people that are really sensitive on the subject.


> YOU installed their software.

I did? I never said so. I'm just looking at their Privacy Statement and find questionable clauses there.

> Also, I'm not willing to grant that you are reading the TOS correctly

I did say that's a possibility from the start. But as long as nobody shows that it's not possible that my reading is not how a lawyer or judge would read it I remain deeply skeptical about it.


There are two approaches: give someone a pile of power over you and trust them not to abuse it, or never give them that power in the first place. Given the repeated demonstrations of what can and does go wrong with the former...


One other thing. A "pile of power over you"...that's not helping, man. They have some commercial legal arrangement that you don't particularly care for. They can't come and kick you in the shin and torture you. They can't beat you to death and plant a weapon on you or anything. We are talking about an issue that is squarely within middle and upper class privilege in an industry that literally could not exist without government defense funding.

Take, for instance, Richard Stallman is an Alumn of Harvard and MIT. There literally can't be a place that is more establishment. So all of that "freedom" is about being able to use an expensive commercial product that was developed with RnD money from the DOD...but somehow it's morally wrong to not ship source code to a compiler? Can you see where I'm coming from here? The moralizing is pretty arbitrary.

Furthermore, if they did give your content to the Government because of a national security letter how is that abuse of power? Should they not comply with the law? I disagree with a lot of the laws that have been passed in support the war efforts of the last decade, but that's kind of the way that democracy works. I lost, but I still have to live by the rules.

I just think that the privacy absolutism that everyone keeps bringing up isn't reasonable. Even Bruce Schneier says that the way that you actually change these things is through the political process.

Power is a boot on your neck. This is more of an inconvenience.


Leaving aside the tangent in your comment (we were talking about governments having access to your encrypted data, not about Free Software)...

> Furthermore, if they did give your content to the Government because of a national security letter how is that abuse of power? Should they not comply with the law?

I fully expect that they would have little choice in doing so if they received a warrant from a government with jurisdiction over them. (Though I'd also be unsurprised if they did so even if asked without a warrant.) I don't want them to have anything to give if asked.

> I just think that the privacy absolutism that everyone keeps bringing up isn't reasonable.

Different people value their privacy differently. If you don't value it as much, feel free to trade it for things you consider more valuable. Don't assume everyone else wants to make the same trade you do, though.

I'm not advocating absolutism. You should be able to have as much or as little privacy as you want, which may even mean different amounts of privacy in different contexts.

> Power is a boot on your neck. This is more of an inconvenience.

The government having full access to the contents of your encrypted drive is an "inconvenience"? I'd hate to know what you consider an abuse of privacy, then.

The whole point of encryption is to keep unauthorized people from having access to your data.


Those aren't even remotely the only two choices here. There have never been more options for an end user of technology.

You don't have to use agree to it. It's a trade off.

If you have different requirements they are more than willing to come up with a different arrangement with you. (Yes, for a fee.)

They aren't the government. They are an overblown bubble gum factory. It's up to you if you chew or not. And there have never been so many flavors!


What kind of fee do you have in mind for not using their contract of adhesion? Unless I'm buying thousands of copies of the OS I doubt I can even get that negotiation started.


Some of it's pay as you go, oddly enough. But you are largely correct that more money equals more access to these kinds of things.

There is a company in China that paid them to install Office 365 in their data center. There is an amount of money that will make them install it in your data center, too.

I just think that there has never been more choice for end users and a lot of this stuff about privacy is disingenuous. There are a group of people that wouldn't be happy even if MS released their own version of TAILS and hosted part of the Tor network. (It would be "embrace, extend, extinguish!"..."Tor is part sponsored by the Navy...I be MS gives your Tor traffic directly to the NSA."...It's really not hard to imagine the BS.)


OK. What I'm trying to say is that backing up to OneDrive is optional. You get the choice. You can protect the key with a TPM or a smart card...It's not an all or nothing thing. You have options there, if you are interested.

The other thing is that it sounds like a lot of privacy minded people can't trust BitLocker despite any number of assurances from MS or code reviews by third parties. AND THAT'S OK. Use something else.

EDIT: I forgot to mention that if you are an admin or just operate your own AD installation you can store the key in Active Directory. The behavior is version specific, I think.

EDIT EDIT: I believe that the TOS you are talking about is specifically referring to online services. I don't have time to stop and read it right now, but I think that you are misconstruing the intent.


> This privacy statement explains what personal data we collect from you and how we use it.

> It applies to Bing, Cortana, MSN, Office, OneDrive, Outlook.com, Skype, Windows, Xbox and other Microsoft services that display this statement.

> References to Microsoft services in this statement include Microsoft websites, apps, software and devices.

Seems to cover the Windows OS too.


> OK. What I'm trying to say is that backing up to OneDrive is optional. You get the choice. You can protect the key with a TPM or a smart card...It's not an all or nothing thing. You have options there, if you are interested.

Except that the default is both insecure and privacy-violating.


It's insecure by a standard that you are setting. If they can demonstrate an audit log of every admin who has escalated their permission to logon to the container of your data and access it, including the files they accessed, would that be good? (Because they do that.)

Again privacy-violating by your, arguably, very narrow standard. I'm sorry friend, but you are stating these things as if there's no question as to what you say.

More accurately, you might say that there are higher privacy and audit-ability standards that you would require for your given situation or application. I wouldn't be able to argue with that at all.


If they can demonstrate an audit log of every admin who has escalated their permission to logon to the container of your data and access it, including the files they accessed, would that be good? (Because they do that.)

They are legally prevented from showing you such an audit log if a National Security Letter is involved.


And -unless there have recently been great strides in the NSL gag order battle- they are legally prevented from indicating to you that you or your data has been targeted by an NSL.


Read this warning on release day, went to read this myself and haven't found such statement neither in ToS, nor in Privacy Policy.

This paragraph (about private communications and files in private folders) seems to be gone from their Privacy Policy. Google cache confirms it was present (in PP, not ToS), but I suppose MS spotted had this insane statement and removed in a hurry - or hid somewhere else, deeper in small fine print and with another wording.

(Or maybe I had totally missed something, scrolling through the document and my browser's search function malfunctioned.)


Did you expand the sections in the Privacy Statement [1]? Open the page in Firefox or Chrome, hit F12 to get to the browser console, then run this to expand all the sections:

    $('.learnMoreLabel').click()
If you search the page for "disclose", you'll see that that exact wording is no longer present, but very similar wording is in the "Reasons We Share Personal Data" and "Skype - Partner companies" sections.

[1]: https://www.microsoft.com/en-us/privacystatement/default.asp...


Thanks! Yes, now I see this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: