Microsoft faces criticism for handing over user encryption keys to the FBI
The FBI filed a court order to access the keys that would be used to decrypt the contents of laptops under investigation
Microsoft is receiving criticism because it was confirmed that it handed over BitLocker recovery keys to the US government to unlock encrypted drives, something that for many people breaks the implicit promise of "my data is safe even if they take my laptop." In terms of privacy, the key point is this: if your key is stored in the Microsoft cloud, Microsoft can hand it over if it receives a valid legal order, and that opens a real (legal and technical) door to your information.
What happened with Microsoft, BitLocker, and the FBI?
The case that sparked the controversy stems from a federal investigation: the FBI filed a warrant to obtain keys capable of decrypting information on three laptops, and Microsoft complied. A Microsoft spokesperson explained that the company provides recovery keys when there is a “valid legal order” and when those keys are stored on its servers (for example, if the user chose to save them in the cloud). It was also mentioned that Microsoft receives around 20 such requests per year, suggesting that it's not a "unique and extremely rare" event, but rather an existing mechanism within the system. Here's where the reputational blow comes in: BitLocker is sold (and perceived) as a strong layer of security for the average user, but the detail that "the key is backed up in the cloud for convenience" completely changes the threat model. And yes, it's very different from someone "breaking" your encryption; what happened is that the encryption worked, but storing the key with a third party made it possible to open the door from above, with the proper paperwork. What does this mean for your privacy? BitLocker is full disk encryption: if someone steals your computer while it's turned off or locked, in theory they can't read your files because the disk is encrypted. The problem is that, by default or by design in many scenarios, the recovery key may end up backed up in the Microsoft cloud, and that makes Microsoft a “custodian” with the ability to help recover or deliver it. From a privacy standpoint, the question isn't just “Are they spying on me?” but rather,“Who can access my data if my device falls into the wrong hands?” If an agency obtains a court order and Microsoft has the key, encryption ceases to be a lock that only you control and becomes a lock with a copy in someone else's safe. And organizations and critical voices—including Senator Ron Wyden—have pointed out that it is irresponsible to design products in such a way that the vendor can “turn around” and hand over keys that unlock a person's entire digital life. Furthermore, experts have warned of an equally serious parallel risk: if those keys are centralized in cloud infrastructure, an attacker who compromises Microsoft systems could attempt to access those keys (even though they would still need physical access to the disks to use them). That combination (keys + cloud + history of incidents in the industry) is what makes the issue feel bigger than “an isolated criminal case.”

