Making USB even more dicey: encrypted drives


Imagine you were expecting to documents from a business associate. Being reasonably concerned with op-sec, they want to encrypt the information en route. Being old fashioned, they also opt for snail-mailing an actual physical drive in the mail instead of using PGP or S/MIME to deliver an electronic copy. Is it safe to connect that USB drive that came out of the envelope? If this is bringing back memories of BadUSB, let’s take the exercise one step further: suppose this drive uses hardware-encryption and requires installing custom applications before users can access the encrypted side. But conveniently there is a small unencrypted partition containing those applications ready to install. Do you feel lucky today?

This is not hypothetical- it is how typical encrypted USB drives work.

Lost/stolen USB drives have been implicated in data breaches and encryption-at-rest is a solid mitigation against such threats. But this well-meaning attempt to improve security against one class of risks ends up reducing operational security overall against a different one: users are trained to install random applications from a USB drive that shows up in their mailbox. (It’s not a stretch to extend that bad habit to drives found anywhere, based on recent research.) Meanwhile the vendors are not helping their own case with a broken software distribution model and unsigned applications. Here is a more detailed peek at the Kingston DataTraveler Vault.

On connecting the drive, it will appear as a CD-ROM. This is not unusual; USB devices can encapsulate multiple interfaces as a single composite device. In this case the encrypted contents are not visible yet because the host PC does not have the requisite software. Looking into the contents of that emulated “CD” we find different solutions intended to cover Windows, OSX and Linux.

Windows

Considering this is an enterprise product and most enterprises are still Windows shops, one would expect the most streamlined experience here. Indeed- Windows itself recognizes the presence of an installer as soon as the drive is connected and asks about what to do. (Actually asking for user input is a major improvement over past Windows versions cursed with the overzealous autorun feature, which happily executed random binaries from any connected removable drive.)

If we decline this offer and decide to take a closer look at the installer, we can see that it has been digitally signed by the manufacturer using Authenticode, a de facto Windows standard for verifying the origin of binaries:

Screen Shot 2016-06-20 at 21.05.59.png

Looking at the digital signature of the installer in Explorer

Using the signtool utility:

Kingston_Windows_signature.jpg

Using signtool to examine Authenticode signature details

The signature includes a trusted timestamp hinting at the age of the binary. (Note the certificate is expired but the signature is still valid. This is possible only because the third-party timestamp provides evidence that the signature was originally produced at a point in time while the certificate was still valid.)

This particular signature uses the deprecated SHA1 hash algorithm, but we will give Kingston a pass on that. The urgency of migrating to better options such as SHA256 did not become apparent until long after this software had shipped. Authenticode features also include checking for certificate revocation, so we can be reasonably confident that Kingston is still in control of the private-key used to sign their binaries and they did not experience a Stuxnet moment. (Alternative view: if they have been 0wned, they are not aware of it and have not asked Verisign to revoke their certificate.)

Overall there is reasonable evidence suggesting that this application is indeed the legitimate one published by Kingston, although it may be slightly outdated given timestamps.

OSX

OSX applications can be signed and the codesign utility can be used to verify those signatures. Did Kingston take advantage of it? No:

Kingston_codesign_fail.jpg

What- me worry about code signing?

Not that it matters; a support page suggests that installing that application would have been a lost cause anyway:

The changes Apple made in MacOS 10.11 disabled the functionality of our secure USB drives. It will cause the error ‘Unable to start the DTXX application…’ or ‘ERR_COULD_NOT_START’ when you attempt to log into the drive. We recommend that you update your Kingston secure USB drive by downloading and installing one of the updates provided in links below.

(As an aside, it is invariably Apple’s fault when OS updates break existing applications.)

Luckily it is much easier to verify the integrity of this latest version- the download link uses SSL (although not the support page linking to the download, allowing for MITM attacks to substitute a different one.) More importantly Kingston got around to signing this one:

Kingston_signed_OSX_update

Linux

It is a rare surprise when any vendor attempts to get an enterprise scenario working on Linux. At this point, asking for some proof of the authenticity of binaries might be too much and the screenshot above confirms that.

In fairness Linux does not have its own native standard for signing binaries. There are some exceptions- boot loaders are signed with Authenticode thanks to UEFI requirements and some systems such as Fedora also require kernel-modules to be signed when secure-boot is used. There are also higher-level application frameworks such as Java and Docker with their own primitive code-signing schemes. But state-of-the-art code authentication in open source is PGP signatures. Typically there is a file containing hashes of files, which is cleartext signed using one of the project developers’ keys. (As for establishing trust in that key? Typically it is published on key servers and also available for download over SSL on the project pages- thus paradoxically relying on hierarchical certificate-authority model of SSL to bootstrap trust in the distribute PGP web-of-trust.) Luckily Kingston did not bother with any of that; Linux directory just contains a handful of ELF binaries without the slightest hint of verifying their integrity.

On balance

Where does that leave this approach to distributing files?

  • Users on recent versions of Windows have a fighting chance to verify that they are running a signed binary (Even that small victory comes with a long list of caveats: “signed” is not a synonym for “trustworthy.” There are dangerous applications which are properly signed, including remote-access tools.)
  • Users on OSX have to go out of their way to download the application from less dubious sources, by hunting around for it on the vendor website
  • Linux users should abandon hope and follow standard operating procedure: launch a disposable, temporary virtual machine to run dubious binaries and hope they did not include a VM-escape exploit.

Even considering 100% Windows environments, there is collateral damage: training users that it is perfectly reasonable to run random binaries that arrive on a USB drive. In the absence of additional controls such as group policy, users are one click away from bypassing the unsigned binary warning to execute something truly dangerous. (In fact given that Windows has sophisticated plug-and-play framework with capability to download drivers from Windows Update on  demand, it is puzzling that any users action to install an application is required.)

Bottom line: encrypting data on removable drives is a good idea. But there are better ways of doing that without setting bad opsec examples, such as encouraging users to install random applications of dubious provenance.

CP

One thought on “Making USB even more dicey: encrypted drives

  1. A Look Into the “Secure” USB Technology

    DataTraveler “secure” USB is not actually secure. It is likely to use the same technology as dataShur iStorage “secure” USB device and similar ClevX LLC secure drive technology as the basis.

    If you look at ClevX’s website on it’s partnership with Kingston to support the DataTraveler Vault Privacy [1][2][3], it is clear to say that the DataTraveler technology relies on ClevX’s secure drive technology.

    Referencing other implementor’s of ClevX’s DataLock technology [4], it is save to say that DataTraveler Vault Privacy may rely on the same technology. Upon inspecting the NIST FIPS140-2 CMVP Security Policy [5][6][7] for the existing DataLock technology implementations, it is highly likely DataTraveler Vault Privacy maybe using the same technology.

    From the 3 pieces of CMVP Security Policy files [5][6][7], it is clear that the security lies in “potting” and “metallic shell protection” tactics as there are no evidence of tamper resistant measures aside from tamper evident features. The chip at the heard for the security controller is a Microchip PIC16 series chip with the one of the datasheets for PIC16F[8]. From the PIC16F datasheet, it can be easily noticed that neither tamper resistant nor cryptographic capabilities are present on the chips used thus raising the question on how secure these suposedly “secure” USB self-encrypting drives are.

    Regarding the packaging of unencrypted installation files via a secondary partition by emulating itself as a CD-R device and making itself appear as a Read-Only CD-R is one of the common techniques in the industry from a confidential conversation I had with a provider in the past.

    If a malware wants to mount an attack on the “secure” USB, from a logical standpoint, the CD-R virtual disk would prevent the malware from infecting the installation files (assuming the installation files packaged are clean). That does not mean that the malware could not issue arbitrary USB commands to attempt to find a vulnerability in the USB controller firmware before embedding itself or do whatever it wants.

    The encryption offered does nothing more than scrambling logical data without the security guarantees of a proper security IC chip and assuming the USB controller firmware is a black box, it adds more uncertainty of it’s security at resisting possible logical attacks I described above.

    Does it mean that we have to forgo secure data storage ? The answer is likely to be a no as encryption is an integral part of our lives which protects our privacy and intellectual property. Code signing is not enough to give us a definite answer on the integrity and security of these firmwares and packaged installation software. Devices should be assumed compromised to a certain degree and a person must decide how much security he/she wants and create a hierarchy of security domains. One example is setting aside a fixed set of USB devices for certain classification of machines it can interact with and security tokens to a fix set of machines for authentication. Security would become tedious as one have to manage more tokens, USB devices and machines but security comes with the price tag of due diligence and not laziness.

    In the event a cross-domain transfer is required, a middle man machine on the network can be setup that matches the checksum of files (and other security mechanisms) coming across the middle man machine. This machine is typically a highly hardened machine with a very tiny trusted OS running on it. The technical name of such a cross-domain data transfer machine used by the military and governments is called a Cross-Domain Guard [9][10][11].

    References:
    1.) http://www.clevx.com/news.html
    2.) http://www.kingston.com/us/company/press/article/6447
    3.) http://www.kingston.com/us/usb/encrypted_security/antivirus_protection
    4.) http://www.clevx.com/datalock-fips.html
    5.) http://csrc.nist.gov/groups/STM/cmvp/documents/140-1/140sp/140sp1873.pdf
    6.) http://csrc.nist.gov/groups/STM/cmvp/documents/140-1/140sp/140sp1876.pdf
    7.) http://csrc.nist.gov/groups/STM/cmvp/documents/140-1/140sp/140sp1527.pdf
    8.) http://ww1.microchip.com/downloads/en/DeviceDoc/41440A.pdf
    9.) https://gdmissionsystems.com/cyber/products/trusted-computing-cross-domain/tactical-cross-domain-guards/
    10.) https://www.rockwellcollins.com/Products_and_Systems/Information_Assurance/Cross_Domain_Solutions.aspx
    11.) http://www.disa.mil/network-services/enterprise-connections/connection-process-guide/disn-service-appendices/cross-domain-solutions

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s