Legal Techology


Introduced yesterday, it’s just one page:http://www.lofgren.house.gov/images/stories/pdf/draft%20lofgren%20bill%20to%20exclude%20terms%20of%20service%20violations%20from%20cfaa%20%20wre%20fraud%20011513.pdf

This is in response to the prosecution and subsequent suicide of Aaron Swartz and the JSTOR academic documents hacking, and in response to criticism that the preexisting law is both vague and outdated. Read more here:http://www.latimes.com/business/technology/la-fi-tn-lofgren-floats-aarons-law-on-reddit-after-aaron-swartzs-death-20130116,0,929268.story

Last night, the Senate voted on cloture of S.3414 (The Cybersecurity Act), which failed by 51-47, and which previously failed August 2d vote by failed 52-46.

The OpenCongress.org bill summary is, as follows:

This bill would establish a new National Cybersecurity Council to identify technology sectors as “critical infrastructure” and approve or reject the sectors’ plans to secure their networks from cyber attacks. Businesses would not be required to adopt Council-approved security plans, but if they do they would be rewarded with broad protection from lawsuits and other benefits. The bill would also bolster website owners’ authority to monitor their users’ information and block users’ access. Companies would be given special legal immunity for sharing user information with federal government agencies.

To the best of my knowledge, there is no plan to reconsider this issue in the Senate, although there was some discussion observed between Lieberman and McCain prior to the vote, which may lead to an endeavor to reconsider.

Attention may now focus on a potential Executive Order (a PDF copy of the September 28th draft of which is here) and a rewrite of HSPD-7 (PDF here).

Let me share with you the interesting case of Blythe v. Bell, (2012 NCBC 42 (N.C. Super. Ct. 2012,  http://www.ncbusinesscourt.net/opinions/2012_NCBC_42.pdf).  There, defendants engaged an outside “expert,” Tom Scott, owner of Computer Ants, for e-discovery work. Defendants’ counsel failed to conduct any intervening review Scotts’ work. Instead, defendants relied exclusively on Scott to conduct a privilege review, among other things.

Unfortunately for defendants, Scott had “never provided any forensic computer services in the context of a lawsuit,” and had never “been engaged as a computer expert or provided an opinion in any legal proceeding.” Rather, Scott had worked as a “truck driver, a Bass Pro Shop Security Manager, a respiratory therapist, and a financial auditor for a retail seller.” Put differently, Scott had no experience in e-discovery.

The Court found that Scotts’ paucity of qualifications to serve as an e-discovery “expert” rendered the defendants’ actions particularly unreasonable. Consequently, defendants produced nearly 2,000 pages of otherwise privileged documents to the plaintiff.

Lesson here is that, if you have e-discovery issue or digital forensics issue, don’t call the computer dude or dudette you use to fix your slow computer, or printer, or to set up your wireless network.  That person and a digital forensics expert or e-discovery consultant will rarely be one and the same.

_______________________________

The author, Sean L. Harrington, is a law student and digital forensics examiner, information security professional, and e-discovery, trial, and litigation consultant with the private practice firm of Attorney Client Privilege, LLC, and a risk management team lead for US Bank. Harrington holds the MCSE, CISSP, CHFI, CSOXP, and LexisNexis CaseMap support certifications, served on the board of the Minnesota Chapter of the High Technology Crime Investigation Association in 2011, is a member of Infragard, a member of Century College’s Computer Forensics Advisory Board and [erstwhile] Investigative Sciences for Law Enforcement Technology (ISLET) board, and is a council member of the Minnesota State Bar Association (MSBA) Computer & Technology Law Section.

Introduction

Over the last several years, I’ve posted a handful of short blog entries about the topic of compelling a criminal defendant to surrender a passphrase to an encrypted volume or hard-drive.  These entries concern the three cases of re Grand Jury Subpoena Duces Tecum Dated March 25, 2011, United States v. Fricosu, (D.Colo, 2012), and In re Grand Jury Subpoena (Boucher), 2009 U.S. Dist. Lexis 13006 (D. Vt., 2009).

I have developed the opinion admittedly, more on hunch than scholarly researchthat a defendant should not be able to knowingly withhold a passphrase or password to an evidence trove any more than he should be permitted to hang on to a physical key that could be used to open a safe that the Government has a valid warrant to search, and which is believed to contain evidence.

Unfortunately, I have found myself on the wrong side of this issue.  My colleagues Sharon Nelson and Craig Ball disagree with me on some aspects of the issue.  And my position is seemingly at odds with the Eleventh Circuit in Grand Jury Subpoena Duces Tecum Dated March 25, above, a decision that Professor Orin Kerr described as mostly correct (although I note that the Eleventh Circuit did distinguish Boucher, and recognize exceptions).

Privilege

Setting aside, for the sake of this comment, the question of whether knowledge of the passphrase is both “testimonial” and “incriminating” for purposes of the Fifth Amendment (the very issues central to the aforementioned cases), or whether the knowledge of the passphrase should be distinguished from possession of a physical key, my belief has been based on a principle that parties to either criminal or civil litigation should simply not be permitted to purposefully withhold admissible evidence from each other. Professor Arthur Miller, the great civil procedure legal scholar calls this the “cardinal, basic, philosophical principle.” “Since we’re trying to get at the truth,” he continues, “You must give every litigant equal access to all relevant data . . . It’s as American as apple pie.” Arthur R. Miller, Civil Procedure, Sum & Substance audio lecture series (1999).

Now, before I continue, let’s recognize I have purported to be an axiom for what it is:  incorrect.    In fact, there are several basis under our system of law when a party is permitted to withhold otherwise relevant, admissible evidence.  We call it “privilege.”  Privilege is that annoying rule of law (I’m being facetious here) that, “to protect a particular relationship or interest, either permits a witness to refrain from giving testimony he otherwise could be compelled to give, or permits someone (usually one of the parties) to prevent the witness from revealing certain information” Waltz & Park, Evidence, Gilbert Law Summaries, § 635. Perhaps the most common example of it is the attorney-client privilege. See Upjohn Co. v. United States, 449 U,S, 383, 389 (1981) (acknowledging the attorney-client privilege as “the oldest of the privileges for confidential communications known to the common law”).

But even the hallowed attorney-client privilege has its limits.  Under the civil fraud and criminal fraud exceptions, an otherwise privileged communication becomes discoverable. See, e.g., United States v. Zolin, 491 U.S. 554, 562–63 (1989) (stating goals of attorney-client privilege are not served by protecting communications made for purpose of getting advice for commission of crime or fraud). And see Deborah F. Buckman, Annotation, Crime-Fraud Exception to Work Product Privilege in Federal Courts, 178 A.L.R. FED. 87, § 2[a] (2002).

Encryption as Evidence Destruction

Craig Ball often reminds his audiences of the three ways to destroy electronically stored information: (1) overwrite the bytes with new data; (2) physically destroy the media upon which the data was written; or (3) use strong encrypt on the data and forget the passphrase.  Thus, in my assessment, if an individual encrypts evidence while engaging in the commission of a crime, it is tantamount to flushing drugs down the toilet, throwing the murder weapon in a lake, or silencing a witness. These are independent criminal acts, separable from the underlying charges.   Likewise, a civil litigant, who encrypts evidence after the duty to preserve has attached (articulated best in Zubulake v. UBS Warburg, 220 F.R.D. 212, 218 (S.D.N.Y. 2003)(“Once a party reasonably anticipates litigation, it must suspend its routine document retention/destruction policy and put in place a litigation hold’ to ensure the preservation of relevant documents”)), engages in spoliation that may be punishable.  Therefore, I contend, by using encryption, a defendant or litigant may engaged in spoliation of evidence albeit undoable which may be subject to independent criminal liability, civil sanctions, or an adverse jury instruction.

Notice, these phrases in bold, above, establish mens rea, (i.e., intent — purposeful or knowing conduct) that the actor was using encryption in the furtherance of a crime, or to destroy evidence to thwart a law enforcement investigation.  An instructive analog may be the safe harbor provision, Fed.R.Civ.P. Rule 37(f), as applied to electronic discovery in civil cases. The provision shields a party who cannot produce evidence lost as a result of the routine, good faith operation of an electronic information system. In other words, if an individual was using whole-disk encryption not to obfuscate criminal activity, but rather because he was trying to protect against identify theft, or because the system came with it by default, there is no intent, hence no criminal culpability. Another helpful analog might be found in Arizona v. Youngblood, 488 U.S. 51, 58 (1988), where the U.S. Supreme Court held charges may be dismissed based upon evidence lost or destroyed by the Government, which is deemed to be only potentially exculpatory (as opposed to apparently exculpatory), only if defendant can show the evidence was destroyed in bad faith.

But, perhaps the best authority addressing the mens rea requirement is probably that required for the 18 U.S.C. § 1503 (conduct that, among other things, corruptly endeavors to obstruct or impede the due administration of justice): To sustain its burden of proof, the government must show that the defendant knowingly and intentionally undertook an action from which an obstruction of justice was a reasonably foreseeable result. Although the government is not required to prove that the defendant had the specific purpose of obstructing justice, it must establish that the conduct was prompted, at least in part, by a corrupt motive.” United States v. Barfield, 999 F.2d 1520, 1524 (11th Cir. Ala. 1993) (internal quotations omitted).  Unlike the duty-to-preserve in civil cases, which requires only reasonable anticipation of litigation, the federal criminal context requires there to have been a pending judicial proceeding known to defendant at the time. See, e.g., U.S. v. Fineman, 434 F. Supp 197 (E.D.Pa 1977) (In applying the obstruction of justice statute to issues of destruction of documents, federal courts generally have not required that a subpoena have issued. Rather, it is sufficient for an obstruction conviction that the defendant knew that a grand jury was investigating possible violations of federal law and intentionally caused destruction of the incriminating document.). In fact, 18 U.S.C. § 1503  has even been applied to prosecute those who, in a civil case, were accused of willfully destroying documents subject to discovery. U.S. v. Lundwall, 1 F.Supp.2d 249 (S.D.N.Y.,1998).

Note that my theory is not that the presence of encryption is somehow admissible as relevant in demonstrating defendant’s mental state or aptitudes, as it appears to have been in State v. Levie, 695 N.W.2d 619 (Minn.App. 2005) (“the existence of an encryption program on [defendant’s] computer was at least somewhat relevant to the state’s case against him,” and the jury was allowed to consider it). See also Jessica Murphy, Swiss Cheese That’s All Hole: How Using Reading Material To Prove Criminal Intent Threatens The Propensity Rule, 83 Wash. L. Rev. 317 (May 2008).  Rather, my theory is that, even if a court finds that a defendant cannot be compelled to aid in his prosecution by surrendering a passphrase (because doing so would be testimonial and incriminating), a defendant may nevertheless be criminally liable for evidence spoliation.  Further, when evidence is spoliated, a factfinder may be entitled to presume that the evidence was unfavorable to the spoliator. See Washington Gas Light Co. v. Biancaniello, 87 U.S. App. D.C. 164, 183 F.2d 982 (D.C. Cir. 1950) (Willful destruction of evidence by a party properly raises the inference that the materials destroyed were adverse to the party which brings about the destruction); Brown & Williamson Tobacco v. Jacobson, 827 F.2d 1119, 1134 (7th Cir. 1987) (“A court and a jury are entitled to presume that documents destroyed in bad faith while litigation is pending would be unfavorable to the party that has destroyed the documents.”); Dale A. Oesterle, A Private Litigant’s Remedies for an Opponent’s Inappropriate Destruction of Relevant Documents, 61 Tex. L. Rev. 1185, 1232-39 (1983) (“[A] party’s bad faith destruction of relevant documents is an admission by conduct that he believes his case is weak and cannot be won fairly.”). See generally 2 John Henry Wigmore, Evidence §291 (James H. Chadbourn rev. ed., 1979) (discussing evidence spoliation).

Conclusion

The right to privacy as, Justice Douglas recognized in Griswold v. Conneticut, arises from “penumbras, formed by emanations from those [specific] guarantees . . . in the Bill of Rights.”  And the Bill of Rights operates as a constraint on the Government.  But, those penumbrae do not, in my view, confer a magical privileged status to file or disk encryption under the rubric of privacy, when, in certain limited circumstances, such encryption is really just evidence spoliation.

As a forensics examiner, I am already seeing and foresee a higher frequency of criminal and civil investigations thwarted by the use of file or disk encryption and the privilege under the Fifth Amendment.  Absent new statutes addressing the misuse of encryption technology, a prosecutor should closely examine the Eleventh Circuit decision to see if his or her case falls under the limited exceptions that would require defendants to surrender the passphrase under the penalty of remedial contempt. Alternatively or conjunctively, prosecutors should determine whether the use of encryption by defendants fall within the scope of an applicable federal or state statute for destroying evidence in the furtherance of a crime, or incident to a criminal investigation, where there is extrinsic evidence of a corrupt motive.

Introduction

In this post, I will provide some initial impressions and findings.  I do not  endeavor to write a white paper, or to employ an industry standard, scientific methodology to evaluating the tool (if for no other reason than because I am constrained by time).

 

PostgreSQL

First, I note that it appears that no one has been able to get FTK to work with PostgreSQL, leading me to conclude that the product was shipped without being tested in this regard.  (If a reader has been able to get it working, I encourage you to post a comment here).   I was not able to get it to work, and I wasted two valuable —otherwise billable— days I had set aside for a client, only to make this discovery.

My review of the AccessData forums indicates identical experiences, and I haven’t found one poster there who yet claims to have finished an evidence load using FTK with Postgres.  (Note: I am unable to determine whether it is an AccessData use-violation to excerpt comments from the private forums, so I am proceeding with an abundance of caution by not doing so).

Likewise, I conferred on Friday with another colleague, a lead examiner for a large company, and he replied:

I just had the same experience. I mistakenly upgraded to 4.0, removed Oracle completely, and installed PostgreSQL. That was a mistake . . . some of my run-of-the-mill cases that should only take a couple hours were taking days and had to be killed off. Then, after I removed PostgreSQL and re-installed Oracle I couldn’t get it to forget about the old connection and had all sorts of weirdness with it not finding Oracle some of the time. I eventually backed out FTK, Oracle, and PostgreSQL and did a complete manual cleanup of all garbage files and registry entries and then re-installed everything. I am back to Oracle with 4.0 and things are fine again, but what a mess to deal with this on 3 machines.

And, the same experiences are found on the ForensicFocus forums (e.g., http://www.forensicfocus.com/Forums/viewtopic/p=6557533/).  Thus, based solely on these numerous anecdotes, and based on my understanding that new purchasers of FTK do not receive Oracle licensing, I have concluded that FTK 4.0 & Postgres is not merchantable (suitable for its intended purpose), although –as noted above– I may be incorrect, and would be pleasantly surprised to be proved wrong.

So, I, too, reverted back to Oracle.   Unfortunately, I couldn’t get the Oracle KFF library for v4.0 posted on AccessData’s FTP site to work.  In browsing through the AccessData forums, neither could anyone else.  AccessData made available to me and certain others who complained a working KFF, which –last time I checked– is not the one available for download at the AccessData FTP site.

Now, like many of the others who have posted to the AccessData forums and elsewhere, I am able to use v4.0 with Oracle.

 

Hardware

The three machines I used for testing are, as follows:

(1)  FTK & Oracle server (one box) – SuperMicro X8DTL-6F motherboard, LSI SAS2 2008 controller, two RAID-0 volumes each consisting of two OCz Vertex 3 Max IOPS 120GB SSDs (SATA III – one volume for Oracle data; the other for O/S and the adTemp directory), 24GB of DDR3 1333MHz ECC non-registered server memory, and two Intel Xeon X5650 hexacore processors.

(2) Distributed Processing Engine (“DPE”) #1Asus M4A89GTD Pro/USB3 motherboard, AMD Phenom II X6 1100t hexacore processor (watercooled, but not overclocked), 16Gb of DDR3 memory, O/S residing on an OCz Vertex 3 SSD (SATA III), and the temp directory used by the AccessData distributed processing engine residing on a separate OCz Vertex 3 Max IOPS edition SSD, and the pagefile residing on a Western Digital  Raptor 10K RPM hard drive.

(3) DPE #2 – Hewlett Packard DV6 laptop, Intel core i7 720qm processor, O/S installed on an Intel SATA II SSD, temp directory used by AccessData distributed processing service residing on a separate OCz Vertex 2 SSD (SATA II), 8GB of RAM

 

Source Evidence Configuration

In an effort to find the fastest evidence load times, I experimented with various combinations of the foregoing.  As a test image, I used a 186GB DD image (ultimately consisting of 1,052,891 evidence items), hosted on a Western Digital 4TB My Book Studio Edition II (SATA II – up to 3 Gb/sec) configured as RAID-0.  I used both KFF alert and ignore, MD5 & SHA1 hashes (but not SHA256 or “fuzzy” hashes), expand compound files, flag bad extensions, entropy test, dtSearch index, create graphics thumbnails, data carve, meta carve, registry reports, include deleted files, and explicit image detection (X-DFT & X-FST).   Using oradjuster, I tweaked the SGA_TARGET parameter to use only 18% of available physical memory during evidence processing.

 

Distributed Processing

Before continuing, I’d like to mention a few things about the Distributed Processing Engine, which are hard-learned lessons from either failing to read the user guides and appendices, or from experimentation:

(1) To get DPE working, you must have a system share as the path to the evidence in the Add Evidence dialogue box.  Without it, no distributed processing will occur.  Likewise, you need to have the Oracle working directory on a public share, the FTK-cases directory on a public share, and all systems using mirrored accounts (which Microsoft defines as “a matching user name and a matching password on two [or more] computers”).   You also need to disable any Windows firewalls or other firewalls.  Tip ►  An easy way to make certain the port on a DPE is reachable is to install PortQuery v. 2, and run the command, “PortQry -n {machineName} -p tcp -e 34097″ where “machineName” is the name of your DPE, and where 34097 is the default port (configurable in the distributed processing configuration menu). AccessData ought to include a “test connection” button in the distributed processing configuration — it would probably save their help desk a lot of e-mails and calls.

(2) And, although the v4.0 System Specification Guide discusses how to configure the adTemp directory on the localhost processing engine (which directory should be located on its own, high i/o throughput drive, because it is the interface between the processing engine and Oracle), I have found no discussion about how to optimize the DPE machines.

Tip ►  On a Windows Vista or Windows 7 machine, if you are logged on as “farkwark,” the distributed processing engine will write its files to Users/farkwark/appData/Local/Temp.   To relocate this /Temp directory to a different drive, you need to create a junction, as follows.  First, log off and log on as a different admin account user.  Next, move (not copy) the /Temp directory to the different drive (say F:).  Rename it, if you like (e.g., “FTKtemp”).   Now, from a command prompt, type:

mklink /j “Users\farkwark\appData\Local\Temp” “F:\FTKtemp”

From this point forward, the processing engine will, in fact, be writing its temporary files to the F:-drive, thereby not competing with the O/S drive for i/o.

 

Hardware Configuration Experiment No. 1: FTK & Oracle Server + DPE #1 & DPE #2

With this three-machine configuration, I rarely saw the FTK & Oracle server’s 12 cores (24, if one counts hyperthreading) get above a collective 15% of load.  DPE #2 (the HP laptop) processing load reached near 100% several times, with up to 6 of 8GB available memory in use. DPE #1 reached between 50-80% CPU utilization with extended periods of low utilization, and about 8GB (of 16GB available physical memory). Total time was 11 hours, 24 minutes.

 

Hardware Configuration Experiment No. 2: FTK & Oracle server (one box implementation), alone

With this one-box configuration, the dual xeon hexacore CPUs were pegged for extended periods of time at 99 or 100% (note, this differs from the experience of others, who have written,”We have multi cored, multi processor CPUs on our systems. What we’ve found is that typically, unless we are password cracking, that the I/O from the disks can’t keep up with resources available. Meaning our CPUs are never maxed out. So the CPUs are not the bottleneck for getting more speed“). The FTK/Oracle server used up to 20GB (of 24GB available) of physical memory (recall that SGA_TARGET was set to 18%).  Total time elapsed was 9 hours, 4 minutes, an improvement of 20.5% over using a three-machine distributed processing configuration (no, that’s not a typo).

 

Hardware Configuration Experiment No. 3: FTK & Oracle server + DPE #1

With this two-machine configuration, the FTK Server’s CPU utilization was rarely  above 40%, only occasionally reached 60%, and most usually was between 5 to 25%, and used between 8 to 12GB (of 24GB available) of physical memory.  Meanwhile, DPE #1′s CPU utilization was pegged at 99 to 100% for extended periods of time, and used up to 10GB (of 16GB available) of physical memory.  Total time elapsed was 8 hours, 33 minutes, a 25% improvement over the 3-machine distributed processing configuration, and only a 5.7% improvement over using the FTK/Oracle one-box solution.

 

Hardware Configuration Experiments Conclusion

Based on the type of hardware I am using, I found very little benefit (up to 6% processing time improvement) and, in fact, some detriment (over 20% in processing time loss) in stringing together numerous DPE workstations.  My experience is inconsistent with AccessData’s findings of processing time differences between stand-alone boxes and distributed processing clusters (see http://accessdata.com/distributed-processing).

 

Add-on Modules: Data Visualization & Explicit Image Detection

Initially, I thought the Data Visualization module did not work. No matter whether I attempted to view a directory containing several score of files, or the entire 1 million + items, it never displayed more than a handful of results (sometimes zero or one, resulting in a pie graph that was just one big green circle).   Turns out (of course) that It was my fault for failing to “select” the appropriate date range.  Had I read the manual first, I would have noticed, “Information can only be displayed for the date that you have selected.” FTK 4.0 User Guide at 200.  Apparently, when the Data Visualization tool first opens, it defaults to one day (the first day of the oldest evidence in the list) –not very inuitive.

AccessData claims that the Data Visualization add-on component “provides a graphical interface to enhance understanding and analysis of cases. It lets you view data sets in nested dashboards that quickly communicate information about the selected data profile and its relationships.”  Among other things, it purportedly provides “a complete picture of the data profile and makeup,” empowers the examiner to “Understand the file volume and counts through an interactive interface,”  and “Create a treemap of the underlying directory structure of the target machine for an understanding of relative file size and location” (similar to, but not as elegant as WinDirStat).

In summation, the tool appears to work as designed, although I haven’t done any substantive reporting off of it.  One user posted on the AccessData forum that there appears to be no way to export the graphs in to a report, but this can be easily remedied by taking a screen clipping using SnagIt, Microsoft’s OneNote, or a screen print.

Also, I have been experimenting with the EID. AccessData states, “This image detection technology not only recognizes flesh tones, but has been trained on a library of more than 30,000 images to enable auto-identification of potentially pornographic images . . . AccessData will continue to integrate more advanced image search and analysis functionality into FTK. Customers who have added the explicit ID option to their Forensic Toolkit® license and are current on their SMS will automatically receive those new capabilities as they become available.” Notwithstanding this commitment, it appears that no additional functionality has been added since its release with FTK 3.0.  I also note that the technology is unlike Microsoft’s “PhotoDNA,” which is reputed to process images in less than five milliseconds each and accurately detected target images 98 percent of the time, while reporting a false alarm one in a billion times. Comparatively, AccessData’s EID has been found to achieve 69.25% effectiveness with 35.5% false positives. Marcial-Basilio, Aguilar-Torres, Sáchez-Pérez, Toscano-Medina, Pérez-Meana, “Detection of Pornographic Images, 2 Int’l Journal of Computers 5 (2011).

My experience reveals many false positives (such as, “small_swatch_beige.png,” an image consisting of a plain beige coloured box, ranking at the very top of the list compiled by the X-ZFN algorithm, which is supposed to be the most accurate of the three algorithms), and seems to confirm that the algorithm is based on the presence of flesh tones (and nothing more, unless your system has one or more of the 30,000 images that became part of the library at the time the tool was introduced). Nevertheless, if one is short on time (and many law enforcement agencies’ examiners are), the tool does certainly help to reduce the data set that requires manual review.

Barely three weeks after I penned Another Judge Rules Encryption Passphrase not Testimonial Under Fifth Amendment Analysis, the Eleventh Circuit has held that a defendant’s “decryption and production of the hard drives’ contents would trigger Fifth Amendment protection because it would be testimonial, and that such protection would extend to the Government’s use of the drives’ contents.”

For the reasons set forth in my previous posts on this topic, and for the reasons more fully set forth below, I disagree, and I hope the Government petitions for a writ of certiorari on this issue.

In this case, captioned,  In re Grand Jury Subpoena Duces Tecum Dated March 25, 2011, law enforcement officials began an investigation of an individual using a YouTube.com account whom the Government suspected of sharing explicit materials involving underage girls.  During the course of the investigation, officers obtained several  IP  addresses from which the account accessed the internet.  Three of these IP addresses were then traced to hotels, which hotels’ guest registries revealed the sole common hotel registrant during the relevant times was defendant.

Although probable cause was not raised as an issue in this case, it should be noted that the Government’s forensic investigator testified the Government believed that data existed on the still-encrypted parts of the hard drive and “introduced an exhibit with nonsensical characters and numbers, which it argued revealed the encrypted form of data.” Further, the Government’s forensic expert conceded that, although encrypted, it was possible the volumes contained nothing.  When defendant asked the forensic expert, “So if a forensic examiner were to look at an external hard drive and just see encryption, does the possibility exist that there actually is nothing on there other than encryption?  In other words, if the volume was mounted, all you would see is blank.  Does that possibility exist?,”  the expert replied: “Well, you would see random characters, but you wouldn’t know necessarily whether it was blank.” And, when pressed by defendant to explain why Government believed something may be hidden, the expert replied, “The scope of my examination didn’t go that far.”  In response to further prodding, “What makes you think that there are still portions that have data[?],” the expert explained, “We couldn’t get into them, so we can’t make that call.”  Finally, when asked whether “random data is just random data,” the expert concluded that “anything is possible.”

Of course, everything the expert said –taken in isolation– was true, but I fail to see why these explanations undermine the Government’s right to the unencrypted data.  Sure, the expert could or should have pointed to circumstantial trace evidence (such as registry data, link files, that should exist had the defendant possessed and viewed the files as alleged).  Sure, the Government could or should have asked for an adverse inference as to the presence and use of a forensic wiping utility, if trace evidence was not be present, as it should have been had the defendant possessed and viewed the files, as alleged.  But the Government wasn’t required to have probable cause as to the encrypted volumes specifically, because probable cause as to the entire computing equipment had already been satisfied.

Indeed, some discussion of the Fourth Amendment is here necessary:  The Fourth Amendment provides that, “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”  As relevant here, the warrant must describe the place to be searched with particularity, and the things to be seized.  Note that the comma following the word “searched,” limits the particularity requirement to the place to be searched. And, although warrants must establish probable cause and particularly name the place to be searched, the Supreme Court has rejected the argument that warrants must include “a specification of the precise manner in which they are to be executed.” Dalia v. United States, 441 U.S. 238, 257 (1979).

In this case, the place to be searched was the hotel where defendant was staying and, presumably, any computers found therein were identified as the “things to be seized.”  But, some urge that computer hard drives should be regarded as a “virtual home” or “virtual warehouse.” See, e.g., Orin Kerr, Searches and Seizures in a Digital World, 119 Harv. L. Rev. 531, 539, 542 (2005) (“While houses are divided into rooms, computers are more like virtual warehouses . . . While computers are compact at a physical level, every computer is akin to a vast warehouse of information.”). If so, the warrant may be construed to refer to the computers as among the “places to be searched” (in addition to the hotel room), as well as the things to be seized. See United States v. Ross, 456 U.S. 798, 821 (1982) (“When a legitimate search is under way, and when its purpose and its limits have been precisely defined, nice distinctions between closets, drawers, and containers, in the case of a home, or between glove compartments, upholstered seats, trunks, and wrapped packages, in the case of a vehicle, must give way to the interest in the prompt and efficient completion of the task at hand”).  My point here is that, if the warrant was sufficient to justify the search of the computers, that justification extended to all portions of each computer.

Assuming the Government has a right to inspect all portions of the hard-drives, based on probable cause to believe they were an instrumentality of a crime, then it is appropriate to begin the Fifth Amendment analysis. Under the Fifth Amendment, “[n]o person … shall be compelled in any criminal case to be a witness against himself.”  The courts have consistently interpreted this provision as “protect[ing] a person . . . against being incriminated by his own compelled testimonial communications.” Fisher v. United States, 425 U.S. 391, 409 (1976).  Thus, to be afforded the protection, the statement must be: (1) compelled, (2) testimonial in nature, and (3) serve to incriminate the declarant in a criminal proceeding. If these elements are met, the declarant has the right “not to answer questions put to him in any proceeding, civil or criminal, formal or informal, where the answers might incriminate him in future criminal proceedings.” Lefkowitz v. Turley, 414 U.S. 70, 77 (1973).

In this case, there was no dispute that defendant had care, custody, and control of the computers and hard-drives. As the sole owner, no one else could have created the encrypted volumes, and the Eleventh Circuit’s opinion does not indicate that defendant claimed someone else had created those volumes.  Therefore, it is not clear to me why defendant’s mere knowledge of the passphrase is an admission of guilt, any more than it would be to surrender the a key hanging about his neck, or to surrender the combination code to a safe in a home, that was properly within the scope of a valid search warrant (as these hard-drives were).  Knowledge of the passphrase is not an element of the crime, but rather possession of child pornography.  (Conversely, a murderer’s knowledge of the secret location of his victim’s grave would be incriminating, because only the murderer would know that location). Therefore, although the court intoned, “the Government appears to concede, as it should, that the decryption and production are compelled and incriminatory,” I don’t agree that the act of decryption and production, by itself, is incriminatory (even though the fruits of that production could contain evidence that is incriminating).

That leaves the question of whether the passphrase is testimonial.  The Court noted, “an act of production can be testimonial when that act conveys some explicit or implicit statement of fact that certain materials exist, are in the subpoenaed individual’s possession or control.” Yet, as noted above, it is uncontroverted that defendant had exclusive care, custody, and control of the encrypted volumes, and knows the passphrase, regardless of whether those volumes contain contraband.  Citing United States v. Hubbell, 530 U.S. 27 (2000) and Fisher v. United States, supra, the court relied upon the so-called “foregone conclusion” doctrine, which posits that an act of production is not testimonial—even if the act conveys a fact regarding the existence or location, possession, or authenticity of the subpoenaed materials—if the Government can show with “reasonable particularity” that, at the time it sought to compel the act of production, it already knew of the materials, thereby making any testimonial aspect a “foregone conclusion.” I contend that exception is here met, because it is not in dispute that the contraband was traced back to three separate IP addresses in different hotel rooms rented by defendant, and that there was no other plausible repository for those files to exist but his computer equipment, and this satisfies the “reasonable particularity” requirement.

I previously discussed, on a bar association section blog in 2007 (here) and 2009 (here), the case of In re Boucher, where a U.S. judge for the District of Vermont ruled that requiring a criminal defendant to produce an unencrypted version of his laptop’s hard-drive, which was believed to contain child pornography,did not constitute compelled testimonial communication.  Professor Orin Kerr posted a detailed discussion of the matter here.

The circuit court appeal in Boucher was dropped, but a new case has surfaced in the U.S. Court for the District of Colorado, United States v. Fricosu.  There, a bank fraud defendant’s home was searched pursuant to a warrant, and a computer seized which held encrypted files. Further, Defendant provided evidence indicating her ownership computer, that she knew it was encrypted, and that it contained inculpatory evidence. The government subpoenaed defendant to produce an unencrypted version of the content of the computer, offering immunity against authenticating for doing so.  Defendant  moved to quash, arguing that the subpoena violated her Fifth Amendment privilege against self-incrimination by requiring a testimonial act of production by compelling her to acknowledge her control over the computer and its contents.  Borrowing substantively from Judge Sessions’ decision in Boucher, Judge Robert E. Blackburn  ordered defendant to reveal the decryption password to the contents of the hard-drive.

In Boucher, Judge Sessions observed that, although the Fifth Amendment privilege ordinarily applies to verbal or written communications, an act that implicitly communicates a statement of fact may be within the purview of the privilege as well.  (citing United States v. Hubbell, 530 U.S. 27, 36 (2000); Doe v. United States, 487 U.S. 201, 209 (1988)), and that, although the contents of a document may not be privileged, the act of producing the document may be.”   United States v. Doe, 465 U.S. 605, 612 (1984).  Production itself acknowledges that the document exists, that it is in the possession or control of the producer, and that it is authentic. Hubbell, 120 S.Ct. at 2043.  In summation, an act is testimonial when the act entails implicit statements of fact, such as admitting that evidence exists, is authentic, or is within a suspect’s control. 487 U.S. at 209.

In the current case (Fricosu), Judge Blackburn has ruled that, where the existence and location of the documents are known to the government, no constitutional rights are touched, because these matters are a foregone conclusion, insofar as they add little or nothing to the sum total of the Government’s information. Likewise, defendant’s production wasn’t necessary to authenticate the computer drives where she had already admitted possession of the computers. See United States v. Gavegnano, 2009 WL 106370 at *1 (4th  Cir. Jan. 16, 2009) (where government independently proved that defendant was sole user and possessor of computer, defendant’s revelation of password not subject to suppression). Specifically, Judge Blackburd ruled that, “There is little question here but that the government knows of the existence and location of the computer’s files.  The fact that it does not know the specific content of any specific documents is not a barrier to production.”

Law Technology News quoted our colleague, Craig Ball, as saying what disturbs him most about the ruling is “that Judge Blackburn points to Boucher and simply equates the two scenarios without noting that the government’s knowledge of the contents of the kiddy porn laptop was substantial, specific, and no way speculative.” He added that this case differs from Boucher in the sense that the government at least saw the questionable files on Boucher’s computer already, “the LEOS saw the contraband with their own eyes and recognized its criminal character.” But, in the current case, officials likely don’t know what is on the computer, because “the cops never got past the log in screen, due to the computer’s encryption.”  Likewise, Professor Kerr characterizes Judge Blackburn’s ruling as “not a model of clarity.”

I see Craig’s point, and I recognize that he doesn’t want law enforcement to have carte blanche to peer into one’s encrypted volume based solely on reasonable suspicion (or  –alternatively– to suffer an adverse jury instruction (State v. Levie, 695 N.W.2d 619(Minn.App. 2005)).

But, I wonder whether a helpful analogy (one I can’t claim credit for) is the locked safe, within which the police have probable cause to believe a stolen painting has been stored. They don’t want to blast the safe open, because the priceless painting may be destroyed.  Assume the safe is in defendant’s home, and the police have a search warrant, and, therefore, assume the safe belongs to defendant and he is aware of its presence. Can defendant be compelled to turn over the key to the safe?  The answer, I believe, is yes. See Schmerber v. Cal., 384 U.S. 757 (U.S. 1966) (Where blood test evidence, although it may be an incriminating product of compulsion, is neither testimony nor evidence relating to some communicative act or writing by a defendant, it is not inadmissible on privilege grounds).  Now, let’s assume that it’s not a key the cops need, but a combination code to the safe.  Defendant has this code memorized.  Can the defendant now not be compelled to give up the code?  Why not?  Because it’s a memory?  But the Fifth Amendment doesn’t afford an unqualified privilege for memories; it affords a privilege to statements that are testimonial in nature. How could defendant’s very knowledge of the code to a safe in his own home be either incriminating or testimonial, regardless of whether the cops actually saw the painting deposited therein or not?

Back in 2007 (here) and 2009 (here) I discussed the case of In re Boucher, where a U.S. judge for the District of Vermont ruled that requiring a criminal defendant to produce an unencrypted version of his laptop’s hard-drive, which was believed to contain child pornography,did not constitute compelled testimonial communication.  Professor Orin Kerr posted a detailed discussion of the matter here.

The circuit court appeal in Boucher was dropped, but a new case has surfaced in the U.S. Court for the District of Colorado, United States v. Fricosu.  There, a bank fraud defendant’s home was searched pursuant to a warrant, and a computer seized which held encrypted files. Further, Defendant provided evidence indicating her ownership computer, that she knew it was encrypted, and that it contained inculpatory evidence. The government subpoenaed defendant to produce an unencrypted version of the content of the computer, offering immunity against authenticating for doing so.  Defendant  moved to quash, arguing that the subpoena violated her Fifth Amendment privilege against self-incrimination by requiring a testimonial act of production by compelling her to acknowledge her control over the computer and its contents.  Borrowing substantively from Judge Sessions’ decision in Boucher, Judge Robert E. Blackburn  ordered defendant to reveal the decryption password to the contents of the hard-drive.

In Boucher, Judge Sessions observed that, although the Fifth Amendment privilege ordinarily applies to verbal or written communications, an act that implicitly communicates a statement of fact may be within the purview of the privilege as well.  (citing United States v. Hubbell, 530 U.S. 27, 36 (2000); Doe v. United States, 487 U.S. 201, 209 (1988)), and that, although the contents of a document may not be privileged, the act of producing the document may be.”   United States v. Doe, 465 U.S. 605, 612 (1984).  Production itself acknowledges that the document exists, that it is in the possession or control of the producer, and that it is authentic. Hubbell, 120 S.Ct. at 2043.  In summation, an act is testimonial when the act entails implicit statements of fact, such as admitting that evidence exists, is authentic, or is within a suspect’s control. 487 U.S. at 209.

In Fricosu, Judge Blackburn has ruled that, where the existence and location of the documents are known to the government, no constitutional rights are touched, because these matters are a foregone conclusion, insofar as they add little or nothing to the sum total of the Government’s information. Likewise, defendant’s production wasn’t necessary to authenticate the computer drives where she had already admitted possession of the computers. See United States v. Gavegnano, 2009 WL 106370 at *1 (4th  Cir. Jan. 16, 2009) (where government independently proved that defendant was sole user and possessor of computer, defendant’s revelation of password not subject to suppression). Specifically, Judge Blackburd ruled that, “There is little question here but that the government knows of the existence and location of the computer’s files.  The fact that it does not know the specific content of any specific documents is not a barrier to production.”

Law Technology News quoted my colleague, Craig Ball, what disturbs him most about Judge Blackburn’s ruling is “that Judge Blackburn points to Boucher and simply equates the two scenarios without noting that the government’s knowledge of the contents of the kiddy porn laptop was substantial, specific, and no way speculative.” He added that this case differs from Boucher in the sense that the government at least saw the questionable files on Boucher’s computer already, “the LEOS saw the contraband with their own eyes and recognized its criminal character.” But, in the District of Colorado case, officials likely don’t know what is on the computer, because “the cops never got past the log in screen, due to the computer’s encryption.”  Likewise, Professor Kerr characterizes Judge Blackburn’s ruling as “not a model of clarity.”

I see Craig’s point, and I recognize that he doesn’t want law enforcement to have carte blanche to peer into one’s encrypted volume based solely on reasonable suspicion (or  –alternatively– to suffer an adverse jury instruction (State v. Levie, 695 N.W.2d 619 (Minn.App. 2005)).  But, I wonder whether a helpful analogy (one I can’t claim credit for) is the locked safe, within which the police have probable cause to believe a stolen painting has been stored. They don’t want to blast the safe open, because the priceless painting may be destroyed.  Assume the safe is in defendant’s home, and the police have a search warrant, and, therefore, assume the safe belongs to defendant and he is aware of its presence. Can defendant be compelled to turn over the key to the safe?  The answer, I believe, is yes. See Schmerber v. Cal., 384 U.S. 757 (U.S. 1966) (Where blood test evidence, although it may be an incriminating product of compulsion, is neither testimony nor evidence relating to some communicative act or writing by a defendant, it is not inadmissible on privilege grounds).  Now, let’s assume that it’s not a key the cops need, but a combination code to the safe.  Defendant has this code memorized.  Can the defendant now not be compelled to give up the code?  Why not?  Because it’s a memory?  But the Fifth Amendment doesn’t afford an unqualified privilege for memories; it affords a privilege to statements that are testimonial in nature.  How could defendant’s very knowledge of the code to a safe in his own home be either incriminating or testimonial, regardless of whether the cops actually saw the painting deposited therein or not?

In the last few years, an number of ethics opinions concerning information technology are aimed at the increasing entrustment of client data to third partis (viz. so-called “cloud computing”), and are trending along with proposed or enacted data privacy litigation across the country. For example, California’s proposed Formal Opinion 08–0002 requires a lawyer to evaluate information security and finds that “attorneys are faced with an ongoing responsibility of evaluating the level of security of technology that has increasingly become an indispensable tool in the practice of law.” State Bar of Cal. Standing Comm. on Prof’l Responsibility & Conduct, Formal Op. Interim No. 08-0002 (2010)   Alabama’s Ethics Committee Opinion 2010–02 requires attorneys to exercise reasonable care against unauthorized access, which includes becoming knowledgeable about a cloud provider’s storage and security. Arizona’s Ethics Opinion 09–04 provides, in pertinent part, that:

[W]hether a particular system provides reasonable protective measures must be informed by the technology reasonably available at the time to secure data against unintentional disclosure. As technology advances occur, lawyers should periodically review security measures in place to ensure that they still reasonably protect the security and confidentiality of the clients’ documents and information.

It is also important that lawyers recognize their own competence limitations regarding computer security measures and take the necessary time and energy to become competent or alternatively consult experts in the field.

State Bar of Ariz., Ethics Op. 09-04, Confidentiality: Maintaining Client Files; Electronic Storage; Internet (12/2009) (citations and quotations omitted) (citing N.J. Ethics Op. 701).

Likewise, Opinion 842 of the New York State Bar Association requires lawyers to “stay abreast of technological advances,” (New York State Bar, Ass’n Comm. on Prof’l Ethics, Op. 842 (2010) (quoting N.Y. State 782 (2004))), and Minnesota’s Rule 1.6 requires that “[a] lawyer must act competently to safeguard information relating to the representation of a client against inadvertent or unauthorized disclosure by the lawyer or other persons who are participating in the representation of the client or who are subject to the lawyer’s supervision.” Minn. R. Prof’l. Conduct 1.6 cmt. 15 (emphasis added). See also Minn. Lawyers Prof’l Responsibility Bd., Op. No. 22, A Lawyer’s Ethical Obligations Regarding Metadata (2010).

And, earlier this year, the ABA Commission on Ethics 20/20 released it’s a proposal for comment regarding “lawyers’ growing use of technology, especially technology that stores or transmits confidential information.”  The commission recommended amendments to Model Rules 1.6 and 5.3, adding a paragraph to the former requiring attorneys to “make reasonable efforts to prevent the inadvertent disclosure of, or unauthorized access to, confidential information, including information in electronic form.” The Commission also is concerned with the inadvertent transmission of information and recommend that lawyers have a duty to notify the sender of both physical and electronic information under certain circumstances.

The Arkansas Supreme Court reversed and remanded a death row inmate’s murder conviction, ordering a new trial because one juror slept and another used the Twitter service during court proceedings.

Significantly, the trial court instructed the jurors prior to opening arguments, as follows:

When you’re back in the jury room, it’s fine with me to use your cell phone if you need to call home or call business. Just remember, never discuss this case over your cell phone. And don’t Twitter anybody about this case. That did happen down in Washington County and almost had a, a $15 million law verdict overthrown. So don’t Twitter. Don’t use your cell phone to talk to anybody about this case other than perhaps the length of the case or something like that.

In one message, the Juror 2 wrote: “Choices to be made. Hearts to be broken…We each define the great line.” Less than an hour before the jury announced its verdict, he tweeted: “It’s over.”  Others including references to the trial, such as, “the coffee sucks here” and “Court. Day 5. Here we go again.”

The supreme court observed:

Because of the very nature of Twitter as an … online social media site, Juror 2’s tweets about the trial were very much public discussions. Even if such discussions were one-sided, it is in no way appropriate for a juror to state musings, thoughts, or other information about a case in such a public fashion . . . More troubling is the fact that after being questioned about whether he had tweeted during the trial, Juror 2 continued to tweet during the trial.

« Previous PageNext Page »