Introduced yesterday, it’s just one page:http://www.lofgren.house.gov/images/stories/pdf/draft%20lofgren%20bill%20to%20exclude%20terms%20of%20service%20violations%20from%20cfaa%20%20wre%20fraud%20011513.pdf

This is in response to the prosecution and subsequent suicide of Aaron Swartz and the JSTOR academic documents hacking, and in response to criticism that the preexisting law is both vague and outdated. Read more here:http://www.latimes.com/business/technology/la-fi-tn-lofgren-floats-aarons-law-on-reddit-after-aaron-swartzs-death-20130116,0,929268.story

Last night, the Senate voted on cloture of S.3414 (The Cybersecurity Act), which failed by 51-47, and which previously failed August 2d vote by failed 52-46.

The OpenCongress.org bill summary is, as follows:

This bill would establish a new National Cybersecurity Council to identify technology sectors as “critical infrastructure” and approve or reject the sectors’ plans to secure their networks from cyber attacks. Businesses would not be required to adopt Council-approved security plans, but if they do they would be rewarded with broad protection from lawsuits and other benefits. The bill would also bolster website owners’ authority to monitor their users’ information and block users’ access. Companies would be given special legal immunity for sharing user information with federal government agencies.

To the best of my knowledge, there is no plan to reconsider this issue in the Senate, although there was some discussion observed between Lieberman and McCain prior to the vote, which may lead to an endeavor to reconsider.

Attention may now focus on a potential Executive Order (a PDF copy of the September 28th draft of which is here) and a rewrite of HSPD-7 (PDF here).

by Diane Vlassis, JD, guest contributor

On October 5, 2012, a one-hour CLE on the ethical implications of e-discovery was offered by the MSBA’s Solo and Small Firm Section at the MSBA Offices in Minneapolis. There were two presenters: Michael Arkfeld, J.D., Director of the Arkfeld eDiscovery and Digital Evidence Program (AEDEP) of the Arizona State University’s Sandra Day O’Connor College of Law in Phoenix, and Christine Chalstrom, J.D., proprietor of Shepherd Data Services,® a litigation support company in Minneapolis.

Mr. Arkfeld opened the session by alerting the audience of the ABA’s August 6, 2012 adoption of additional language to comment 6 of the ABA Model Rule of Professional Responsibility 1.1: Competence.  ABA Model Rule 1.1 requires an attorney to provide “competent representation to a client.”  Comment [6]: Maintaining Competence was amended to include a lawyer’s obligation to keep abreast of “the benefits and risks associated with relevant technology.” http://www.abajournal.com/files/20120808_house_action_compilation_redline_105a-f.pdf

This amendment was described by Mr. Arkfeld to be a strong reminder of the ethical duty attorneys already have to follow discovery rules for all evidence, digital or otherwise, under the general competency requirement of Model Rule 1.1. As applied to e-discovery, Mr. Arkfeld listed certain attorney misconduct to avoid under Model Rule 1.1 including the “failure to inquire into and understand your client’s and adversary’s IT infrastructure and practices.” He cited Qualcomm Inc. v. Broadcom Corp. No. 05cv1958-B (BLM), 2008 WL 66932 (S.D. Cal. Jan. 7, 2008), vacated in part, 2008 WL 638108 (S.D. Cal. Mar. 5, 2008) to support this conclusion.  In this case Qualcomm was ordered to pay $8.3 million in sanctions for failing to turn over thousands of electronic documents that were responsive to defendant’s discovery requests.  The federal magistrate found that Qualcomm had intentionally suppressed these documents, and the magistrate also sanctioned 6 attorneys, all outside counsel representing Qualcomm, for failing to comply with FRCP Rule 26(g), which mandates that the attorney conduct “a reasonable inquiry” to determine if the discovery submission is sufficient and proper. Fed. R. Civ. P. 26(g); Fed. R. Civ. P. 26 Advisory Committee Notes (1983 Amendment).  According to Qualcomm, the sanctioned attorneys should have reviewed Qualcomm’s records to verify that the correct computers had been searched and that the appropriate search terms were used.

How can a solo practitioner or small law firm conduct a “reasonable inquiry” under FRCP Rule 26(g) to ensure that that a client’s electronic discovery submission is sufficient?

 

Mr. Arkfeld offers that Qualcomm Inc. v. Broadcom Corp., No. 05cv1958-B (BLM), 2010 WL 1336937 (S.D. Cal. Apr. 2, 2010), provides some guidance by indicating what the sanctioned counsel should have understood:

 

  1. “how Qualcomm’s computer system is organized”
  2. “where electronic mail is stored”
  3. “how often and to what location laptops and personal computers are backed up”
  4. “whether, when and under what circumstances data from laptops are copied into repositories what records are being kept regarding the search for and collection of documents”
  5. “what type of information is contained within the various databases and repositories”
  6. “what records are maintained regarding the search for, and collection of, documents for litigation”

 

Id. at Qualcomm, 2010 WL 1336937 at *2.

 

In addition, the 2010 Qualcomm opinion states that outside counsel is responsible for supervising and verifying that the necessary discovery, as planned, was indeed conducted. Id. Are there circumstances under which counsel needs to make an onsite visit to understand their client’s computer network? Mr Arkfeld says yes.

 

At this point in the CLE presentation, some members of the audience expressed concern about how the average attorney could possibly become technology savvy enough to meet the guidance offered in Qualcomm Inc. v. Broadcom Corp., No. 05cv1958-B (BLM), 2010 WL 1336937 (S.D. Cal. Apr. 2, 2010). The learning curve for understanding a “computer system” or network, such that an attorney can supervise and verify the adequacy of the e-discovery process, seems incredibly high. At this time, the second presenter, Christine Chalstrom JD, proposed an approach in the context of the services her company offers to assist attorneys comply with e-discovery obligations. Following Ms. Chalstrom’s presentation, a panel comprised of attorney-practitioners with experience in e-discovery challenges answered questions.

 

What I took away from this CLE, was an awareness of the need for attorneys to become educated and skilled enough to meet the ABA Model Rule 1.1 duty to keep abreast of the “benefits and risks associated with relevant technology” as applied to conducting e-discovery. Based upon the presentation, reactions of the audience, and attorneys I know, more training is necessary for the average attorney to conduct “a reasonable inquiry” to determine the adequacy of an e-discovery submission under Fed. R. Civ. P. 26(g), and most likely, equivalent state and local court rules. According to Mr. Arkfeld, judges as well as practitioners could benefit from tutelage on the nature of electronically stored information (ESI) and the considerations that must be taken to properly collect, produce and eventually offer and admit it in court. I could not help but think about the online Case Management (CM)/Electronic Case Filing (ECF) training offered by the federal courts to practitioners and their staff. Perhaps free or low-cost comprehensive online training would help bridge the e-discovery technology gap to assist attorneys in meeting their competence obligation of ABA Model Rule 1.1 as applied to e-discovery.

———————————————————————-

Guest Contributor, Diane Vlassis, JD, AAS in Computer Networking

Let me share with you the interesting case of Blythe v. Bell, (2012 NCBC 42 (N.C. Super. Ct. 2012,  http://www.ncbusinesscourt.net/opinions/2012_NCBC_42.pdf).  There, defendants engaged an outside “expert,” Tom Scott, owner of Computer Ants, for e-discovery work. Defendants’ counsel failed to conduct any intervening review Scotts’ work. Instead, defendants relied exclusively on Scott to conduct a privilege review, among other things.

Unfortunately for defendants, Scott had “never provided any forensic computer services in the context of a lawsuit,” and had never “been engaged as a computer expert or provided an opinion in any legal proceeding.” Rather, Scott had worked as a “truck driver, a Bass Pro Shop Security Manager, a respiratory therapist, and a financial auditor for a retail seller.” Put differently, Scott had no experience in e-discovery.

The Court found that Scotts’ paucity of qualifications to serve as an e-discovery “expert” rendered the defendants’ actions particularly unreasonable. Consequently, defendants produced nearly 2,000 pages of otherwise privileged documents to the plaintiff.

Lesson here is that, if you have e-discovery issue or digital forensics issue, don’t call the computer dude or dudette you use to fix your slow computer, or printer, or to set up your wireless network.  That person and a digital forensics expert or e-discovery consultant will rarely be one and the same.

_______________________________

The author, Sean L. Harrington, is a law student and digital forensics examiner, information security professional, and e-discovery, trial, and litigation consultant with the private practice firm of Attorney Client Privilege, LLC, and a risk management team lead for US Bank. Harrington holds the MCSE, CISSP, CHFI, CSOXP, and LexisNexis CaseMap support certifications, served on the board of the Minnesota Chapter of the High Technology Crime Investigation Association in 2011, is a member of Infragard, a member of Century College’s Computer Forensics Advisory Board and [erstwhile] Investigative Sciences for Law Enforcement Technology (ISLET) board, and is a council member of the Minnesota State Bar Association (MSBA) Computer & Technology Law Section.

This post concerns a First Circuit ruling where plaintiffs who had lost significant funds due to fraudulent wire transfers authorized by a Bank were permitted to assert causes of action against the Bank because the Bank’s security protocols may not have been commercially reasonable.

What reminded me of this decision was an upcoming CLE on September 12th entitled, “Disclosure of Cybersecurity Risk: What You Need to Do and Say About It” Amy C. Seidel (Faegre Baker Daniels LLP) will talk about the new guidance from the SEC Division of Corporation Finance regarding company disclosures of cybersecurity risk. In the presentation, Amy will suggest what attorneys should consider to be certain that clients are in compliance. Topics include: What the SEC guidance requires companies to disclose about cybersecurity risk; procedures companies are following to assess cybersecurity risk; the kinds of disclosures companies are including in SEC reports about cybersecurity risk; and what the SEC saying about cybersecurity risk in comment letters.

Indeed, I’ve noticed an increasing volume of posts in the MSBA listservs in the recent year regarding fraud and attempted or successful fraudulent transfers of funds. In fact, one listserv member contacted me when it was discovered that someone has hacked into a Hotmail account and was sending requests to a personal banker to transfer funds out of the country.  Fortunately, the banker was astute enough to recognize that the requests were abnormal for the client, and she sought voice verification.  There was another recent post seeking referrals for a cause of action regarding the same.  And I may also recall a recent post about a pending claim by a lawfirm that was duped by a phishing scam or something of the sort, and had initiated a suit against on the basis of the same.

And, I gave a couple of presentations in 2011 on “Corporate and Individual Liabilities of Releasing Vulnerable Code.” In that presentation, I discussed the concept of “Negligent Enablement of Cybercrime,” which is a cause of action advanced by Rustad & Koenig. See Michael L Rustad & Thomas H. Koenig, The Tort of Negligent Enablement of Cybercrime, Berkeley Technology Law Journal, Vol. 20 No.4, Fall 2005, 1553-1611,  (http://www.btlj.org/data/articles/20_04_03.pdf).

Although, in the PATCO decision (discussed below), the First Circuit didn’t use the “Negligent Enablement” parlance, the concept is the same.

Significantly, the PATCO case is based on the Uniform Commercial Code, adopted by most states, including Minnesota.

The Court explained that the commentary to the applicable UCC section governing electronic funds transfers does not, on its face, preclude an action for breach of contract or breach of fiduciary duty, because those common law claims are “not inherently inconsistent” with Article 4A.  On the other hand, the Court perceived that a closer question is whether Article 4A, on these particular facts, precludes a negligence claims insofar as such claims might be inconsistent with the duties and liability limits set forth in Article 4A. The Court suggested that it, indeed, does.

What the Court actually said is that the UCC establishes a particular set of obligations, remedies, and limitations on liability that were intended by the drafters to provide clarity to funds transfers because the preexisting common law was uncertain, and that this provision of the UCC appears to preclude plaintiffs from bringing a common law negligence claim for losses related to the fraudulent funds transfers. However, the UCC does not restrain the parties from agreeing to more stringent provisions above and beyond the Article, which would be governed by other common law causes of action (such as contract), if the UCC Article doesn’t conflict. And it does not.

The lesson here seems to be that the UCC does establish guidelines on who bears the loss in the event there is a breach of the duty to act reasonably (which duties are defined by the FFIEC, OCC, and generally accepted industry standards), but if the parties have agreed to something more robust, they can be held accountable for breach of loyalty or breach of contract.

 

Introduction

Over the last several years, I’ve posted a handful of short blog entries about the topic of compelling a criminal defendant to surrender a passphrase to an encrypted volume or hard-drive.  These entries concern the three cases of re Grand Jury Subpoena Duces Tecum Dated March 25, 2011, United States v. Fricosu, (D.Colo, 2012), and In re Grand Jury Subpoena (Boucher), 2009 U.S. Dist. Lexis 13006 (D. Vt., 2009).

I have developed the opinion admittedly, more on hunch than scholarly researchthat a defendant should not be able to knowingly withhold a passphrase or password to an evidence trove any more than he should be permitted to hang on to a physical key that could be used to open a safe that the Government has a valid warrant to search, and which is believed to contain evidence.

Unfortunately, I have found myself on the wrong side of this issue.  My colleagues Sharon Nelson and Craig Ball disagree with me on some aspects of the issue.  And my position is seemingly at odds with the Eleventh Circuit in Grand Jury Subpoena Duces Tecum Dated March 25, above, a decision that Professor Orin Kerr described as mostly correct (although I note that the Eleventh Circuit did distinguish Boucher, and recognize exceptions).

Privilege

Setting aside, for the sake of this comment, the question of whether knowledge of the passphrase is both “testimonial” and “incriminating” for purposes of the Fifth Amendment (the very issues central to the aforementioned cases), or whether the knowledge of the passphrase should be distinguished from possession of a physical key, my belief has been based on a principle that parties to either criminal or civil litigation should simply not be permitted to purposefully withhold admissible evidence from each other. Professor Arthur Miller, the great civil procedure legal scholar calls this the “cardinal, basic, philosophical principle.” “Since we’re trying to get at the truth,” he continues, “You must give every litigant equal access to all relevant data . . . It’s as American as apple pie.” Arthur R. Miller, Civil Procedure, Sum & Substance audio lecture series (1999).

Now, before I continue, let’s recognize I have purported to be an axiom for what it is:  incorrect.    In fact, there are several basis under our system of law when a party is permitted to withhold otherwise relevant, admissible evidence.  We call it “privilege.”  Privilege is that annoying rule of law (I’m being facetious here) that, “to protect a particular relationship or interest, either permits a witness to refrain from giving testimony he otherwise could be compelled to give, or permits someone (usually one of the parties) to prevent the witness from revealing certain information” Waltz & Park, Evidence, Gilbert Law Summaries, § 635. Perhaps the most common example of it is the attorney-client privilege. See Upjohn Co. v. United States, 449 U,S, 383, 389 (1981) (acknowledging the attorney-client privilege as “the oldest of the privileges for confidential communications known to the common law”).

But even the hallowed attorney-client privilege has its limits.  Under the civil fraud and criminal fraud exceptions, an otherwise privileged communication becomes discoverable. See, e.g., United States v. Zolin, 491 U.S. 554, 562–63 (1989) (stating goals of attorney-client privilege are not served by protecting communications made for purpose of getting advice for commission of crime or fraud). And see Deborah F. Buckman, Annotation, Crime-Fraud Exception to Work Product Privilege in Federal Courts, 178 A.L.R. FED. 87, § 2[a] (2002).

Encryption as Evidence Destruction

Craig Ball often reminds his audiences of the three ways to destroy electronically stored information: (1) overwrite the bytes with new data; (2) physically destroy the media upon which the data was written; or (3) use strong encrypt on the data and forget the passphrase.  Thus, in my assessment, if an individual encrypts evidence while engaging in the commission of a crime, it is tantamount to flushing drugs down the toilet, throwing the murder weapon in a lake, or silencing a witness. These are independent criminal acts, separable from the underlying charges.   Likewise, a civil litigant, who encrypts evidence after the duty to preserve has attached (articulated best in Zubulake v. UBS Warburg, 220 F.R.D. 212, 218 (S.D.N.Y. 2003)(“Once a party reasonably anticipates litigation, it must suspend its routine document retention/destruction policy and put in place a litigation hold’ to ensure the preservation of relevant documents”)), engages in spoliation that may be punishable.  Therefore, I contend, by using encryption, a defendant or litigant may engaged in spoliation of evidence albeit undoable which may be subject to independent criminal liability, civil sanctions, or an adverse jury instruction.

Notice, these phrases in bold, above, establish mens rea, (i.e., intent — purposeful or knowing conduct) that the actor was using encryption in the furtherance of a crime, or to destroy evidence to thwart a law enforcement investigation.  An instructive analog may be the safe harbor provision, Fed.R.Civ.P. Rule 37(f), as applied to electronic discovery in civil cases. The provision shields a party who cannot produce evidence lost as a result of the routine, good faith operation of an electronic information system. In other words, if an individual was using whole-disk encryption not to obfuscate criminal activity, but rather because he was trying to protect against identify theft, or because the system came with it by default, there is no intent, hence no criminal culpability. Another helpful analog might be found in Arizona v. Youngblood, 488 U.S. 51, 58 (1988), where the U.S. Supreme Court held charges may be dismissed based upon evidence lost or destroyed by the Government, which is deemed to be only potentially exculpatory (as opposed to apparently exculpatory), only if defendant can show the evidence was destroyed in bad faith.

But, perhaps the best authority addressing the mens rea requirement is probably that required for the 18 U.S.C. § 1503 (conduct that, among other things, corruptly endeavors to obstruct or impede the due administration of justice): To sustain its burden of proof, the government must show that the defendant knowingly and intentionally undertook an action from which an obstruction of justice was a reasonably foreseeable result. Although the government is not required to prove that the defendant had the specific purpose of obstructing justice, it must establish that the conduct was prompted, at least in part, by a corrupt motive.” United States v. Barfield, 999 F.2d 1520, 1524 (11th Cir. Ala. 1993) (internal quotations omitted).  Unlike the duty-to-preserve in civil cases, which requires only reasonable anticipation of litigation, the federal criminal context requires there to have been a pending judicial proceeding known to defendant at the time. See, e.g., U.S. v. Fineman, 434 F. Supp 197 (E.D.Pa 1977) (In applying the obstruction of justice statute to issues of destruction of documents, federal courts generally have not required that a subpoena have issued. Rather, it is sufficient for an obstruction conviction that the defendant knew that a grand jury was investigating possible violations of federal law and intentionally caused destruction of the incriminating document.). In fact, 18 U.S.C. § 1503  has even been applied to prosecute those who, in a civil case, were accused of willfully destroying documents subject to discovery. U.S. v. Lundwall, 1 F.Supp.2d 249 (S.D.N.Y.,1998).

Note that my theory is not that the presence of encryption is somehow admissible as relevant in demonstrating defendant’s mental state or aptitudes, as it appears to have been in State v. Levie, 695 N.W.2d 619 (Minn.App. 2005) (“the existence of an encryption program on [defendant’s] computer was at least somewhat relevant to the state’s case against him,” and the jury was allowed to consider it). See also Jessica Murphy, Swiss Cheese That’s All Hole: How Using Reading Material To Prove Criminal Intent Threatens The Propensity Rule, 83 Wash. L. Rev. 317 (May 2008).  Rather, my theory is that, even if a court finds that a defendant cannot be compelled to aid in his prosecution by surrendering a passphrase (because doing so would be testimonial and incriminating), a defendant may nevertheless be criminally liable for evidence spoliation.  Further, when evidence is spoliated, a factfinder may be entitled to presume that the evidence was unfavorable to the spoliator. See Washington Gas Light Co. v. Biancaniello, 87 U.S. App. D.C. 164, 183 F.2d 982 (D.C. Cir. 1950) (Willful destruction of evidence by a party properly raises the inference that the materials destroyed were adverse to the party which brings about the destruction); Brown & Williamson Tobacco v. Jacobson, 827 F.2d 1119, 1134 (7th Cir. 1987) (“A court and a jury are entitled to presume that documents destroyed in bad faith while litigation is pending would be unfavorable to the party that has destroyed the documents.”); Dale A. Oesterle, A Private Litigant’s Remedies for an Opponent’s Inappropriate Destruction of Relevant Documents, 61 Tex. L. Rev. 1185, 1232-39 (1983) (“[A] party’s bad faith destruction of relevant documents is an admission by conduct that he believes his case is weak and cannot be won fairly.”). See generally 2 John Henry Wigmore, Evidence §291 (James H. Chadbourn rev. ed., 1979) (discussing evidence spoliation).

Conclusion

The right to privacy as, Justice Douglas recognized in Griswold v. Conneticut, arises from “penumbras, formed by emanations from those [specific] guarantees . . . in the Bill of Rights.”  And the Bill of Rights operates as a constraint on the Government.  But, those penumbrae do not, in my view, confer a magical privileged status to file or disk encryption under the rubric of privacy, when, in certain limited circumstances, such encryption is really just evidence spoliation.

As a forensics examiner, I am already seeing and foresee a higher frequency of criminal and civil investigations thwarted by the use of file or disk encryption and the privilege under the Fifth Amendment.  Absent new statutes addressing the misuse of encryption technology, a prosecutor should closely examine the Eleventh Circuit decision to see if his or her case falls under the limited exceptions that would require defendants to surrender the passphrase under the penalty of remedial contempt. Alternatively or conjunctively, prosecutors should determine whether the use of encryption by defendants fall within the scope of an applicable federal or state statute for destroying evidence in the furtherance of a crime, or incident to a criminal investigation, where there is extrinsic evidence of a corrupt motive.

Introduction

In this post, I will provide some initial impressions and findings.  I do not  endeavor to write a white paper, or to employ an industry standard, scientific methodology to evaluating the tool (if for no other reason than because I am constrained by time).

 

PostgreSQL

First, I note that it appears that no one has been able to get FTK to work with PostgreSQL, leading me to conclude that the product was shipped without being tested in this regard.  (If a reader has been able to get it working, I encourage you to post a comment here).   I was not able to get it to work, and I wasted two valuable —otherwise billable— days I had set aside for a client, only to make this discovery.

My review of the AccessData forums indicates identical experiences, and I haven’t found one poster there who yet claims to have finished an evidence load using FTK with Postgres.  (Note: I am unable to determine whether it is an AccessData use-violation to excerpt comments from the private forums, so I am proceeding with an abundance of caution by not doing so).

Likewise, I conferred on Friday with another colleague, a lead examiner for a large company, and he replied:

I just had the same experience. I mistakenly upgraded to 4.0, removed Oracle completely, and installed PostgreSQL. That was a mistake . . . some of my run-of-the-mill cases that should only take a couple hours were taking days and had to be killed off. Then, after I removed PostgreSQL and re-installed Oracle I couldn’t get it to forget about the old connection and had all sorts of weirdness with it not finding Oracle some of the time. I eventually backed out FTK, Oracle, and PostgreSQL and did a complete manual cleanup of all garbage files and registry entries and then re-installed everything. I am back to Oracle with 4.0 and things are fine again, but what a mess to deal with this on 3 machines.

And, the same experiences are found on the ForensicFocus forums (e.g., http://www.forensicfocus.com/Forums/viewtopic/p=6557533/).  Thus, based solely on these numerous anecdotes, and based on my understanding that new purchasers of FTK do not receive Oracle licensing, I have concluded that FTK 4.0 & Postgres is not merchantable (suitable for its intended purpose), although –as noted above– I may be incorrect, and would be pleasantly surprised to be proved wrong.

So, I, too, reverted back to Oracle.   Unfortunately, I couldn’t get the Oracle KFF library for v4.0 posted on AccessData’s FTP site to work.  In browsing through the AccessData forums, neither could anyone else.  AccessData made available to me and certain others who complained a working KFF, which –last time I checked– is not the one available for download at the AccessData FTP site.

Now, like many of the others who have posted to the AccessData forums and elsewhere, I am able to use v4.0 with Oracle.

 

Hardware

The three machines I used for testing are, as follows:

(1)  FTK & Oracle server (one box) – SuperMicro X8DTL-6F motherboard, LSI SAS2 2008 controller, two RAID-0 volumes each consisting of two OCz Vertex 3 Max IOPS 120GB SSDs (SATA III – one volume for Oracle data; the other for O/S and the adTemp directory), 24GB of DDR3 1333MHz ECC non-registered server memory, and two Intel Xeon X5650 hexacore processors.

(2) Distributed Processing Engine (“DPE”) #1Asus M4A89GTD Pro/USB3 motherboard, AMD Phenom II X6 1100t hexacore processor (watercooled, but not overclocked), 16Gb of DDR3 memory, O/S residing on an OCz Vertex 3 SSD (SATA III), and the temp directory used by the AccessData distributed processing engine residing on a separate OCz Vertex 3 Max IOPS edition SSD, and the pagefile residing on a Western Digital  Raptor 10K RPM hard drive.

(3) DPE #2 – Hewlett Packard DV6 laptop, Intel core i7 720qm processor, O/S installed on an Intel SATA II SSD, temp directory used by AccessData distributed processing service residing on a separate OCz Vertex 2 SSD (SATA II), 8GB of RAM

 

Source Evidence Configuration

In an effort to find the fastest evidence load times, I experimented with various combinations of the foregoing.  As a test image, I used a 186GB DD image (ultimately consisting of 1,052,891 evidence items), hosted on a Western Digital 4TB My Book Studio Edition II (SATA II – up to 3 Gb/sec) configured as RAID-0.  I used both KFF alert and ignore, MD5 & SHA1 hashes (but not SHA256 or “fuzzy” hashes), expand compound files, flag bad extensions, entropy test, dtSearch index, create graphics thumbnails, data carve, meta carve, registry reports, include deleted files, and explicit image detection (X-DFT & X-FST).   Using oradjuster, I tweaked the SGA_TARGET parameter to use only 18% of available physical memory during evidence processing.

 

Distributed Processing

Before continuing, I’d like to mention a few things about the Distributed Processing Engine, which are hard-learned lessons from either failing to read the user guides and appendices, or from experimentation:

(1) To get DPE working, you must have a system share as the path to the evidence in the Add Evidence dialogue box.  Without it, no distributed processing will occur.  Likewise, you need to have the Oracle working directory on a public share, the FTK-cases directory on a public share, and all systems using mirrored accounts (which Microsoft defines as “a matching user name and a matching password on two [or more] computers”).   You also need to disable any Windows firewalls or other firewalls.  Tip ►  An easy way to make certain the port on a DPE is reachable is to install PortQuery v. 2, and run the command, “PortQry -n {machineName} -p tcp -e 34097″ where “machineName” is the name of your DPE, and where 34097 is the default port (configurable in the distributed processing configuration menu). AccessData ought to include a “test connection” button in the distributed processing configuration — it would probably save their help desk a lot of e-mails and calls.

(2) And, although the v4.0 System Specification Guide discusses how to configure the adTemp directory on the localhost processing engine (which directory should be located on its own, high i/o throughput drive, because it is the interface between the processing engine and Oracle), I have found no discussion about how to optimize the DPE machines.

Tip ►  On a Windows Vista or Windows 7 machine, if you are logged on as “farkwark,” the distributed processing engine will write its files to Users/farkwark/appData/Local/Temp.   To relocate this /Temp directory to a different drive, you need to create a junction, as follows.  First, log off and log on as a different admin account user.  Next, move (not copy) the /Temp directory to the different drive (say F:).  Rename it, if you like (e.g., “FTKtemp”).   Now, from a command prompt, type:

mklink /j “Users\farkwark\appData\Local\Temp” “F:\FTKtemp”

From this point forward, the processing engine will, in fact, be writing its temporary files to the F:-drive, thereby not competing with the O/S drive for i/o.

 

Hardware Configuration Experiment No. 1: FTK & Oracle Server + DPE #1 & DPE #2

With this three-machine configuration, I rarely saw the FTK & Oracle server’s 12 cores (24, if one counts hyperthreading) get above a collective 15% of load.  DPE #2 (the HP laptop) processing load reached near 100% several times, with up to 6 of 8GB available memory in use. DPE #1 reached between 50-80% CPU utilization with extended periods of low utilization, and about 8GB (of 16GB available physical memory). Total time was 11 hours, 24 minutes.

 

Hardware Configuration Experiment No. 2: FTK & Oracle server (one box implementation), alone

With this one-box configuration, the dual xeon hexacore CPUs were pegged for extended periods of time at 99 or 100% (note, this differs from the experience of others, who have written,”We have multi cored, multi processor CPUs on our systems. What we’ve found is that typically, unless we are password cracking, that the I/O from the disks can’t keep up with resources available. Meaning our CPUs are never maxed out. So the CPUs are not the bottleneck for getting more speed“). The FTK/Oracle server used up to 20GB (of 24GB available) of physical memory (recall that SGA_TARGET was set to 18%).  Total time elapsed was 9 hours, 4 minutes, an improvement of 20.5% over using a three-machine distributed processing configuration (no, that’s not a typo).

 

Hardware Configuration Experiment No. 3: FTK & Oracle server + DPE #1

With this two-machine configuration, the FTK Server’s CPU utilization was rarely  above 40%, only occasionally reached 60%, and most usually was between 5 to 25%, and used between 8 to 12GB (of 24GB available) of physical memory.  Meanwhile, DPE #1′s CPU utilization was pegged at 99 to 100% for extended periods of time, and used up to 10GB (of 16GB available) of physical memory.  Total time elapsed was 8 hours, 33 minutes, a 25% improvement over the 3-machine distributed processing configuration, and only a 5.7% improvement over using the FTK/Oracle one-box solution.

 

Hardware Configuration Experiments Conclusion

Based on the type of hardware I am using, I found very little benefit (up to 6% processing time improvement) and, in fact, some detriment (over 20% in processing time loss) in stringing together numerous DPE workstations.  My experience is inconsistent with AccessData’s findings of processing time differences between stand-alone boxes and distributed processing clusters (see http://accessdata.com/distributed-processing).

 

Add-on Modules: Data Visualization & Explicit Image Detection

Initially, I thought the Data Visualization module did not work. No matter whether I attempted to view a directory containing several score of files, or the entire 1 million + items, it never displayed more than a handful of results (sometimes zero or one, resulting in a pie graph that was just one big green circle).   Turns out (of course) that It was my fault for failing to “select” the appropriate date range.  Had I read the manual first, I would have noticed, “Information can only be displayed for the date that you have selected.” FTK 4.0 User Guide at 200.  Apparently, when the Data Visualization tool first opens, it defaults to one day (the first day of the oldest evidence in the list) –not very inuitive.

AccessData claims that the Data Visualization add-on component “provides a graphical interface to enhance understanding and analysis of cases. It lets you view data sets in nested dashboards that quickly communicate information about the selected data profile and its relationships.”  Among other things, it purportedly provides “a complete picture of the data profile and makeup,” empowers the examiner to “Understand the file volume and counts through an interactive interface,”  and “Create a treemap of the underlying directory structure of the target machine for an understanding of relative file size and location” (similar to, but not as elegant as WinDirStat).

In summation, the tool appears to work as designed, although I haven’t done any substantive reporting off of it.  One user posted on the AccessData forum that there appears to be no way to export the graphs in to a report, but this can be easily remedied by taking a screen clipping using SnagIt, Microsoft’s OneNote, or a screen print.

Also, I have been experimenting with the EID. AccessData states, “This image detection technology not only recognizes flesh tones, but has been trained on a library of more than 30,000 images to enable auto-identification of potentially pornographic images . . . AccessData will continue to integrate more advanced image search and analysis functionality into FTK. Customers who have added the explicit ID option to their Forensic Toolkit® license and are current on their SMS will automatically receive those new capabilities as they become available.” Notwithstanding this commitment, it appears that no additional functionality has been added since its release with FTK 3.0.  I also note that the technology is unlike Microsoft’s “PhotoDNA,” which is reputed to process images in less than five milliseconds each and accurately detected target images 98 percent of the time, while reporting a false alarm one in a billion times. Comparatively, AccessData’s EID has been found to achieve 69.25% effectiveness with 35.5% false positives. Marcial-Basilio, Aguilar-Torres, Sáchez-Pérez, Toscano-Medina, Pérez-Meana, “Detection of Pornographic Images, 2 Int’l Journal of Computers 5 (2011).

My experience reveals many false positives (such as, “small_swatch_beige.png,” an image consisting of a plain beige coloured box, ranking at the very top of the list compiled by the X-ZFN algorithm, which is supposed to be the most accurate of the three algorithms), and seems to confirm that the algorithm is based on the presence of flesh tones (and nothing more, unless your system has one or more of the 30,000 images that became part of the library at the time the tool was introduced). Nevertheless, if one is short on time (and many law enforcement agencies’ examiners are), the tool does certainly help to reduce the data set that requires manual review.

Barely three weeks after I penned Another Judge Rules Encryption Passphrase not Testimonial Under Fifth Amendment Analysis, the Eleventh Circuit has held that a defendant’s “decryption and production of the hard drives’ contents would trigger Fifth Amendment protection because it would be testimonial, and that such protection would extend to the Government’s use of the drives’ contents.”

For the reasons set forth in my previous posts on this topic, and for the reasons more fully set forth below, I disagree, and I hope the Government petitions for a writ of certiorari on this issue.

In this case, captioned,  In re Grand Jury Subpoena Duces Tecum Dated March 25, 2011, law enforcement officials began an investigation of an individual using a YouTube.com account whom the Government suspected of sharing explicit materials involving underage girls.  During the course of the investigation, officers obtained several  IP  addresses from which the account accessed the internet.  Three of these IP addresses were then traced to hotels, which hotels’ guest registries revealed the sole common hotel registrant during the relevant times was defendant.

Although probable cause was not raised as an issue in this case, it should be noted that the Government’s forensic investigator testified the Government believed that data existed on the still-encrypted parts of the hard drive and “introduced an exhibit with nonsensical characters and numbers, which it argued revealed the encrypted form of data.” Further, the Government’s forensic expert conceded that, although encrypted, it was possible the volumes contained nothing.  When defendant asked the forensic expert, “So if a forensic examiner were to look at an external hard drive and just see encryption, does the possibility exist that there actually is nothing on there other than encryption?  In other words, if the volume was mounted, all you would see is blank.  Does that possibility exist?,”  the expert replied: “Well, you would see random characters, but you wouldn’t know necessarily whether it was blank.” And, when pressed by defendant to explain why Government believed something may be hidden, the expert replied, “The scope of my examination didn’t go that far.”  In response to further prodding, “What makes you think that there are still portions that have data[?],” the expert explained, “We couldn’t get into them, so we can’t make that call.”  Finally, when asked whether “random data is just random data,” the expert concluded that “anything is possible.”

Of course, everything the expert said –taken in isolation– was true, but I fail to see why these explanations undermine the Government’s right to the unencrypted data.  Sure, the expert could or should have pointed to circumstantial trace evidence (such as registry data, link files, that should exist had the defendant possessed and viewed the files as alleged).  Sure, the Government could or should have asked for an adverse inference as to the presence and use of a forensic wiping utility, if trace evidence was not be present, as it should have been had the defendant possessed and viewed the files, as alleged.  But the Government wasn’t required to have probable cause as to the encrypted volumes specifically, because probable cause as to the entire computing equipment had already been satisfied.

Indeed, some discussion of the Fourth Amendment is here necessary:  The Fourth Amendment provides that, “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”  As relevant here, the warrant must describe the place to be searched with particularity, and the things to be seized.  Note that the comma following the word “searched,” limits the particularity requirement to the place to be searched. And, although warrants must establish probable cause and particularly name the place to be searched, the Supreme Court has rejected the argument that warrants must include “a specification of the precise manner in which they are to be executed.” Dalia v. United States, 441 U.S. 238, 257 (1979).

In this case, the place to be searched was the hotel where defendant was staying and, presumably, any computers found therein were identified as the “things to be seized.”  But, some urge that computer hard drives should be regarded as a “virtual home” or “virtual warehouse.” See, e.g., Orin Kerr, Searches and Seizures in a Digital World, 119 Harv. L. Rev. 531, 539, 542 (2005) (“While houses are divided into rooms, computers are more like virtual warehouses . . . While computers are compact at a physical level, every computer is akin to a vast warehouse of information.”). If so, the warrant may be construed to refer to the computers as among the “places to be searched” (in addition to the hotel room), as well as the things to be seized. See United States v. Ross, 456 U.S. 798, 821 (1982) (“When a legitimate search is under way, and when its purpose and its limits have been precisely defined, nice distinctions between closets, drawers, and containers, in the case of a home, or between glove compartments, upholstered seats, trunks, and wrapped packages, in the case of a vehicle, must give way to the interest in the prompt and efficient completion of the task at hand”).  My point here is that, if the warrant was sufficient to justify the search of the computers, that justification extended to all portions of each computer.

Assuming the Government has a right to inspect all portions of the hard-drives, based on probable cause to believe they were an instrumentality of a crime, then it is appropriate to begin the Fifth Amendment analysis. Under the Fifth Amendment, “[n]o person … shall be compelled in any criminal case to be a witness against himself.”  The courts have consistently interpreted this provision as “protect[ing] a person . . . against being incriminated by his own compelled testimonial communications.” Fisher v. United States, 425 U.S. 391, 409 (1976).  Thus, to be afforded the protection, the statement must be: (1) compelled, (2) testimonial in nature, and (3) serve to incriminate the declarant in a criminal proceeding. If these elements are met, the declarant has the right “not to answer questions put to him in any proceeding, civil or criminal, formal or informal, where the answers might incriminate him in future criminal proceedings.” Lefkowitz v. Turley, 414 U.S. 70, 77 (1973).

In this case, there was no dispute that defendant had care, custody, and control of the computers and hard-drives. As the sole owner, no one else could have created the encrypted volumes, and the Eleventh Circuit’s opinion does not indicate that defendant claimed someone else had created those volumes.  Therefore, it is not clear to me why defendant’s mere knowledge of the passphrase is an admission of guilt, any more than it would be to surrender the a key hanging about his neck, or to surrender the combination code to a safe in a home, that was properly within the scope of a valid search warrant (as these hard-drives were).  Knowledge of the passphrase is not an element of the crime, but rather possession of child pornography.  (Conversely, a murderer’s knowledge of the secret location of his victim’s grave would be incriminating, because only the murderer would know that location). Therefore, although the court intoned, “the Government appears to concede, as it should, that the decryption and production are compelled and incriminatory,” I don’t agree that the act of decryption and production, by itself, is incriminatory (even though the fruits of that production could contain evidence that is incriminating).

That leaves the question of whether the passphrase is testimonial.  The Court noted, “an act of production can be testimonial when that act conveys some explicit or implicit statement of fact that certain materials exist, are in the subpoenaed individual’s possession or control.” Yet, as noted above, it is uncontroverted that defendant had exclusive care, custody, and control of the encrypted volumes, and knows the passphrase, regardless of whether those volumes contain contraband.  Citing United States v. Hubbell, 530 U.S. 27 (2000) and Fisher v. United States, supra, the court relied upon the so-called “foregone conclusion” doctrine, which posits that an act of production is not testimonial—even if the act conveys a fact regarding the existence or location, possession, or authenticity of the subpoenaed materials—if the Government can show with “reasonable particularity” that, at the time it sought to compel the act of production, it already knew of the materials, thereby making any testimonial aspect a “foregone conclusion.” I contend that exception is here met, because it is not in dispute that the contraband was traced back to three separate IP addresses in different hotel rooms rented by defendant, and that there was no other plausible repository for those files to exist but his computer equipment, and this satisfies the “reasonable particularity” requirement.

I previously discussed, on a bar association section blog in 2007 (here) and 2009 (here), the case of In re Boucher, where a U.S. judge for the District of Vermont ruled that requiring a criminal defendant to produce an unencrypted version of his laptop’s hard-drive, which was believed to contain child pornography,did not constitute compelled testimonial communication.  Professor Orin Kerr posted a detailed discussion of the matter here.

The circuit court appeal in Boucher was dropped, but a new case has surfaced in the U.S. Court for the District of Colorado, United States v. Fricosu.  There, a bank fraud defendant’s home was searched pursuant to a warrant, and a computer seized which held encrypted files. Further, Defendant provided evidence indicating her ownership computer, that she knew it was encrypted, and that it contained inculpatory evidence. The government subpoenaed defendant to produce an unencrypted version of the content of the computer, offering immunity against authenticating for doing so.  Defendant  moved to quash, arguing that the subpoena violated her Fifth Amendment privilege against self-incrimination by requiring a testimonial act of production by compelling her to acknowledge her control over the computer and its contents.  Borrowing substantively from Judge Sessions’ decision in Boucher, Judge Robert E. Blackburn  ordered defendant to reveal the decryption password to the contents of the hard-drive.

In Boucher, Judge Sessions observed that, although the Fifth Amendment privilege ordinarily applies to verbal or written communications, an act that implicitly communicates a statement of fact may be within the purview of the privilege as well.  (citing United States v. Hubbell, 530 U.S. 27, 36 (2000); Doe v. United States, 487 U.S. 201, 209 (1988)), and that, although the contents of a document may not be privileged, the act of producing the document may be.”   United States v. Doe, 465 U.S. 605, 612 (1984).  Production itself acknowledges that the document exists, that it is in the possession or control of the producer, and that it is authentic. Hubbell, 120 S.Ct. at 2043.  In summation, an act is testimonial when the act entails implicit statements of fact, such as admitting that evidence exists, is authentic, or is within a suspect’s control. 487 U.S. at 209.

In the current case (Fricosu), Judge Blackburn has ruled that, where the existence and location of the documents are known to the government, no constitutional rights are touched, because these matters are a foregone conclusion, insofar as they add little or nothing to the sum total of the Government’s information. Likewise, defendant’s production wasn’t necessary to authenticate the computer drives where she had already admitted possession of the computers. See United States v. Gavegnano, 2009 WL 106370 at *1 (4th  Cir. Jan. 16, 2009) (where government independently proved that defendant was sole user and possessor of computer, defendant’s revelation of password not subject to suppression). Specifically, Judge Blackburd ruled that, “There is little question here but that the government knows of the existence and location of the computer’s files.  The fact that it does not know the specific content of any specific documents is not a barrier to production.”

Law Technology News quoted our colleague, Craig Ball, as saying what disturbs him most about the ruling is “that Judge Blackburn points to Boucher and simply equates the two scenarios without noting that the government’s knowledge of the contents of the kiddy porn laptop was substantial, specific, and no way speculative.” He added that this case differs from Boucher in the sense that the government at least saw the questionable files on Boucher’s computer already, “the LEOS saw the contraband with their own eyes and recognized its criminal character.” But, in the current case, officials likely don’t know what is on the computer, because “the cops never got past the log in screen, due to the computer’s encryption.”  Likewise, Professor Kerr characterizes Judge Blackburn’s ruling as “not a model of clarity.”

I see Craig’s point, and I recognize that he doesn’t want law enforcement to have carte blanche to peer into one’s encrypted volume based solely on reasonable suspicion (or  –alternatively– to suffer an adverse jury instruction (State v. Levie, 695 N.W.2d 619(Minn.App. 2005)).

But, I wonder whether a helpful analogy (one I can’t claim credit for) is the locked safe, within which the police have probable cause to believe a stolen painting has been stored. They don’t want to blast the safe open, because the priceless painting may be destroyed.  Assume the safe is in defendant’s home, and the police have a search warrant, and, therefore, assume the safe belongs to defendant and he is aware of its presence. Can defendant be compelled to turn over the key to the safe?  The answer, I believe, is yes. See Schmerber v. Cal., 384 U.S. 757 (U.S. 1966) (Where blood test evidence, although it may be an incriminating product of compulsion, is neither testimony nor evidence relating to some communicative act or writing by a defendant, it is not inadmissible on privilege grounds).  Now, let’s assume that it’s not a key the cops need, but a combination code to the safe.  Defendant has this code memorized.  Can the defendant now not be compelled to give up the code?  Why not?  Because it’s a memory?  But the Fifth Amendment doesn’t afford an unqualified privilege for memories; it affords a privilege to statements that are testimonial in nature. How could defendant’s very knowledge of the code to a safe in his own home be either incriminating or testimonial, regardless of whether the cops actually saw the painting deposited therein or not?

Back in 2007 (here) and 2009 (here) I discussed the case of In re Boucher, where a U.S. judge for the District of Vermont ruled that requiring a criminal defendant to produce an unencrypted version of his laptop’s hard-drive, which was believed to contain child pornography,did not constitute compelled testimonial communication.  Professor Orin Kerr posted a detailed discussion of the matter here.

The circuit court appeal in Boucher was dropped, but a new case has surfaced in the U.S. Court for the District of Colorado, United States v. Fricosu.  There, a bank fraud defendant’s home was searched pursuant to a warrant, and a computer seized which held encrypted files. Further, Defendant provided evidence indicating her ownership computer, that she knew it was encrypted, and that it contained inculpatory evidence. The government subpoenaed defendant to produce an unencrypted version of the content of the computer, offering immunity against authenticating for doing so.  Defendant  moved to quash, arguing that the subpoena violated her Fifth Amendment privilege against self-incrimination by requiring a testimonial act of production by compelling her to acknowledge her control over the computer and its contents.  Borrowing substantively from Judge Sessions’ decision in Boucher, Judge Robert E. Blackburn  ordered defendant to reveal the decryption password to the contents of the hard-drive.

In Boucher, Judge Sessions observed that, although the Fifth Amendment privilege ordinarily applies to verbal or written communications, an act that implicitly communicates a statement of fact may be within the purview of the privilege as well.  (citing United States v. Hubbell, 530 U.S. 27, 36 (2000); Doe v. United States, 487 U.S. 201, 209 (1988)), and that, although the contents of a document may not be privileged, the act of producing the document may be.”   United States v. Doe, 465 U.S. 605, 612 (1984).  Production itself acknowledges that the document exists, that it is in the possession or control of the producer, and that it is authentic. Hubbell, 120 S.Ct. at 2043.  In summation, an act is testimonial when the act entails implicit statements of fact, such as admitting that evidence exists, is authentic, or is within a suspect’s control. 487 U.S. at 209.

In Fricosu, Judge Blackburn has ruled that, where the existence and location of the documents are known to the government, no constitutional rights are touched, because these matters are a foregone conclusion, insofar as they add little or nothing to the sum total of the Government’s information. Likewise, defendant’s production wasn’t necessary to authenticate the computer drives where she had already admitted possession of the computers. See United States v. Gavegnano, 2009 WL 106370 at *1 (4th  Cir. Jan. 16, 2009) (where government independently proved that defendant was sole user and possessor of computer, defendant’s revelation of password not subject to suppression). Specifically, Judge Blackburd ruled that, “There is little question here but that the government knows of the existence and location of the computer’s files.  The fact that it does not know the specific content of any specific documents is not a barrier to production.”

Law Technology News quoted my colleague, Craig Ball, what disturbs him most about Judge Blackburn’s ruling is “that Judge Blackburn points to Boucher and simply equates the two scenarios without noting that the government’s knowledge of the contents of the kiddy porn laptop was substantial, specific, and no way speculative.” He added that this case differs from Boucher in the sense that the government at least saw the questionable files on Boucher’s computer already, “the LEOS saw the contraband with their own eyes and recognized its criminal character.” But, in the District of Colorado case, officials likely don’t know what is on the computer, because “the cops never got past the log in screen, due to the computer’s encryption.”  Likewise, Professor Kerr characterizes Judge Blackburn’s ruling as “not a model of clarity.”

I see Craig’s point, and I recognize that he doesn’t want law enforcement to have carte blanche to peer into one’s encrypted volume based solely on reasonable suspicion (or  –alternatively– to suffer an adverse jury instruction (State v. Levie, 695 N.W.2d 619 (Minn.App. 2005)).  But, I wonder whether a helpful analogy (one I can’t claim credit for) is the locked safe, within which the police have probable cause to believe a stolen painting has been stored. They don’t want to blast the safe open, because the priceless painting may be destroyed.  Assume the safe is in defendant’s home, and the police have a search warrant, and, therefore, assume the safe belongs to defendant and he is aware of its presence. Can defendant be compelled to turn over the key to the safe?  The answer, I believe, is yes. See Schmerber v. Cal., 384 U.S. 757 (U.S. 1966) (Where blood test evidence, although it may be an incriminating product of compulsion, is neither testimony nor evidence relating to some communicative act or writing by a defendant, it is not inadmissible on privilege grounds).  Now, let’s assume that it’s not a key the cops need, but a combination code to the safe.  Defendant has this code memorized.  Can the defendant now not be compelled to give up the code?  Why not?  Because it’s a memory?  But the Fifth Amendment doesn’t afford an unqualified privilege for memories; it affords a privilege to statements that are testimonial in nature.  How could defendant’s very knowledge of the code to a safe in his own home be either incriminating or testimonial, regardless of whether the cops actually saw the painting deposited therein or not?

« Previous PageNext Page »