shows the absence of formal process, tools, and training. ABSTRACT: This article summarizes current forensic research directions and argues that to GOLDEN AGE OF DEGITAL move forward the community needs to adopt FORENSICS: The years from 1999–2007 standardized, modular approaches for data were a kind of "Golden Age" for digital representation and forensic processing. forensics. Forensics became so widespread and reliable that it escaped from the lab and onto the INTRODUCTION: Developments in TV screen, creating the so-called "CSI Effect" forensic research, tools, and process over the This Golden Age was characterized by the past decade have been very successful and widespread use of Microsoft Windows, and many in leadership positions now rely on specifically Windows XP. Universities around these tools on a regular basis—frequently the world started offering courses in DF. Today without realizing it. Without a clear research there are 14 schools offering certificate agenda aimed at dramatically improving the programs in DF, 5 schools offering associates efficiency of both our tools and our very degrees, 16 bachelor programs, 13 masters research process, our hard-won capabilities programs, and two doctoral programs, according will be degraded and eventually lost in the to the Digital Forensics Association (2010) the coming years. growth in digital Forensics research and professionalization was also marked by a rapid This paper proposes a plan for achieving that dramatic improvement in research and growth. operational efficiency through the adoption DIGITAL FORENSICS IS FACING of systematic approaches for representing CRISIS: Similar problems with diversity forensic data and performing forensic and data extraction exist with computation. telecommunications equipment, video game consoles and even eBook readers. BRIEF HISTORY OF DEGITAL Yet all of these systems have been used as FORENSICS the instrument of crimes and may contain information vital to investigations. Our EARLY DAYS inability to extract information from devices DF is roughly forty years old. What we now in a clean and repeatable manner also means that we are unable to analyze these devices consider forensic techniques were developed for malware or Trojan horses. Encryption primarily for data recovery. Forensics was and cloud computing both threaten forensic largely performed by computer professionals visibility—and both in much the same way. who worked with law enforcement on an ad hoc, case-by-case basis. Evidence left on time sharing systems frequently could be recovered without TODAY’S RESEARCH CHALLENGES the use of recovery tools. The FBI started a "Magnetic Media Program" in 1984, but only EVIDENCE-ORIENTED DESIGN: There performed examinations in three cases during its are two fundamental problems with the first year. Prior to the passage of the Computer design of today's computer forensic tools: Fraud and Abuse Act of 1984, computer hacking Today's tools were designed to help was not even a crime, further limiting the need examiners find specific pieces of evidence, to subject systems to forensic analysis. This not to assist in investigations. Today's tools Digital forensics research: The next 10 years GABRIEL BASS MAT# 21826002 were created for solving crimes committed CONTRIBUTION: Increase the number of against people where the evidence resides on annual competitions in open-source tool a computer; they were not created to assist development to encourage practitioners to in solving typical crimes committed with solve problems in emerging areas. The computers or against computers. They were creation of a forensic tool marketplace created for finding evidence where the where latest tools, their issues, bugs, and possession of evidence is the crime itself. tests, and testing datasets are shared. Today’s tools can (sometimes) work with a case that contains several terabytes of data, but they cannot assemble terabytes of data into a concise report. CONCLUSION: Digital forensics research needs new abstractions for data VSIBILITY, FILTER AND REPORT representation forensic processing. Funding MODEL: Most of today's DF tools agencies will need to adopt standards and implement the same conceptual model for procedures that use these abstractions. This finding and displaying information. This is probably one of the few techniques at our approach may be terms the "Visibility, Filter disposal for surviving the coming crisis. and Report" model. Examples of roots include the partition table of a disk; the root directory of a file system; a critical structure in the kernel memory; or a directory holding evidence files. Examples of data objects include files, network streams, and application memory maps. Because files are recovered before they are analyzed, certain kinds of forensic analysis are significantly more computationally expensive than they would be with other models.
THE DIFFICULTY OF REVERSE
ENGINEERING: DF engineering resources are dedicated to reverse engineering hardware and software artifacts. But researchers lack a systematic approach to reverse Engineering. There is no standard set of tools or procedure.
NEW RESEARCH DIRECTION: DF
research needs to become dramatically more efficient, better coordinated, and better funded if investigators are to retain significant DF capabilities in the coming decade. The key to improving research is the development and adoption of standards for case data.