You are on page 1of 11

Proposed 1999 DARPA Off-line

Intrusion Detection Evaluation Plans

Richard Lippmann
MIT Lincoln Laboratory

MIT Lincoln Laboratory


28 Jan 99 - 1
Richard Lippmann
Overview of Last Year’s 1998 Evaluation

• Last Year’s 1998 Evaluation Was The First Comprehensive


Realistic Evaluation of Intrusion Detection Systems

• Extremely Successful
– Results Focused Research, and Highlighted Current
Capabilities and Recent Advances

– Procedures Worked Well, Many Successful Participants using


Different Approaches

– We now have the first large baseline corpus of background


traffic and attacks for algorithm development and testing

• Only Minor Procedural Changes are Necessary for 1999

MIT Lincoln Laboratory


28 Jan 99 - 2
Richard Lippmann
Best Combination System from 1998
Evaluation Compared to Keyword Baseline
100

80
BEST
COMBINATION
60

40
KEYWORD
BASELINE
20

0
0.001 0.01 0.1 1 10 100
FALSE ALARMS (%)
• More Than Two Orders of Magnitude Reduction in False Alarm Rate
With Improved Detection Accuracy, Most Errors are in New Attacks
• Keyword Baseline Performance Similar to That of Commercial and
Government Keyword- Based Systems
MIT Lincoln Laboratory
28 Jan 99 - 3
Richard Lippmann
Best Systems in 1998 Evaluation Didn’t
Accurately Detect New Attacks
100

80 OLD

60 NEW

40

20

0
PROBE DOS U2R R2L
(14,3) (34,9) (27,11) (5,17)
CATEGORY

• Systems Generalize Well to New Probe and User to Root Attacks,


but Miss New Denial of Service and Remote to Local Attacks

• Basic Detection Accuracy for Old Attacks Must Also Improve


MIT Lincoln Laboratory
28 Jan 99 - 4
Richard Lippmann
Goals of 1999 Evaluation

• Measure Ability of Intrusion Detection Systems to Detect New


Attacks
– E x t e n d B e y o n d S i g n a t u r e- B a s e d A p p r o a c h e s

– This Requires Many New Attacks

• Add NT Workstation Victim to the Simulation Net


– NT Traffic and Attacks

• Add Insider Attacks


• Provide Selected File System Dumps
– Provide Important Components from File Systems of Five
Victims Each Night (Includes NT Audit Logs)

• Provide Inside Sniffing Data

MIT Lincoln Laboratory


28 Jan 99 - 5
Richard Lippmann
Last Year’s 1998 Simulation Network
“INSIDE” “OUTSIDE”
(172.16 - eyrie.af.mil) (Many IP Addresses)

OUTSIDE OUTSIDE
PC Work Station INSIDE WEB
WS Work Station Web Server
GATEWAY GATEWAY
PC Work Station
GATEWAY
Work Station Web Server

PC Work Station
Work Station Web Server
P2 P2 P2

CISCO
ROUTER

Ultra Ultra
486 Sparc
Sparc

Solaris Linux SunOS Solaris Solaris


Audit Host Sniffer
SNIFFED
DATA
DISK AUDIT
DUMPS DATA

MIT Lincoln Laboratory


28 Jan 99 - 6
Richard Lippmann
New Features in 1999 Evaluation
INSIDER “OUTSIDE”
2.
(Many IP Addresses)
ATTACKS

OUTSIDE OUTSIDE
PC Work Station INSIDE WEB
WS Work Station Web Server
GATEWAY GATEWAY
PC Work Station
GATEWAY
Work Station Web Server

PC Work Station
Work Station Web Server
P2 P2 P2

CISCO
ROUTER
Sniffer
Sniffer
486
Ultra Ultra
OUTSIDE
Sparc Sparc
SNIFFING
Linux SunOS Solaris Solaris Solaris
DATA

SELECTED INSIDE
FILE SNIFFING
NT DUMPS DATA
3. 4.
1.
NT SYSTEMS,
TRAFFIC, ATTACKS MIT Lincoln Laboratory
28 Jan 99 - 7
Richard Lippmann
Major New Features for 1999

• NT Workstation
– NT Traffic and Attacks

• Insider Attacks
• Selected File System Dumps
– Provide Important Components from File Systems of Five
Victims Each Night (Includes NT Audit Logs)

– We Will No Longer Provide Complete File System Dumps

• Provide Inside Sniffing Data

MIT Lincoln Laboratory


28 Jan 99 - 8
Richard Lippmann
Minor Changes for 1999

• Simplify Scoring Procedure


– No Longer Use List Files

– Evaluate Attack Detection and Identification Separately

• Add a New Attack Category “Data Compromise”

• Last Week of Training Data Contains No Attacks to Assist


Training Anomaly Detectors

• No Longer Provide ASCII Praudit BSM Audit Data

• No Longer Provide psmonitor output

• Provide Security Policy for Eyrie Air Force Base

• Fewer Overlapping Attacks in Training/Test Data

MIT Lincoln Laboratory


28 Jan 99 - 9
Richard Lippmann
Proposed Time Line for 1999 Evaluation

COLLECT DATA IMPORTANT DATES


ON HANSCOM
CREATE
NEW ATTACKS

DELIVER 4 WEEKS OF May 1 - June 1


TRAINING DATA

PRE-TEST July12

DELIVER 2 WEEKS OF
August 2
TEST DATA
September
ANALYZE RETURNED DATA

EVALUATION
O c t o b e r-
W O R K S H O P - PI MEETING
November

7/98 1/99 7/99 1/00

MIT Lincoln Laboratory


28 Jan 99 - 10
Richard Lippmann
Provide Feedback and Suggestions

• Send Email to jhaines @sst.ll. mit.edu

• For More Information Look at the Lincoln Laboratory


Intrusion Detection Evaluation Web Site
– A D D R E S S : ideval.ll.mit.edu

– U S E R : ideval

– PASSWORD: daRpa98!

MIT Lincoln Laboratory


28 Jan 99 - 11
Richard Lippmann

You might also like