You are on page 1of 25

Windows System

Forensics
Module 0: Introduction to
Windows
Module 0: Introduction to Windows

Learning Objectives
• Understanding some history of desktop computing
• Understanding technology divergence and convergence
• Understanding digital forensics in a rapidly changing space
• Understanding Microsoft Windows from then to now, and into the future
A Brief History of Desktop
Computing
The Importance of History

• It’s hard to know where things are going if you don’t know anything
about where they’ve been.
• Looking back at trends and past trajectories can help us predict what
might be coming.
• This is particularly true in science and technology.
– Everything very overtly builds on discoveries and innovations of the past.
– Even obsolete technologies gave rise to something more modern.
The Rise of Desktop Computing

• Desktop computing is over 50 years old.


• Early adoption was low and slow:
– Expensive
– Difficult to use
– Mostly hobbyists at first
• Commercial viability changed this.
• It always does, it seems.
Disk Operating Systems (DOS)

• First evolution of computing away from mainframes and programs stored


on physical punch cards (prior to the 1980’s)
• Program data could be stored on readable/writable “disk drives”!
– 1974: Control Program for Microcomputers (CP/M) and floppy disks
– 1978: Intel releases the 8086 microprocessor
– 1981: The IBM Personal Computer (based on the 8088 variant)
– 1981: IBM/Microsoft and PC-DOS v1.0
• Incremental but steady improvements and competition throughout the
early 1980’s
The Improvement Cycle

• It’s all driven by Layer 8!


– Consumers demand new features and usability improvements.
– This forces software developers to need higher-performing hardware.
– This forces hardware developers to design/improve technologies.
– Technology revolutions enable software developers to revolutionize.
• And Layer 8 always wants more.
The Rise of the Graphical User Interface (GUI)

• Everyone can agree that for most consumers, pointing and clicking (or
tapping and thumbing), is more intuitive than typing at a terminal.
– Humans are very visual creatures.
– Typing commands to get textual output isn’t very useful for most people.
• Hence the rise of the GUI:
– 1983: Apple Lisa
– 1984: Apple Macintosh
– 1984: X Windows for Unix
– 1985: Microsoft Windows 1.01
– 1985: Commodore Amiga
– 1987: IBM OS/2
– 1989: NeXTSTEP
• Notice how the cycle-time has compressed and competition increased?
Interlude: Divergence and
Convergence
A Brief Look at CPU Architectures

• CPU architectures: 8-bit? 16-bit? 32-bit? 64-bit?


• It’s all about the CPU’s “instruction set” sizes and clock speeds.
– The CPU has to do many things. Among them:
o It fetches memory pages from RAM or cache.
o It then decodes and executes the instructions.
o It then stores the results back to RAM or cache.
– A single CPU core can only do one thing at a time.
– These are “cycles”, the speed of which is measured in megahertz (MHz).
Pipelining for Performance

• However, with “pipelining” we get better performance.


• By breaking the tasks down to phases a CPU core can do more per
cycle.
• While it’s executing an instruction it can also be:
– Storing the results of its last instruction
– Fetching/decoding the next instruction
• In practice this is broken down into many more sub-phases.
Divergence: CISC vs. RISC

• Historically CPU manufacturers had a choice to make.


• Complex Instruction Set Computer (CISC)
– If the set of instructions is large and complex, then executing them could take time
while the next instruction set waits to be fetched and decoded.
– CPU performance is the bottleneck.
• Reduced Instruction Set Computer (RISC)
– If the set of instructions is small and simple, then the CPU could be faster than it
takes to get them back and forth from RAM.
– RAM I/O is the bottleneck.
• Different CPU vendors went in different directions:
– Intel and the x86 architecture is all about CISC.
– RISC is still alive as Advanced RISC Machines (ARM).
Convergence: CISC/RISC Doesn’t Matter

• Which is a better approach has changed back-and-forth with technology


improvements over time.
• But it doesn’t seem to matter much anymore:
– Apple migrated from RISC to CISC and back again.
o Moved to x86 for OS 10
o Now moved everything to Apple Silicon which is ARM-based.
– Windows and Linux both have x86 and ARM versions.
A Final Example of Convergence

• Consider the smartphone market:


– Apple’s iPhone revolutionized the pocket computer.
– At first Android had only a tiny slice of the market.
• The underlying technologies are vastly different!
– But from the consumer’s perspective, does it matter at all?
– Almost every app is available on both, just from different stores.
• And these days does it really matter if I use a Mac, you use Windows,
and our friend runs Linux?
Convergence as a Virtue

• For the consumer all these fluctuations don’t really matter much.
– Layer 8’s interest doesn’t go below OSI layer 9.
– With the modern exception of “is it encrypted or not” no one cares.
– “Please just make it work.”
• Convergence for the consumer is great for business.
– Consumer compatibility is all that matters for sales.
– Platform independence allows for faster/easier innovation.
Digital Forensics in a Rapidly
Changing Space
Consumer Convergence and Digital Forensics

• This situation unleashes technology and development to diverge wildly


and very rapidly.
– This is true between businesses.
– But it is also true within businesses.
• For the forensics analyst this poses a very difficult problem.
– We have to understand the actual internals at almost every level.
– The structures of memory, filesystems, network protocols, and more dictate how we
conduct our investigations.
• This forces most forensic analysts to have to specialize to keep up with
the rapid changes.
– Memory, Mobile, Network, Windows/Linux/MacOS are all separate disciplines
for most.
Digital Forensics Convergence

• There are some positive developments in this for us.


• The open-source movement has allowed our tools to improve.
• Prior to the explosion of divergence all we had were narrow choices.
– Almost entirely expensive commercial tools
– Almost entirely platform-specific
• Now we have many better choices.
Free and Open-Source

• We can use most of our core tools across platforms:


– The Sleuth Kit
– Autopsy
– Volatility
• Forensics platforms:
– Tsurugi
– The SIFT kit
• Unix/Linux built in tools:
– dd/dcfldd
– filesystem handling
– strings, grep, awk, etc.
Microsoft Windows From Then
to Now, and into the Future
Sorting Out the
Different Versions
Sorting Out the Different Editions

• Even within the different Versions there are different Editions.


• The combination of Version/Edition dictates which features you get.
• Sometimes these don’t make sense at all.
• Windows 7 might have been the worst for this:
– Home Edition didn’t have the built-in backup utilities, though Enterprise did.
– Enterprises would be using a feature-rich 3rd-party infrastructure for this.
– Who more needs that feature than the home user?
• Which Enterprise Editions of Server have Bitlocker support?
• https://en.wikipedia.org/wiki/List_of_Microsoft_Windows_versions
Windows into the Future

• Microsoft is clearly banking on migration to the subscription model.


• This puts a nasty spin on traditional acquisition tools and techniques!
– No disks to image if everything is running out of a browser.
– Local RAM only, but all of it in Chrome’s memory space?
• Some evolving tools for acquiring enterprise data out of the cloud
– But if Azure is your AD?
– And if you forgot or didn’t know to pay for a subscription level to get emails far
enough back in time?
• Basically we’re at Microsoft’s mercy, and there we will remain.
Module Quiz

You might also like