You are on page 1of 41

Overview of TACC

Victor Eijkhout, Susan Lindsey

Fall 2019
Your instructors
Victor Eijkhout, Susan Lindsey
work at the
Texas Advanced Computing Center

PCSE — Fall 2019— 2


So where is TACC?

PCSE — Fall 2019— 3


How do you get to TACC?

PCSE — Fall 2019— 4


Pickle Campus
Formerly Balcones Research Center,
location of some of the best wildflowers in Austin.

PCSE — Fall 2019— 5


James Jarrell ‘Jake’ Pickle

• 1913–2005, congressman
1963–1995
• US Navy during WW II
• important role in Civil
Rights Act and Social
Security reform

PCSE — Fall 2019— 6


TACC

• Started in 2001 with 10-ish people, now 140


• UT has had computing centers before; in 2001 TACC became
independent unit: falls under VP for research.
• First major supercomputer in 2008: Ranger.
• Currently: Frontera, #5 in the world, and largest academic
computer in the world; and Stampede2, #19 in the world.

PCSE — Fall 2019— 7


TACC now

• 140-ish people, divided in Systems, High Performance


Computing, Computational Biology, Big Data, Machine
learning, Visualization, Outreach (and more) groups.
• 15 platforms
• 1000 projects, 200 public data collections
• 30 web portals with 35k users
• new 10MWatt data center
• second new building in 10 years

PCSE — Fall 2019— 8


Stampede1, now retired

(Want to guess how much a computer costs? How long it stays


operational?)
PCSE — Fall 2019— 9
Frontera

• 91 racks with 8008 nodes; each node two 28-core Intel


Cascade Lake processors.
• 60Pbyte of storage, of which 3Pbyte flash.
• Two GPU subclusters

PCSE — Fall 2019— 10


front view

PCSE — Fall 2019— 11


frontera processors

PCSE — Fall 2019— 12


network switch

PCSE — Fall 2019— 13


overhead cabling

PCSE — Fall 2019— 14


Stampede2

• Second biggest machine: cost $30M


• 4000 nodes with Intel ‘Knights Landing’ Xeon phi;
1700 nodes with two Skylake server processors.
• 75 miles of cabling, up to 4.5Mwatt power
• TACC’s machines are popular and reliable:
Stampede1 was used by 5000 users, up 98% of the time,
8 million jobs over its lifetime.

PCSE — Fall 2019— 15


Stampede2

PCSE — Fall 2019— 16


Stampede cabling

PCSE — Fall 2019— 17


Lonestar5
Our Cray

PCSE — Fall 2019— 18


Maverick2
GPU machine

Btw, this picture is not sideways:


the machine hangs in a bath of HEB-$1/bottle-mineral oil
(ok, slightly better than that)

PCSE — Fall 2019— 19


Hikari

PCSE — Fall 2019— 20


Hikari water cooling

PCSE — Fall 2019— 21


Big data

• Wrangler: big data machine with lots of SSDs


• Rustler: hadoop cluster
• Stockyard: 20Pbyte spinning disc (shared between all clusters)
• Ranch: 50Pbyte of tape

PCSE — Fall 2019— 22


Wrangler

PCSE — Fall 2019— 23


Clouds

• Rodeo: mostly internal use


• Chameleon: cloud research
• Jetstream: for educational use

PCSE — Fall 2019— 24


Jetstream

PCSE — Fall 2019— 25


Catapult
Microsoft FPGA machine learning platform

PCSE — Fall 2019— 26


Stockyard
Mass storage

PCSE — Fall 2019— 27


Visualization lab (POB)

PCSE — Fall 2019— 28


A picture is worth a thousand. . .
Our graphics people can help you understand your results (and sell
your research) through high quality visualization.

PCSE — Fall 2019— 29


Typical academic customer

PCSE — Fall 2019— 30


Non-typical academic customer

PCSE — Fall 2019— 31


Typical non-academic customer

PCSE — Fall 2019— 32


We’re very hands-on

PCSE — Fall 2019— 33


We’re very hands-on

PCSE — Fall 2019— 34


We’re very hands-on

PCSE — Fall 2019— 35


Student activities: REU

PCSE — Fall 2019— 36


Student cluster competition

PCSE — Fall 2019— 37


Outreach: Code at TACC

PCSE — Fall 2019— 38


We keep growing

PCSE — Fall 2019— 39


Our new building

PCSE — Fall 2019— 40


TACC is a good place to be

PCSE — Fall 2019— 41

You might also like