You are on page 1of 22

SHOULDN'T THE WEB BE GETTING FASTER?

Better browsers Faster fixed line connections Single page apps in the browser

HTTP ARCHIVE
httparchive.org Based on Alexa top 1,000,000 sites (since Nov '11) IE8, empty cache, DSL connection, US location Each URL loaded 3 times, average taken

PAGE LOAD TIMES

Nov 10 - Nov 12

TOTAL PAYLOAD SIZE

Nov 10 - Nov 12

JAVASCRIPT PAYLOAD SIZE

Nov 10 - Nov 12

WHY SO MUCH JAVASCRIPT?


jQuery plugin culture Polyfills for native HTML5 'Fat client' architecture - push all rendering duties to the browser

WHY 'FAT CLIENT'?


for performance .. gives a responsive app "to mimic native OS / store apps" Transfer of Flash / Silverlight / server-side skillsets to HTML? Clean separation between JSON services and 'sealed front-end' JSON services can be re-used by multiple client platforms - native app, HTML5, 3rd party etc "to save money on server costs"

PAGE LOAD SPEED


The number of end-to-end round trips is the main factor

LOADING SEQUENCE - FAT CLIENT


The following sequence has to complete before the user sees any content: 1. HTML 2. Javascript download and execute 3. JSON download / parse 4. Client side templating 5. I see it!

SERVER-SIDE RENDERED APP


1. HTML (can be flushed in sections) 2. See it! 3. JS-enhanced functionality downloads and bootstraps

SERVER SIDE RENDERED APP - NOTES


HTML can be flushed in sections (chunked encoding) Perceived speed content becomes visible quickly user will take a moment to react , giving time for functionality to load The initial JS payload will be lighter because it's doing less. Other JS can be loaded on demand or predictively

TWITTER.COM
Sept 2010 Twitter relaunches with fat client architecture
a new architecture almost entirely in JavaScript. We put special emphasis on ease of development, extensibility, and performance http://engineering.twitter.com/2010/09/tech-behind-new-twittercom.html

May 2012 Twitter reverts to server-side rendering


... which dropped our initial page load times to 1/5th of what they were previously ... the raw parsing and execution of Javascript [on the old site] had caused massive outliers in perceived rendering speed [across browsers] The bottom line is that a client-side architecture leads to slower performance because most of the code is being executed on our users' machines rather than our own. http://engineering.twitter.com/2012/05/improving-performance-on-twittercom.html

RUNNING CODE IN THE BROWSER IS RISKIER


Browsers are the most hostile software development environment imaginable Douglas Crockford

ENVIRONMENT
Huge variations in browser capabilities Variations in device CPU / javascript performance 3rd party code / extensions can mess with ours Connections vary in speed and reliability If theres an unhandled JS error due to our code or any other in the page, the user may see nothing

http://blogs.wsj.com/digits/2011/02/07/gawker-outagecausing-twitter-stir

CAN PROGRESSIVE ENHANCEMENT DO THE JOB?

Problem - templates duplicated on server and client Use partial server templates for each replaceable unit then generate the HTML for the client side template by inserting {{}} placeholders into the partial template instead of data

Challenge - We want JSON services that can be consumed by native app, html app, java client etc The same JSON services that are available to other fat clients such as native apps, can also be consumed by a lightweight server side app that renders the data into templates.

MYTHS ABOUT PROGRESSIVE ENHANCEMENT


It's for content sites not apps It's slower and means repeated page loads (use pushState!) It's outdated, with modern browsers it's redundant

THE REALITY (I THINK)


Faster page loads More robust More device independent

You might also like