The value of the Internet lies in
the information that can be found through it, and as everyone who has gone
online can attest, the Internet is a seemingly infinite fount of
information. When using our
favorite websites it is easy to take this wealth of information for granted,
but the sheer amount of data these sites have to process and save are unbelievably
massive.
To get a better understanding of
the scale of the data backing our favorite sites, we can look at the ubiquitous
Facebook photo. On Facebook, more
than 250 million pictures per day are uploaded1 and there are
approximately 140 billion photos stored there in total. To put that number into a
quasi-comprehensible form, that is 10,000 times the number of photos in all of
the books in the Library of Congress, and 4% of all photographs ever taken
since the dawn of photography.2 And while we lazily flip through our
friends’ Facebook photos, their datacenters are serving well over 600,000
photos a second.3
These are very big numbers, and
the datacenters that store this data are correspondingly gigantic. Facebook’s Prineville datacenter, for
example, is 307,000 square feet and has 60,000 servers. A quick look at wolfram alpha will tell
you that that is almost 5.5 football fields worth of data storage capacity. That’s pretty ridiculous and is also
the reason that America’s largest photo library has evolved from this:
To this:
So the next time you are looking at pictures on Facebook and
are complaining about how slowly they load (which they have been recently) ,
take a moment to think about the sheer amount of data that is coursing through
their system and the humongous datacenters that work tirelessly to provide you
with something approximating a seamless user experience.
Sources:
1. http://www.facebook.com/press/info.php?statistics
2. http://www.datacenterknowledge.com/archives/2009/10/13/facebook-now-has-30000-servers/
3. http://www.datacenterknowledge.com/archives/2009/10/13/facebook-now-has-30000-servers/
No comments:
Post a Comment