Current websites like Facebook, Google, Twitter see unprecedented growth in number of concurrent active users and volumes of data in the guise of photos and videos. This level of scalability-demand defies our collective experiences from the past.
For example, Facebook in January of 2009 had 150 million active users. On September of the same year, it reached 300m and currently it is over 400m. Back in 2006, they had one data center in the bay area. In 2008, they added a new one in Virginia. Pretty soon, they are adding a third one in Oregon. Users download 3 billion photos a month, part of 200 Terabytes of live data. They do have over 60,000 servers doing the service for this rapidly growing community. All this was described by their engineer Tom Cook at Velocity 2010 last week in a talk titled “A day in the life of Facebook Operation”.
Sometimes, operational aspects are invisible to the user community. The belief is that somehow it all works until some breakdown or failure makes headlines. It takes a huge amount of innovation and discipline to run these operations. Boring stuff like configuration management, version control, early optimization, failure management, instrumentation, and automated tools require tremendous focus. Google spends a lot of money and talent to keep its operation efficient. So does Facebook. At the same conference, my friend James Hamilton (we were at IBM years back) gave an interesting talk on “Datacenter Infrastructure Innovation”. James is currently at Amazon as a VP and distinguished engineer after working at Microsoft for a number of years. He identifies top cost components and where some innovations can yield significant savings.
As more and more cloud service providers face these challenges, they better check how these pioneers at Facebook, Amazon, and Google are charting new courses for extreme scalability.
I am quoting from a recent article:
“We have become fantastic multitaskers, well-suited for tasks like text-messaging seven people simultaneously or trading stock. Meanwhile, our attention spans are evaporating. Studies have shown that younger generations are getting worse at eye contact and detecting nonverbal cues. Are we unwittingly pushing ourselves down the autism spectrum?
The average American now watches more than 1,800 hours a year of television, yet 80% have not read a book in the last year. It’s beautiful irony that we created a culture that likes watching authors be interviewed on TV, yet doesn’t like reading the books they write.
Maybe deep thinking no longer matters. Maybe we are a culture that prefers to be entertained rather than informed. Maybe everything that is important can be said in 140 characters. Or, maybe we should stop and think about whether we want to live in the world we dreamed up.
Consider this: 42% of college graduates will never read another book for the rest of their lives.
As publishers, this will transform our businesses — and as a society, harm it irreparably.”
It’s kind of scary to see the changing behavior of young people growing up with “always-on” devices and the Facebook culture. Gone are the simple pleasures of reading a book, writing your thoughts in long hand, spending time with friends and family without constantly staring at your iPhone. Someone called this the CPA phenomenon – Continuous Partial Attention.
I also heard that suddenly there is overflow of information between long-lost friends, that can get tiresome. There is some charm in being “unavailable” for some time, then the meeting becomes that much more interesting.
My hope is that this is cyclic and we will be back to old-fashioned habits like reading books and forcing “disconnected” times for our sanity. Current euphoria of “easy connectivity” and social networking will adjust itself from over-indulgence. There is a special charm in reading printed words on paper over staring at a LCD screen for hours.
I just read this insightful article by Bill Baxton of Microsoft research. He talks about how long it takes for a new idea to mature and surface as a product. He cites the example of the mouse which started as an idea back in 1965 and did no get to mass popularity until Windows 3.0 in 1995. In between it went through various stages of augmentation and refinement. A typical span of 20 to 30 years is what it takes for this process.
Take the example of Relational Database, an idea first surfaced by the late Ted Codd back in 1969-70. By the time it was refined and productized was twenty years later (when it became a billion dollar business). So is RISC technology which took 30 years.
I like his statement about appreciating this process of refinement, augmentation, and goldsmithing. This is what he says:
The heart of the innovation process has to do with prospecting, mining, refining, and goldsmithing. Knowing how and where to look and recognizing gold when you find it is just the start. The path from staking a claim to piling up gold bars is a long and arduous one. It is one few are equipped to follow, especially if they actually believe they have struck it rich when the claim is staked. Yet the true value is not realized until after the skilled goldsmith has crafted those bars into something worth much more than its weight in gold. In the meantime, our collective glorification of and fascination with so-called invention—coupled with a lack of focus on the processes of prospecting, mining, refining, and adding value to ideas—says to me that the message is simply not having an effect on how we approach things in our academies, governments, or businesses.
Hence the new technologies of today such as the touch screen interface of the iPhone is already ten years old and will take another ten years to reach maturation via various products. This will indeed be the new user interface replacing the mouse.
Credit must go to the subsequent teams that take an idea through refinement and productization, a highly non-trivial task. Maybe we should reward such groups more vividly for highlighting their contribution.