Stacy Higginbotham wrote a thought-provoking article last week entitled “The Elephant in the Gigabit Network Room”. Therein, she talks about how many providers are starting to bring gigabit connectivity to residential areas for prices in the $200-$300 range. She also discusses that this is overkill for most customers, as many devices today can’t reach sustained transfer rates above 500 Mbps as well as the majority of the content being provided are low speed, bandwidth non-intensive services like Twitter. She goes on to discuss that while there may be applications for using gigabit broadband, they are few and far between now and don’t equate to the cost when something like a 25 Mbps downstream cable modem would suffice just as well.
Allow me to disagree here.
I think one of the reasons why this article sounded flawed to me is because is sounds based on the idea that people still use one computer at a time. The more I thought about it, the more I realized that the supposition that gigabit residential service for a single machine is overkill is indeed correct. However, that’s where my opinion diverges. I would argue that today’s residential networks are staring to resemble small enterprise networks with regard to bandwidth usage.
Think about all the things that you are doing with your home networks right now. Sure, there’s a fair amount of low bandwidth web surfing going on. We use Twitter to and Facebook to post status updates. We check email. We look up things on Wikipedia to win Internet arguments. If that was it, I would say that even 100 Mbps or 25 Mbps service would be more than you’d ever need. But go deeper. We now use Netflix to stream movies to our televisions. We use iTunes to download content to all manner of devices. Hulu, Boxee, and Vudu are all clamoring for attention and bandwidth. Even simple Bittorrent transfers can suck up an entire pipe. Now imagine all this couple with the blah blah cloud services coming down the pipe. We even use cloud-ish services today. Gigabytes of pictures uploaded to Picasa and Flickr. Video uploaded to Youtube and Vimeo. Music streaming coming from Google, Amazon, Apple, and anyone else with a handheld device with a headphone jack. We can even run our household phone system over the Internet. Not to mention Facetime, Telepresence, and all manner of real-time video communications. Sounds to me like that little cable modem is starting to get a bit crowded.
Another argument against gigabit networking is the inability of devices to use the full bandwidth. Specifically, the lack of gigabit wireless networking is pointed out in the article. Right now, she’s right. However, with 802.11ac coming down the pipe and WiGig coming to the 60 Ghz spectrum sooner rather than later, I think it’s better if we have the broadband infrastructure in place sooner rather than later. In the article, it is stated that a generic laptop only hit 420 Mbps downstream in a test. Okay, so with a little optimization we could probably hit 600 Mbps easy. Did they test several sites to be sure it wasn’t a transit network issue? Did they pull from a close FTP server with a high-speed backbone? Or were they clocking Windows Update? Most machines will eat any amount of bandwidth you throw at them. Even if you peaked at 500 Mbps out of the box, that’s still 5 times faster than a 100 Mbps network. Think about what would happen in your enterprise if you granted users the ability to run gigabit all the way to the desktop. Files could be transferred faster internally. Content could be pushed with little effort. Imagine again what might happen if you then brought those same users back down to 100 Mbps. You’d have a mutiny on your hands. When driving on the highway, 80 MPH only seems fast when you get going. Once you’ve been cruising there for a while, 60 MPH seems like a standstill. I think that even half a gigabit connection per machine is still amazingly fast, especially when that pipe starts getting crowded as I’ve outlined above.
The final argument is that there is no killer app that necessitates paying such high fees for gigabit service. One service that is discussed by the author is online backup. This, however, is dismissed as being too infrequent to be useful to a customer paying a monthly charge. Let me ask this of you out there: how crazy did the idea of downloading music on the Internet seem when the fastest connection we could muster was 56k? How about watching movies in our house solely over the internet when 128k ISDN was the fastest kid on the block (that was exorbitantly high priced for its time too)? Why code an app if you know it can’t work to its fullest potential today? What about continuous online backup? If you’ve already got the pipe to handle it why not keep a running backup of your files out in the blah blah cloud? HD streaming video to multiple devices simultaneously? What about the burgeoning website designs that seem to be taking more and more bandwidth every day with Flash landing pages, Flash adds, Shockwave menus and more? If we start running gigabit to our house, I can promise you that there will be apps written to take advantage of those big fat pipes.
Yes, running a gigabit pipe into my house would probably be overkill right now. Despite my protestations to the contrary, my wife realizes that I don’t need to have the ability to instantly download anything and everything on the Internet. But I also see that as we start placing more and more content and information outside of our computers and in the blah blah cloud, we’re going to get very impatient to get that content quickly. HD video, 27 megapixel images, and enough MP3s to sink an aircraft carrier stored somewhere in an online vault and we have to have it NAO! Just because 100 Mbps would do anyone just fine today doesn’t mean that there isn’t a market for gigabit residential service. It’s like saying that just because we can only drive 65-75 MPH on the highway there’s no need for sports cars that can do 130. Someone out there will find a use for it if it’s available. If nothing else, the blah blah cloud providers should be championing us to get the fastest available connections and start storing everything we have with them. That way, we don’t have to spend so much time worrying about where our stuff is being stored. We just click it and go.
This definitely touches to a debate I am having on getting 10Gig at work. While I can’t get management to drop the money on 10 gig line cards and SFPs they are willing to put fiber in until they turn blue in the face. So as a test I took a 3750X, did a 4 gig etherchannel up to a core switch and gave gig down to the desktop. We also have dual gig connections to the internet. The point in all of this being that while we definitely don’t use the entire pipe at any given time the latencies are incredibly low. Since network traffic is so bursty the fact that one user can get their packets over the network that much faster has made a world of difference, sure local file transfers and stuff are great, but internet traffic is where most people care right? (Never as a network engineer break the internet, break all else first 😉 ) The overall effect is that the internet “Feels” much faster since the latencies stay incredibly low, this is something else to factor in.