Facebook Is Once Again Putting the $41 Billion Computer Network Industry to Shame
Facebook has produced yet another computer network innovation that will once again floor the industry.
And Facebook will again share it with the world for free, putting commercial network tech vendors on notice. (We’re looking at you, Cisco).
The new innovation, revealed on Tuesday, is something and it’s a second-generation computer switch, the successor to the one it released last year called the 6-Pack that directly challenged tech made by market leader Cisco (and others, like Juniper).
The difference is, the Backpack is way, way faster.
The 6-Pack was a 40G switch, which means it could stream 40G worth a data around a data center network. The Backpack is an 100G optical switch, which means it’s 2.5 times faster, and using fiber optics (aka light) to move data around instead of the traditional and more limited copper wires.
The Backpack is also a companion to the new switch Facebook announced . The Wedge 100 is what’s known as a “top of rack” switch, that connects a computer rack of servers to the network. The Backpack then connects all the Wedge 100 switches together. In network jargon this is known as a “network fabric.”
Facebook is attempting to build itself a fully 100G data center and these two pieces get it much of the way there, along with the network equipment it announced
Going on sale in 2017
There are two key thing about this new switch. First, Facebook is turning it over to its game-changing Open Compute Project, which has in the few years since Facebook launched it.
OCP creates open source hardware, where engineers can freely take hardware designs and work on them together.
OCP offers designs on racks, servers, storage drives and other hardware. Contract manufactures stand by to build them. OCP has even inspired other internet players to build their own hardware completely from scratch,
In the case of Facebook’s switches, Facebook went the extra step of arranging for its contract manufacturer, Accton, to mass produce the devices so anyone can buy them.
And Facebook also open sources the software to run the switch, and worked with other network startups to get their .
Facebook plans to do all of this for the Backpack, too, Omar Baldonado, a member of Facebook’s network engineering team tells us.
“We anticipate it will follow same path. Later in 2017, people will be able to get a Backpack. We are working with the software eco-system, too. That’s why we are contributing ot OCP,” he said.
Mind-blowing technology
In order to create Backpack, Facebook had to work with chip makers and optical materials vendors to do what’s never been done before, create special chips and special optical fiber that brings the cost of such switches down.
The optical switches on the market today are not typically used in the data center to connect servers together. They are typically used in the “Backplane,” the part of a network that stretches between data centers or across cities.
And because they’ve been targeted for metro-scale networks and beyond, such switches tend to use a lot of power, throw off a lot of heat, and are very expensive.
Facebook helped design a switch that uses less power and generates less heat, can operate at around 55-degree Celsius, Baldonado says, which has never been done before. Folks in the network industry have told us Facebook’s 100G work is “mind blowing.”
To bring costs down, this switch, like the other OCP switches, is modular, meaning you can pull it apart and swap out parts, using different chips, different network cards and different software.
At one point, a former Facebook OCP engineer named Yuval Bachar declared the a goal that networks should cost as little as $1 per gigabyte. This goal has not been achieved, and Baldonado is the first to admit it. But with this switch and all the other hardware, Facebook is bringing costs down, he says. In this case, even if the switch is still pricey to buy, it will cost less to operate, he says.
Facebook is leading this charge into faster, cheaper, mind-blowing networks and data centers because one day we will all be using the social network to hang out in virtual reality, in addition to live-streaming more video.
“We are now creating more immersive, social, interactive 360 video sorts of experiences and that demands a much more scalable and efficient and quick network,” he says.