The final presentation for Network Field Day 3 came from Spirent Communications. This was the one company at NFD3 that I was completely in the dark about. Beyond knowing that they “test stuff”, I was unsure how that would translate into something that a networker would be interested in using. After I walked out of their building, I now how a new-found respect for companies that build the devices that we take for granted when reading reports.
We almost didn’t get the chance to show Spirent to the viewing audience. Spirent was unsure how some of their software would come across on a live stream. I can attest to the fact that software demos are sometimes not the best thing to showcase to the home audience. However, after watching the coverage of NFD3 from the previous day, Spirent was impressed by the amount of feedback and discussion going on between the delegates and the home audience. When we arrived at the Spirent offices, we grabbed a quick lunch while the video crew set up for the session. We got a quick introduction from Sailaja Tennati and Patrick Johnson about who Spirent is and what they do. Turns out that Spirent makes many of the tools that other networking vendors use to test their equipment. I liken it to the people that make the equipment that is used to test high performance cars. As impressive as the automobile might be, it’s equally (if not more) impressive to build a machine that can test that performance and even exceed it as needed. A famous quote says “Fred Astaire was a great dancer. But don’t forget Ginger Rogers did everything he did backwards in high heels.” To me, Spirent is like Ginger Rogers. They not only have to keep up with the equipment that Cisco puts out, they have to exceed it and provide that additional capacity to the vendor.
Ankur Chadda was the next presenter. He started off by telling us about the difficulties in testing equipment. Firstly, as soon as there is a problem, the first thing to blame is the testing equipment. It seems that certain people are so sure their equipment is right, there is no way anything could be wrong. Instead, it’s the tester that’s at fault. Many times, this comes from the idea that the data used to test the equipment should be carefully considered. Ask yourself how many times you’ve looked at “speed and feed” numbers on a data sheet or in a publication and said to yourself, “Yeah, but are those real numbers?” Odds are good that’s because those numbers are somewhat synthetic and generated with carefully crafted packets. Throughput is done with very small packet sizes. VPN connections are done with clients that just connect and not transfer data. And so on. Spirent uses their PASS methodology to test equipment – Performance, Availability, Security, and Scalability. This ensures that the numbers that are generated are grounded in reality and useful to the customers wanting to run this in a production environment.
Jurrie van den Breekel introduced us to the data center testing arm of Spirent. I find it very interesting that many vendors like Alcatel, Avaya, and Huawei come to Spirent to provide objective interoperability testing. That says a lot about their capability as well as the trust invested in a company to provide unbiased results. This is something I‘ve said we’ve needed in networking for very long time. Another key piece of testing methodology is ensure that you’re comparing similar capabilities. The example Jurrie gave in the above video is comparing switching performance when the devices use cut-through forwarding versus store-and-forward. Based on understanding of the way those methods work, cut-through should beat store-and-forward. However, Jurrie mentioned that there have been testing scenarios when the converse it true. The key is making sure that the tests match the specifications being tested. Otherwise, you end up with wacky results like those above. The other fun anecdote from Jurrie involved testing a Juniper QFabric implementation. One thing that most people tend to overlook when testing or installing equipment is simple cabling. While many might take it for granted, it becomes a non-trivial issue at a big enough scale. In the case of the QFabric test, it took two full days to cable the 1500 ports. That’s something to keep in mind the next time someone wants you to quote hours for an installation.
Our last presenter for the streamed portion of NFD3 was Ameya Barve. He led his talk with a nice prediction – testing as we know it will shift from individual scenarios like application or network testing and instead become converged on infrastructure testing. This is critical because most of these tests today occur completely independent of each other. This means that the people doing the testing need to know what to test for. That’s one of the things that Spirent is moving towards. I think that this kind of holistic testing is going to be critical as well. Too many times we find out after the fact that an application had some unforeseen interaction with a portion of the network in what is normally called a “corner case scenario”. Corner cases are extremely hard to test for in siloed testing because the interaction never happens. It’s only when you toss everything together and shake it all up that you start finding these interesting problems.
After we shut off the cameras, we got a chance to look at a tool that Spirent uses for more focused testing. It’s an Integrated Development Environment (IDE) tool called iTest. iTest allows you to use all kinds of interesting things to test all aspects of your network. You can have iTest SSH to a router to observe what happens when you pump a lot of HTTP traffic through it. You can also write regular expressions (regex) to pull in all kinds of information that is present in log files and console output. There’s a ton of things that you can do with iTest, and I’m just scratching the surface with it. I’m hoping to have a totally separate post up at some point covering some of the more interesting parts of iTest.
It’s always a fun when I realize there is a whole world out there that I have no idea about. My trip to Spirent showed me that the industry built around testing is a world unto itself. I had no idea that so much went into the methodology and setup for generating the numbers we see in marketing slides. I’m really interested to see what Spirent will be bringing to market to help converge the siloed testing that we see today.
Tech Field Day Disclaimer
Spirent was a sponsor of Network Field Day 3. As such, they were responsible for covering a portion of my travel and lodging expenses while attending Network Field Day 3. In addition, they provided me with a gift bag containing a coffee mug, polo shirt, pen, scratchpad, USB drive containing marketing collateral, and a 1-foot long Toblerone chocolate bar. They did not ask for, nor where they promised any kind of consideration in the writing of this review/analysis. The opinions and analysis provided within are my own and any errors or omissions are mine and mine alone.