It’s Probably Not The Wi-Fi

After finishing up Mobility Field Day last week, I got a chance to reflect on a lot of the information that was shared with the delegates. Much of the work in wireless now is focused on analytics. Companies like Cape Networks and Nyansa are trying to provide a holistic look at every part of the network infrastructure to help professionals figure out why their might be issues occurring for users. And over and over again, the resound cry that I heard was “It’s Not The Wi-Fi”

Building A Better Access Layer

Most of wireless is focused on the design of the physical layer. If you talk to any professional and ask them to show your their tool kit, they will likely pull out a whole array of mobile testing devices, USB network adapters, and diagramming software that would make AutoCAD jealous. All of these tools focus on the most important part of the equation for wireless professionals – the air. When the physical radio spectrum isn’t working users will complain about it. Wireless pros leap into action with their tools to figure out where the fault is. Either that, or they are very focused on providing the right design from the beginning with the tools validating that access point placement is correct and coverage overlap provides redundancy without interference.

These aren’t easy problems to solve. That’s why wireless folks get paid the big bucks to build it right or fix it after it was built wrong. Wired networkers don’t need to worry about microwave ovens or water pipes. Aside from the errant fluorescent light or overly aggressive pair of cable pliers, wired networks are generally free from the kinds of problems that can plague a wire-free access layer.

However, the better question that should be asked is how the users know it’s the wireless network that’s behind the faults? To the users, the system is in one of three states: perfect, horribly broken, or slow. I think we can all agree that the first state of perfection almost never actually exists in reality. It might exist shortly after installation when user load is low and actual application use is negligible. However, users are usually living in one of the latter states. Either the wireless is “slow” or it’s horribly broken. Why?

No-Service Station

As it turns out, thanks to some of the reporting from companies like Cape and Nyansa, it turns out that a large majority of the so-called wireless issues are in fact not wireless related at all. Those designs that wireless pros spend so much time fretting over are removed from the equation. Instead, the issues are with services.

Yes, those pesky network services. The ones like DNS or DHCP that seem invisible until they break. Or those services that we pay hefty sums to every month like Amazon or Microsoft Azure. The same issues that plague wired networking exist in the wireless world as well and seem to escape blame.

DNS is invisible to the majority of users. I’ve tried to explain it many times with middling to poor results. The idea that computers on the internet don’t understand words and must rely on services to translate them to numbers never seems to click. And when you add in the reliance on this system and how it can be knocked out with DDoS attacks or hijacking, it always comes back to being about the wireless.

It’s not hard to imagine why. The wireless is the first thing users see when they start having issues. It’s the new firewall. Or the new virus. Or the new popup. It’s a thing they can point to as the single source of problems. And if there is an issue at any point along the way, it must be the fault of the wireless. It can’t possibly be DNS or routing issues or a DDoS on AWS. Instead, the wireless is down.

And so wireless pros find themselves defending their designs and configurations without knowing that there is an issue somewhere else down the line. That’s why the analytics platforms of the future are so important. By giving wireless pros visibility into systems beyond the spectrum, they can reliably state that the wireless isn’t at fault. They can also engage other teams to find out why the DNS servers are down or why the default gateway for the branch office has been changed or is offline. That’s the kind of info that can turn a user away from blaming the wireless for all the problems and finding out what’s actually wrong.


Tom’s Take

If I had a nickel for every problem that was blamed on the wireless network or the firewall or some errant virus when that actually wasn’t the case, I could retire and buy my own evil overlord island next to Larry Ellison. Alas, these are issues that are never going to go away. Instead, the only real hope that we have is speeding the time to diagnose and resolve them by involving professionals that manage the systems that are actually down. And perhaps having some pictures of the monitoring systems goes a long way to tell users that they should make sure that the issue is indeed the wireless before proclaiming that it is. Because, to be honest, it probably isn’t the Wi-Fi.

Advertisements

The History of The Wireless Field Day AirCheck

Mobility Field Day 2 just wrapped up in San Jose. It’s always a little bittersweet to see the end of a successful event. However, one thing that does bring a bit of joy to the end of the week is the knowledge that one of the best and longest running traditions at the event continues. That tradition? The Wireless/Mobility Field Day AirCheck.

The Gift That Keeps Giving

The Wireless Field Day AirCheck story starts where all stories start. The beginning. At Wireless Field Day 1 in March of 2011, I was a delegate and fresh off my first Tech Field Day event just a month before. I knew some wireless stuff and was ready to learn a lot more about site surveys and other great things. Little did I know that I was about to get something completely awesome and unexpected.

As outlined in this post, Fluke Networks held a drawing at the end of their presentation for a first-generation AirCheck handheld wireless troubleshooting tool. I was thrilled to be the winner of this tool. I took it home and immediately put it to work around my office. I found it easy to use and it provided great information about wireless networks that I could use to make my life easier. I even loaned it out to some of my co-workers during troubleshooting calls and they immediately told me the wanted one of my own.

As the rest of 2011 rolled forward, I found uses for my AirCheck but I didn’t do as much wireless as a lot of the other people out there. I knew that someone else could probably get more out of having it than I did. So, I hatched a plan. I told Stephen Foskett that if I had the chance to come back to Wireless Field Day 2, I would gladly give my AirCheck away to another worthy delegate. I wanted to keep the tool in use with the best and brightest people in the community and help them see how awesome it was.

Sure enough, I was invited to Wireless Field Day 2 in January 2012. I arrived with my AirCheck and waited until the proper moment. During the welcome dinner, Matt Simmons and I found a way to randomly draw a number and award the special prize to Matthew Norwood. He was just as thrilled to get the AirCheck as I was. I sent my prize from Wireless Field Day 1 on its way to a new home, content that I would help someone get more wireless knowledge.

But the giving didn’t stop there. Even though I wasn’t a delegate for Wireless Field Day 3 or Wireless Field Day 4, the AirCheck kept coming back. Matthew gave it to Dan Cybulskie. Dan gave it to Scott Stapleton. The AirCheck headed down under for half of 2013. When Wireless Field Day 5 rolled around, I was now a staff member for Tech Field Day and working behind the scenes. I had forgotten about the AirCheck until a box arrived from Australia with Scott’s postmark on it. He mailed it back to the US to continue the tradition!

And so, the AirCheck passed along to a new set of hands every event. Blake Krone got it at Wireless Field Day 5. Then Jake Snyder, followed by Richard McIntosh and Scott McDermott. Even when we changed the name of the event to Mobility Field Day in 2016, the AirCheck passed along to Rowell Dionicio.

Changing Of The Guard

In the interim, the AirCheck product moved over to Netscout. They developed a new version, the G2, that was released after Mobility Field Day 1 in 2016. The word also got around to the Netscout folks that there was a magical G1 AirCheck that was passed along to successive Wireless/Mobility Field Day delegates as a way of keeping the learning active in the community.

Netscout was a presenter during Mobility Field Day 2 in 2017. Chris Hinz contacted me before the event and asked if we still gave away the AirCheck during the event. I assured him that we did. He said that a tradition like that should continue, even if the G1 AirCheck was getting a bit long in the tooth. He told me that he might be able to help us all out.

After the Netscout presentation at Mobility Field Day 2, Chris presented me with his special surprise: a brand new G2 AirCheck! Since we hadn’t given the old unit to its new recipient just yet, we decided that it was time to “retire” the old G1 and pass along the G2 to the next lucky contestant. Shaun Neal was the lucky delegate this time and took the new and improved G2 home with him Wednesday night. I was happy to see it go to him knowing that he’ll get to put it through its paces and learn from it. And then he will get to bring it back to the next Mobility Field Day for it to pass along to a new delegate and continue the chain of sharing.


Tom’s Take

When I gave away my G1 AirCheck all those years ago, I never expected it would turn into something so incredible. The sharing and exchange of tools and knowledge at both Wireless Field Day and Mobility Field Day help remind me of why I do this job with Stephen. The community is an awesome and amazing place sometimes. The new G2 AirCheck will have a long life helping delegates troubleshoot wireless issues.

The old G1 AirCheck, my AirCheck, is in my suitcase. It’s ready to start its retirement in my office, having earned thousands of frequent flyer miles as well as becoming a very important part of Tech Field Day lore. I couldn’t be happier to get it back at the end of its life knowing how much happiness it brought to people along the way.

Will Dell Networking Wither Away?

chopping-block-Dell-EMC

The behemoth merger of Dell and EMC is nearing conclusion. The first week of August is the target date for the final wrap up of all the financial and legal parts of the acquisition. After that is done, the long task of analyzing product lines and finding a way to reduce complexity and product sprawl begins. We’ve already seen the spin out of Quest and Sonicwall into a separate entity to raise cash for the final stretch of the acquisition. No doubt other storage and compute products are going to face a go/no go decision in the future. But one product line which is in real danger of disappearing is networking.

Whither Whitebox?

The first indicator of the problems with Dell and networking comes from whitebox switching. Dell released OS 10 earlier this year as a way to capitalize on the growing market of free operating systems running on commodity hardware. Right now, OS 10 can run on Dell equipment. In the future, they are hoping to spread it out to whitebox devices. That assumes that soon you’ll see Dell branded OSes running on switches purchased from non-Dell sources booting with ONIE.

Once OS 10 pushes forward, what does that mean for Dell’s hardware business? Dell would naturally want to keep selling devices to customers. Whitebox switches would undercut their ability to offer cheap ports to customers in data center deployments. Rather than give up that opportunity, Dell is positioning themselves to run some form of Dell software on top of that hardware for management purposes, which has always been a strong point for Dell. Losing the hardware means little to Dell if they have to lose profit margin to keep it there in the first place.

The second indicator of networking issues comes from comments from Michael Dell at EMCworld this year. Check out this short video featuring him with outgoing EMC CEO Joe Tucci:

Some of the telling comments in here involve Michael Dell’s praise for the NSX business model and how it is being adopted by a large number of other vendors in the industry. Also telling is their reaffirmation that Cisco is an important partnership in VCE and won’t be going away any time soon. While these two things don’t seem to be related on the surface, they both point to a truth Dell is trying hard to accept.

In the future, with overlay network virtualization models gaining traction in the data center, the underlying hardware will matter little. In almost every case, the hardware choice will come down to one of two options:

  1. Which switch is the cheapest?
  2. Which switch is on the Approved List?

That’s it. That’s the whole decision tree. No one will care what sticker is on the box. They will only care that it didn’t cost a fortune and that they won’t get fired for buying it. That’s bad for companies that aren’t making white boxes or named Cisco. Other network vendors are going to try and add value in some way, but the overlay sitting on top of those bells and whistles will make it next to impossible to differentiate in anything but software. Whether that’s superior management capabilities, open plug-in model, or some other thing we haven’t thought of will make no difference in the end. Software will still be king and the hardware will be an inexpensive pawn or a costly piece that has been pre-approved.

Whither Wireless?

The other big inflection point that makes me worry about the Dell networking story is the lack of movement in the wireless space. Dell has historically been a company to partner first and acquire second. But with HPE’s acquisition of Aruba Networks last year, the dominos in the wireless space are still waiting to fall. Brocade raced out to buy Ruckus. Meru offered itself on a platter to anyone that would buy them. Now Aerohive stands as the last independent wireless vendor without a dance partner. Yes, they’ve announced that they are partnering with Dell, but have you been to the Dell Wireless Networking page? Can you guess what the Dell W-series is? Here’s a hint: it rhymes with “Peruba”.

Every time Dell leads with a W-series deployment, they are effectively paying their biggest competitor. They are opening the door to allowing HPE/Aruba to come in and not only start talking about wireless but servers, storage, and other networking as well. Dell would do well at this point to start deemphasizing the W-series and start highlighting the “new generation” of Aerohive APs and how they are going to the be the focus moving forward.

The real solution would be for Dell to buy a wireless company and take all the wireless expertise they are selling in-house. That would show they are serious about both the campus network of the future and the data center network needed to support their other server and storage infrastructure. Sadly, with Dell being leveraged due to the privatization of his company just two years ago and mounting debt for this mega merger, Dell is looking to make cash with spin offs instead of spending it on yet another company to ingest and subsume. Which means a real non-partner wireless solution is still many years away.


Tom’s Take

Dell’s networking strategy is in maintenance mode. Make switches to support faster speeds for now, probably with Tomahawk support soon, and hope that this whole networking thing goes software sooner rather than later. Otherwise, the need to shore up the campus wireless areas along with the coming decision about showing support fully behind NSX and partnerships is going to be a bitter pill to swallow. Perhaps Dell Networking will exist as an option for companies wanting a 100% Dell solution? Or maybe they are waiting for a new offering from Dell/EMC in the data center to drive profits to research and development to keep pace with Cisco and Arista? One can only hope that their networking flower doesn’t wither on the vine.

HP Is Buying Aruba. Who’s Next?

HPAruba_Networks_Logo

Sometimes all it takes is a little push. Bloomberg reported yesterday that HP is in talks to buy Aruba Networks for their wireless expertise. The deal is contingent upon some other things, and the article made sure to throw up disclaimers that it could still fall through before next week. But the people that I’ve talked to (who are not authorized to comment and wouldn’t know the official answer anyway) have all said this is a done deal. We’ll likely hear the final official confirmation on Monday afternoon, ahead of Aruba’s big Atmosphere (nee Airheads) conference.

R&D Through M&A

This is a shot in the arm for HP. Their Colubris-based AP lineup has been sorely lacking in current generation wireless technology, let alone next gen potential. The featured 802.11ac APs on their networking site are OEMed directly from Aruba. They’ve been hoping to play the OEM game for a while and see where the chips are going to fall. Buying Aruba gives them second place in the wireless market behind Cisco overnight. It also fixes the most glaring issue with Colubris – R&D. HP hasn’t really been developing their wireless portfolio. Some had even thought it was gone for good. This immediately puts them back in the conversation.

More importantly to HP, this acquisition cuts off many of their competitor’s wireless plans at the knees. Dell, Juniper, Brocade, Alcatel Lucent, and many others OEM from Aruba or have a deep partnership agreement. By wrapping up the entirety of Aruba’s business, HP has dealt a blow to the single-source vendors that are playing in the wireless market. And this is going to lead to some big changes relatively soon.

The Startup Buzz

Dell is perhaps the most impacted by this announcement. A very large portion of their wireless offerings were Aruba. They sold APs, controllers, and even ClearPass through their channels (with the names filed off, of course). Now, they are back to square one. How are they going to handle the most recent deals? What are their support options?

I little thought exercise with my friend Josh Williams (@JSW_EdTech) had a few possibilities:

  1. Dell forces HP to buyout all the support contracts for Dell/Aruba customers. That makes sense for Dell, but it will turn a lot of customers against them, especially when HP lets those customers know the reasons why.
  2. Dell agrees to release the developments they’ve done on the platform to HP in return for HP taking the support business. Quiet and clean. Which is why it likely won’t happen.
  3. Dell pays HP an exorbitant amount of money to take the support contracts. This gives HP the capital to take on all those new support contracts and gives Dell an exit to rebuild. This is probably what HP wants, but could end up sinking the deal.

Dell got burned, plain and simple. They likely could have purchased Aruba months ago and solidified the relationship. Instead, they are now looking for a new partner. However, I don’t think they are going to get burned again. Rather than shopping for a friend, they are going to be shopping for an acquisition. My money has always been on Aerohive. They have an existing relationship. The Aerohive controller-less cloud model fits Dell’s new strategies. And they would be a much cheaper pickup than Aruba. There is precedence for Dell skipping the big name and picking up a smaller company that’s a better fit. It’s a hard pill to swallow, but it gives Dell the chance to move forward with a lasting relationship.

Softwarely Defined

Brocade is a line-of-business partner of Aruba. They’ve only recently gotten involved since Motorola shut down their WLAN business. This is a good sign for them. That means they can exit from their position and not be significantly affected. It does leave them with a quandary of where to go.

The first choice would be to go back to the Motorola relationship, now in the form of Zebra Technologies. Zebra inherited quite a large portion of the WLAN space from Motorola, but they’ve been keeping rather quiet about it. Are they angling to be more of a support organization for existing installs? Or are they waiting for a big splash announcement to get back in the game? Partnering with Brocade would give them that announcement given the elevated profile Brocade has today.

Brocade’s other option would be to go down the SDN road. The plan for a while has been to embrace SDN, OpenFlow, and all things software defined. The natural target for this would be Meru Networks. Meru has been embracing SDN as well as of late. They had a nice event last year showcasing their advances in SDN. Brocade could bolster that SDN knowledge while obtaining a good wireless company that would give them the strength they need to augment their enterprise business.

Permission To Retire

The odd company out is Juniper. I’ve heard that they were involved at first in trying to acquire Aruba, but when you’re betting against HP’s pockets you will lose in the long run. Their other problem is Elliott Management, everyone’s new favorite “activist investor”.

Elliott has made no secret that they see the value in Juniper in the service provider market. As far back as last year, Elliott has been trying to get Juniper to reave off the ancillary businesses, including security, enterprise, and wireless. Juniper has officially ended sales for Trapeze-based products already. Why would Elliott let them buy another wireless company so soon after getting rid of the last one. Even as successful as Aruba is, Elliott would see it as another distraction. And when someone that active is calling the shots, you can’t go against them, lest you end up unemployed.

This is the end for Juniper’s wireless aspirations. That’s not a bad thing, necessarily. This gives them the impetus needed to focus on the service provider market. It also gives them a smaller enterprise switching portfolio to package up and sell off should that pound of flesh be necessary to sate Elliott as well. Time will tell.

Everyone Else

Any other companies with Aruba relationships are either dipping their toes in the wireless waters or don’t care enough to worry about the impact it will have. It will be an easy matter for companies like Alcatel-Lucent to go out and find a new OEM partner, likely with someone like Extreme Networks or Ruckus. Those companies are making great technology and will be happy to supply the APs that customers need. Showing off their technology will also give them great in-roads into customers that might not have been on their radar before.


Tom’s Take

It’s going to be an exciting time in the wireless space. HP’s acquisition is going to start the falling dominoes for other companies to buy into the wireless space as well. When the dust settles, there will be new number twos and number threes in the market. It also clears the middle of the space for up-and-comers to grow. Cisco is going to stay number one for a while, and HP will be number two when this deal closes. But until we see the fallout from who will be purchased and partnered with it’s tough to say who will be a clear winner. But make sure you’ve got your popcorn ready. Because this isn’t over yet. Not by a long shot.

 

A Complicated World Without Wires

WFD-Logo2-400x398

Another Field Day is in the books. Wireless Field Day 5 was the first that I’d been to in almost two years. I think that had more to do with the great amount of talent that exists in the wireless space. Of course, it does help that now I’m behind the scenes and not doing my best to drink from the firehose of 802.11ac transitions and channel architecture discussions. That’s not to say that a few things didn’t absorb into my head.

Analysis is King

I’ve seen talks from companies like Fluke and Metageek before at Wireless Field Day. It was a joy to see them back again for more discussion about new topics. For Fluke, that involved plans to include 802.11ac in their planning and analysis tools. This is going to be important going forward to help figure out the best way to setup new high-speed deployments. For Metageek, it was all about showing us how they are quickly becoming the go-to folks for packet analysis and visual diagramming. Cisco has tapped them to provide analysis for CleanAir. That’s pretty high praise indeed. Their EyePA tool is an amazing peek into what’s possible when you take the torrent of data provided by wireless connections and visualize it.

Speaking of analytics, I was very impressed to see what 7signal and WildPackets were pulling out of the air. WildPackets is also using a tool to capture 802.11ac traffic, OmniPeek. A lot of the delegates were happy to see that 11ac had been added in the most recent release. 7signal has some crazy sensors that they can deploy into your environment to give you a very accurate picture of what’s going on. As the CTO, Veli-Pekka Ketonen told me, “You can hope for about 5% assurance when you just walk around and measure manually. We can give you 95% consistently.”

It’s Not Your AP, It’s How You Use It

The other thing that impressed me from the Wireless Field Day 5 sponsors was the ways in which APs were being used. Aerohive took their existing AP infrastructure and started adding features like self-registration guest portals. I loved that you could follow a Twitter account and get your guest PPSK password via DM. It just shows the power of social media when it interacts with wireless. AirTight took the social integration to an entirely different level. They are leveraging social accounts through Facebook and Twitter to offer free guest wifi access. In a world where free wifi is assumed to be a given, it’s nice to see vendors figuring out how to make social work for them with likes and follows in exchange for access.

That’s not to say that software was king of the hill. Xirrus stepped up to the the stage for a first-time appearance at Wireless Field Day. They have a very unique architecture, to say the least. Their CEO weathered the questions from the delegates and live viewers quite well compared to some of the heat that I’ve seen put on Xirrus in the past. I think the delegates came away from the event with a greater respect for what Xirrus is trying to do with their array architecture. Meru also presenter for the first time and talked about their unique perspective with an architecture based on using single-channel APs to alleviate issues in the airspace. I think their story has a lot to do with specific verticals and challenging environments, as outlined by Chris Carey from Bellarmine College, who spoke about his experiences.

If you’d like to watch the videos from Wireless Field Day 5, you can see them on Youtube or Vimeo.  You can also read through the delegates thoughts at the Wireless Field Day 5 page.


Tom’s Take

Wireless growing by leaps and bounds. It’s no longer just throwing up a couple of radio bridges and offering a network to a person or two with laptops in your environment. The interaction of mobility and security have led to dense deployments with the need to keep tabs on what the users are doing through analytics like those provided by Meru and Motorola. We’ve now moved past focusing on protocols like 802.11ac and instead on how to improve the lives of the users via guest registration portals and self enrollment like Aerohive and AirTight. And we can’t forget that the explosion of wireless means we need to be able to see what’s going on, whether it be packet capture or airspace monitoring. I think the group at Wireless Field Day 5 did an amazing job of showing how mature the wireless space has become in such as short time. I am really looking forward to what Wireless Field Day 6 will bring in 2014.

Disclaimer

Wireless Field Day 5 doesn’t happen without the help of the sponsors. They each cover a portion of the travel and lodging costs of the delegates. Some even choose to provide takeaways like pens, coffee mugs, and even evaluation equipment. That doesn’t mean that they are “buying” a review. No Wireless Field Day delegate is required to write about what they see. If they do choose to write, they don’t have to write a positive review. Independence means no restrictions. No sponsor every asks for consideration in a review and they are never promised anything. What you read from myself and the delegates is their honest and uninfluenced opinion.

Accelerating E-Rate

ERateSpeed

Right after I left my job working for a VAR that focused on K-12 education and the federal E-Rate program a funny thing happened.  The president gave a speech where he talked about the need for schools to get higher speed links to the Internet in order to take advantage of new technology shifts like cloud computing.  He called for the FCC and the Universal Service Administration Company (USAC) to overhaul the E-Rate program to fix deficiencies that have cropped up in the last few years.  In the last couple of weeks a fact sheet was released by the FCC to outline some of the proposed changes.  It was like a breath of fresh air.

Getting Up To Speed

The largest shift in E-Rate funding in the last two years has been in applying for faster Internet circuits.  Schools are realizing that it’s cheaper to host servers offsite either with software vendors or in clouds like AWS than it is to apply for funding that may never come and buy equipment that will be outdated before it ships.  The limiting factor has been with the Internet connection of these schools.  Many of them are running serial T-1 circuits even today.  They are cheap and easy to install.  Enterprising ISPs have even started creating multilink PPP connections with several T-1 links to create aggregate bandwidth approaching that of fiber connections.

Fiber is the future of connectivity for schools.  By running a buried fiber to a school district, the ISP can gradually increase the circuit bandwidth as a school increases needs.  For many schools around the country that could include online testing mandates, flipped classrooms, and even remote learning via technologies like Telepresence.  Fiber runs from ISPs aren’t cheap.  They are so expensive right now that the majority of funding for the current year’s E-Rate is going to go to faster ISP connections under Priority 1 funding.  That leaves precious little money left over to fund Priority 2 equipment.  A former customer of mine spent the Priority 1 money to get a 10Gbit Internet circuit and then couldn’t afford a router to hook up to it because of the lack of money leftover for Priority 2.

The proposed E-Rate changes will hopefully fix some of those issues.  The changes call for  simplification of the rules regarding deployments that will hopefully drive new fiber construction.  I’m hoping this means that they will do away with the “dark fiber” rule that has been in place for so many years.  Previously, you could only run fiber between sites if it was lit on both ends and in use.  This discouraged the use of spare fiber, or dark fiber, because it couldn’t be claimed under E-Rate if it wasn’t passing traffic.  This has led to a large amount of ISP-owned circuits being used for managed WAN connections.  A very few schools that were on the cutting edge years ago managed to get dedicated point-to-point fiber runs.  In addition, the order calls for prioritizing funding for fiber deployments that will drive higher speeds and long-term efficiency.  This should enable schools to do away with running multimode fiber simply because it is cheap and instead give preferential treatment to single mode fiber that is capable of running gigabit and 10gig over long distances.  It should also be helpful to VARs that are poised to replace aging multimode fiber plants.

Classroom Mobility

WAN circuits aren’t the only technology that will benefit from these E-Rate changes.  The order calls for a focus on ensuring that schools and libraries gain access to high speed wireless networks for users.  This has a lot to do with the explosion of personal tablet and laptop devices as opposed to desktop labs.  When I first started working with schools more than a decade ago it was considered cutting edge to have a teacher computer and a student desktop in the classroom.  Today, tablet carts and one-to-one programs ensure that almost every student has access to some sort of device for research and learning.  That means that schools are going to need real enterprise wireless networks.  Sadly, many of them that either don’t qualify for E-Rate or can’t get enough funding settle for SMB/SOHO wireless devices that have been purchase for office supply stores simply because they are inexpensive.  It causes the IT admins to spend entirely too much time troubleshooting these connections and distracting them from other, more important issues. It think this focus on wireless will go a long way to helping alleviate connectivity issues for schools of all sizes.

Finally, the FCC has ordered that the document submission process be modernized to include electronic filing options and that older technologies be phased out of the program. This should lead to fewer mistakes in the filing process as well as more rapid decisions for appropriate technology responses.  No longer do schools need to concern themselves with whether or not they need directory assistance on their Priority 1 phone lines.  Instead, they can focus on their problem areas and get what they need quickly.  There is also talk of fixing the audit and appeals process as well as speeding the deployment of funds.  As anyone that has worked with E-Rate will attest, the bureaucracy surrounding the program is difficult for anyone but the most seasoned professionals.  Even the E-Rate wizards have problems from year to year figuring out when an application will be approved or whether or not an audit will take place.  Making these processes easier and more transparent will be good for everyone involved in the program.


Tom’s Take

I posted previously that the cloud would kill the E-Rate program as we know it.  It appears I was right from a certain point of view.  Mobility and the cloud have both caused the E-Rate program to be evaluated and overhauled to address the changes in technology that are now filtering into schools from the corporate sector.  Someone was finally paying attention and figured out that we need to address faster Internet circuits and wireless connectivity instead of DNS servers and more cabling for nonexistent desktops.  Taking these steps shows that there is still life left in the E-Rate program and its ability to help schools.  I still say that USAC needs to boost the funding considerably to help more schools all over the country.  I’m hoping that once the changes in the FCC order go through that more money will be poured into the program and our children can reap the benefits for years to come.

Disclaimer

I used to work for a VAR that did a great deal of E-Rate business.  I don’t work for them any longer.  This post is my work and does not reflect the opinion of any education VAR that I have talked to or have been previously affiliated with.  I say this because the Schools and Libraries Division (SLD) of USAC, which is the enforcement and auditing arm, can be a bit vindictive at times when it comes to criticism.  I don’t want anyone at my previous employer to suffer because I decided to speak my mind.

Backdoors By Design

I was listening to the new No Strings Attached Wireless podcast on my way to work and Andrew von Nagy (@revolutionwifi) and his guests were talking about the new exploit in WiFi Protected Setup (WPS).  Essentially, a hacker can brute force the 8-digit setup PIN in WPS, which was invented in the first place because people needed help figuring out how to setup more secure WiFi at home.  Of course, that got me to thinking about other types of hacks that involve ease-of-use features being exploited.  Ask Sarah Palin about how the password reset functionality in Yahoo mail could be exploited for nefarious purposes.  Talk to Paris Hilton about why not having a PIN on your cell phone’s voice mail account when calling from a known number (i.e. your own phone) is a bad idea when there  are so many caller ID spoofing tools in the wild today.

Security isn’t fun or glamorous.  In the IT world, the security people are pariahs.  We’re the mean people that make you have strong passwords or limit access to certain resources.  Everyone thinks were a bunch of wet blankets.  Why is that exactly?  Why do the security people insist on following procedures or protecting everything with an extra step or two of safety?  Wouldn’t it just be easier if we didn’t have to?

The truth is that security people act the way we do because users have been trying for years to make it easy on themselves.  The issues with WPS highlight how a relatively secure protocol like WPA can be affected by something minor like WPS because we had to make things easy for the users.  We spend an inordinate amount of time taking a carefully constructed security measure and eviscerating it so that users can understand it.  We spend almost zero time educating users about why we should follow these procedures.  At the end of the day, users circumvent them because they don’t understand why they should be followed and complain that they are forced to do so in the first place.

Kevin Mitnick had a great example of this kind of exploitation in his book The Art of Intrusion.  All of the carefully planned security for accessing a facility through the front doors was invalidated because there was a side door into the building for smokers that had no guard or even a secure entrance mechanism.  They even left it propped open most of the time!  Given the chance, people will circumvent security in a heartbeat if it means their jobs are easier to do.  Can you imagine if the US military decided during the Cold War to move the missile launch key systems closer together so that one man could operate them in case the other guy was in the bathroom?  Or what if RSA allowed developers to access the seed code for their token system from a non-secured terminal?  I mean, what would happen if someone accessed the code from a terminal that had been infected with an APT trojan horse?  Oh, wait…

We have been living in the information age for more than a generation now.  We can’t use ignorance as an excuse any longer.  There is no reason why people shouldn’t be educated about proper security and why it’s so important to prevent not only exposure of our information but possible exposure of the information of others as well.  In the same manner, it’s definitely time that was stop coddling users by creating hacking points in technology deemed “too complicated” for them to understand.  The average user has a good grasp of technology.  Why not give them the courtesy of explaining how WPA works and how to set it up on their router?  If we claim that it’s “too hard” to setup or the user interface is too difficult to navigate to setup a WPA key, isn’t that more an indictment of the user interface design than the user’s technical capabilities?

Tom’s Take

I resolve to spend more time educating people and less time making their lives easy.  I resolve to tell people why I’ve forced them to use a regular user account instead of giving them admin privileges.  I promise to spend as much time as it takes with my mom explaining how wireless security works and why she shouldn’t use WPS no matter how easy it seems to be. I look at it just like exercise.  Exercise shouldn’t be easy.  You have to spend time applying yourself to get results.  The same goes for users.  You need to spend some time applying yourself to learn about things in order to have true security.  Creating backdoors and workarounds does nothing but keep those that need to learn ignorant and make those that care spend more time fixing problems than creating solutions.

If you’d like to learn more about the WPS hack, check out Dan Cybulsike’s blog or follow him on twitter (@simplywifi)