Certifications Are About Support

You may have seen this week that VMware has announced they are removing the mandatory recertification requirement for their certification program. This is a huge step from VMware. The VCP, VCAP, and VCDX are huge certifications in the virtualization and server industry. VMware has always wanted their partners and support personnel to be up-to-date on the latest and greatest software. But, as I will explain, the move to remove the mandatory recertification requirement says more about the fact that certifications are less about selling and more about supporting.

The Paper Escalator

Recertification is a big money maker for companies. Sure, you’re spending a lot money on things like tests and books. But those aren’t usually tied to the company offering the certification. Instead, the testing fees are given to the testing center, like Pearson, and the book fees go to the publisher.

The real money maker for companies is the first-party training. If the company developing the certification is also offering the training courses you can bet they’re raking in the cash. VMware has done this for years with the classroom requirement for the VCP. Cisco has also started doing in with their first-party CCIE training. Cisco’s example also shows how quality first-party content can drive out the third parties in the industry by not even suggesting to prospective candidates that this is another option to get their classroom materials.

I’ve railed against the VCP classroom requirement before. I think forcing your candidates to take an in-person class as a requirement for certification is silly and feels like it’s designed to make money and not make good engineers. Thankfully, VMware seems to agree with me in the latest release of info. They’re allowing the upgrade path to be used for their recertification process, which doesn’t necessarily require attendance in a classroom offering. I’d argue that it’s important to do so, especially if you’re really out of date with the training. But not needing it for certification is a really nice touch.

Keeping the Lights On

The other big shift with this certification change from VMware is the tacit acknowledgement that people aren’t in any kind of rush to upgrade their software right after the newest version is released. Ask any system administrator out there and they’ll tell you to wait for a service pack before you upgrade anything. System admins for VMware are as cautious as anyone, if not moreso. Too often, new software updates break existing functionality or cause issues that can’t be fixed without a huge time investment.

How is this affected by certification? Well, if I spent all my time learning VMware 5.x and I got my VCP on it because my company was using it you can better believe that my skill set is based around VCP5. If my company doesn’t decide to upgrade to 6.x or even 7.x for several years, my VCP is still based on 5.x technology. It shouldn’t expire just because I never upgraded to 6.x. The skills that I have are focused on what I do, not what I’m studying. If my company finally does decide to move to 6.x, then I can study for and receive my VCP on that version. Not before.

Companies love to make sure their evangelists and resellers are all on the latest version of their certifications because they see certifications as a sales tool. People certified in a technology will pick that solution over any others because they are familiar with it. Likewise, the sales process benefits from knowledgable sales people that understand the details behind your solution. It’s a win-win for both sides.

What this picture really ignores is the fact that a larger number of non-reseller professionals are actually using the certification as a study guide to support their organization. Perhaps they get certified as a way to get better support terms or a quicker response to support calls. Maybe they just learned so much about the product along the way that they want to show off what they’ve been doing. No matter what the reason, it’s very true that these folks are not in a sales role. They’re the support team keeping the lights on.

Support doesn’t care about upgrading at the drop of a hat. Instead, they are focused on keeping the existing stuff running as long as possible. Keeping users happy. Keeping executives happy. Keeping people from asking questions about availability or services. That’s not something that looks good on a bill of materials. But it’s what we all expect. Likewise, support isn’t focused on new things if the old things keep running. Certification, for them, is more about proving you know something instead of proving you can sell something.


Tom’s Take

I’ve had so many certifications that I don’t even remember them all. I got some of them because we needed it to sell a solution to a customer. I got others to prove I knew some esoteric command in a forgotten platform. But, no matter what else came up, I was certified on that platform. Windows 2000, NetWare 6.x, you name it. I was certified on that collection of software. I never rushed to get my certification upgraded because I knew what the reality of things really was. I got certified to keep the lights on for my customers. I got certified to help the people that believed in my skills. That’s the real value of a certification to me. Not sales. Just keeping things running another month.

Advertisements

Risking It All

When’s the last time you thought about risk? It’s something we have to deal with every day but hardly ever try to quantify unless we work in finance or a high-stakes job. When it comes to IT work, we take risks all the time. Some are little, like deleting files or emails thinking we won’t need them again. Or maybe they’re bigger risks, like deploying software to production or making a change that could take a site down. But risk is a part of our lives. Even when we can’t see it.

Mitigation Revelations

Mitigating risk is the most common thing we have to do when we analyze situations where risk is involved. Think about all the times you’ve had to create a backout plan for a change that you’re checking in. Even having a maintenance window is a form of risk mitigation. I was once involved in a cutover for a metro fiber deployment that had to happen between midnight and 2 am. When I asked why, the tech said, “Well, we don’t usually have any problems, but sometimes there’s a hiccup that takes the whole network down until we fix it. This way, there isn’t as much traffic on it.”

Risk is easy to manage when you compartmentalize it. That’s why we’re always trying to push risk aside or contain the potential damage from risk. In some cases, like a low-impact office that doesn’t rely on IT, risk is minimal at best. Who cares if we deploy a new wireless AP or delete some files? The impact is laughable if one computer goes offline. For other organizations, like financial trading or healthcare, the risks of downtime are far greater. Things that could induce downtime, such as patches or changes, must be submitted, analyzed, approved, and implemented in such a way as to ensure there is no downtime and no chance for failure.

Risk behaves this way no matter what we do. Sometimes our risks are hidden because we don’t know everything. Think about bugs in release code, for example. If we upgrade to a new code train to fix an existing bug or implement a new feature we are assuming the code passed some QA checks at some point. However, we’re still accepting a risk that the new code will contain a bug that is worse or force us to deal with new issues down the road. New code upgrades have even more stringent risk mitigation, such as bake-in periods or redundancy requirements before being brought online. Those protections are there to protect us.

Invisible Investments In Problems

But what about risks that we don’t know about? What if those risks were minimized before we ever had a chance to look for them in a shady way?

For example, when I had LASIK surgery many years ago, I was handed a pamphlet that was legally required to be handed to me. I read through the procedure, which included a risk of possible side effects or complications. Some of them were downright scary, even with a low percentage chance of occurring. I was told I had to know the risks before proceeding with the surgery. That way, I knew what I was getting into in case one of those complications happened.

Now, legal reasons aside, why would the doctor want to inform me of the risks? It makes it more likely that I’m not going to go through with the procedure if there are significant risks. So why say anything at all unless you’re forced to? Many of you might say that the doctor should say something out of the goodness or morality of their own heart, but the fact a law exists that requires disclosure should tell you about the goodness in people’s hearts.

Medical providers are required to reveal risk. So are financial planners and companies that provide forward looking statements. But when was the last time that a software company disclosed potential risk in their platform? After all, their equipment or programs could have significant issues if they go offline or are made to break somehow. What if there is a bug that allows someone to hack your system or crash your network? Who assumes the risk?

If your provider doesn’t tell you about the risks or tries to hide them in the sales or installation process, they’re essentially accepting the unknown risk on your behalf for you. If they know there is a bug in the code that could cause a hospital core network to melt down or maybe reboot a server every 180 days like we had to do with an unpatched CallManager 6.0 server then they’ve accepted that risk silently and passed it along to you. And if you think you’re going to be able to sue them or get compensation back from them you really need to read those EULAs that you agree to when you install things.

Risky Responsibility

The question now becomes about the ultimate responsibility. These are the “unknown unknowns”. We can’t ask about things we don’t know about. How could we? So it’s up to people with the knowledge of the risk to disclose it. In turn, that opens them up to some kinds of risk too. If my method of mitigating the risk in your code is to choose not to purchase your product, then you have to know that it was less risk for me to choose that route. Sure, it’s more risky for you to disclose it, but the alternative could lead to a big exposure.

Risk is a balance. In order to have the best balance we need to have as much information as possible in order to put plans in place to mitigate it to our satisfaction. Some risks can’t be entirely mitigated. But failure to disclose risks to prevent someone from making a sale or implementing a technology is a huge issue. And if you find out that it happened to you then you absolutely need to push back on it. Because letting someone else accept the risk on your behalf in secret will only lead to problems down the road.


Tom’s Take

Every change I checked into a network during production hours was a risk. Some of them were minor. Others were major. And still others were the kind that burned me. I accepted those risks for myself but I always made sure to let the customer I was working for know about them. There was no use in hiding information about a change that could take down the network or delete data. Sure, it may mean holding off on the change or making it so that we needed to find an alternative method. But the alternative was hurt feelings at best and legal troubles at worst. Risk should never be a surprise.

Wi-Fi 6 Is A Stupid Branding Idea

You’ve probably seen recently that the Wi-Fi Alliance has decided the rebrand the forthcoming 802.11ax standard as “Wi-Fi CERTIFIED 6”, henceforth referred to as “Wi-Fi 6”. This branding decision happened late in 2018 and seems to be picking up steam in 2019 as 802.11ax comes closer to ratification later this year. With manufacturers shipping 11ax access points already and the consumer market poised to explode with the adoption of a new standard, I think it’s time to point out to the Wi-Fi Alliance that this is a dumb branding idea.

My Generation

On the surface, the branding decision looks like it makes sense. The Wi-Fi alliance wants to make sure that consumers aren’t confused about which wireless standard they are using. 802.11n, 802.11ac, and 802.11ax are all usable and valid infrastructure that could be in use at any one time, as 11n is 2.4GHz, 11ac is 5GHz, and 11ax encompasses both. According to the alliance, there will be a number displayed on the badge of the connection to denote which generation of wireless the client is using.

Except, it won’t be that simple. Users don’t care about speeds. They care about having the biggest possible number. They want that number to be a 6, not a 5 or a 4. Don’t believe me? AT&T released an update earlier this month that replaced the 4G logo with a 5G logo even when 5G service wasn’t around. Just so users could say they had “5G” and tell their friends.

Using numbers to denote generations isn’t a new thing in software either. We use version numbers all the time. But using those version numbers as branding usually leads to backlash in the community. Take Fibre Channel, for example. Brocade first announced they would refer to 16GB fibre channel as “Gen 5”, owing to the fifth generation of the protocol. Gen 6 was 32GB and so on. But, as the chart on this fibre channel page shows, they worked themselves into a corner. Gen 7 is both 64GB and 256GB depending on what you’re deploying. Even Gen 6 was both 32GB and 128GB. It’s confusing because the name could be many things depending on what you wanted it to mean. Branding doesn’t denote clear information.

Subversion of Versions

The Wi-Fi Alliance decision also doesn’t leave room for expansion or differentiation. For example, as I mentioned in a previous post on Gestalt IT, 802.11ax doesn’t make the new OWE spec mandatory. It’s up to the vendors to implement this spec as well as other things that have been made option, as upstream MU-MIMO is rumored to become as well. Does that mean that if I include both of those protocols as options that my protocol is Wi-Fi 6.1? Or could I even call it Wi-Fi 7 since it’s really good?

Windows has had this problem going all the way back to Windows 3.0. Moving to Windows 3.1 was a huge upgrade, but the point release didn’t make it seem that way. After that, Microsoft started using branding names by year instead of version numbers. But that still caused issues. Was Windows 98 that much better than 95? Were they both better than Windows NT 4? How about 2000? Must be better, right? Better than Windows 99?1

Windows even dropped the version numbers for a while with Windows XP (version 5.1) and Windows Vista (version 6.0) before coming back to versioning again with Windows 7 (version 6.1) and Windows 8 (version 6.2) before just saying to hell with it and making Windows 10 (version 10.0). Which, according to rumor, was decided on because developers may have assumed all legacy consumer Windows versions started with ‘9’.

See the trouble that versioning causes when it’s not consistent? What happens if the next minor revision to the 802.11ax specification doesn’t justify moving to Wi-Fi 7? Do you remember how confusing it was for consumers when we would start talking about the difference between 11ac Wave 1 and Wave 2? Did anyone really care? They just wanted the fastest stuff around. They didn’t care what wave or what version it was. They just bought what the sticker said and what the guy at Best Buy told them to get.

Enterprise Nightmares

Now, imagine the trouble that the Wi-Fi Alliance has potentially caused for enterprise support techs with this branding decision. What will we say when the users call in and say their wireless is messed up because they’re running Windows 10 and their Wi-Fi is on 6 still? Or if their cube neighbor has a 6 on their Wi-Fi but my Mac doesn’t?2

Think about how problematic it’s going to be when you try to figure out why someone is connecting to Wi-Fi 5 (802.11ac) instead of Wi-Fi 6 (802.11ax). Think about the fights you’re going to have about why we need to upgrade when it’s just one number higher. You can argue power savings or better cell sizes or more security all day long. But the jump from 5 to 6 really isn’t that big, right? Can’t we just wait for 7 and make a really big upgrade?


Tom’s Take

I think the Wi-Fi Alliance tried to do the right thing with this branding. But they did it in the worst way possible. There’s going to be tons of identity issues with 11ax and Wi-Fi 6 and all the things that are going to be made optional in order to get the standard ratified by the end of the year. We’re going to get locked into a struggle to define what Wi-Fi 6 really entails while trying not to highlight all the things that could potentially be left out. In the end, it would have been better to just call it 11ax and let users do their homework.


  1. You’d be shocked the number of times I heard it called that on support calls ↩︎
  2. You better believe Apple isn’t going to mar the Airport icon in the system bar with any stupid numbers ↩︎

iPhone 11 Plus Wi-Fi 6 Equals Undefined?

I read a curious story this weekend based on a supposed leak about the next iPhone, currently dubbed the iPhone 111. There’s a report that the next iPhone will have support for the forthcoming 802.11ax standard. The article refers to 802.11ax as Wi-Fi 6, which is a catch branding exercise that absolutely no one in the tech community is going to adhere to.

In case you aren’t familiar with 802.11ax, it’s essentially an upgrade of the existing wireless protocols to support better client performance and management across both 2.4GHz and 5GHz. Unlike 802.11ac, which was rebranded to be called Wi-Fi 5 or 802.11n, which curiously wasn’t rebranded as Wi-Fi 4, 802.11ax works in both bands. There’s a lot of great things on the drawing board for 11ax coming soon.

Why did I say soon? Because, as of this writing, 11ax isn’t a ratified standard. According to this FAQ from Aerohive, the standard isn’t set to be voted on for final ratification until Q3 of 2019. And if anyone wants to see the standard pushed along faster it would be Aerohive. They were one of, if not the, first company to bring a 802.11ax access point to the market. So they want to see a standard piece of equipment for sure.

Making pre-standard access points isn’t anything new to the market. Manufacturers have been trying to get ahead of the trends for a while now. I can distinctly remember being involved in IT when 802.11n was still in the pre-standard days. One of our employees brought in a Belkin Pre-N AP and client card and wanted us to get it working because, in his words, “It will cover my whole house with Wi-Fi!”

Sadly, we ended up having to ditch this device once the 802.11n standard was finalized. Why? Because Belkin had rushed it to the market and tried to capitalize on the fervor of people wanting fast connection speeds. The AP only worked with the PCMCIA client card sold by Belkin. Once you started to see ratified 802.11n devices they were incompatible with the Belkin AP and fell back to 802.11g speeds.

Belkin wasn’t the only manufacturer that was trying to get ahead of the curve. Cisco also pushed out the Aironet 1250, which had detachable lobes that could be pulled off and replaced. Why? Because they were shipping a draft 802.11n piece of hardware. They claimed that anyone purchasing the draft spec hardware could send in the lobes and get an upgrade to ratified hardware as soon as it was finalized. Except, as a rushed product the 1250 also consumed lots of power, ran hot, and generally had very low performance compared to the APs that came out after the ratification process was completed.

We’re seeing the same rush again with 802.11ax. Everyone wants to have something new when the next refresh cycle comes up. Instead of pushing people toward the stable performance of 802.11ac Wave 2 with proper design they are going out on a limb. Manufacturers are betting on the fact that their designs are going to be software-upgradable in the end. Which assumes there won’t be any major changes during the ratification process.

Cupertino Doesn’t Guess

One of the major criticism points of 802.11ax is that there is not any widespread adoption of clients out there to push us to need 802.11ax APs. The client vs. infrastructure argument is always a tough one. Do you make the client adapter and hope that someone will eventually come out with hardware to support it? Or do you choose to instead wait for the infrastructure to jump up in speed and then buy a client adapter to support it?

I’m usually one revision behind in most cases. My home hardware is running 802.11ac Wave 2 currently, but my devices were 11ac capable long before I installed any Meraki or Ubiquiti equipment. So my infrastructure was playing catchup with my clients. But not everyone runs the same gear that I do.

One of the areas where this is more apparent is not in the Wi-Fi realm but instead in the carrier space. We’re starting to hear that carriers like AT&T are deploying 5G in many cities even though there aren’t many 5G capable handsets. And, even when the first 5G handsets start hitting the market, the smart money says to avoid the first generation. Because the first generation is almost always hot, power hungry, and low performing. Sound familiar?

You want to know who doesn’t bet on non-standard technology? Apple. Time and again, Apple has chosen to take a very conservative approach to introducing new chipsets into their devices. And while their Wi-Fi chipsets often seen upgrades long before their cellular modems do, you can guarantee that they aren’t going to make a bet on non-standard technology that could potentially hamper adoption of their flagship mobile device.

A Logical Approach

Let’s look at it logically for a moment. Let’s assume that the standards bodies get off their laurels and kick into high gear to get 802.11ax ratified at the end of Q2. That’s just after Apple’s WWDC. Do you think Apple is going to wait until post-WWDC to decide what chipsets are going to be in the new iPhone? You bet your sweet bandwidth they aren’t!

The chipset decisions for the iPhone 11 are being made right now in Q1. They want to know they can get sufficient quantities of SoCs and modems by the time manufacturing has to ramp up to have them ready for stores in October. That means you can’t guess whether or not a standard is going to be approved in time for launch. Q3 2019 is during the iPhone announcement season. Apple is the most conservative manufacturer out there. They aren’t going to stake their connectivity on an unproven standard.

So, let’s just state it emphatically for the search engines: The iPhone 11 will not have 802.11ax, or Wi-Fi 6, support. And anyone trying to tell you differently is trying to sell you a load of marketing.

The Future of Connectivity

So, what about the iPhone XII or whatever we call it? That’s a more interesting discussion. And it hinges on something I heard in a recent episode of a new wireless podcast. The Contention Window was started by my friends Tauni Odia and Scott Lester. In Episode 1, they have their big 2019 predictions. Tauni predicted that 802.11ax won’t be ratified in 2019. I agree with her assessment. Despite the optimism of the working group these things tend to take longer than expected. Which means Q4 2019 or perhaps even Q1 2020.

If 802.11ax ratification slips into 2020 you’ll see Apple taking the same conservative approach to adoption. This is especially true if the majority of deployed infrastructure APs are still pre-standard. Apple would rather take an extra year to get things right and know they won’t have any bugs than to rush something to the market in the hopes of selling a few corner-case techies on something that doesn’t have much of an impact on speeds in the long run.

However, if the standards bodies prove us all wrong and push 11ax ratification through we should see it in the iPhone X+2. A mature technology with proper support should be seen as a winner. But you should see them move telegraphed far in advance with adoption of the 11ax radios in the MacBook Pro first. Once the bigger flagship computing devices get support it will trickle down. This is just an economic concern. The MacBook has more room in the case for a first-gen 11ax chip. Looser thermal tolerances and space considerations means more room to make mistakes.

In short: Don’t expect an 11ax (or Wi-Fi 6) chip before 2020. And if you’re betting the farm on the iPhone, you may be waiting a long time.


Tom’s Take

I like the predictions of professionals with knowledge over leaks with dubious marketing value. The Contention Window has lots of good information about why 802.11ax won’t be ratified any time soon. A report about a leaked report that may or may not be accurate holds a lot less value. Don’t listen to the hype. Listen to the people who know what they’re talking about, like Scott and Tauni for example. And don’t stress about having the newest, fastest wireless devices in your house. Odds are way better that you’re going to have to buy a new AP for Christmas this year than the hope of your next iPhone support 802.11ax. But the one thing we can all agree on: Wi-Fi 6 is a terrible branding decision!


  1. Or I suppose the XI if you’re into Roman numerals ↩︎

What Makes IoT A Security Risk?

IoT security is a pretty hot topic in today’s world. That’s because the increasing number of smart devices is causing issues with security professionals everywhere. Consumer IoT devices are expected to top 20 billion by 2020. And each of these smart devices represents an attack surface. Or does it?

Hello, Dave

Adding intelligence to a device increases the number of ways that it can compromised. Take a simple thermostat, for example. The most basic themostat is about as dumb as you can get. It uses the expansion properties of metal to trigger switches inside of the housing. You set a dial or a switch and it takes care of the rest. Once you start adding things like programmability or cloud connection, you increase the number of ways that you can access the device. Maybe it’s a webpage or an app. Maybe you can access it via wireless or Bluetooth. No matter how you do it, it’s more available than the simple version of the thermostat.

What about industrial IoT devices? The same rule applies. In this case, we’re often adding remote access to Supervisory Control And Data Acquistion (SCADA) systems. There’s a big market from enterprise IT providers to create secured equipment that allows access to existing industrial equipment from centralized control dashboards. It makes these devices “smart” and allows you to make them easier to manage.

Industrial IoT has the same kind of issues that consumer devices do. We’re increasing the number of access avenues to these devices. But does that mean they’re a security risk? The question could be as simple as asking if the devices are any easier to hack than their dumb counterparts. If that is our only yardstick, then the answer is most assuredly yes they are a security risk. My fridge does not have the ability for me to access it over the internet. By installing an operating system and connecting it to the wireless network in my house I’m increasing the attack surface.

Another good example of this increasing attack surface is in home devices that aren’t consumer focused. Let’s take a look at the electrical grid. Our homes are slowly being upgraded with so-called “smart” electrical meters that allow us to have more control over power usage in our homes. It also allows the electric companies to monitor the devices more closely and read the electric meters remotely instead of needing to dispatch humans to read those meters. These smart meters often operate on Wi-Fi networks for ease-of-access. If all we do is add the meters to a wireless network, are we really creating security issues?

Bigfoot-Sized Footprints

No matter how intelligent the device, increasing access avenues to the device creates security access issues. A good example of this is the “hidden” diagnostic port on the original Apple Watch. Even though the port had no real use beyond internal diagnostics at Apple, it was a tempting target for people to try and get access to the system. Sometimes these hidden ports can dump hidden data or give low-level access to areas of the system that aren’t normally available. While the Apple Watch port didn’t have this kind of access, other devices can offer it.

Giving access to any device allows you to attack it in a way that can gain you access that can get you into data that you’re not supposed to have. Sure, a smart speaker is a very simple device. But what if someone found a way to remotely access the data and capture the data stream? Or the recording buffer? Most smart speakers are monitoring audio data listening for their trigger word to take commands. Normally this data stream is dumped. But what if someone found a way to reconstruct it? Do you think that could qualify as a hack? All it takes is an enterprising person to figure out how to get low-level access. And before you say it’s impossible, remember that we allow access to these devices in other ways. It’s only a matter of time before someone finds a hole.

As for industrial machines, these are even more tempting. By gaining access to the master control systems, you can cause some pretty credible havoc with their programming. You can shut down all manner of industrial devices. Stuxnet was a great example of writing a very specific piece of malware that was designed to cause problems for a specific kind of industrial equipment. Because of the nature of the program it was very difficult to figure out exactly what was causing the issues. All it took was access to the systems, which was reportedly caused by hiding the program on USB drives and seeding them in parking lots where they would be picked up and installed in the target facilities.

IoT devices, whether consumer or enterprise, represent potential threat vectors. You can’t simply assume that a simple device is safe because there isn’t much to hack. The Mirai bonnet exploited bad password hygiene in devices to allow them to be easily hacked. It wasn’t a complicated silicon-level hack or a coordinated nation state effort. It was the result of someone cracking a hard-coded password and exploiting that for their own needs. Smart devices can be made to make dumb decisions when used improperly.


Tom’s Take

IoT security is both simple and hard at the same time. Securing these devices is a priority for your organization. You may never have the compromised, but you have to treat them just like you would any other device that could potentially be hacked and turned against you. Zero-trust security models are a great way to account for this, but you need to make sure you’re not overlooking IoT when you build that model. Because the invisible devices helping us get our daily work done could quickly become the vector for hacking attacks that bring our day to a grinding halt.

2019 Is The King of Content

2018 was a year full of excitement and fun. And for me, it was a year full of writing quite a bit. Not only did keep up my writing here for my audience but I also wrote quite a few posts for GestaltIT.com. You can find a list of all the stuff I wrote right here. I took a lot of briefings from up-and-coming companies as well as talking to some other great companies and writing a couple of series about SD-WAN.

It was also a big year for the Gestalt IT Rundown. My co-host with most Rich Stroffolino (@MrAnthropology) and I had a lot of fun looking at news from enterprise IT and some other fun chipset and cryptocurrency news. And I’ve probably burned my last few bridges with Larry Ellison and Mark Zuckerberg to boot. I look forward to recording these episodes every Wednesday and I hope that some of you will join us on the Gestalt IT Facebook page at 12:30 EST as well.

Content Coming Your Way

So, what does that leave in store for 2019? Well, since I hate predictions on an industry scale, that means taking a look at what I plan on doing for the next year. For the coming 365 days, that means creating a lot of content for sure. You already know that I’m going to be busy with a variety of fun things like Networking Field Day, Mobility Field Day, and Security Field Day. That’s in addition to all the things that I’m going to be doing with Tech Field Day Extra at Cisco Live Europe and Cisco Live US in San Diego.

I’m also going to keep writing both here and at Gestalt IT. You probably saw my post last week about how hard it is to hit your deadlines. Well, it’s going to be a lot of writing coming out in both places thanks to coverage of briefings that I’m taking about industry companies as well as a few think pieces about bigger trends going on in the industry.

I’m also going to experiment more with video. One of the inspirations that I’m looking at is none other than my good friend Ethan Banks (@ECBanks). He’s had some amazing videos series that he’s been cranking out on his daily walks. He’s been collecting some of them in the Brain Spasms playlist. It’s a really good listen and he’s tackling some fun topics so far. I think I’m going to try my hand at some solo video content in the future at Gestalt IT. This blog is going to stay written for the time being.

Creating Content Quickly

One of the other things that I’m playing around with is the idea of being able to create content much more quickly and on the spot versus sitting down for long form discussions. You may recall from a post in 2015 that I’ve embraced using Markdown. I’ve been writing pretty consistently in Markdown for the past three years and it’s become second nature to me. That’s a good thing for sure. But for 2019, I’m going to branch out a bit.

The biggest change is that I’m going to try to do the majority of my writing on an iPad instead of my laptop. This means that I can just grab a tablet and type out some words quickly. It also means that I can take notes on my iPad and then immediately translate them into thoughts and words. I’m going to do this using iA Writer as my content creation tool. It’s going to help me with my Markdown as well as helping me keep all the content I’m going to write organized. I’m going to force myself to use this new combination unless there’s no way I can pull it off, such as with my Cisco Live Twitter list. That whole process still relies quite a bit on code and not on Markdown.

As I mentioned in my deadline post, I’m also going to try to move my posting dates back from Friday to Wednesday or Thursday at the latest. That gives me some time to play around with ideas and still have a cushion before I’m late with a post. On the big days I may still have an extra post here or there to talk about some big news that’s breaking. I’m hoping this allows me to get some great content out there and keep the creative juices flowing.


Tom’s Take

2019 is going to be a full year. But it allows me to concentrate on the things that I love and am really good at doing: Writing and leading Tech Field Day. Maybe branching out into video is going to give me a new avenue as well, but for now that’s going to stay pretty secondary to the writing aspect of things. I really hope that having a more mobile writing studio also helps me get my thoughts down quickly and create some more compelling posts in the coming year. Here’s hoping it all works out and I’ve got some great things to look back on in 365 days!

 

Meeting Your Deadlines Is Never Easy

2018 has been a busy year. There’s been a lot going on in the networking world and the pace of things keeps accelerating. I’ve been inundated with things this last month, including endless requests for my 2019 predictions and where I think the market is going. Since I’m not a prediction kind of person, I wanted to take just a couple of moments to talk more about something that I did find interesting from 2018 – deadlines.

Getting It Out The Door

Long-time readers of this blog may remember that I’ve always had a goal set for myself of trying to get one post published every week. It’s a deadline I set for myself to make sure that I didn’t let my blog start decaying into something that is barely updated. I try to hold fast to my word and get something new out every week. Sometimes it’s simple, like reflections on one of the various Tech Field Day events that I’m working on that week. But there’s always something.

That is, until Cisco Live this year. I somehow got so wrapped up in things that I missed a post for the first time in eight years! Granted, this was the collection of several things going on at once:

  1. I was running Tech Field Day Extra during Cisco Live. So I was working my tail off the entire time.
  2. I was at Cisco Live, which is always a hugely busy time for me. Even when I’m not doing something specific to the event it’s social hour every hour.
  3. I normally write posts on Thursday afternoon to publish Friday this year. Guess what happened on Thursday at Cisco Live after we all said goodbye? I went on vacation with my family to Disney World. So I kind of forgot that I didn’t publish anything until Sunday afternoon.

The perfect confluence of factors led to me missing a deadline. Now, I’ve missed it again once more this year and totally forgotten to write something until the Monday following my deadline. And it’s even more frustrating when it’s something I totally could have controlled but didn’t.

Why the fuss? I mean, it’s not like all my readers are going to magically run away if I don’t put something out today or tomorrow. While that is very true, it’s more for me that I don’t want to forget to put content out. More than any other thing, scheduling your content is the key to keeping your readers around.

Think about network television. For years, they advertised their timeslots as much as they advertised their shows. Must-See Thursday. TGIF. Each of these may conjure images of friendly shows or of full houses. But you remember the day as much as you remember the shows, right? That’s because the schedule became important. If you don’t think that matters, imagine the shows that are up against big events or keep getting bumped because of sporting events. There’s a reason why Sunday evening isn’t a good time for a television show. Or why no one tries to put something up against the Super Bowl.

Likewise, schedules are important for blogging. I used to just hit publish on my posts whenever I finished them. That meant sending them out at 9pm on a Tuesday some times. Not the best time for people to want to dive into a technical post. Instead, I started publishing them in the mornings after I wrote them. That means more eyeballs and more time to have people reflect on them. I’ve always played around with the daily schedule of when to publish, but in 2018 it got pushed to Friday out of necessity. I kept running out of time. Instead of focusing on the writing, I would often wake up Friday morning with writer’s block and just churn something out to hit my deadline.

Writing because you have to is not fun. Wracking your brain to come up with some topic of conversation is stressful. Lee Badman has been posting questions every weekday morning to the wireless community for a long while and he’s decided that it’s run its course. I applaud Lee for stepping away from something like that before it became a chore. It’s not easy to leave something behind that has meant a lot to you.

Write Like The Wind

For me, blogging is still fun. I still very much enjoy sitting down in front of a computer keyboard and getting some great thoughts out there. I find my time at Tech Field Day events has energized my writing to a large degree because there is so much good content out there that needs to be discussed and indexed. I still enjoy pouring my thoughts out onto a piece of digital paper for everyone to read.

Could I cut back to simple reaction posts? Sure. But that’s not my style. I started blogging because I like the long-form of text. I’ve written some quick sub-500 word pieces because I needed to get something out. But those are the exceptions to the rule. I’d rather keep things thoughtful and encourage people to spend more time focusing on words.

I think the biggest thing that I need to change in the posting dates. I need to move back from Friday to give myself some headroom to post. I also need to use Friday as my last-ditch day to get things published. That may mean putting more thought to my posts earlier in the week for sure. It may also mean having two posts on weeks that big news breaks. But that’s the life of a writer, isn’t it?

Home Away From Home

The third biggest challenge for deadlines is all the other writing that I’m doing. I spend a lot of time taking briefings and such for Gestalt IT, which I affectionately refer to as my “Bruce Wayne” job. I get to hear a lot of fun stories and see a lot of great companies just starting out in the world. I write a lot over there because it’s how I keep up with the industry. Remember that year that I went crazy and wrote two posts every week for an entire year? Yeah, good times. Guess what? It’s going to be like that again!

Gestalt IT is going to be my writing source for most of my briefings and coverage of companies. It’s going to have a much different tone that this blog does. Here is when I’m going to spend more time pontificating and looking at big trends in technology. Or perhaps it will be stirring the pot. But I still plan on getting out one post a week about some topic. And I won’t be posting it on Friday unless I absolutely have to.


Tom’s Take

It’s no stretch to say that writing is something I do better than anything else. It’s also something I love to do. I want to do my best to keep bringing good content to everyone out there that likes to read my blog. I’m going to spend some time exploring new workflows and trying to keep the hits coming along as 2019 rolls around. I’ll have more to say on that in my usual January 1 post to kick off the new year!