Gathering No MOS

mossBall1

If you work in the voice or video world, you’ve undoubtedly heard about Mean Opinion Scores (MOS). MOS is a rough way of ranking the quality of the sound on a call. It’s widely used to determine the over experience for the user on the other end of the phone. MOS represents something important in the grand scheme of communications. However, MOS is quickly becoming a crutch that needs some explanation.

That’s Just Like Your Opinion

The first think to keep in mind when you look at MOS data is that the second word in the term is opinion. Originally, MOS was derived by having selected people listen to calls and rank them on a scale of 1 (I can’t hear you) to 5 (We’re sitting next to each other). The idea was to see if listeners could distinguish when certain aspects of the call were changed, such as pathing or exchange equipment. It was an all-or-nothing ranking. Good calls got a 4 or even rarely a 5. Most terrible calls got 2 or 3. You take the average of all your subjects and that gives your the overall MOS for your system.

voip-qualitypbx

When digital systems came along, MOS took on an entirely different meaning. Rather than being used to subjectively rank call quality, MOS became a yardstick for tweaking the codec used to digitally transform analog speech to digital packets. Since this has to happen in order for the data to be sent, all digital calls must have a codec somewhere. The first codecs were trying to approximate the quality of a long distance phone call, which was the gold standard for quality. After that target was reached, providers started messing around the codecs in question to reduce bandwidth usage.

G.711 is considered the baseline level of call quality from which all others are measure. It has a relative MOS of 4.1, which means very good voice quality. It also uses around 64 kbps of bandwidth. As developers started playing with encoding schemes and other factors, they started developing codecs which used significantly less bandwidth and had almost equivalent quality. G.729 uses only 8 kbps of bandwidth but has a MOS of 3.9. It’s almost as good as G.711 in most cases but uses an eighth of the resources.

MOS has always been subjective. That was until VoIP system providers found that certain network metrics have an impact on the quality of a call. Things like packet loss, delay, and jitter all have negative impacts on call quality. By measuring these values a system could give an approximation of MOS for an admin without needing to go through the hassle of making people actually listen to the calls. That data could then be provided through analytics dashboards as an input into the overall health of the system.

Like A Rolling Stone

The problem with MOS is that it has always been subjective. Two identical calls may have different MOS scores based on the listener. Two radically different codecs could have similar MOS scores because of simple factors like tonality or speech isolation. Using a subjective ranking matrix to display empirical data is unwieldy at best. The only reason to use MOS as a yardstick is because everyone understands what MOS is.

Enter R-values. R-values take inputs from the same monitoring systems that produce MOS and rank those inputs on a scale of 1 – 100. Those scores can then be ranked with more precision to determine call quality and VoIP network health. A call in the 90s is a great call. If things dip in the 70s or the 60s, there are major issues to identify. R-values solve the problem of trying to bolt empirical data onto a subjective system.

Now that communications is becoming more and more focused on things like video, the need for analytics around them is becoming more pronounced. People want to track the same kinds of metrics – codec quality, packet loss, delay, and jitter. But there isn’t a unified score that can be presented in green, yellow, and red to let people know when things are hitting the fan.

It has been suggested that MOS be adapted to reference video in addition to audio. While the idea behind using a traditional yardstick like MOS sounds good on the surface, the reality is that video is a much more complicated thing that can’t be encompassed by a 50-year-old ranking method like MOS.

Video calls can look horrible and sound great. They can have horrible sound and be crystal clear from a picture perspective. There are many, many subjective pieces that can go into ranking a video call. Trying to shoehorn that into a simple scale of 5 values is doing a real disservice to video codec manufacturers, not to mention the network operators that try and keep things running smoothly for video users.

R-value seems to be a much better way to classify analytics for video. It’s much more nuanced and capable of offering insight into different aspects of call and picture quality. It can still provide a ranked score for threshold measuring, but that rank is much more likely to mean something important for each number as opposed to the absolute values present in MOS.


Tom’s Take

MOS is an old fashioned idea that tries valiantly to tie the telecom of old to the digital age. People who understood subjective quality tried to pair it with objective analytics in an effort to keep the old world and the new world matched. But even communications is starting to eclipse these bounds. Phone calls have given way to email, texting, and video chats. Two of those are asynchronous and require no network reliability beyond up or down. Video, and all the other real-time digital communications, needs to have the right metrics and analytics to provide good feedback about how to improve the experience for users. And whatever we end up calling that composite metric or ranked algorithmic score, it shouldn’t be called MOS. Time to let that term grow some moss in the retirement bin.

 

CCIE Loses Its Voice

ccievThe world we live in is constantly adapting and changing to new communications methods.  I can still remember having a party line telephone when I was a kid.  I’ve graduated to using landlines, cellular phones, email, instant messaging, text messaging, and even the occasional video call.  There are more methods to contact people than I can count on both hands.  This change is also being reflected in the workforce as well.  People who just a few years ago felt comfortable having a desk phone and simple voice mail are now embracing instant messaging with presence integration and unified voice mail as well as single number reach to their mobile devices.  It’s a brave new world that a voice engineer is going to need to understand in depth.

To that end, Cisco has decided to retire the CCIE Voice in favor of an updated track that will be christened the CCIE Collaboration.  Note that they aren’t merely changing the blueprint like they have in the past with the CCIE SP or the CCIE R&S.  This is like the CCIE Storage being moved aside for the CCIE Data Center.  The radical shift in content of the exam should be a tip-off to the candidates that this isn’t going to be the same old voice stuff with a few new bells and whistles.

Name That Tune

The lab equipment and software list (CCO account required) includes a bump to CUCM 9.1 for the call processor, as well as various 9.x versions of Unity Connection, Presence, and CUCME.  There’s also a UCS C460, which isn’t too surprising with CUCM being a virtualized product now.  The hardware is rounded out with 2921 and 3925 routers as well as a 3750-X switch.  The most curious inclusion is the Cisco Jabber Video for Telepresence.  That right there is the key to the whole “collaboration” focus on this exam.  There is a 9971 phone listed as an item.  I can almost guarantee you’re going to have to make a video call from the 9971 to the video soft client in Cisco Jabber.  That’s all made possible thanks to Cisco’s integration of video in CUCM in 9.1.  This has been their strategy all along.

The CCIE Voice is considered one of the hardest certifications to get, even among the CCIE family.  It’s not that there is any one specific task to configure that just wrecks candidates.  The real issue is the amount of tasks that must be configured.  Especially when you consider that a simple 3-point task to get the remote site dial plan up and running could take a couple of hours of configuration.  Add in the integrated troubleshooting section that requires you to find a problem after you’ve already configured it incorrectly and you can see why this monster is such a hard test.  One has to wonder what adding video and other advanced topics like presence integration into the lab is going to do to the amount of time the candidate has to configure things.  It was already hard to get done in 8 hours.  I’m going to guess it’s downright impossible to do it in the CCIE Collaboration.  My best guess is that you are going to see versions of the test that are video-centric as well as ones that are voice-centric.  There’s going to be a lot of overlap between the two, but you can’t go into the lab thinking you’re guaranteed to get a video lab.

Hitting the Wrong Notes

There also seems to have been a lot of discussion about the retirement of the CCIE Voice track as opposed to creating a CCIE Voice version 4 track with added video.  In fact, there are some documents out there related to the CCIE Collaboration that reference a CCIE Voice v4.  The majority of discussion seems to be around the CCIE Voice folks getting “grandfathered” into a CCIE Collaboration title.  While I realize that the change in the name was mostly driven about the marketing of the greater collaboration story, I still don’t think that there should be any automatic granting of the Collaboration title.

The CCIE Collaboration is a different test.  While the blueprint may be 75% the same, there’s still the added video component to take into account (as well as cluster configuration for multiple CUCM servers).  People want an upgrade test to let the CCIE Voice become a CCIE Collaboration.  They have one already: the CCIE Collaboration lab exam.  If the title is that important, you should take that lab exam and pass it to earn your new credential.  The fact that there is precedent for this with the migration of the Storage track to Data Center shows that Cisco wants to keep the certifications current and fresh.  While Routing & Switching and Security see content refreshes, they are still largely the same at the core.  I would argue that the CCIE Collaboration will be a different exam in feel, even if not in blueprint or technology.  The focus on IM, presence and video means that there’s going to be an entirely different tone.  Cisco wants to be sure that the folks displaying the credential are really certified to work on it according to the test objectives.  I can tell you that there was serious consideration around allowing Storage candidates to take some sort of upgrade exam to get to the CCIE Data Center, but it looks like that was ultimately dropped in favor of making everyone go through the curriculum.  The retirement of the CCIE Voice doesn’t make you any less of a CCIE.  Like it or not, it looks like the only way to earn the CCIE Collaboration is going to be in the trenches.

It Ain’t Over Until…

The sunsetting officially starts on November 20th, 2013.  That’s the last day to take the CCIE Voice written.  Starting the next day (the 21st) you can only take the Collaboration written exam.  Thankfully, you can use either the Voice written or the Collaboration written exam to qualify for either lab.  That’s good until February 13, 2014.  That’s the last day to take the CCIE Voice lab.  Starting the next day (Valentine’s Day 2014), you will only be able to take the Collaboration lab exam.  If you want to get an idea of what is going to be tested on the lab exam, check out the document on the Cisco Learning Network (CCO account required).

If you’d like to read more about the changes from professional CCIE trainers, check out Vik  Malhi (@vikmalhi) on IPExpert’s blog.  You can also read Mark Snow’s (@highspeedsnow) take on things at INE’s blog.


Tom’s Take

Nothing lasts forever, especially in the technology world.  New gadgets and methods come out all the time to supplant the old guard.  In the world of communications and collaboration, Cisco is trying to blaze a trail towards business video as well as showing the industry that collaboration is more than just a desk phone and a voice mailbox.  That vision has seen some bumps along the way but Cisco seems to have finally decided on a course.  That means that the CCIE Voice has reached the apex of potential.  It is high time for something new and different to come along and push the collaboration agenda to the logical end.  Cisco has already created a new CCIE to support their data center ambitions.  I’m surprised it took them this long to bring business video and non-voice communications to the forefront.  While I am sad to see the CCIE Voice fade away, I’m sure the CCIE Collaboration is going to be a whole new barrel of fun.

Cisco Telepresence – The White Glove Treatment

I’ve spent the last week or so working on and training people to use a new Cisco Telepresence Profile 55″.  I like the fact that the whole unit is bundled together and only needs to have a couple of cables routed around and plugged in along with 4-6 screws to join everything together.  One thing that did bother me was that the system shipped with two desk microphones and no microphone cables.  I’m still trying to sort out that mess, but upon further investigation, I uncovered something else entirely strange to me.

The box contains all the accessories that were included in the bundle.  There’s the usual microphones (sans cables) and power cables and even a microfiber cleaning cloth.  But what are those white things in the plastic bags?  I wondered that myself.  At first, I thought they may have been bags to keep the microphones in.  After opening one, I found that I was totally wrong…

Uh, Whiskey. Tango. Foxtrot.  Those are indeed white cotton gloves.  The kind the a butler might wear when checking the mansion to ensure that everything is nice and tidy. They aren’t OEMed from anyone either, as you can tell by the Cisco logo on the tag.  Why on earth are these in the package?  I had to do a little searching to find the answer.

I can’t really tell if these are a holdover from the old Tandberg systems, but I have found references to them in the MX200 and MX300 installation posters.  According to the prose, it looks like you’re supposed to put them on when you begin installing the TV portion of the unit onto the base to ensure that you don’t transfer any oils or other strange things to the unit.  That’s a good idea in theory, but as well all know there is a world of difference between theory and practice.  If you’ve never picked up the monitor portion of a Profile 55″, it’s a 55″ TV surrounded by a metal cage and mounting bracket. It weighs between 80-100 pounds.  It’s not a flimsy thing.  Plus, with all that metal, it’s a very slippery surface bare handed, let alone if your hands are encased in soft, smooth cotton.  I could barely hold the cardboard box the gloves came in when I had one on.  I can just imagine the whole TV slipping out of my hands when I’m trying to secure it to the base.  Also suspect is the fact that the LCD screen comes out of the shipping container with a big plastic cover taped to the front.  There’s almost no chance of transferring anything onto the screen itself until the cover is peeled away.  Even if you do manage to smudge the metal case, there’s a microfiber cloth in the box too.  Why go to all the trouble of the white butler gloves?  I think more than anything else, this is mostly for appearances.  These things can’t serve any real purpose, and if the people responsible for wasting space in the accessories box feel differently, I invite them to come with me and do a couple of these assemblies and installations.  I can promise you that the gloves will get stripped off and thrown in the same pile as all those little static wrist straps.

Why Won’t AirPlay Work On My Macbook?

One of the major reasons why I decided to upgrade to OS X 10.8 Mountain Lion was for AirPlay mirroring.  AirPlay has been a nice function to have for people with an AirPlay receiver (basically an AppleTV) and an AirPlay source, like an iDevice.  I know of many people that like to watch a movie from iTunes on their iPad to start, then switch over to the big TV in the living room via AirPlay to the AppleTV.  That’s all well and good for those that want to stream movies or music.  However, my streaming needs are a little more advanced.  I’d rather be able to mirror my desktop to the AirPlay receiver instead, for things like presentations or demonstrations.  That functionality has only be available with software applications like AirParrot up until the release of Mountain Lion, which now has support for AirPlay mirroring on Macs.  Once the GM release of Mountain Lion came out, people started noticing that AirPlay was only supported on relatively new Apple hardware.  Even in cases where the CPU was almost identical to a later hardware release.  It seems a bit mind-boggling that Apple has a very limited specification list for AirPlay Mirroring.  The official site doesn’t even list it, as a matter of fact.  Essentially, any Mac made in 2011 or newer should be capable of supporting AirPlay.  So why did the 2010 Macs get left out?  They’re almost as good as their one-year-newer cousins.

The real answer comes down to the chipset.  Apple started shipping Macs with Intel’s Sandy Bridge chipset in 2011.  This enabled all kinds of interesting things, like Thunderbolt for instance.  There was one little feature down at the bottom of the list of Sandy Bridge spec sheets that didn’t mean much at the time – Intel QuickSync.  QuickSync is an application-specific integrated circuit (ASIC) that has been placed in the Sandy Bridge line of processors to allow high-speed video encoding and decoding.  This allows the Sandy Bridge i-series processors to offload video encoding to the ASIC to reduce the amount of CPU power consumed by performing video tasks.  Rather than tying up the CPU or the GPU of a machine, Sandy Bridge can use this ASIC to do very high speed encoding.  Why would this be a boon?  Well, for most people the idea was that QuickSync could reduce the amount of time that it took to do video production work on mid-range machines.  The problem was that QuickSync turned out lower quality video in favor of optimization for speed?  Where would you find an application that prioritized speed over quality?  If you guessed video streaming, you’d be spot on.  QuickSync supports high-speed encoding of H.264 video streams, which is the preferred format for Apple.  Mountain Lion can now access the QuickSync ASIC to mirror your desktop over to an AppleTV with almost no video lag.  The quality may not be the same as a Pixar rendering farm, but for 1080p video on a TV it’s close enough.

Any Mac made before the introduction of Sandy Bridge isn’t capable of running AirPlay mirroring, at least according to Apple.  Since they are missing the QuickSync ASIC, they aren’t capable of video encoding at the rate that Apple wants to in order to preserve the AirPlay experience.  While on the surface it looks like the same i-series processors are present in 2010 and 2011 machines, the older Macs are using the Clarksdale chipset, which does have a high-speed video decoder, but not an encoder.  Since the Mac is doing all the heavy lifting for the AppleTV in an AirPlay mirroring setup, having the onboard encoding ASIC is critical.  This isn’t the first time that Apple has locked out use of AirPlay.  If you want to AirPlay mirror from your favorite iDevice, you have to ensure that you’re running an iPhone 4S or an iPad 2 or iPad 3.  What’s different about them?  They’re all running the A5 dual-core chip.  Supposedly, the A5 helps with video-intensive tasks.  That says to me that Apple is big on using hardware to help accelerate video mirroring.  That’s not to say that you can’t do AirPlay mirroring with a pre-2011 Mac.  You’re just going to have to rely on a third party program to do it, like the aforementioned AirParrot.  Take note, though, that AirParrot is going to use your CPU to do all the encoding work for AirPlay.  While that isn’t going to be a big issue for simple presentations or showing your desktop, you should take care if you’re going to do any kind of processor-intensive activity, like firing up a bunch of virtual machines or compiling code.

Tom’s Take

Yes, it’s very irritating that Apple drew the line for AirPlay mirroring support at Sandy Bridge.  As it is with all technology refreshes, being on the opposite side of that line sucks big time.  You’ve got a machine that’s more than capable, yet some design guy said that you can’t hack it any more.  Sadly, these are the kinds of decisions that aren’t made lightly by vendors.  Rather than risk offering incomplete support of producing the kind of dodgy results that make for bad Youtube comparison videos, Apple took a hard line and leaned heavily on QuickSync for AirPlay mirroring support.  In another year it won’t matter much as people will have either upgraded their machines to support it if it’s a crucial need for them, or they’ll let it lie fallow and unused like FaceTime.  If you find yourself asking whether or not your machine can support AirPlay mirroring, just look for a Thunderbolt port.  If you’ve got one, you’re good to go.  Otherwise, you should look into a software solution.  There are lots of good ones out there that will help you out.  Based on Apple’s track record with the iDevices, I wouldn’t hold out hope that they’re going to enable AirPlay mirroring on pre-2011 Macs any time soon.  So, if AirPlay mirroring is something important to you, you’re either going to need to spring for a new Mac or get to work installing some software.

VADD – Video Attention Deficit Disorder

While I was at Cisco Live, I heard a lot about video.  In fact, “video is the new voice” was the center square on my John Chambers Keynote Bingo card.  With the advances that Cisco has been making with the various Jabber clients across all platforms, Cisco really wants to drive home the idea that customers want to use video in every aspect of their life.  This may even be borne out when you think about all the social networks that have been adding video capabilities, such as Facebook or Skype.  Then there’s the new launch of AirTime from the guys that brought you Napster.  AirTime is a social network that is built entirely around video and how you can interact with complete strangers that share your interests.

I started thinking about video and the involvement that it has in everyone’s life today.  It seems that everything has a video-capable camera now.  Mobile phones, tablets, and laptops come standard with them.  They are built into desktop monitors and all-in-one computers.  It seems that video has become ubiquitous.  So too have the programs that we use to display video.  I can remember all the pain and difficulty of trying to setup programs like AIM and Yahoo! Messenger to work with a webcam not all that long ago.  Now we have Skype and Facetime and Google+ Hangouts.  On the business side we have things like Cisco Jabber for Telepresence (formerly Movi) and Webex.  I even have a dedicated video endpoint on my desk.  However, the more and more I thought about it, the more that I realized that I hardly used video in my everyday life.  I’ve done maybe two Facetime calls with my family since my wife and I purchased Facetime-capable devices last year.  My Skype calls never involve video components.  My Webex sessions always have video features muted.  Even my EX90 gathers dust most of the time unless it gets dialed to test a larger Telepresence unit.  If video is so great, why does it feel so neglected?

For me, the key came in an article about AirTime.  In the press conference, the founders talked about how social media today consists of “asynchronous communications”.  We leave messages on walls and timelines.  We get email or instant messages when people try to communicate with us that sit there, beckoning to us to respond.  In some cases, we even have voicemail messages or transcriptions thereof that call to our attention.  The AirTime folks claim that this isn’t a natural method of communication and that video is how we really want to talk to people.  Nuances and body language, not text and typing.  That’s a good a noble goal, but when I thought about how many Facetime devices are out there and how many people I knew with the capability that weren’t really using it, something didn’t add up.  Why does everyone have access to video and yet not want to use it?  Why do we prefer to stick to things like Twitter timelines or instant messages via your favorite service?

I think it’s because people today have Video Attention Deficit Disorder (VADD).  People don’t like using video because it forces them to focus.  Now that all my communications can happen without direct dependence on someone else, I find my attention drifting to other things.  I can fire off emails or tweets aimed at people I want to communicate with and go on about my other tasks without waiting for an answer.  Think about how easy it is to just say something via instant message versus waiting for a response in real time.  Twitter doesn’t really have awkward silence during a conversation.  Twitter doesn’t require me to maintain eye contact with the person I’m talking to, and that’s when I can even figure out if I’m supposed to be looking at the camera or the eyes of the projected video image.  When I’m on a video camera, I have to worry about how I look and what I’m doing when I’m not talking to someone.  Every time I watch a Google+ hangout that consists of more than two or three people, I often see people not directly speaking having wandering attention spans.  They look around the room for something to grab their attention or get distracted by other things.  That’s why asynchronous communication is so appealing.  I can concentrate on my message and not on the way it’s delivered.  In real-time conversations, I often find myself subconsciously thinking about things like making eye contact or concentrating on the discussion instead of letting my focus drift elsewhere.  Sometimes I even miss things because I’m more focused on paying attention than what I should be paying attention to.  Video conversation is much the same way.  Add in the fact that most conversation takes place on a computer that provides significant distraction and you can see why video is not an easy thing for people like me.


Tom’s Take

I’ve wanted to have a video phone ever since I first watched Blade Runner.  The idea that I can see the person that I’m talking to while I converse them was so far out back then that I couldn’t wait for the future.  Now, that future is here and I find myself neglecting that amazing technology in favor of things like typing out emails and tweets.  I’d much rather spend my time concentrating on the message and not the presentation.  Video calling is a hassle because I can’t hide anymore.  For those that don’t like personal interaction, video is just as bad and being there.  While I don’t deny that video will eventually win out because of all the extra communication nuances that it provides, I doubt that it will be anytime soon.  I figure it will take another generation of kids growing up with video calling being ubiquitous and commonplace for it to see any real traction.  After all, it wasn’t that long ago that the idea of using a mobile phone not tied to a landline was pretty far fetched.  The generation prior to mine still has issues with fully utilizing those types of devices.  My generation uses them as if they’d always been around.  I figure my kids will one day make fun of me when they try to call me on their fancy video phone and their dad answers with video muted or throws a coat over the camera.  If they really want to talk to me, they can always just email me.  That’s about all the attention I can spare.

Upgrading to Cisco Unified Presence Server 8.6(4) – Caveat Jabber

With the Jabber for Everyone initiative that Cisco has been pushing as of late, the question about compatibility between the Jabber client and Cisco Unified Presence Server (CUPS) has come into question more than once.  Cisco has been pretty clear on the matter since May – you must upgrade to CUPS 8.6(4) [PDF Link] to take advantage of the Jabber for Everyone.  This version was released on June 16th, and being the diligent network engineer that I am, I had already upgraded to 8.6(3) previously.  This week I finally had enough down time to upgrade to 8.6(4) to support Jabber for Everyone as well as some of the newer features of the Jabber client for Windows.  Of course, that’s where all my nightmares started.

I read the release notes and found that 8.6(4) was a refresh upgrade.  I’ve already done one of these previously on my CUCM server so I knew what to expect.  I prepped the upgrade .COP file for download prior to installing the upgrade itself.  Luckily for me, 8.6(3) is the final version prior to the refresh upgrade, so it doesn’t require the upgrade .COP file to perform the upgrade.  The necessary schema extensions and notification fields are already present.  With all of the release note prerequisites satisfied, I fired up my FTP server and began the upgrade process.  As is my standard procedure, I didn’t let the server upgrade to the new version automatically.  I figured I’d let the upgrade run for a while and reboot afterwards.  After a couple of hours, I ordered the server to reboot and perform the upgrade.  Imagine my surprise when the server came back up with 8.6(4) loaded, but none of the critical services were running.  Instead, the server reported that only backups and restorations were possible.  I was puzzled at this, as the upgrade had appeared to work flawlessly.  After tinkering with things for a bit, I decided to revert my changes and roll back to 8.6(3).  After a quick reboot, the old version came back up.  Only this time, the critical services were stuck in the “starting” state.  Seemed that I was doomed either way.  After I verified the MD5 checksum of the upgrade file, I started the upgrade for the second time.  While I waited for the server to install the second time, I strolled over to the Internet to find out if anyone was having issues with this particular upgrade.

After some consulting, it turns out that Cisco pulled a bone-headed mistake with this upgrade.  Normally, one can be certain that any hardware-specific changes will be contained to major version upgrades.  For instance, upgrading from Windows XP to Windows 7 might entail hardware requirement changes, like additional RAM.  Point releases are a little more problematic.  Cisco uses the minor version to constitute bi-annual system releases.  So CUCM 8.0 had a certain set of hardware requirements, but CUCM 8.5 had different ones.  In that particular case, it was a higher RAM requirement.  However, for CUPS 8.6(4), the RAM requirement doubled to 4 GB.  For a sub-minor point release.  Worse yet, this information didn’t actually appear in the release notes themselves.  Instead, I had to stumble across a totally separate page that listed specific hardware requirements for MCS server types.  Even within that page, the particular model of server that I am using (MCS-7825-I3) is listed as compatible (with caveats).  Turns out that any 8.6(x) release is supposed to require more than 4GB of RAM to function correctly.  Except I was able to install 8.6(3) with no issues on 2GB of RAM.  Since I knew I was going to need to test 8.6(4), I rummaged around the office until I was able to dig up the required RAM (PC2-5300 ECC in case you’re curious).  Without the necessary amount of RAM, the server will only function in “bridge mode” for migrations to new hardware.  This means that your data is still intact on the CUPS server, but none of the services will start to begin processing user requests.  At least knowing that might prevent some stress.

For those of you that aren’t lucky enough to have RAM floating around the office and you’ve gotten as far as I have, reverting the server back to 8.6(3) isn’t the easiest thing to do.  Turns out that moving back to 8.6(x) from 8.6(4) requires a little intervention.  As found on the Cisco Support Forums, rolling back can only be accomplished by installing the ciscocm.cup.pe_db_install.cop file.  But there are two problems.  First, this file is not available anywhere on Cisco’s website.  The only way you can get your hands on it is to request it from TAC during a support call.  That’s fortunate, because problem number 2 is that the file is unsigned.  That means that it will fail the installation integrity check when you try to install it on the CUPS server.  You have to have TAC remote connect to the server and work some support voodoo to get it working.  Now, I suppose if you have a way to gain root access to a Cisco Telephony OS shell, you could do something like the outlined steps in the forum post (as follows):

Here what's required to temporarily install unsigned COP files

cd /usr/local/bin/base_scripts
mv SIGNED_FILTER SIGNED_TEMP

Here what's required in Remote Access to remove the temporary fix

cd /usr/local/bin/base_scripts
mv SIGNED_TEMP SIGNED_FILTER

Note: This is totally unsupported by me.  I’m putting it here for posterity.  Don’t call me if you blow up your server.  Also, I don’t have the TAC .COP file either, so don’t bother asking for it.

That being said, the above instructions should get you back up and running on 8.6(3) until you can buy some RAM from Newegg or your other preferred vendor.


Tom’s Take

Yes, I should have read the release notes a little more closely.  Yes, I should have verified the compatibility before I ran wild with this upgrade.  However, having fallen on my sword for my own mistakes, I think it’s well within my rights to call Cisco out on this one as well.  How do you not put a big, huge, blinking red line in the release notes warning people that you need to check the amount of RAM in the server before performing an upgrade?  You figure something like this would be pretty important to know?  Worse yet, why did you do this on a sub-minor point release?  When I install Windows 7 Service Pack 1 or OS X 10.7.4, I feel pretty confident that the system requirements for the original OS version will suffice for the minor service pack.  Why up the hardware requirements for CUPS for a minor upgrade at best?  Especially one that you’re driving all your people to be on to support your big Jabber initiative?  Why not hold off on the requirement until the CUCM 9 system release became final (which happened about a week later)?  If I’m moving from 8.6 to 9.0, I would at least expect a bunch of hardware to be retired and for things to not work correctly when moving to a new, big major version.  From now on I’m going to be a lot more careful when checking the release notes.  Cisco, you should be a lot more diligent in using the release notes to call attention to things that are important for that release.  The more caveats we know about up front, the less likely we are to jabber about them afterwards.

Why the Flip Didn’t Fail

Cisco announced today that it is restructuring its consumer line of products and closing down the Flip business.  R.I.P Flip, it appears.  The Twitter is alive with the sound of people commenting about this move, ranging from “Wait…what?” to what I think are the sounds of champagne corks popping and party music popping up all over.  To many of my peers, Flip represented all that was wrong with Cisco’s decision to get in front of a new perceived market transition toward consumerization and video.  By chasing the golden calf of Flip, Cisco deserted their core business and alienated customers towards alternatives, such as HP and Juniper, or so this line of thinking goes.  These people will now point to the retirement of the Flip brand and say that they were right and that Cisco needs to kill off the other consumer products and get back to what the do best: moving packets around as fast as possible.  Not the Flip was entirely bad.  It made for a great line of door prizes to be handed out by Cisco people at the events I attended.

I’m going to take a slightly different line of reasoning here.  I don’t think Cisco failed with the Flip.  I don’t consider something to be a failure so long as you learned something from it.  Apollo 13 wasn’t a failed moon landing.  It was a successful astronaut rescue.  We learned how to think on the fly when the pressure was on and bring people home safely when it counted.  In a slightly different way, I think Cisco learned a lot about what went wrong with the Flip and dissecting it over the coming months should yield a lot of information about how to avoid things like this in the future.

– Consumers don’t care about gadgets. Bold statement from someone that has both Fruit Company Mobile devices on his desk, right?  Funny thing about consumers is that they don’t really want a separate phone, still camera, video camera, and GPS receiver in their pocket.  If they can get all that functionality in one device, they’ll do it.  Even if it means that they won’t have the Super Whiz-bang 1080pqrstu Video.  I still have an iPhone 3GS, which was the first model to include video.  I don’t use the video camera all that much, only in cases where it’s convenient.  Others tend to use the video camera for anything and everything.  I don’t own a separate GPS receiver, I use Google Maps on my phone.  I don’t carry a point-and-shoot camera any more, I use my phone.  The ubiquity of having a video camera built into your phone has really hurt the low-end fixed-focus video market.  Not just Flip, but Kodak and others have really been pinched because people are now looking at buying a $200 video camera and saying to themselves “Why bother?  I’ve already got one on my phone that works almost as well.”  I figured Flip was headed for hard times when I saw many of the HD models on sale at deep discounts at my local office supply store.  Reducing inventory means they aren’t flying off the shelves like you wanted.

– Consumers want to do something with content. Once upon a time, people used to shoot video of their kids and then copy the tape and send it to Grandma and Grandpa for hours of watching over and over again.  Now, we just post it all to Youtube/Vimeo/Facebook and tell Grandma to check it out there.  In a world where Cisco says everyone wants video, people tend to fall back on these outlets to let the world know about what you have to say.  I do it myself.  My kids riding a bike for the first time? Facebook.  My first ride on a Segway? Youtube.  My IPv6 Presentation? Vimeo.  I want to share things with people.  Rather than coming up with a new and unique hardware device to capture these moments, what Cisco needs to do is focus on a way to expedite sharing this content.  Other than having a button that says “Upload to Youtube”, do you know how hard it is to get video off of a Fruit Company Mobile Device?  Not that easy.  iMovie exists to edit videos on these devices because they are such a captive platform.  Once you’ve edited the movie how you’d like it, you just upload it to the content aggregator of choice.  Imagine, though, that you want to share this content with someone and not necessarily have to send them to Youtube.  That’s where Cisco Digital Media Manager needs to come more into play.  By allowing consumers to upload content to a…cloud-based version that can then be pushed down to a local digital signage endpoint.  Think of a school where users want to take video of a football game or a band concert and make short clips available at signage endpoints in the common areas.  How would you do that seemlessly today?  You’d have to force people to go to Youtube and play the video instead of making it instantly available at their fingertips.  Concentrate on writing the middleware to make the sharing process invisible to users, and I promise you’ll make more money than you would with another hardware device.

– People want to use hardware as they want. This is probably the biggest gripe about Flip to me.  Shop the Flip accessories page (while you can).  What do you find?  Cables and tripods and image designers to make skins for your camera.  Where’s the external Bluetooth microphone so my presentations don’t sound like they’re in a tin can?  How about different lens options besides the single wide-angle one?  How about a wireless option to allow me to NOT have to plug my camera in every time I want to offload a video or I simply want to upload it to Youtube?  That last one is where the Flip really missed the boat.  People constantly complain that Apple won’t allow them to wirelessly sync their Mobile Devices.  They would much prefer just hitting a button once you were within range to synchronize everything.  Now think about the Flip.  If I want to upload the cute video of my kids being chased by the AFLAC duck, I need to break out my laptop, download the video from my camera, find a wireless hotspot, and then upload that video to Youtube.  How would I do that on my phone?  Take video, push “Upload to Youtube”, done.  Quick and easy.  I didn’t need to think about how to accomplish the task, I just did it and let the hardware sort it out.  I have no doubt that Cisco would have eventually released a wireless Flip.  And I would have loved to have bought one.  It would have allowed me much more freedom to do things I wanted with my videos.

Tom’s Take

The Flip ultimately didn’t fail.  In my mind, the umi was a much bigger failure in terms of R&D-to-revenue.  Flip taught Cisco that the consumer market is a dog-eat-dog world of products made by the lowest bidder and people too focused on getting the most bang for their buck to care about cutesy things like graphics skins.  A few years ago, the Flip might have succeeded to the point of driving a different kind of video revolution.  Instead, Cisco tried to jump ahead of the transition and guessed wrong.  Rather than trying to provide the hardware to drive the transition, create the software to take advantage of it.  Integration is more important that manufacturing.  Think about the coup you could pull off if you could integrate Apple Facetime with CUCM or Telepresence.  That’s where Cisco needs to be headed, not to plastic video cameras.  Just as long as Cisco learns from what happened with Flip, it will never truly be a failure.

Flip MinoPRO – My Review

A while back, I talked about using a Cisco Flip video camera to document my walkthroughs and act as a visual note taking tool.  Little did I know that just after that post, I was going to get a chance to look at the new hotness to my old-and-busted Flip.  I present…the Flip MinoPRO:

Pretty, ain’t it?

The Cisco Flip MinoPRO is the “prosumer” offering that Cisco is targeting for the business/enterprise customer.  If you’ve been living under a rock for the last year or so, John Chambers is driving video like a herd of cattle in City Slickers.  From the Flip acquisition to Tandberg to the new Cisco Cius, Chambers believes that video is the future.  By putting a video camera in the hands of your employees, you can get them to capture things and share them with many other people.  Want to save the New Employee Orientation for posterity?  How about a lesson on BPDUGuard?  Chalk talk with the virtualization architect about Vmotion?  Chambers wants you to record all of this and let everyone see what you’ve done.

So how does the Flip MinoPRO fall into this?  One of the things that I noticed about the Flip Mino that I’ve been carrying is that it felt a little…well, flimsy.  It was made out of plastic and weighed next to nothing.  While this a great thing for the consumer space, it always seemed to be a little disposable to me.  In fact, my Mino has already started to show a few cracks around the edges.  I’m sure that most of it is due to me being a little rougher on it that the average family of 4, but if this is a device that is going to be used day in and day out by your employees, it should probably be a little more rugged than your average CD jewel case.  With the Flip MinoPRO, it is definitely sturdier.  When I first pulled it out of the box, I noticed the housing was made out of metal, not plastic.  It has a great heft while staying light.  The PRO really feels nice.  I’m sure if I threw it at someone, I could cause a bruise.

The unit itself is fully capable of recording in 720p HD (1280 x 720).  This is wonderful for getting details on things in racks and noticing all the little nuances you might miss on a cursory glance.  In addition, the audio seems to be somewhat better, although if you’re expecting movie camera-quality you’ll be disappointed.  The unit can pick up lots of ambient noise, but it’s still a condenser microphone, so loud background noise will overwhelm it.  Another nice upgrade from my Mino is the storage size.  The old camera had 1GB of storage, which at it’s standard definition was good for about one hour of recording time.  The new camera has a staggering 16GB of storage space.  Even with the increased HD recording, that’s 4 hours of recording space available to me now.  Whereas before I had to be judicious about removing video after I was finished with it, I now can leave it on the camera until I’m ready to use it.  I don’t have to be concerned if I can’t capture a whole event or lecture, as I’m now able to get as much as I can.  In addition, the PRO can also function as a USB memory stick.  While it’s a bit more unwieldy than the USB drive I carry in my pocket, it’s quite handy if I need to copy something back and forth and it’s already plugged in.  In addition to the HD recording, the output of the camera got a matching upgrade.  Gone are the three-pronged composite video/audio outputs of yore.  In their place is a shiny new HDMI output!  This means that when you’ve finished recording your video in it’s pixel-orgy quality, you can plug your camera into any HDTV with a simple cable and play away.  It also makes it much easier to find a cable instead of always forgetting the proprietary one that came with the old camera.

The software for the camera stays the same as the consumer version (FlipShare), but as businesses have different requirements for their video, Cisco too has the answer.  Cisco FocalPoint is a video portal run by Cisco that allows you to upload video directly from either the FocalPoint software or from FlipShare.  Acting essentially as a corporate YouTube, you gain the additional options of keeping video secure as it is uploaded to the portal, as well as wiping the camera after the upload.  The hallmark of FocalPoint is the ability to easily search the video that you’ve fed into it, while at the same time securing the content so that only authorized users have access.  That way, your manager’s retreat videos don’t get downloaded and sent to Wikileaks.  FocalPoint is a licensed product, so you’ll need to have a license to create a workspace and attach users.

Now, just like every silver lining has a cloud, so too must this device have a drawback.  It’s not necessarily unique to the Flip MinoPRO, but to HD video in general.  I’ve noticed that my ‘notes’ are starting to take up much more space now that I’ve switched to the new-fangled HiDef stuff.  I’m clocking in at about 1 Meg/second.  That means that any of my longer videos will no longer be able to be disseminated via e-mail.  I’m sure that this is what Cisco had in mind when they created FocalPoint, as the only other method available to me currently is burning the movies to a CD/DVD, or uploading them to a shared internal storage location.  Oh well, if nothing else you can sell a lot more storage space with each new Flip MinoPRO.

If you’d like to get a look at what kind of video the Flip MinoPRO is capable of, take a look at this video of me on a Segway.  Be sure to select 720p to see it in all it’s…um, glory.