When the global economic outlook is uncertain, businesses prepare themselves for a possible decrease in revenue by adopting strict cost control measures such as travel and budget cuts. The most extreme organisations literally turn into hibernation mode, freezing investments which ultimately results in leaving more space for the competition to make aggressive moves and gain market share.
Hoping that the days ahead will be brighter, others decide to invest in audio and video conferencing solutions in the hope of instantly reducing costs, time risk, and carbon emissions associated with travel. Unfortunately, this investment often becomes a waste of financial resource as once the storm has passed people resume previous travel habits. This is the result of a lack of an internal policy combined with the absence of technology adoption programmes for executives and staff.
So what do the best companies do?
Click here to read the rest of this interesting and insightful blog…..
Takeaway: Social media for the enterprise is about much more than Facebook or Twitter on the corporate cloud or server. It is poised to spur changes in CRM, BI, and collaboration, just as starting points.
When enterprises consider setting up their own social media networks, it’s to offer much more than just a place for idle chatter or a technological replacement for the watercooler. In fact, many people now think that using social processes within the business will be transformative across the enterprise. That’s because putting social into business processes leverages the untapped potential of employees, partners, and stakeholders, while also returning control of data to the brand and improving customer relationships.
Jonathan Frappier, director of technical operations at LightWire, Inc., and an independent consultant and blogger at VIRTXPERT, dove into the enterprise social media waters with one of the first companies he worked for when it built a web-based community for CIOs. Ever since then he’s advocated their use. Frappier says the variety of names being used to describe these enterprise social media efforts really only denote differences in complexities and at their hearts they’re all about new ways for people to have conversations, instead of relying on just email.
He cites the advantage of opening up communication to everyone in the organization, across all departments, and how empowering that can be for bringing in new perspectives and ideas. Other advantages he points to include:
- Creating a searchable knowledge base that everyone in the organization can benefit from even if they weren’t directly involved in the original conversations.
- Allowing companies to connect with their employees much like public social media allows individuals to do on the personal level.
- Offering a more modern document management system allowing collaboration on documents that used to be passed around in email or file sharing.
- Expanding business intelligence efforts so every document, and even perhaps conversations, are indexed.
The public side of social in the enterprise
There is also a public side to enterprise social media where companies set up communities allowing the general public to interact with their brands. Rob Howard, chief technology officer for Telligent, recently predicted that companies will be increasingly shifting their social media investments from the public social platforms to their own, on-domain communities. That’s because businesses recognize the content created by consumers is invaluable, that consumers want online customer service, and company websites are the number one sales resource. He said:
“What we’ve heard from a lot of our brands that we’ve been talking with is that, while they do agree with our philosophy that Facebook is great from a consumer point of view and from a personal relationship point of view, it’s not where a lot of these consumers are going to make decisions. Secondly, a lot of these organizations care very deeply about data ownership and as these customers are going through the process of sharing information — whether it’s asking questions, answering questions, providing data about their experiences with the brand — brands want more control and ownership over that.”
Howard espouses the Telligent philosophy as being one where social is not viewed as a destination but rather as a set of experiences. He sees social categorized into social media (Facebook, Twitter), social networks (LinkedIn, Chatter, Yammer), and social communities (the company interface with customers), and he says that businesses will have to balance their investments across these multiple channels. He maintains that social communities are one of the most valuable channels to invest in because it’s there that companies can control the brand, the data, and manage the customer experience much better.
He cites information from Forrester dealing with the customer experience management lifecycle claiming brands have thought of the customer lifecycle as being a linear process. Companies would take each activity in the cycle and segment them out to different units in the business. Typically though, the disconnects with the customer occurred because oftentimes the various business units didn’t interoperate with each other.
Howard says new evidence suggests the customer lifecycle isn’t linear, but rather circular, and made up of repetitive processes where customers continuously come in and engage with the brand. Those different aspects of how the brand touches its customers through sales, marketing and support, are interrelated, and so a lot of brands see social as the way to connect all those interrelated experiences together.
Understanding enterprise social pinch points
As with anything there are always challenges. Frappier, speaking mainly about internal social media efforts aimed at employees, partners, and stakeholders, says just using the name “social media” creates hurdles since many people have a limited idea of its usefulness. Beyond that he cites the difficulty in tracking the benefits of the investment and finding a solution that meets the needs of all the various groups, although he says there are vendors now that offer an app-centric approach to building these networks.
Howard says the challenges he sees include:
- Dealing with negative reactions to the brand;
- Growing successful communities; and
- Getting organizational commitment.
Regardless of the name used to describe it, social media in the enterprise offers a very different way of collaborating, managing customer relationships, and finding new value in the infinite stream of data companies are creating and managing. Many are betting “the social way” will be the next big step in the continual evolution of how business is done.
Whether it’s a UC project, a collaboration project or whatever, to ensure project success there are some steps that need to be taken by the integrator and the customer IT department. If the steps aren’t taken, prepare for the worst – a failed project! As we considered the Cloud in today’s world, we realized that all of these steps are still basically valid, even the testing step, although the scope or complexity of some of the steps may be minimized by Cloud delivery. As a systems integrator, I learned some of these lessons the hard way. You don’t have to!
Key Step #1: Make sure there is senior management buy-in
Don’t make the mistake of being pulled into someone’s “plans” without making sure upper management is on board with the need and the overall plan. Make sure budget has been allocated and know the name of the senior manager who is supporting the project. If you’re the integrator, make sure that this isn’t just a pet project of someone in the customer’s IT department – with no support and no funding.
Key Step #2: Involve end-users and stakeholders early and often
No one understands their needs and their processes better than end-users. And no one can de-rail a project faster than end-users who see no value in the project. Would we be wrong in assuming that end-users were part of the “needs analysis” and fact finding that went on in the development of the solution that is the basis for the project? End-user representatives should also be involved in the project, to ensure their buy-in and championing when the project is rolled out.
Key Step #3: Set expectations early
All stakeholders and end-users need to be educated on what is going to change, what the change will look like, how it will affect their own work routine and where they can go for help if they have problems. This should NOT be left until right before going live. Change is never well-received and efforts to minimize the concern and provide help will go a long way in ensuring that end-users will actually use the new solution.
Key Step #4: Minimum specs usually mean headaches at some point
As the systems integrator or VAR, don’t be guilty of trying to undersell the components necessary for the solution with the plan to go back during project implementation and “add on”. As the customer IT leader, don’t make the mistake of going after strictly “low bid” or the least costly solution unless you know for 100% positive that you are getting exactly what is needed and that everything will integrate correctly and seamlessly with your infrastructure. Here’s a good adage…. “You get what you pay for.”
Key Step #5: Create a detailed project plan
Formal project plans force the project manager, and everyone involved, to consider all the necessary phases and steps, and the order in which to proceed. In addition, they define accountability and responsibility – what are the integrator’s responsibilities and what are the customer’s responsibilities? Ever seen this saying…. "Failure to plan is planning to fail"?
Key Step #6: Schedule meetings only as needed and when key players are available
Meetings should be used expeditiously and should include all key players. Schedule them when everyone is available (can you say “group calendar”?) and have a specific agenda in place. Each meeting should have a specific goal or outcome – whether the goal is to resolve a problem with the project, assign additional responsibilities, or whatever. Never call a meeting to update on the status of the project – that can and should be done in writing, on a regular basis and shared with all stakeholders. We’re in the high-tech industry – use video conferencing and other technology tools to maximize communication and collaboration and minimize wasted time.
Key Step #7: Make sure adequate testing is included in the project timeline.
Testing is essential to project success. Advance testing should be done whenever possible at the integrator/VARs facility. Testing should be done again, and again as the implementation continues on the customer site. Once the project is complete, the customer should have a testing program of their own – using their own employees and a testing script.
Key Step #8: Have a plan in place in case the “go-live” or “cutover” fails
Heaven forbid a “go-live” doesn’t go as planned, but it happens every now and then. The integrator/VAR and the customer project leader need to have agreed ahead of time on what go-live success looks like -- and when it's time to admit failure and begin again another day. There always should be a backup plan in case a “go-live” fails and the failure issues can’t be resolved by the integrator and the IT department.
Key Step #9: Make sure the “go-live” or cutover is scheduled for minimum disruption and maximum support availability
OK – this sounds like an oxymoron because it generally means that the “go-live” will occur over a weekend (or heaven forbid, over a holiday). It is the responsibility of the integrator/VAR to make sure that their own support team and any additional support from vendors will be readily available if needed. It is the responsibility of the customer IT department to make sure that all relevant staff is available on site or easily accessible. Again, think “technology”. Video collaboration and conferencing? Presence to know exactly what expertise is available?
Key Step #11: Build in adequate training
While most integrators/VARs understand the importance of training, it is all too often deleted or skimped on in the proposal; or needing to be removed from the proposal because the customer didn’t build it into their budget for the project. “Easy-to-use” just isn’t usually true when it comes to technology. What may be “easy-to-use” for someone with a technical background is not necessarily so for the average end-user. And nothing will label a project a “failure” faster than end-users not embracing and using the new solution. Adequate communication and training is vital to the success of any UC or collaboration project when it will change how the end-user performs their daily job.
So good luck and remember….. you can never do too much planning or communicating on a new project!
Today communication with patients is fragmented at best. Family communications can only be described as dysfunctional. Recently, at one of the largest hospital systems in Maryland I tried to find out the date, time and location of an appointment for a relative. She was too doped up on pain meds to remember the details or where she put the reminder paperwork. It took a day and a half, 4 phone calls and was not resolved until I physically appeared at the doctor’s office. Who, by the way, had no recommendations for confirming appointments with other doctors other than the path that I had just taken. My concern is that I am certain that this goes on every days. It make you wonder how many of those people roaming the hallways of large hospitals are in the same predicament. Not just about appointments, but in search of information for themselves or their family. Maybe a question about a new medication that they forgot to ask during their meeting with their doctor. Maybe something they forgot to tell someone that might be relevant to their health. There are few of us that walk out of a doctor’s office that don’t think about something that they forgot while on their way to their next destination.
Many healthcare providers are considering centralizing their appointment process using 30 year-old contact center technology. This is admirable; however, it only solves one of the many communications problems that patients encounter every day. Coincidentally, I am aware of a Fortune 50 insurance company that is engaged in an effort to develop a patient collaboration interface. They have been trying to deliver this interface for years, but have been prevented from moving forward because of the high cost of computer-related support calls from patients and the high cost of proprietary software. With Web Real-Time Communications (WebRTC) these barriers are rapidly being overcome.
At this point it is worth exploring who is best served by building a patient collaboration interface. It seems to me that patient’s are the prime benefactors; however, I am sure that there is great debate related to who should host such a service…payers or providers? One of the game changing elements of WebRTC is that both can implement solutions that support the patients needs and patients can be supported collaboratively throughout the spectrum of the health care process. The applications will naturally dove-tail together to support the entire process and they can accomplish this without the need for technology or corporate federation.
WebRTC is a new standard that Google is supporting for browser-to-browser communications. It supports these communications without the need to download an app of plugin. It works on any smartphone, tablet or PC that can surf the web. WebRTC is transformational, but the big transformations will not be built by the technology providers or even start-ups, but by intelligent business people that can harness the capability to transform their business and gain significant competitive advantage. In the early days of the web/browser, the early adopters began using new techniques to reduce transactions cost were able to gain market share. Across all industries, the web changed the business models. The webification of telecommunications with WebRTC will create the same opportunity with richer interfaces that will extend well beyond traditional enterprise communications boundaries. Further, there will be over One Billion WebRTC enabled devices in use by the end of 2013 so the innovation wave has already started.
WebRTC-based collaboration interfaces are secured with end-to-end encryption that is superior to telephone communication. Access is restricted with user-name and password requirements. Initiation of communications and display of web content is secured with Secure HTTP (HTTPS). Transport of communications for file transfer, text, audio and video are encrypted with Secure Real-Time Transport Protocol. Currently, encryption of communications ends at the edge of the enterprise. WebRTC extends it all the way to the user’s browser.
The availability of customizable patient communications directories is the first element of a patient collaboration interface that will make a difference. These directories can include legacy 10 digit numbers and/or hyperlinks to communicate browser-to-browser or browser-to-telephone. This way doctors can always be available without the need to disclose their cell phone number. These directories can be systematically gleaned from the patient’s medical record or manually updated by staff , patients or family members. These directories are not limited to hospital employees. They can include ambulance services, physical therapists, claims adjusters, clergy or even contact centers that support things like managing appointments.
WebRTC supports screen sharing and file transfer from the browser on the patient’s device of choice. This means that test results, financial paperwork and/or images can be shared between patients, family, health workers and insurance professionals. Legacy collaboration application require technology and corporate federation in advance of sharing files or screens in order to traverse corporate boundaries. WebRTC does not. The ability to conduct more thorough communications is supported by the richness of the content that patients and healthcare professionals can share with others. One day a patient may collaborate with a financial professional about the completion of a government form and the next they may share a photo of a sore on their foot with a nurse.
In-home care is greatly enhanced by the real-time nature of these communications and the availability of inexpensive Bluetooth devices to monitor the health of the patient. Further, patient’s can be prompted to score their pain or comfort level on a systematic basis. Based on business rules, these events can be automatically escalated to a communications session, audio or video, in the event that intervention is necessary.
Big Data is being used to analyze the behavior of patients to determine risk and possible treatment options. In the past this has been done, but the results were often delayed by days, weeks or months. With the advent of Big Data these calculations can be made in real-time. Calls from patient’s to healthcare workers can be accompanied by statistical recommendations that are based on the web browsing history of the patient, their medical record, their current treatment and what web-page they were looking at when they decided to communicate.
Seamless integration with legacy telecommunications systems and wireless devices is available. Further, these systems can be configured in a duplicated architecture to support the fault-tolerant needs of the healthcare business.
While there are other benefits for payers and for internal communication within the healthcare community, patient collaboration is the real game changer. Within the next 12 months there will be several products that come to market to support patient collaboration. Pricing for these solutions will be an order of magnitude lower than current proprietary systems. The numbers will be more Magic Jack than AT&T. The question for healthcare providers and payers is not if, but whether to build a solution or contract with a cloud-based service provider.
According to a new study released by MarketsandMarkets, a global market research and consulting company based in the U.S., the market for the cloud version of UC (UCaas – unified communications as a service) is expected to grow from $2.52 billion in 2013 to $7.62 billion by 2018, at an estimated CAGR of 24.8% from 2013 to 2018. Telephony is the most used technology for now and will remain so in the next few years as well. The global UCaaS Telephony market is expected to grow from $0.87 billion in 2013 to $2.48 billion by 2018, at an estimated CAGR of 23.3% from 2013 to 2018. This is great news for channel partners offering UC solutions from the cloud.
Interestingly, the most significant growth comes from the collaboration area, reports the study. The UCaaS collaboration application market revenue is expected to grow from $540.74 million in 2013 to $1.75 billion by 2018, at an estimated CAGR of 26.5% from 2013 to 2018. Companies across all verticals are using UcaaS to integrate web conferencing, video conferencing, messaging, VoIP and presence. Use of cloud delivery and integration helps decrease front load capital cost as the applications are offered on a per seat basis, which enables businesses to scale communications easily and effectively, with the end results of reducing travel time and creating leaner business processes.
Most of the major UcaaS players identified in the report come as no surprise - Avaya, Cisco, Microsoft, Alcatel-Lucent, Interactive Intelligence, Siemens Enterprise Communications, Mitel, and NEC – although their inclusion in the list of Panterra Networks and CSC integrators did raise eyebrows. Perhaps reading the report will bring clarity on why the inclusion of these two organizations.
According to their press release, MarketsandMarkets have the report available for purchase at http://www.marketsandmarkets.com/Purchase/purchase_report1.asp?id=893
VARs, integrators and telecom dealers may not sell smartphones or provide the carrier services to make them work, but there are some amazing revenue opportunities that have been created by BYOD in the enterprise. And best of all, the majority of the opportunities are in the services area, which brings higher margins than product sales. According to the Gartner CIO Agenda 2012 study mobile technology and solutions are very high on the agenda of a majority of CIOs – higher than UC and collaboration.
Whether it’s implementing mobile UC for their end-users or addressing the challenges of BYOD in their own enterprise, the topic of mobility can be an excellent “conversation starter” when meeting with IT staffs or CIOs. And consider this, as BYOD continues to grow at the enterprise level, it should pull mobile UC along with it. For an employee who is now primarily communicating on his smartphone, how does a customer or another employee reach him effectively and efficiently? Mobile UC!
Where are those opportunities for the channel? Think creatively and strategically and you’ll find them!
Already offering a VoIP product that has mobile UC capabilities, either in a client environment or inherent in the VoIP product itself? Learn what that VoIP product can do with mobility and then visit your existing customer base and have a “mobility” discussion. Is there additional revenue available by adding mobile UC capabilities to their existing voice system? This could be a good source of easy incremental revenue.
The need for a solid mobile UC solution could also lead to a communication system upgrade or an entirely new system for an existing or new customer.
Along these same lines comes the potential for network and Wi-Fi assessments (professional services) as well as projects to upgrade network infrastructure to accommodate increased voice traffic or Wi-Fi for internal smartphone users.
For those in the channel who are services focused, consider developing a set of policies and procedures for managing BYOD in the enterprise. From the perspective of the customer’s IT department, controlling BYOD – to protect company data as well as control mobile spending – is a growing issue. Both existing and new customers could be candidates for this service that would not only provide high margins but be a competitive differentiator as well.
Are you an MSP? What about offering Managed Device Management (MDM). MDM software from companies like MobileIron is now readily available to secure and manage mobile applications, documents, and devices. As BYOD continues to grow across enterprises, MDM sales and services will grow also.
Thinking “outside the box”, security is one of the most serious concerns with BYOD. Company data is now residing on personal smartphones, which can be lost. Data residing on company servers is at risk of being hacked through those same personal smartphones. Companies that have already taken major steps to secure their information from internet intrusion are now finding it vulnerable via smartphones. Consider the industries for which security is vitally important (government and healthcare to name the most obvious). Develop expertise in this area and reach out not only to existing and new customers but to other channel partners that need to add “security expertise” to their portfolio but don’t have the training or knowledge to do it themselves
Historically, the carriers – AT&T, Verizon, Sprint, etc. – focused on consumer, personal smartphone sales. Today, with BYOD and mobile UC growing, they are actively engaged in finding ways to capture the growing business customer. The agent model is their immediate best bet to reach that customer and the agent relationship can provide a lucrative recurring revenue stream for little effort or financial commitment.
VARs/MSPs, integrators, and telecom dealers – don’t let these opportunities slip away. This is a relatively new area where customers are plentiful and competitors are few!
I hate talking about topics of the week, such as the debate around Yahoo's new CEO, Marissa Mayer, telling her staffers to stop working from home.
First, in my opinion, CEOs are allowed to make such statements to their employees, and you can't judge unless you work there or own stock. Second, it probably won't help Yahoo one bit.
However, what is relevant about this issue is the use of cloud computing by a remote workforce. What are those synergies? That's worth discussing.
The work-at-home movement drives a great deal of interest in cloud computing. Public cloud platforms are typically better at providing IT services over the open Internet than enterprise IT is capable of doing. Thus, the public cloud can better serve a workforce that's as likely to work at the local Starbucks as the corner conference room because they can push processing, storage, and enterprise applications to a middle tier between the company and the user. In other words, connectivity, security, capacity management, and resiliency become somebody else's problem.
Indeed, the more distributed your workforce, the more public cloud computing can benefit the support of that workforce. Innovative enterprises are adopting Dropbox or Box.net for file sharing services, taking up Google Apps for office automation and collaboration, accessing SaaS-based solutions such as Saleforce.com for CRM, and beginning to migrate large portions of operational data to public IaaS providers such as Amazon Web Services. If you add mobile computing and BYOD to the equation, the public cloud becomes even more compelling.
Of course, some companies push back on public cloud computing with the normal excuses, including security, privacy, ownership, and so on. But those busineses typically don't offer work-at-home options to their employees, I've found.
While a remote workforce issue is typically not the only benefit that drives business to the cloud, it's often on the radar. Moreover, companies innovative enough to create a strong remote workforce are typically the organizations that accept cloud computing. If they trust people to work poolside, then trusting public clouds is not much of a stretch.
Google recently launched its high-end Chromebook Pixel, and like previous Chromebooks this notebook computer makes a distinctly 21st Century assumption: that users' data, work and play belong mostly online, not on their own computers. Google isn't alone in pushing this notion, but it's the most powerful evangelist for the shift to what tech people call the "cloud" and away from "local" storage.
Call me unconvinced. Deeply unconvinced.
The cloud evangelists have an alluring pitch. First, they say, we can now count on being connected as much of the time as necessary. Second, these computing and data services becoming a utility like electricity – easier and safer to run from remote servers than on our local systems.
Like almost everyone else, I use lots of cloud services. They start with everything I do from a browser, such as search, microblogging (Twitter), multiuser games, etc. They also include my email (I store a few weeks' worth of messages in an online system that shows me the same inbox and folder structure no matter what computer I'm using) and calendars, but in those cases I'm synchronizing the data to the local machine. And I use several online sites to back up my music and important documents.
But move everything to the cloud, and use it in an on-demand way? No chance, at least not now – and probably not ever.
For one thing, web-based applications simply can't match the power and flexibility of native desktop software, at least not yet. Google Docs do many things well enough for non-complex tasks, but that's not good enough when I need, say, the track changes feature in Microsoft Word or its Linux equivalent, LibreOffice Writer. Online applications are getting better, and they can do some things the offline ones can't, of course; there are tradeoffs that over time will make the online offerings more compelling. And as Google and other web-based software companies make it possible to work offline – you can do that now with Google Docs – one more advantage of local computing will be mooted.
It's harder for me to imagine cloud computing ever being fully trustworthy. The idea that data is like electricity is only partly true. The electron that comes to me from the power grid is identical to the electron that goes to someone else. This isn't true for data, except at the most basic level, where all information can be reduced to zeros and ones. Put a bunch of electrons together and you still have just a bunch of electrons. Put a bunch of bits together in different orders, and they are completely different.
The promoters of the live-in-the-cloud vision tend to minimize the downsides. Online databases are vulnerable to hacking; hardly a day goes by anymore when we don't hear of yet another breach. Outages on networks or individual services are all too common. Centralized databases, owned and operated by big companies, are one-stop shops for government snoops.
One reason the cloud has become so useful is the same reason we should have a "local storage" backup as well: The companies that make disk drives and solid state storage (SSDs) keep improving their technologies, making storage cheaper and with vastly more capacity all the time. You can buy a portable hard disk with 2 terabytes (2 million megabytes) of storage for under $150. The micro-SD card, smaller than a fingernail, now holds 64GB for about $50; eventually it'll hold 2 terabytes at a comparable cost. In fact, the storage industry has outpaced everyone else in tech with its exponential improvements.
There are dangers in local storage, too. The chief one is disk failure. But other mishaps can occur, too, including physical loss of the backup. I made a terrible mistake last fall that cost me weeks of work on a project, because I bungled my backups. I was creating full and incremental backups to several external disk drives, rotating among them to ensure that nothing would be too old. But I made two crucial mistakes: I didn't back up several key folders to my normal online services, because I'd moved them on my laptop to a part of the drive where they were no longer automatically added to the online folders. Worse, I failed to rest the "restore" function of my backup software, which was encrypting the files; when I needed it most, it didn't work. I kicked myself for a couple of weeks, and moved on – with a different and (I believe) much safer routine.
I can't – and don't want to – avoid using the cloud for many tasks. But I won't solely rely on it for backups and working documents. My approach is to use both, and to encrypt my files in both places.
Giving further credence to the growing use of the iPad in the enterprise, ShoreTel recently announced the availability of two new releases of their mobility and collaboration products that have been optimized for the Apple iPad, enabling integrated collaboration capabilities, increased accessibility and improved communications, regardless of the user’s location. Can we deny that BYOD is here to stay?
ShoreTel Mobility 6 makes it easier to use mobility features on the enterprise users’ iPad. Imagine sitting in an airport and using your iPad to place calls that appear to be coming from your office desktop phone (your “business persona”).
ShoreTel Conferencing for iOS offers application collaboration capabilities. Users “easily share presentations controlled by their iPad or iPhone with remote participants; or can view shared desktops of their colleagues’ PC and Macs”, according to the press release.
In a quote from the press release, “The Apple iPad has quickly become the most popular tablet brought by users into the workplace,” said Peter Blackmore, chief executive officer, ShoreTel. “ShoreTel transforms the iPad into a true multi-modal business communications device – for placing and receiving calls just like a desk phone, for sending and responding to instant messages, and for easily collaborating with other PC, Mac and iPad users. By combining these applications together in a manner that is brilliantly simple for employees to use, businesses can feel comfortable supporting a BYOD policy to drive effective communications and enhance productivity.”
For those VARs, integrators, and telecom dealers who haven’t yet seen the opportunities that BYOD can bring to their business, this announcement should serve as a wake-up call.
Something both strange and unfortunate is happening in the rush to embrace new communications technologies. The world seems to be moving relentlessly toward less efficient communications modalities.
Maybe I’m in a small minority who thinks this way, but I find it annoying when I’m on some news website, find a story of interest, click on it, and the screen pops up with a video of some guy reading a script off a teleprompter. What I’d like to see is that script itself. I want to scan it quickly and find the parts of the story that are of interest. Scanning for relevant content is a heck of a lot more efficient than listening to someone drone on – not to mention the 15-second video advertisement that precedes the “news.” Yet video is replacing text in many communications. Certainly there are places where video adds an important dimension to a story, but the trend to present one-directional distribution of content by video seems to me to be a leap backwards.
Information delivery by video is just one example. The process of creating email messages or documents is another. We used to sit down at our laptop and use a full sized keyboard to get thoughts into digital form. Keyboards started shrinking as we moved to netbooks. They got much smaller when we went to Blackberry and two-thumbs replaced eight fingers (at least for those of us who learned touch-typing). The next step was a smartphone in which the keyboard is now on the screen, so that it’s both small and without any tactile sensory feedback. This is progress?
I just got done with an IM chat on Skype with a colleague. It spanned 25 minutes to exchange his questions and my suggestions for his upcoming vacation to my corner of the world. A voice conversation – merely a click away on Skype – would have covered this in seven minutes, and probably provided greater details which were too cumbersome to type out.
I raise these examples to add to the frequently discussed concern that all of us are burning too many hours sorting out spam, reading too many emails on which we are needlessly copied, being buried in too much data and too little information, and other time-sinks of modern life.
So are the tools the problem? Yes, partially. Muddled concepts, poor design, and form factor compromises all contribute to communications inefficiencies we see in our workplace. But the other problem is not understanding how best to use the tools that we have. The great tendency is to apply new tools to existing processes. Because we are accustomed to the current way that work gets done, using new tools to automate today’s methods is much easier than figuring out how can we change what we do to take advantage of new technology capabilities.
That is what’s happened in many UC deployments. The array of new tools that unified communications brings has been applied to yesterday’s processes. In many cases, that brings some improvements, but misses what can be real breakthroughs in how work gets done.
Presence, which shows up on our “buddy lists,” is one example. We all have circles of contacts in our business lives, and it’s useful to add a presence capability to see whether a colleague we know is available. But a far more useful capability would be to have lists also automatically organized around expertise. Then, when seeking advice about a particular business issue, we could look for someone who may be an expert but who is not necessarily in our buddy circle. Some social network systems tout this sort of solution, but so far few enterprises are using the systems this way.
This tendency to underuse innovative capabilities (like presence) is exacerbated by the way UC is often introduced into organizations. In many firms, UC was thought of as a side-benefit to introducing a new IP-PBX, or as an adjunct to some other system brought into the company such as e-mail. The result is a “try it out and see how great it is” approach to introducing these new capabilities. With that kind of an introduction, it’s not surprising that people tend to find their way to applying the new tools to existing ways of working—automating and adding functionality to current ways of doing things, but often missing really innovative ways of getting things done.
The approach promoted here at UCStratgies.com and used in our consulting practice is to start, not with the technology, but with the business processes and with identifying the communications bottlenecks and breakdowns in the current workflows. The new UC capabilities are therefore introduced to fix specific problems or to address specific opportunities. In some cases, that means augmenting existing processes or introducing new ones. Some see that as a barrier, but our consistent experience is that people will readily embrace the changes, because they get immediate feedback of how the new approach is much better or easier or more efficient.
The way to stem the drift toward inefficient communications is to link “how to use the tools” directly to tasks that people want or need to accomplish. That means identifying the use cases and the usage profiles in the organization, and matching communications capabilities with the work to be done. This is an important part of implementing new systems, and also of figuring out what capabilities and what technologies are really needed in the first place.
 Of course the “inefficiency” trend started with keyboards themselves. The familiar QWERTY design was purposefully not the most efficient for typists. The goal was to slow down typing so the mechanical actions in manual typewriters wouldn’t jam. Dvorak keyboards are designed for better efficiency and less strain, but are rarely used.
 We’ve been doing this a long time. The original voicemail systems didn’t have the technical capability to “answer the telephone.” They were non-integrated, verbal email systems. You logged into your mailbox and sent and received messages with others. That was a new process, and required changes in thinking about how communications works. A few years later voicemail was integrated with the PBX, and now that automatic telephone answering function started replacing secretaries. We automated what was a manual process and saved some headcount. But a more innovative communications concept, “verbal email,” was lost. Few people today even know it’s still possible.