Contact Centers Get Smarter

Contact centers have evolved consistently over the past two decades and always seem to be utilizing a mix of both old school voice technologies and newer solution elements.  My exposure to contact centers has been as a customer, product manager for related connectivity products and as a contributor to an important SIP-based IETF standard for standardizing collection of user information. I also get to hear war stories on a regular basis from people I know who work in call centers.

One of the newest trends is to add Artificial Intelligence (AI) into the mix.  Google has recently announced their Cloud Contact Center AI solution and it’s described in some detail within a blog post on the Google web site.

Google themselves aren’t in the Contact Center solution market (yet!) and this solution is designed to complement solutions from other providers. There is a rich history of solutions provided by companies such as Avaya, Genesys, Cisco and many others that were originally all premise based, but contact center solutions increasingly have been moving to the Cloud in recent years.  A review of the blog post noted above shows an interesting mix of how AI is injected into the fray.  Contact centers are a people intensive business, as agents take incoming calls and customers get queued up until an agent is available.  Google’s Diagflow development tool enables contact center providers to create automated virtual agents who can take incoming calls and use a combination of Interactive Voice Response (IVR) tools and access to databases to start interacting with incoming callers.  There are limits at this point, but the tools such as Virtual Agent (shown as being in a Beta status) can start analyzing the caller’s needs, answer some questions and determine if a handoff to a live agent is needed.

Another new tool is called Agent Assist.  Assuming that some of the incoming calls eventually do need to connect to a human agent, this tool (shown as being at Alpha level) can augment the agent’s progress through the conversation by providing tips such as relevant articles or other shortcuts.

The big picture here is fascinating. There’s been a long term debate about whether AI should replace human roles or augment human capabilities.  Walter Isaacson’s book, The Innovators, has some interesting discussion about this from AI experts on both sides of this argument.  Google AI pursues both directions.  At the business level, contact centers employ a lot of humans and need to assist many customers via tools which can include voice, chat, speech recognition and much more.  Customers want answers or perhaps want to make a purchase. So whether AI is used to deal directly with the customer needs or help an agent to get to answers more quickly, it’s a win for the customers. For the companies who deploy contact centers, AI  offers another approach to get more productivity out of the investment they have already made in contact center solutions and in agent resources that utilize these solutions. Human call agents do add value in this equation, particularly when the issues are complex or emotions come into play, so don’t expect these virtual assistants to eliminate those roles, but over time, the trend is likely to be toward making more use of AI at various stages of the customer interaction.

If you or your company is active in the contact center eco-system, feel free to weigh in with your comments.  If your company would like advice on how trends like AI will affect strategies for providing contact center solutions, you can reach me on LinkedIn or at our web site.

 

 

 

 

 

 

 

Advertisements

Leveraging Industry Standards for Success – A Case Study

Telecom is an example of an industry that has created national and international standards for communications in ways that benefit companies large and small. As a consultant, I’ve frequently advised companies about specific standards and how they can be aligned with business strategies. Let’s consider an example.

During the last 15 years, facsimile communications has been dramatically affected by technological forces.  The circuit-switched network is being replaced by IP networks throughout the world, as I noted in a previous post.  As a result, all fax communications company have had to develop a strategy for the transition to IP. The vendor community anticipated this in the late Nineties and key new standards for sending fax messages over IP were developed by the Internet Engineering Task Force (IETF) and the International Telecommunications Union (ITU). The IETF focused on integrating fax with Internet email and the ITU split its efforts between supporting the IETF Internet Fax standards by reference (T.37) and devising a new standard for real-time fax communications (T.38).

Standards adoption often takes time and such was the case for IP fax. There were some early adopters of the email based approach (for example, Panafax and Cisco), but despite backing by both the ITU and IETF, the market didn’t take off. One big reason was the emergence of voice communications over IP (VoIP), primarily based upon the IETF’s Session Initiation Protocol (SIP), which gained increasing momentum during the first decade of the 21st century.

Several of us in the ITU and IETF took a small but critical step which allowed T.38 IP fax to ride this wave. In the year 2000, we completed an annex to T.38 which specified how it could be used with SIP.  As a result, when implementors wanted to add fax support to their SIP-based Voice over IP solutions, the steps required to enable a Voice over IP session to spawn a T.38 fax session had already been specified in a T.38 annex. During this same period, Voice over IP gateways were emerging as the preferred approach to connect the existing circuit-based network to the emerging IP network based on SIP. Cisco and other gateway manufacturers such as Audiocodes and Cantata (later renamed Dialogic) cut over to T.38 as their favored solution to support fax over IP.  The fax board manufacturers such as Brooktrout (later Dialogic) followed suit and T.38 became the most widely adopted solution for Fax over IP.  The use of T.38 for IP fax was also supported by the Third Generation Partnership Project (3GPP) for 4th generation mobile networks and by the SIP Connect initiative for SIP Trunking driven by the SIP Forum.

When I was advising my fax industry clients in the late Nineties, I suggested they keep a close eye on the trends in both fax over IP and Voice over IP in deciding upon their product directions. At this time, the IETF standards for Internet Fax via email got early momentum, but in the standards community, we kept working on both the email and real-time IP fax solutions. As noted above, the step of ensuring that T.38 could eventually be used with SIP in a standards-based solution became very important as Voice over IP became a much bigger industry trend than Fax over IP.  As a result, fax solutions that would work over the emerging voice over IP networks became successful and are still being sold by many communications vendors today. The story didn’t stop there. There are other important trends that have emerged in recent years such as the needs for enhanced security and the transition from physical products to software-based solutions in the Cloud that communications vendors need to bake into their strategies going forward.

If you have been in business scenario where leveraging industry standards helped your company’s products gain success, please feel free to weigh in with your comments. If you’d like to explore strategies on how to evolve your company’s solutions and leverage current or potential industry standards, you can reach me on LinkedIn or on our web site.

Business Disruption in Document Communications – What Happened?

In the late 1990s, the Internet and the World Wide Web created massive technical disruption for the worlds of document communications and messaging. Now, nearly twenty years later, business communications looks much different than it did going into the Millennium and once major businesses such as the marketing of enterprise fax machines are deep into their long tail phase. In my last post, I noted several trends in both fax and email as the related standards communities pushed to transform these technologies for the new IP world. Let’s look at what happened.

One major driver of the success of fax in the Nineties was the classic network effect as postulated by Ethernet inventor Robert Metcalfe. In essence, Metcalfe had stated that a network became much more compelling as the number of connected devices increased.  In the Nineties, the fax machine vendors and computer fax companies were often on opposing sides in technical battles, but all of these companies benefited from Metcalfe’s network effect as it applied to the overall fax network. But as we crossed into the 21st century, fax machines designed to run on the circuit-switched phone network (aka the Public Switched Telephone Network or PSTN) had much less utility in an increasingly IP network connected world. As a result, physical fax machines began to disappear from larger enterprise offices and in smaller offices, they were often replaced by less expensive multi-function peripherals (MFPs), which were basically printers that also included fax and scanning features. This meant that the number of Group 3 fax devices in total at first plateaued and then began a decline. In essence, Metcalfe’s network effect played out in reverse. The fax machines and MFPs of the Nineties did not evolve to use the new IP fax standards, so as document communications moved to IP, these physical fax or MFP devices still only sent faxes over the PSTN and were less connected as IP communications became more prevalent.

If we consider the trends in computer-based fax, they played out differently. Companies like Brooktrout sold fax boards to independent software developers and the boards were incorporated in local area network solutions. These solutions also typically included tight integration with email.  By 2004, Fax over IP enabling technology started to be commercialized, using the ITU-T T.38 IP fax standards. T.38 had some technical issues, but it could use the same call control protocols — SIP, H.323 and H.248 — that were being adopted by the new Voice over IP networks, so T.38 became a popular choice for conveying fax over these VoIP networks. By contrast, the T.37 approach of Internet Fax over Email did not get much adoption, most likely because it didn’t mesh very well with Voice over IP.  The computer-based fax solutions that ran on Local Area Networks continued to have healthy growth in the first decade of the 2000s in large part due to the continued validity of fax as a legal document, perceived security compared to use of email over the Internet, a slow rampup in the use of digital signatures on other electronic documents and regulations such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA) which meshed well with receiving fax documents in electronic form (rather than on a paper tray).

During the same period, email use continued to grow, but rising issues such as lack of security and massive amounts of spam made the use of email outside of corporate subject to a number of hassles. As noted above, electronic signatures started to become available as a legal alternative to fax signatures, but didn’t gain widespread use until the past few years. As a result, enterprises tended to standardize on a particular commercial email package and communicate whenever possible over secured private IP networks and by making use of security tools such as Virtual Private Networks (VPNs).

Now, in 2018, the messaging world is highly fragmented. Large enterprises have tended to choose unified communications eco-systems from large players like Microsoft, Cisco and Avaya, but even these solutions are rapidly evolving as the momentum is shifting toward pushing enterprise communications into the Cloud.  Hence, Microsoft is shifting its emphasis from Lync to Skype for Business and now onto Teams and other vendors such as Cisco are doing much the same.  Upstarts such as Slack have started by offering cloud-based team communications and have forced reactions from the traditional Unified Communications players.  As messaging has evolved, voice is now becoming less important and fax is now more of a niche play.  One thing I don’t see too much of is the use of business communications that can effectively cross the boundaries between organizations. In theory, Cloud-based communications could get us there, but the vision of the late Nineties of being able to communicate documents and other types of media effectively across the entire Internet has been hobbled by security, privacy and spam issues. We’ll have to see if the Cloud and better cross-network security mechanisms could form the foundation for approaches that will be superior to today’s highly balkanized communications landscape.

If you or your company have participated in the massive changes to the communications eco-system since the 1990s, feel free to weigh in with comments. If you’d like to explore strategies on how to evolve your application solutions or other communications products and services to better address the rapidly changing business environment, you can reach me on LinkedIn or on our web site.

A Tale of Business Disruption in Document Communications

In the middle of the 1990s, the Internet and its associated IP protocols were like a huge wave that was off the shore of the business world, but poised to come in and cause massive disruption. At that time, I ran a consulting business for telecom clients (Human Communications) and was active on several fronts to be proactive on the topic.  In the TR-29 fax standards committee, we started work on how fax communications could take place over the Internet. A small group began work on an initiative called Group 5 Messaging, whose goal was to take the best ideas of fax, email and telex and spin up the next generation of business communications. In late 1996, the Internet Engineering Task Force (IETF) held an informal Birds of a Feather (BOF) on Internet Fax.  In meetings of Study Group 8 of the International Telecommunications Union (ITU), discussions began on how to extend fax protocols to work over the Internet or on private IP networks.

On the business side, fax was very hot and even very small businesses such as pizza parlors had purchased fax machines. Corporations had been adopting fax over Local Area Networks, and companies like Rightfax, Omtool, Optus and Biscom had  very healthy businesses selling into this space. Brooktrout Technology had introduced multi-channel fax boards and drivers for Windows NT, and had built up market momentum that enabled the company to go public. But all of this fax technology was based on sending faxes over circuit-switched networks. What would be the impact of the Internet and its technology on fax and business communications?

By 1999, the business communications landscape had changed dramatically. On the standards front, the IETF had created several standards for providing a fax services via email and the ITU had referenced these standards in the T.37 standard. The ITU had also independently created a new T.38 standard which essentially extended the T.30 Group 3 fax protocol into the IP packet world. The Group 5 initiative had lost momentum, as the fax and other communications players lined up to support the new IP-based standards from the IETF and ITU which appeared to solve the problem of how to send faxes over IP.  Related standards work continued and I was active in making sure that the new T.38 fax protocol was supported under both the current H.323 call control and under the new SIP and Megaco (later H.248) protocols.

On the business side, fax was still doing well, but now had new competition. The advent of the World Wide Web had totally wiped out the Fax on Demand business that had done well in the early Nineties. Various pundits were saying that email was the future of business communications and that new portable document formats like the PDF from Adobe would be used in place of fax.  Curiously, the email experts who participated in the IETF Internet Fax work weren’t so sure. Fax had business quality of service elements which were hard to duplicate in email — notably instant confirmation of delivery at the end of a session, negotiations between the endpoints on what document formats were acceptable and the legal status of fax, where fax messages over the circuit network were accepted as legal documents for business purposes.  The IETF work group tried to upgrade email protocols to address the technical elements, but the work was hard and the path to adoption slow.

I also shifted my career and suspended my consulting business to join Brooktrout Technology and help them participate in the new Voice over IP business. But just before I left my business, I advised my fax clients and newsletter subscribers to get diversified and not put all of their eggs in the fax communications basket.  I saw both challenges and opportunities ahead. There had been a large number of new startups that had attempted to ride IP fax to success in the late Nineties, but most of them crashed and burned within a couple of years. E-Fax had introduced “free” IP fax mailboxes and that approach was quickly emulated by competitors, but the business model for “free” wasn’t obvious.  I’d helped form a new industry association called the Internet Fax and Business Communications Association in early 1999, but we had difficulty getting fax and other communications industry vendors to sign on. The times were turbulent and the way forward was less than obvious.

In my next post, I’ll talk about how the trends toward IP Fax and its communications competitors played out and which related business communications issues still need to be addressed.

If your organization has participated in the evolution of fax or other business communications during this evolution from the circuit-switched phone network to IP, please feel free to comment. If you’d like to explore strategies on how to evolve your application solutions or other communications products and services in this rapidly changing business environment, you can reach me on LinkedIn or on our web site.

Paradigm Shift: Virtual to the Cloud

We live in a world where communication solutions can be hardware-based, run in a virtual machine on a local server or be situated in the Cloud. The paradigm for communications solutions has been shifting from hardware to software to virtualization as I’ve discussed in my recent posts. Once a solution is virtual, in principle, customers have the flexibility to control their own destiny. They can run solutions on their own premises, in the Cloud, or with a hybrid model that uses both approaches.

Let’s consider an example. Dialogic has traced this type of evolution in its SBC products.  In 2013, the company positioned two products as SBCs. The BorderNet™ 2020 IMG provided both SBC and media gateway capabilities and found an audience that wanted an IP gateway between different networks or an enterprise edge device. The BorderNet™ 4000 was a new product which focused on SBC interconnect functions and ran on an internally-managed COTS platform. Five years later, both products have changed significantly.  The IMG 2020 continues to run its core functions on a purpose-built platform, but its management can be either virtual or web-based.  The BorderNet™ 4000 has morphed into a re-branded BorderNet™ SBC product offering. The product has evolved from its initial hardware focus to being a more portable software offering.  Customers can now run the software on a hardware server, in a choice of virtual machines or by deploying on the Amazon Web Services (AWS) cloud. Whereas the original BorderNet 4000 only supported signaling, the BorderNet SBC can optionally also support transcoding of media, either in hardware (using a COTS platform) or in software. The journey of these products has offered customers more choices. The original concepts of both products are still supported, but the products now have elements of virtualization which have enhanced their portability. So as a result, the full functionality of the BorderNet SBC can run in the Amazon cloud and in the other business models.

Once a product has been virtualized, it can be deployed numerous ways and can be deployed using a variety of business models. As customers want to move solutions to the Cloud, being able to run one or more instances of software in virtual machines is essential. The term Cloud tends to be used generically, but in telecom, there are multiple ways the evolution to the cloud is playing out. One example is the OpenStack movement, where open source has helped drive what is sometimes called the Public Cloud. The various forms of private clouds have also been popular, with variations being offered   by Amazon, Microsoft, Google, Oracle, IBM and others.

In my next post, we’ll consider how the technical changes we’ve been describing here have also been coupled with changes to business models.

If you participated in the evolution described here, please feel free to weigh in with your comments. If you’d like to explore strategies on how to evolve your application solutions or other communications products / services in this rapidly changing technical and business environment, you can reach me on LinkedIn.

Following the Path to Virtualization

A number of years back, my product team engaged with a Tier 1 solution provider. They wanted to use our IMG media gateway as part of their solution, but with a condition.  They had limited rack space, so they wanted to use an existing server to manage our device. Up until then, we required customers to load our element management system (EMS) software onto a dedicated Linux server.  Instead, our customer asked us to take our EMS software and package it to run on a virtual machine. Our team investigated and were able to port both the underlying Linux layer and the EMS application for use on a Xen virtual machine. Voila! Our software was now virtualized and our customer was able to re-use their existing server to manage the IMG gateway.

That was my introduction to virtualization, but this approach quickly became much more important.  Just a few months later, other customers asked us to port our EMS software to work within the VMWare virtual machine environment. There were immediate benefits. The EMS running directly on server hardware required qualification of new servers roughly every two years, an arduous process which became more difficult over time. By contrast, the virtual EMS (which we shortened to call the VEC), would run on a VMWare virtual machine and we were isolated from any server changes the customer might make. The VEC was also a software based product, so we offered it for much less than $1000 retail price vs. the $3000+ price point of a server based version.  Over the next several years, more and more customers moved to the virtualized version of software and the demand for the server version declined.

A couple of years ago, I was asked to take over a new software-based load balancer (LB) product developed by a Dialogic software team in the United Kingdom. The back story here had some similarities to my earlier experience. The team was working with a major customer who really liked their software-based media resource broker (MRB), but had issues with the LB product offered by a major market player. The team built the software load balancer so that it could run either directly on a server or on a virtual machine. When we launched the product for use by all customers, our Sales Engineering team loaded the software onto their laptops using a commonly available virtual software program and were immediately able to set up prototype sessions and adjust configurations via the software’s graphical user interface. So the LB software was virtualized from the beginning.  This was part of an overall trend within Dialogic, as more and more of the software-based components of products were converted for use in virtual environments.

In the early days, virtualization in telecom was mainly for software tools like user interfaces and configuration, but that is now changing in a major way. The LB product from Dialogic runs in a totally virtual mode, so that operations as diverse as configuration and balancing streams of protocols as diverse as HTTP and SIP all are supported, along with very robust security. In the telecom industry, virtualization is being used in several different ways as part of a sea change where the new approach to scalability involves building additional capability and resiliency by adding new instances of software. In turn, this drives the need for new types of orchestration software, which can manage operations in a world where the new paradigm requires creating, managing and deleting instances based on real time needs.

In my next post, I’ll talk about other ways that virtualization is being used as a key principle for building out telecom operations in a variety of Cloud environments. Virtualization is still a relatively young technological movement, but it has already helped spawn some surprising developments.

If you participated in the evolution described here, please feel free to weigh in with your comments. If you’d like to explore strategies on how to evolve your application solutions or other communications products in this rapidly changing business environment, you can reach me on LinkedIn.

 

Reshaping Enterprise Communications: A Tale of Two Companies

In my last few posts, I’ve described several factors which have encouraged communications solution providers to transition away from hardware and focus on software-based application solutions.

Let’s consider two companies and how they adjusted the path of their technical and business models to address these directions. Avaya is an example of a company whose solutions had a substantial amount of proprietary hardware around the time they split off from Lucent in the year 2000. Avaya had a leading market share in multiple markets targeted to enterprises, including PBXs, which provided telephone infrastructure for enterprises, and Call Centers, which used Avaya components to meet customer needs for highly scalable inbound and outbound communications. But the advent of IP-based technology and new protocols such as SIP began to change all of that. The mantra of IP-based communications was that voice was just another application that ran on an IP stack. This massive technical change was a major challenge for Avaya, since they’d built their business based on selling PBX and call center solutions based on their own hardware, but the cost of sustaining this business model was high. So starting around 2002, they executed a pivot to adjust to the new situation. First, they introduced new IP-based versions of their PBX technology ranging from IP phones to an IP-based PBX and a suite called IP Office for small to medium sized businesses. In parallel, they told potential partners that they wanted to move out of the hardware business and focus on value provided by their software. Third, they created a partner program, the Avaya DeveloperConnection program (later shortened to DevConnect), and encouraged partners to either build on or connect to Avaya solutions. As a result, Avaya was able to cultivate relationships with hardware appliance companies for products like media gateways and focus more on building out their application software. The DevConnect program also fit well with Avaya’s increased role as an integrator. Solutions for customers could be built using not only Avaya technology, but also DevConnect certified products. So Avaya had an approach to building out software-based solutions using IP, but they also had a large installed-base of hardware-based solutions, so they were not as nimble as some of their competitors.

The advent of SIP helped to encourage new market entrants into the communications software space. A prominent example was Microsoft. Starting around 2007, Microsoft introduced it’s new communication solution, Office Communication Server 2007 or OCS.  OCS used SIP as its backbone protocol and touted the ability for enterprises to eliminate the cost of a PBX and replace it by software running on Commercial Off the Shelf (COTS) servers. Enterprises still needed to connect to the telephone networks run by service providers, which were heavily based on circuit-switched technologies, so Microsoft started its own partner and certification program to qualify 3rd party products such as media gateways. Microsoft also had a lot of marketing muscle, since their applications such as Microsoft Office were widely used within enterprises, so they had a ready audience among the information technology managers at customers. In 2010, Microsoft -re-branded their offer and called it Microsoft Lync. Microsoft quickly became a big player in the new Unified Communications market and began to take market share away from traditional PBX vendors such as Avaya. Microsoft also continued to be aggressive in cultivating relationships with 3rd party hardware partners, who added support for Lync compatible IP phones and newer IP-based products such as Session Border Controllers (SBCs). Microsoft has since re-branded Lync to be Skype for Business, but the underlying technology and business model is an evolution of Lync.

The market battle for leadership in communications for enterprises continues, but the momentum has shifted heavily to software-based solutions and most hardware components are provided by other vendors. One exception to this direction is Cisco. They have maintained a strong presence in the hardware side of communications by virtue of their leading market position in routers and have incorporated additional functions such as media gateways and SBCs upon their routers. However, Cisco also has built their own software-based Unified Communications suites and Contact Center solutions, so they use the software-based applications model, but pair it up with Cisco network components to create their solutions.

In summary, the advent of SIP is one of several factors which have radically changed the landscape for communications solutions. In this post, we’ve considered how Avaya and Microsoft built their business strategies based on the strong move to IP-based software solutions over the last decade. In my next post, I’ll talk about another important technology development, virtualization, which is in the process of re-shaping how both application software and communications infrastructure products are being developed and brought to market today.

If you participated in the evolution described here, please feel free to weigh in with your comments. If you’d like to explore strategies on how to evolve your application solutions or other communications products, you can reach me on LinkedIn.