Faxed: A Book Review – Ruminations

In my last post, I talked about the book written by historian and professor Jonathan Coopersmith entitled Faxed – The Rise and Fall of the Fax Machine.  I left off as fax entered the late Eighties and became wildly popular.  As Coopersmith recounts, this confounded the experts, who were expecting electronic messaging, videotext or a variety of other technologies to supersede fax.

In my own work life, I’d worked for a fax company for a decade by then, but didn’t get close to the technology until I joined Product Line Management and wrote the business case for a fax modem product.  Like many companies, Fujitsu sold fax machines, but we also started developing computer-based products.  Around 1989, we released the dexNet 200 fax modem and accompanying software called PC 210 which ran on IBM compatible computers.  A year later, my boss sent me to the TR-29 fax standards committee and I discovered that this group was writing standards that would have a big impact on most companies in the wild west activity known as computer fax.  I also joined an upstart industry group, which became the International Computer Fax Association (ICFA) and started reporting to them on the standards being developed at TR-29.  Fax was hot, but Fujitsu was focused on its mainframe computer business and shut down the US-based business called Fujitsu Imaging Systems of America (FISA) that employed me.  After a month of soul searching, I decided to start a consulting business called Human Communications which advised clients on fax and related technologies.  The ICFA was one of my first clients and I continued to attend TR-29 and gradually built up a client list among fax machine and computer fax companies.

IMG_1010 - Faxed - cropped

By late 1993, the business was going well and that’s when I originally met Jonathan Coopersmith. In his book, he talks about this period as being the heyday of fax. Fax did extremely well in the US, as pizza parlors installed fax machines and offices of every size had one. But it became even more popular in Japan. The Japanese fax manufacturers competed fiercely, but also cooperated to ensure interworking between their machines.  I started attending the meetings of ITU-T Study Group 8 in starting around this time and we were building the new extensions to the very popular Group 3 fax standard.  There was a newer digital standard called Group IV, but Group 3 took on its best attributes and basically shut Group IV out of the market.

In the mid-Nineties, the Internet and the World Wide Web exploded and began a massive transformation in the way the world communicated.  In the fax community, it was obvious to many of us that the Internet would have a huge impact, so we started a very aggressive effort to write Fax over IP standards.  Dave Crocker, a co-author of the standard for electronic mail in the Internet Engineering Task Force (IETF), came to TR-29 and asked for volunteers to begin the work of Internet Fax in the IETF.  A similar effort began in the ITU.  The work proceeded from ground zero to completed standards by 1998, which was unusually fast for standards groups.

I left the fax consulting business in late 1999 and joined the Voice over IP industry.  By then, there were already signs that fax would lose its dominance. The World Wide Web totally took over the information access role that had been played by Fax on Demand.  The chip companies stopped focusing on fax and by the time a new version of the T.38 standard was written in 2004 to accommodate the faster V.34 modem speeds for fax over IP, the VoIP chips didn’t support it.

In Japan, as Coopersmith explains, fax had been even more dominant than in the US.  The visual aspects of Japanese characters such as kanji meant that computer keyboards were much slower to develop in Japan than in the US market.  By the time I met Jonathan again in 2004, fax had begun its next move and had become more of a niche business both in the US and in Japan.  It still sells well in some market segments and there has been a bit of a renaissance as the T.38 fax standard has kicked in to accompany Voice over IP, but the arc of technological history is now in the long tail phase for fax.

Fax is a classic example of a technology that had many false starts — the first half of the book shows just how many there were — but eventually caught fire as all of the pieces came together for massive success. This book offers some good context on all of this and has many useful lessons learned for technologists. Great technology is never enough by itself, but when the right combination of market needs and technology come together, amazing things can happen. Faxed, the book, tells that story for fax.

 

 

Advertisements

Faxed: A Book Review – Part 1

In 1993, I visited the city of San Antonio to participate in a speaking engagement on fax at a conference on electronic commerce.  While there, I had dinner with a professor from the University of Texas A & M named Jonathan Coopersmith.  We had an engaging conversation about facsimile technology and he told me that he was writing a history on the subject.  The fax business was in full ferment at the time and I’d been busy during the past several years working on the TR-29 fax committee, which prepared US fax standards and also submitted contributions to the International Telecommunications Union (ITU), the group which defined standards for fax and other telecom technologies.

Fast forward about ten years.  Jonathan visited Needham, Massachusetts to interview the executives of Brooktrout Technology and discovered that I also worked at the company. He invited me to share lunch with him and we talked about how much fax had changed in the prior ten years, going from the world’s hottest communications technology to one of many ways of communicating in a world now dominated by Internet based tech.  He also said that yes, he was still working on the book, but had taken out time to raise his family and he’d been sidetracked on that long running project.  We continued to exchange messages over the next several years, notably when he visited Japan to interview sources over there in person.  He sent me a draft of a chapter on computer fax and fax during the Nineties around 2010 or so and I offered some feedback.

In 2015, Jonathan got in touch.  Great news.  The book was done and published.  The result is called Faxed: The Rise and Fall of the Fax Machine.  He sent me a copy and I recently sat down and read it over the period of a few weeks. Jonathan’s area of expertise is as an historian specializing in the history of technology.  He discovered fax as a user, finding the fax machine was a technology that even his mother could use effectively.  He’d also discovered that the books on fax were not written from an historical perspective, so he decided to write one.

IMG_1010 - Faxed - cropped

Fax has a fascinating history.  It was invented in 1843 by Alexander Bain, a Scottish physicist, during the era when the telegraph was the king of communications technology.  Bain was one of several notable figures in the early days of fax; as Coopersmith notes, the idea attracted a diverse group of inventors who worked not only on fax, but also on improvements to the telegraph.  I’d been aware of Bain, but Coopersmith digs in and finds many others who advanced fax in one way or another during its first seventy years.  The technology was promising, but difficult, involving aspects of mechanics, optics and electronic synchronization which tended to exceed the state of the art at the time.  The early markets for fax sprung up around World War I and its aftermath, as newspapers began to supplement written words with photographs transferred via fax and competitive technologies.

As Coopersmith recounts, fax moved forward in fits and starts and consumed a great deal of financial capital in the process, but did not actually result in a successful commercial market until the Sixties, when new technologies such as the photocopier from Xerox made it easier for faxed documents to be copied and exchanged within businesses and other organizations.  Even in this period, there was a lack of standards and the main markets were the US and Japan.  Xerox appeared to have all of the pieces to dominate the market, but invested elsewhere and other startups began to compete for the burgeoning market of fax machines targeted to offices.

Two developments changed the landscape in a dramatic way.  First, the Carterphone decision forced AT&T to allow 3rd party devices to connect to the phone network and opened the way to telecom technology to advance outside of the monopolistic Bell system.  Coopersmith notes that NTT was forced to open its network in Japan just a few years later, which also encouraged a number of companies in Japan to jump into fax.  The second development was the hard set of compromises that resulted in the first well accepted fax standard, Group 3, which was agreed within the International Consultative Committee on  Telegraphy and Telephony (aka CCITT) in 1980.  With the advent of Group 3, the factories in Japan were able to standardize mass production of fax machines and Japan became the supplier of fax machines for the world.

In the late Eighties, the sub-$1000 fax machine debuted and the fax explosion was in full motion.  Around this time, a court in New York State accepted that fax documents could be used in legal proceedings and fax gained a stature which pushed other technologies like Telex aside.

During this period, I worked for Fujitsu Imaging Systems of America (FISA) and was a product line manager for a new technology called computer fax modems.  FISA had bought one of the early fax success stories from the Sixties, Graphic Sciences, from Burroughs Corporation in 1986.  This is where my story begins to intertwine with the fax history which Coopersmith recounts.  I’ll continue this review in my next post.

Secure IP Fax – Now Standard

Last fall, I blogged about a pending standard for securing facsimile communications over IP networks here and I spoke about this progress at the SIPNOC conference. Since that time, the standard, known as RFC 7345 has been approved by the Internet Engineering Task Force. The availability of a standard is very good news. There’s a common perception that fax isn’t used anymore, but there are a number of business to business (B2B) and consumer applications where fax still is common, including real estate, insurance, health care and legal applications. There are also a number of companies which provide fax by selling equipment, fax enabling technology, software or a hosted service.

So why should people or companies care about securing IP fax? Increasingly, most of our real time communications, whether by voice, fax, text or video, are transported over IP networks. Very often, they will travel over the Internet for a portion of their journey. The Internet is ubiquitous, but fundamentally unsecure unless the application or the transport layers provide security. Security can mean many different things, but is often referring to solutions for needs which include privacy, authentication and data integrity. The new RFC 7345 is designed to support these types of requirements by applying a standard known as Datagram Transport Layer Security (DTLS). One of the key reasons that the Fax over IP RFC uses DTLS is because the T.38 IP fax protocol most typically formats its signals and data using the User Datagram Protocol Transport Layer (UDPTL), unlike most real time media, which use the Real Time Transport protocol (RTP).  DTLS was designed to provide security services with datagram protocols, so it’s a good fit for T.38 IP fax.  The current version of DTLS is 1.2, which is defined in RFC 6347.

Getting a standard approved is really only the beginning. In order to get traction in the marketplace, there needs to be implementations. For example, T.38 was originally approved in 1998 by the International Telecommunications Union, but implementations did not become common until many years later, starting around 2005. In the time since, T.38 has become the most common way to send fax over IP networks and its been adopted by most of the fax eco-system.  On the plus side, a key advocate for the new standard is the Third Generation Partnership Program (3GPP), which is the standards group that drives standardization of services which will run over mobile networks, such as the emerging Long Term Evolution (LTE) network.  The SIP Forum is also continuing work on its SIP Connect interworking agreements and there is potential for including the new standard in a future version of SIPconnect.

I’ll continue to track what’s happening with respect to implementation of the standard.   As I noted in some of my previous posts, the current work on standardizing WebRTC is helping implementors to gain experience in important new standards for security, codecs and Network Address Translation (NAT) traversal. This WebRTC “toolkit” is also available in open source form.  The inclusion of DTLS in RFC 7345 joins the pending RTCWeb standards in providing new applications and use cases for these emerging standards. This will be good news for the user community, as features which were previously available only in proprietary get implemented in variety of products and services.  If you know of any plans in motion or want to learn more, please feel free to comment or get in touch with me.  You can also learn more by checking out my presentation on Securing IP Fax.

On the Road Again – SIPNOC 2014

I’ll be speaking next week at the SIPNOC conference in Herndon, Virginia.  SIPNOC is sponsored by the SIP Forum and covers a wide variety of topics related to SIP — the Session Initiation Protocol — with a particular focus on the needs of service providers.   It runs from June 9 – 12.

WebRTC continues to be a hot topic in the telecom industry and I’ll be on a panel with several other participants to discuss the relationship between SIP and WebRTC.   SIP has been the primary protocol for Voice over IP and is widely deployed.  WebRTC is much newer, but offers an interesting mix of audio, video and data capabilities and it can be accessed via popular browsers from Google and Mozilla.  WebRTC also has a rapidly growing eco-system.  Are SIP and WebRTC complementary standards which work well together or going in totally different directions?  Come to the panel and find out!

I am also delivering a presentation on a very exciting development in IP fax communications over SIP.  The presentation is entitled: Securing IP Fax – A New Standard Approach.  It’s been a long time coming, but there will soon be a new security standard for implementors of IP Fax over SIP networks.  In particular, the Internet Engineering Task Force is working on using an existing security standard known as DTLS and adding this as a security layer for T.38 fax.    I’ll be talking about the pending standard, why it’s needed and what kind of benefits can be expected for the many users of T.38 IP fax once the new standard is deployed.

I’ve attended SIPNOC as a speaker since its beginning four years ago.  It’s an excellent conference and offers an in-depth perspective on the latest news in SIP as delivered by an all star cast of speakers.  I hope you’ll be able to join us.

Securing Fax over IP for Business Communications

The recent controversy regarding NSA tracking of phone conversations has elevated concerns about security and privacy for business communications. Enterprises generally want to keep their communications private. Use of techniques such as private networks, firewalls and secured tunnels can help to protect internal communication from eavesdroppers, but there are also many exchanges which entail communication with third parties over public networks.

Facsimile is best known as a method of communicating images of printed pages over the Public Switched Telephone Network (PSTN) and many fax companies touted the PSTN as being much more secure than the public Internet, hence reducing the need for formal security approaches. But the circuit-switched network is rapidly being replaced by hybrid and all-IP networks, and a portion of business fax traffic is now sent over the Internet.

During the Nineties, the fax standards experts in the International Telecommunications Union (ITU-T) added annexes to the Group 3 fax T.30 protocol to protect against a variety of security threats. However, there was lack of consensus on how to proceed, so two different approaches were standardized. As attention turned to standardizing fax over higher speed V.34 links and over IP networks, the initial efforts to implement fax security using the new standard approaches fizzled out and never got traction in the marketplace.

Fast forward to 2013. Security and privacy now have a much higher profile. The NSA exposé and other security glitches like the Wikileaks exposures of government and corporate documents have increased awareness of the down side of unsecured documents and communication. In the meantime, as the phone network is being replaced by IP technology, most new sales of fax to the enterprise are for Fax over IP and the T.38 standard from the ITU is frequently used. Most applications of T.38 use a transport protocol called UDPTL (User Datagram Protocol Transport Layer) which is currently an unsecured protocol.

The conventional wisdom might have a “who cares?” attitude, since there’s a common perception that nobody uses fax anymore. However, fax still is used a great deal for a wide variety of business applications which include healthcare, financial and legal organizations, plus fax is integrated into a variety of business processes. Fax is also used for transmission of many normally confidential documents such as insurance claims, real estate transactions and legal notices, plus there are regulations such a HIPAA in the health care domain which require protection of documents from third parties.

For all of these reasons, the need for better security solutions for IP-based facsimile is becoming clear. In another realm of standardization, WebRTC is attracting a lot of attention as a next generation method for performing a wide variety of real time communications such as video and voice over web protocols. The original applications of the Session Initiation Protocol (SIP) were often implemented with little attention paid to security, so the WebRTC standards activities have examined the best approaches for addressing matters such as security and are recommending use of a relatively new security protocol known as Datagram Transport Layer Security (DTLS) to secure real time communications of media within WebRTC.

One advantage of DTLS is that it is relatively protocol agnostic and can be applied as a security layer for various different protocols. So this is a good time to consider how protocols planned for use in WebRTC might also have other applications. The Third Generation Partnership Program (3GPP) has recognized that IP fax is still an important application and wants to have a standard approach to secure faxes which are being transported over IP networks. As a result, there is now an Internet Draft being circulated for comments within the MMUSIC (Multiparty Multimedia Session Control) working group of the Internet Engineering Task Force (IETF) which proposes that DTLS be established as a transport layer that can be used to secure sessions of T.38 IP fax when running over the SIP protocol.

I’m personally enthusiastic about this direction and have made comments on the current draft. I find it ironic that the IETF is looking at adding security layer support to an ITU protocol, but in the world of standards, it’s useful for the work to be done by the experts who have the right domain expertise. In this case, the IETF created DTLS and there is interest in the combination of UDPTL and T.38 from the Fax over IP task group of the SIP Forum, so there is probably enough participation by the Internet and fax communities to produce a useful standard. At this writing, MMUSIC is considering adoption of this draft as an official working group item.

Stay tuned on this one. WebRTC is training a generation of engineers to use a new toolkit of various protocols, so the potential adoption of DTLS by the IP fax community may be a harbinger of a trend to re-purpose various components of the WebRTC initiative in innovative and surprising ways.

WebRTC – Solution for Over The Top Communications?

WebRTC offers an intriguing mix of web-based access and real-time communications.   Part of the excitement has been due to the aggressive approach which has been taken by browser companies such as Google and Mozilla in adding WebRTC to recent versions of their production browsers. 

As a result, any user of these browsers could potentially be connected to other users of WebRTC applications. One example where this could come into play is in Over the Top (OTT) applications. The term Over the Top usually means that an application runs over a broadband IP network and is usually not a packaged service sold by the Internet service provider (ISP). For example, Skype provides a way to do audio and video communications over IP networks. Its base level of service allows for connection to other Skype users at no charge for both audio and video communications. Skype also includes sophisticated features like encryption of calls. For ISPs, Skype potentially competes with a bundled voice offering and a user might elect to use the combination of Skype and a mobile phone for all of their voice communications. This means the ISP gets to sell the customer a broadband IP connection, but may not get any other bundled service revenue.

Let’s suppose you’re an ISP that would like to offer an alternative to Skype for your customer community. What does WebRTC bring to the table? On the media side, WebRTC can support both audio and video communications. It also has built-in security methods for authentication and securing of sessions. For the application, the ISP can create this from scratch or layer this onto a WebRTC enabled browser and automatically take advantage of the WebRTC “hooks” which are built into a browser such as Chrome or Firefox. To truly complete the OTT application, there is still more to do such as determine which signaling to use, and what addressing scheme should be used to interconnect users. For a good analysis of the signaling side, see this recent blog post from webrtchacks.

So, let’s assume the ISP completes the OTT application using WebRTC. What is the potential value add compared to a application like Skype? One potential benefit is the capability for the user to communicate with other users that have WebRTC-enabled applications. One limitation of Skype is that it is a closed community and uses proprietary technologies. As a result, Skype users can currently only communicate with other Skype users unless they go off the network. By contrast, with WebRTC, there will be a standards-based interface based on JavaScript APIs, so that the ISP could structure their application so that it can talk to other WebRTC-enabled applications. There are also a wide variety of WebRTC to SIP gateways that have already been brought to the market, so this offers the potential to interconnect the WebRTC enabled application with the existing base of SIP applications. Hence, WebRTC offers the potential to help break down the silos which currently dominate multimedia communications and enable different applications to communicate either directly via WebRTC or indirectly through WebRTC to SIP gateways.

One way to look at WebRTC is that it offers a very robust “toolkit” of multimedia communications capabilities that can run over web interfaces. The example we have discussed in this blog of an OTT application is just one possibility of how a developer or ISP might use this toolkit. As the web development community learns to take advantage of WebRTC, there will no doubt be a wide range of applications which will emerge. On the business side, WebRTC is a disruptive technology, so we can also anticipate a wide array of different business models to emerge which will build on its open standards hooks.

WebRTC – New Communications Paradigm?

About two years ago, Google brought a new communications initiative called WebRTC to the two best known Internet standards organizations.  WebRTC is short for Web Real Time Communications and the intent is to enable complex real time communications of voice, video and data using web clients, web servers and related applications.  Google has been advancing the work both through contributions to open source libraries and by contributions to standards organizations.  As you may know, once work is accepted by standards organizations, lots of people can get involved, so this work is no longer strictly a Google initiative and has gained support and participation from many companies both large and small. 

The breakdown of work between the standards organizations has played to the strengths of two of them.  The Internet Engineering Task Force (IETF) is contributing Internet protocols to the work and the Worldwide Web Consortium (W3C) is preparing an application program interface (API) based on JavaScript.    

By the second half of last year, the drumbeats promoting WebRTC sounded loudly and in recent weeks, there was an industry conference dedicated strictly to WebRTC, with more to come later this year.   I spoke at the SIPNOC conference on a WebRTC panel a couple of months back and there was lots of interest from telecom industry participants who have been busy in recent years building out real time communications using the Session Initiation Protocol (SIP).  Some articles have even touted WebRTC as the “savior” for the telecom industry, whereas other pundits have said that WebRTC is very high on the hype scale.   

One of the goals of this blog will be to cut through marketing spin and look at what is really happening in the world of communications.  In my view, WebRTC has no shortage of hype, but there is also real technical substance in the initiative and many companies are making serious investments in WebRTC, even though many of the technical elements are nascent and the standards are not yet baked.  One key thing to keep in mind is that WebRTC is the latest attempt to bring real time multimedia communications into the web infrastructure and make it relatively easy for web developers to add real time communications to their applications, without having the master the intricacies of SIP.  The telecom industry has made several attempts to integrate with web developers in the past five years, but the WebRTC initiative seems more promising, since it is centered on web protocols, not on telecom protocols, and much of the “plumbing” will be buried beneath the same kind of JavaScript APIs that web developers have been utilizing for many years.  

If you want a deep dive into WebRTC on the technical side, I can recommend the book “WebRTC:  “APIs and RTCWEB Protocols of the HTML5 Real-Time Web,” written by Alan Johnston and Dan Burnett.  They have just released a second edition, which I have not read yet, but the first edition offered a good technical overview and a nice distillation of the many standards that are being extended or developed as part of the overall initiative.  (Disclosure: I know Alan well from his work in the IETF and we are co-authors on a current Internet Draft.)  Since this is open standards work, you can also dive even deeper and sign up for the various IETF and W3C standards lists if you want to fill up your mailbox with emails.

Circling back to the title of this post, will WebRTC truly be a new communications paradigm?   In my view, it’s too early to tell, but stay tuned and hold on tight.  This promises to be quite a ride.