Faxed: A Book Review – Ruminations

In my last post, I talked about the book written by historian and professor Jonathan Coopersmith entitled Faxed – The Rise and Fall of the Fax Machine.  I left off as fax entered the late Eighties and became wildly popular.  As Coopersmith recounts, this confounded the experts, who were expecting electronic messaging, videotext or a variety of other technologies to supersede fax.

In my own work life, I’d worked for a fax company for a decade by then, but didn’t get close to the technology until I joined Product Line Management and wrote the business case for a fax modem product.  Like many companies, Fujitsu sold fax machines, but we also started developing computer-based products.  Around 1989, we released the dexNet 200 fax modem and accompanying software called PC 210 which ran on IBM compatible computers.  A year later, my boss sent me to the TR-29 fax standards committee and I discovered that this group was writing standards that would have a big impact on most companies in the wild west activity known as computer fax.  I also joined an upstart industry group, which became the International Computer Fax Association (ICFA) and started reporting to them on the standards being developed at TR-29.  Fax was hot, but Fujitsu was focused on its mainframe computer business and shut down the US-based business called Fujitsu Imaging Systems of America (FISA) that employed me.  After a month of soul searching, I decided to start a consulting business called Human Communications which advised clients on fax and related technologies.  The ICFA was one of my first clients and I continued to attend TR-29 and gradually built up a client list among fax machine and computer fax companies.

IMG_1010 - Faxed - cropped

By late 1993, the business was going well and that’s when I originally met Jonathan Coopersmith. In his book, he talks about this period as being the heyday of fax. Fax did extremely well in the US, as pizza parlors installed fax machines and offices of every size had one. But it became even more popular in Japan. The Japanese fax manufacturers competed fiercely, but also cooperated to ensure interworking between their machines.  I started attending the meetings of ITU-T Study Group 8 in starting around this time and we were building the new extensions to the very popular Group 3 fax standard.  There was a newer digital standard called Group IV, but Group 3 took on its best attributes and basically shut Group IV out of the market.

In the mid-Nineties, the Internet and the World Wide Web exploded and began a massive transformation in the way the world communicated.  In the fax community, it was obvious to many of us that the Internet would have a huge impact, so we started a very aggressive effort to write Fax over IP standards.  Dave Crocker, a co-author of the standard for electronic mail in the Internet Engineering Task Force (IETF), came to TR-29 and asked for volunteers to begin the work of Internet Fax in the IETF.  A similar effort began in the ITU.  The work proceeded from ground zero to completed standards by 1998, which was unusually fast for standards groups.

I left the fax consulting business in late 1999 and joined the Voice over IP industry.  By then, there were already signs that fax would lose its dominance. The World Wide Web totally took over the information access role that had been played by Fax on Demand.  The chip companies stopped focusing on fax and by the time a new version of the T.38 standard was written in 2004 to accommodate the faster V.34 modem speeds for fax over IP, the VoIP chips didn’t support it.

In Japan, as Coopersmith explains, fax had been even more dominant than in the US.  The visual aspects of Japanese characters such as kanji meant that computer keyboards were much slower to develop in Japan than in the US market.  By the time I met Jonathan again in 2004, fax had begun its next move and had become more of a niche business both in the US and in Japan.  It still sells well in some market segments and there has been a bit of a renaissance as the T.38 fax standard has kicked in to accompany Voice over IP, but the arc of technological history is now in the long tail phase for fax.

Fax is a classic example of a technology that had many false starts — the first half of the book shows just how many there were — but eventually caught fire as all of the pieces came together for massive success. This book offers some good context on all of this and has many useful lessons learned for technologists. Great technology is never enough by itself, but when the right combination of market needs and technology come together, amazing things can happen. Faxed, the book, tells that story for fax.

 

 

Faxed: A Book Review – Part 1

In 1993, I visited the city of San Antonio to participate in a speaking engagement on fax at a conference on electronic commerce.  While there, I had dinner with a professor from the University of Texas A & M named Jonathan Coopersmith.  We had an engaging conversation about facsimile technology and he told me that he was writing a history on the subject.  The fax business was in full ferment at the time and I’d been busy during the past several years working on the TR-29 fax committee, which prepared US fax standards and also submitted contributions to the International Telecommunications Union (ITU), the group which defined standards for fax and other telecom technologies.

Fast forward about ten years.  Jonathan visited Needham, Massachusetts to interview the executives of Brooktrout Technology and discovered that I also worked at the company. He invited me to share lunch with him and we talked about how much fax had changed in the prior ten years, going from the world’s hottest communications technology to one of many ways of communicating in a world now dominated by Internet based tech.  He also said that yes, he was still working on the book, but had taken out time to raise his family and he’d been sidetracked on that long running project.  We continued to exchange messages over the next several years, notably when he visited Japan to interview sources over there in person.  He sent me a draft of a chapter on computer fax and fax during the Nineties around 2010 or so and I offered some feedback.

In 2015, Jonathan got in touch.  Great news.  The book was done and published.  The result is called Faxed: The Rise and Fall of the Fax Machine.  He sent me a copy and I recently sat down and read it over the period of a few weeks. Jonathan’s area of expertise is as an historian specializing in the history of technology.  He discovered fax as a user, finding the fax machine was a technology that even his mother could use effectively.  He’d also discovered that the books on fax were not written from an historical perspective, so he decided to write one.

IMG_1010 - Faxed - cropped

Fax has a fascinating history.  It was invented in 1843 by Alexander Bain, a Scottish physicist, during the era when the telegraph was the king of communications technology.  Bain was one of several notable figures in the early days of fax; as Coopersmith notes, the idea attracted a diverse group of inventors who worked not only on fax, but also on improvements to the telegraph.  I’d been aware of Bain, but Coopersmith digs in and finds many others who advanced fax in one way or another during its first seventy years.  The technology was promising, but difficult, involving aspects of mechanics, optics and electronic synchronization which tended to exceed the state of the art at the time.  The early markets for fax sprung up around World War I and its aftermath, as newspapers began to supplement written words with photographs transferred via fax and competitive technologies.

As Coopersmith recounts, fax moved forward in fits and starts and consumed a great deal of financial capital in the process, but did not actually result in a successful commercial market until the Sixties, when new technologies such as the photocopier from Xerox made it easier for faxed documents to be copied and exchanged within businesses and other organizations.  Even in this period, there was a lack of standards and the main markets were the US and Japan.  Xerox appeared to have all of the pieces to dominate the market, but invested elsewhere and other startups began to compete for the burgeoning market of fax machines targeted to offices.

Two developments changed the landscape in a dramatic way.  First, the Carterphone decision forced AT&T to allow 3rd party devices to connect to the phone network and opened the way to telecom technology to advance outside of the monopolistic Bell system.  Coopersmith notes that NTT was forced to open its network in Japan just a few years later, which also encouraged a number of companies in Japan to jump into fax.  The second development was the hard set of compromises that resulted in the first well accepted fax standard, Group 3, which was agreed within the International Consultative Committee on  Telegraphy and Telephony (aka CCITT) in 1980.  With the advent of Group 3, the factories in Japan were able to standardize mass production of fax machines and Japan became the supplier of fax machines for the world.

In the late Eighties, the sub-$1000 fax machine debuted and the fax explosion was in full motion.  Around this time, a court in New York State accepted that fax documents could be used in legal proceedings and fax gained a stature which pushed other technologies like Telex aside.

During this period, I worked for Fujitsu Imaging Systems of America (FISA) and was a product line manager for a new technology called computer fax modems.  FISA had bought one of the early fax success stories from the Sixties, Graphic Sciences, from Burroughs Corporation in 1986.  This is where my story begins to intertwine with the fax history which Coopersmith recounts.  I’ll continue this review in my next post.

Need to Manage a Career Change? Try SCRUM

One of the major challenges of a career change is to manage all of the details. Whether you are looking for a new full time position or would prefer consulting assignments, you’ll need to have a clear direction and an execution strategy. Some of the tasks are to identify prospective companies and potential roles, make networking contacts within the companies, apply for positions, conduct company research and prepare for interviews.  And the list of potential activities goes on.

I found my local career center was a great resource for learning the “how to’s” for a job search in 2014, but that managing and executing the activities was a full time job in itself. A few months ago, I took training courses in two of the leading project management methodologies: PMP and SCRUM. PMP reviews the classic methodology for managing large complex projects and includes up to 49 different processes in the latest (5th) Project Management Book of Knowledge (PMBOK). It’s very thorough and well regarded, but is really best for truly complex projects with lots of interactions between the steps. I also took a course in SCRUM, which is one of the Agile methodologies for managing projects. What’s the difference? SCRUM is much more lightweight, has fewer processes and is designed to enable very rapid responses to change.

A few weeks after taking the courses, I decide to put these skills to work. I looked at what I needed to do in my job search and decided that SCRUM was probably a better fit for the task than PMP. Why did I choose SCRUM?  First, I liked it’s lightweight approach. I already had a pretty clear idea on my goal — looking for a full-time position which used my product management, marketing or project management skills. I also had lots of potential tasks every week — identifying companies, networking, creating cover letters and tweaked resumes, making followup contacts and so on. Plus, depending upon what happened from week to week, I might need to change the emphasis — for example, to do company research for upcoming interviews and reduce the amount of prospecting for new companies. SCRUM also is useful for promoting action. I wanted to track my activities and be able to monitor progress with some visible metrics. With SCRUM, you can assess progress day by day and week by week.

If you’re in the process of making a change in your career, what approaches are you taking? Have you considered using project management methodologies such as SCRUM or PMP?  In my next post, I’ll talk about the steps I took to put SCRUM to work to help manage my job search.

Secure IP Fax – Now Standard

Last fall, I blogged about a pending standard for securing facsimile communications over IP networks here and I spoke about this progress at the SIPNOC conference. Since that time, the standard, known as RFC 7345 has been approved by the Internet Engineering Task Force. The availability of a standard is very good news. There’s a common perception that fax isn’t used anymore, but there are a number of business to business (B2B) and consumer applications where fax still is common, including real estate, insurance, health care and legal applications. There are also a number of companies which provide fax by selling equipment, fax enabling technology, software or a hosted service.

So why should people or companies care about securing IP fax? Increasingly, most of our real time communications, whether by voice, fax, text or video, are transported over IP networks. Very often, they will travel over the Internet for a portion of their journey. The Internet is ubiquitous, but fundamentally unsecure unless the application or the transport layers provide security. Security can mean many different things, but is often referring to solutions for needs which include privacy, authentication and data integrity. The new RFC 7345 is designed to support these types of requirements by applying a standard known as Datagram Transport Layer Security (DTLS). One of the key reasons that the Fax over IP RFC uses DTLS is because the T.38 IP fax protocol most typically formats its signals and data using the User Datagram Protocol Transport Layer (UDPTL), unlike most real time media, which use the Real Time Transport protocol (RTP).  DTLS was designed to provide security services with datagram protocols, so it’s a good fit for T.38 IP fax.  The current version of DTLS is 1.2, which is defined in RFC 6347.

Getting a standard approved is really only the beginning. In order to get traction in the marketplace, there needs to be implementations. For example, T.38 was originally approved in 1998 by the International Telecommunications Union, but implementations did not become common until many years later, starting around 2005. In the time since, T.38 has become the most common way to send fax over IP networks and its been adopted by most of the fax eco-system.  On the plus side, a key advocate for the new standard is the Third Generation Partnership Program (3GPP), which is the standards group that drives standardization of services which will run over mobile networks, such as the emerging Long Term Evolution (LTE) network.  The SIP Forum is also continuing work on its SIP Connect interworking agreements and there is potential for including the new standard in a future version of SIPconnect.

I’ll continue to track what’s happening with respect to implementation of the standard.   As I noted in some of my previous posts, the current work on standardizing WebRTC is helping implementors to gain experience in important new standards for security, codecs and Network Address Translation (NAT) traversal. This WebRTC “toolkit” is also available in open source form.  The inclusion of DTLS in RFC 7345 joins the pending RTCWeb standards in providing new applications and use cases for these emerging standards. This will be good news for the user community, as features which were previously available only in proprietary get implemented in variety of products and services.  If you know of any plans in motion or want to learn more, please feel free to comment or get in touch with me.  You can also learn more by checking out my presentation on Securing IP Fax.

On the Road Again – SIPNOC 2014

I’ll be speaking next week at the SIPNOC conference in Herndon, Virginia.  SIPNOC is sponsored by the SIP Forum and covers a wide variety of topics related to SIP — the Session Initiation Protocol — with a particular focus on the needs of service providers.   It runs from June 9 – 12.

WebRTC continues to be a hot topic in the telecom industry and I’ll be on a panel with several other participants to discuss the relationship between SIP and WebRTC.   SIP has been the primary protocol for Voice over IP and is widely deployed.  WebRTC is much newer, but offers an interesting mix of audio, video and data capabilities and it can be accessed via popular browsers from Google and Mozilla.  WebRTC also has a rapidly growing eco-system.  Are SIP and WebRTC complementary standards which work well together or going in totally different directions?  Come to the panel and find out!

I am also delivering a presentation on a very exciting development in IP fax communications over SIP.  The presentation is entitled: Securing IP Fax – A New Standard Approach.  It’s been a long time coming, but there will soon be a new security standard for implementors of IP Fax over SIP networks.  In particular, the Internet Engineering Task Force is working on using an existing security standard known as DTLS and adding this as a security layer for T.38 fax.    I’ll be talking about the pending standard, why it’s needed and what kind of benefits can be expected for the many users of T.38 IP fax once the new standard is deployed.

I’ve attended SIPNOC as a speaker since its beginning four years ago.  It’s an excellent conference and offers an in-depth perspective on the latest news in SIP as delivered by an all star cast of speakers.  I hope you’ll be able to join us.

Lean Six Sigma – Taking it Forward

This is the second post in a two part review of the discipline called Lean Six Sigma.  The first part of the discussion can be found here.

When I was in college at RPI, I segued from the core Engineering curriculum into a degree called Management Engineering. This degree had elements of Industrial Engineering, but also dove very deep into computers, statistics and operations research. As a result, graduates could pursue a variety of career paths including manufacturing, software development and various quantitative careers. I chose the software development direction, but also used the problem solving skills I’d learned. Later, a manager noticed those problem solving skills and offered me a position in operations management. I worked with teams to solve problems, organize processes and eliminate waste, but we did it on an ad hoc basis and had nothing like a Lean Six Sigma methodology to guide us. I’m proud of the work we did,but I soon got interested in products and moved into product management and R & D. In software project management, I tracked defects using databases and graphics, but never dove back into the statistics that I’d enjoyed so much in college.

I later shifted into product management for other products. In one particular case, we had a multi-million dollar customer who was very upset because the product process for an embedded voice mail system had fallen out of control. I was asked to solve the problem and took over program management for the product. I worked closely with our customer, manufacturing, sales and our quality department. We listened carefully to the customer and improved both the quality and efficiency of the product process. Within a year, the customer awarded our team a quality award in recognition of our progress. Once again, I’d worked with a team to solve problems, this time for an external customer. Our quality team used a number of statistical techniques to demonstrate our improved process quality and we used Pareto charts to identify and solve major problems. Sounds a lot like Lean Six Sigma.  We listened carefully to our customer and let them guide us on which problems were the most important to solve.

Fast forward to this year and my participation in this course. The cool thing about Lean Six Sigma for me is that is takes ALL of the skills I learned in my university studies, plus lots of hard earned learnings from different points in my career and weaves them together into a coherent methodology which is great for solving problems.  The review of statistics and other analytical tools in the course was an excellent refresher — I’d studied most of these techniques at RPI — and the techniques are highly relevant in today’s business environment. Analytics and quantitative analysis are hot in a wide variety of fields today, including politics, social media, medical devices, telecom and marketing. Companies like people with experience, but they like it even more if people can analyze data and use it to back up their ideas. The course also offered a variety of approaches for getting information from customers, including ideas on data driven approaches such as surveys and interviews. This ties well into contemporary marketing approaches where listening to the Voice of the Customer is critical and analytical tools such as A / B testing for new marketing campaign ideas are increasingly common.

In short, I’m really glad I took the Lean Six Sigma course. It’s equipped me with a useful philosophy for process and quality improvement and the tools we studied should be useful for numerous business situations. Lean Six Sigma. Check it out.

Lean Six Sigma – First Take

I just finished a two week course and now possess a certification known as the Lean Six Sigma Green Belt. I’d been running into a few people with Lean Six Sigma backgrounds while networking in the last several months, but didn’t really understand what it was all about until I took this course. I now have a much better appreciation for what I’ve been missing and am amazed by the degree to which this particular cluster of methodologies winds like a river through many different elements of my education and career.  

Lean Six Sigma is a combination of two movements. Lean is an approach to improving processes by analyzing and removing various types of waste. But that’s not all. It can also be used to assess a product or process and determine which elements provide value for customers. I’d been thinking Lean Six Sigma was just a manufacturing thing — a common misconception — but here they were talking about the customer value and the Voice of the Customer. So Lean is relevant to customers and therefore, might also be highly useful for people in marketing and product positions. Okay, so Lean is relevant for product and marketing people like me. What about Six Sigma?  

Six Sigma dates back to the Seventies, when Dr. Mikel Harry of Motorola put together a variety of quality and statistical approaches aimed toward helping organizations greatly improve the quality of their processes. The term Six Sigma derives from the statistical world, where sigma is another word for standard deviation. A six sigma process is highly accurate and produces on average only 3.4 defects per million. At one time, Six Sigma and Lean were separate movements, but organizations soon saw the value in using Six Sigma techniques to improve the quality of their processes and Lean to reduce wastes, eliminate unnecessary process costs and in general, have much more efficient processes.  

It turns out there’s a lot to learn. Green Belts get introduced to the smorgasbord of Lean and Six Sigma techniques, but true mastery of the tools takes more learning and experience — hence the use of the term Black Belt.  The overall Lean Six Sigma philosophy and collection of tools strikes me as being valuable for people in a wide variety of disciplines.  I’ll talk more about how Lean Six Sigma relates to my own background and today’s business needs in my next post.